Recent works:
   - Sampled softmax with random Fourier features, NeurIPS 2019 [arXiv]
   - AdaCliP: Adaptive Clipping for Private SGD, [arXiv]
   - Learning a compressed sensing measurement matrix via gradient unrolling, ICML 2019 [arXiv]
   - Stochastic Negative Mining for learning with large output spaces, AISTATS 2019 [arXiv]
Recent works:
   - cpSGD: Communication-efficient and differentially-private distributed SGD, NIPS 2018 spotlight [arXiv]
   - Heated-up softmax embedding [arXiv:1809.04157]
   - Loss decomposition for fast learning in large output spaces, ICML 2018 [PDF]
   - Multiscale quantization for fast similarity search, NIPS 2017 [PDF]
   - Lattice rescoring strategies for long short term memory language models in speech recognition, ASRU 2017 [arXiv]
   - Learning spread-out local feature descriptors, ICCV 2017 [PDF]
   - On binary embedding using circulant matrices, JMLR [PDF]
• A Google research blog post was released on communication efficient Federated Learning [Google blog] [Forbes]:

Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device... We developed a novel way to reduce upload communication costs up to 100x by compressing updates using random rotations and quantization.

• The paper summarizing the quantization approach used in Federated Learning, distributed mean estimation with limited communication , is accepted at ICML 2017 [arXiv]
Learning discriminative and transformation covariant local feature detectors accepted at CVPR 2017
• Named outstanding reviewer of NIPS 2016
Recent works:
   - Orthogonal random features accepted as full oral presentation at NIPS 2016 [arXiv]
   - Distributed mean estimation with limited communication [arXiv]
   - Federated Learning: strategies for improving communication efficiency [arXiv]
   - Learning battery consumption of mobile devices [pdf]
Co-organized workshop and conference:
   - NIPS 2015 Workshop Learning and Privacy with Incomplete Data and Weak Supervision.
   - International Conference on Multimedia Retrieval (ICMR) 2016.
I defended my PhD, and joined Google Research in June 2015.
Recently accepted papers:
   - NIPS 2015 (Spotlight): Spherical Random Features for Polynomial Kernels
   - ICCV 2015: An Exploration of Parameter Redundancy in Deep Networks with Circulant Projections
   - ICCV 2015: Fast Orthogonal Projection Based on Kronecker Product
Recent ongoing works available on arXiv:
   - arXiv:1503.03893: Compact Nonlinear Maps and Circulant Extensions
   - arXiv:1503.00591: Deep Transfer Network: Unsupervised Domain Adaptation
   - arXiv:1502.03436: Fast Neural Networks with Circulant Projections
   - arXiv:1402.5902: On Learning from Label Proportions (v2)
Photo credit: Eileen Barroso

About Me

Felix X. Yu is a Research Scientist at Google, New York. He is currently working on large-scale machine learning. Felix received his Ph.D from Dept. of Electrical Engineering, Columbia University, in 2015, and his B.S. from Dept. of Electronic Engineering, Tsinghua University, China, in 2010. Google Research Google Scholar LinkedIn

Research Interests

Large-scale machine learning:
    Distributed learning with communication efficiency and differential privacy
    Learning with large output space
    Fast and memory efficient neural networks
    Computer vision applications

Foundations of machine learning:
    Structured matrices
    Random features, dimensionality reduction, binary embedding