• A Google research blog post was released on communication efficient Federated Learning [Google blog] [Forbes]:

Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device... We developed a novel way to reduce upload communication costs up to 100x by compressing updates using random rotations and quantization.

• The paper summarizing the quantization approach used in Federated Learning, distributed mean estimation with limited communication , is accepted at ICML 2017 [arXiv]
Learning discriminative and transformation covariant local feature detectors accepted at CVPR 2017
• Named outstanding reviewer of NIPS 2016
Recent works:
   - Orthogonal random features accepted as full oral presentation at NIPS 2016 [arXiv]
   - Distributed mean estimation with limited communication [arXiv]
   - Federated Learning: strategies for improving communication efficiency [arXiv]
   - Learning battery consumption of mobile devices [pdf]
Co-organized workshop and conference:
   - NIPS 2015 Workshop Learning and Privacy with Incomplete Data and Weak Supervision.
   - International Conference on Multimedia Retrieval (ICMR) 2016.
I defended my PhD, and joined Google Research in June 2015.
Recently accepted papers:
   - NIPS 2015 (Spotlight): Spherical Random Features for Polynomial Kernels
   - ICCV 2015: An Exploration of Parameter Redundancy in Deep Networks with Circulant Projections
   - ICCV 2015: Fast Orthogonal Projection Based on Kronecker Product
Recent ongoing works available on arXiv:
   - arXiv:1503.03893: Compact Nonlinear Maps and Circulant Extensions
   - arXiv:1503.00591: Deep Transfer Network: Unsupervised Domain Adaptation
   - arXiv:1502.03436: Fast Neural Networks with Circulant Projections
   - arXiv:1402.5902: On Learning from Label Proportions (v2)
Photo credit: Eileen Barroso

About Me

Felix X. Yu is a Research Scientist at Google, New York. He is currently working on large-scale machine learning for/on mobile devices. Felix received his Ph.D from Dept. of Electrical Engineering, Columbia University, in 2015, and his B.S. from Dept. of Electronic Engineering, Tsinghua University, China, in 2010. Google Research Google Scholar LinkedIn


Research Interests

Large-scale machine learning for/on mobile:
    Fast and memory efficient neural networks
    Distributed optimization
    Computer vision applications

Foundations of machine learning:
    Structured matrices
    Random features, dimensionality reduction, binary embedding