Federated Evaluation of On-device Personalization

10/22/2019
by   Kangkang Wang, et al.
0

Federated learning is a distributed, on-device computation framework that enables training global models without exporting sensitive user data to servers. In this work, we describe methods to extend the federation framework to evaluate strategies for personalization of global models. We present tools to analyze the effects of personalization and evaluate conditions under which personalization yields desirable models. We report on our experiments personalizing a language model for a virtual keyboard for smartphones with a population of tens of millions of users. We show that a significant fraction of users benefit from personalization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2018

Federated Learning for Mobile Keyboard Prediction

We train a recurrent neural network language model using a distributed, ...
research
10/08/2019

Federated Learning of N-gram Language Models

We propose algorithms to train production-quality n-gram language models...
research
05/03/2020

Multi-Center Federated Learning

Federated learning has received great attention for its capability to tr...
research
06/18/2021

Zero-Shot Federated Learning with New Classes for Audio Classification

Federated learning is an effective way of extracting insights from diffe...
research
07/14/2021

Federated Self-Training for Semi-Supervised Audio Recognition

Federated Learning is a distributed machine learning paradigm dealing wi...
research
06/06/2022

Certified Robustness in Federated Learning

Federated learning has recently gained significant attention and popular...
research
03/26/2019

Federated Learning Of Out-Of-Vocabulary Words

We demonstrate that a character-level recurrent neural network is able t...

Please sign up or login with your details

Forgot password? Click here to reset