DeepAI AI Chat
Log In Sign Up

Federated Learning for Mobile Keyboard Prediction

by   Andrew Hard, et al.

We train a recurrent neural network language model using a distributed, on-device learning framework called federated learning for the purpose of next-word prediction in a virtual keyboard for smartphones. Server-based training using stochastic gradient descent is compared with training on client devices using the Federated Averaging algorithm. The federated algorithm, which enables training on a higher-quality dataset for this use case, is shown to achieve better prediction recall. This work demonstrates the feasibility and benefit of training language models on client devices without exporting sensitive user data to servers. The federated learning environment gives users greater control over their data and simplifies the task of incorporating privacy by default with distributed training and aggregation across a population of client devices.


page 1

page 2

page 3

page 4


Federated Learning for Emoji Prediction in a Mobile Keyboard

We show that a word-level recurrent neural network can predict emoji fro...

Federated Learning of N-gram Language Models

We propose algorithms to train production-quality n-gram language models...

Federated Evaluation of On-device Personalization

Federated learning is a distributed, on-device computation framework tha...

Device Heterogeneity in Federated Learning: A Superquantile Approach

We propose a federated learning framework to handle heterogeneous client...

Federated Learning Of Out-Of-Vocabulary Words

We demonstrate that a character-level recurrent neural network is able t...

Federated Learning-Based Risk-Aware Decision toMitigate Fake Task Impacts on CrowdsensingPlatforms

Mobile crowdsensing (MCS) leverages distributed and non-dedicated sensin...

Blockchain-based Federated Learning for Device Failure Detection in Industrial IoT

Device failure detection is one of most essential problems in industrial...