Training Production Language Models without Memorizing User Data

by   Swaroop Ramaswamy, et al.

This paper presents the first consumer-scale next-word prediction (NWP) model trained with Federated Learning (FL) while leveraging the Differentially Private Federated Averaging (DP-FedAvg) technique. There has been prior work on building practical FL infrastructure, including work demonstrating the feasibility of training language models on mobile devices using such infrastructure. It has also been shown (in simulations on a public corpus) that it is possible to train NWP models with user-level differential privacy using the DP-FedAvg algorithm. Nevertheless, training production-quality NWP models with DP-FedAvg in a real-world production environment on a heterogeneous fleet of mobile phones requires addressing numerous challenges. For instance, the coordinating central server has to keep track of the devices available at the start of each round and sample devices uniformly at random from them, while ensuring secrecy of the sample, etc. Unlike all prior privacy-focused FL work of which we are aware, for the first time we demonstrate the deployment of a differentially private mechanism for the training of a production neural network in FL, as well as the instrumentation of the production training infrastructure to perform an end-to-end empirical measurement of unintended memorization.


OLIVE: Oblivious and Differentially Private Federated Learning on Trusted Execution Environment

Differentially private federated learning (DP-FL) has received increasin...

Private Wireless Federated Learning with Anonymous Over-the-Air Computation

We study the problem of differentially private wireless federated learni...

DPAUC: Differentially Private AUC Computation in Federated Learning

Federated learning (FL) has gained significant attention recently as a p...

Dopamine: Differentially Private Federated Learning on Medical Data

While rich medical datasets are hosted in hospitals distributed across t...

Differentially Private Federated Learning for Cancer Prediction

Since 2014, the NIH funded iDASH (integrating Data for Analysis, Anonymi...

Fast-adapting and Privacy-preserving Federated Recommender System

In the mobile Internet era, recommender systems have become an irreplace...

Differentially Private Distributed Learning for Language Modeling Tasks

One of the big challenges in machine learning applications is that train...