TOFU: Towards Obfuscated Federated Updates by Encoding Weight Updates into Gradients from Proxy Data

01/21/2022
by   Isha Garg, et al.
0

Advances in Federated Learning and an abundance of user data have enabled rich collaborative learning between multiple clients, without sharing user data. This is done via a central server that aggregates learning in the form of weight updates. However, this comes at the cost of repeated expensive communication between the clients and the server, and concerns about compromised user privacy. The inversion of gradients into the data that generated them is termed data leakage. Encryption techniques can be used to counter this leakage, but at added expense. To address these challenges of communication efficiency and privacy, we propose TOFU, a novel algorithm which generates proxy data that encodes the weight updates for each client in its gradients. Instead of weight updates, this proxy data is now shared. Since input data is far lower in dimensional complexity than weights, this encoding allows us to send much lesser data per communication round. Additionally, the proxy data resembles noise, and even perfect reconstruction from data leakage attacks would invert the decoded gradients into unrecognizable noise, enhancing privacy. We show that TOFU enables learning with less than 1 drops on MNIST and on CIFAR-10 datasets, respectively. This drop can be recovered via a few rounds of expensive encrypted gradient exchange. This enables us to learn to near-full accuracy in a federated setup, while being 4x and 6.6x more communication efficient than the standard Federated Averaging algorithm on MNIST and CIFAR-10, respectively.

READ FULL TEXT

page 2

page 3

page 10

research
08/22/2019

An End-to-End Encrypted Neural Network for Gradient Updates Transmission in Federated Learning

Federated learning is a distributed learning method to train a shared mo...
research
09/30/2022

Sparse Random Networks for Communication-Efficient Federated Learning

One main challenge in federated learning is the large communication cost...
research
02/04/2022

Aggregation Service for Federated Learning: An Efficient, Secure, and More Resilient Realization

Federated learning has recently emerged as a paradigm promising the bene...
research
06/08/2022

Gradient Obfuscation Gives a False Sense of Security in Federated Learning

Federated learning has been proposed as a privacy-preserving machine lea...
research
04/28/2022

AGIC: Approximate Gradient Inversion Attack on Federated Learning

Federated learning is a private-by-design distributed learning paradigm ...
research
02/24/2021

A Quantitative Metric for Privacy Leakage in Federated Learning

In the federated learning system, parameter gradients are shared among p...
research
05/05/2021

Byzantine-Robust and Privacy-Preserving Framework for FedML

Federated learning has emerged as a popular paradigm for collaboratively...

Please sign up or login with your details

Forgot password? Click here to reset