Multi-hop Federated Private Data Augmentation with Sample Compression

07/15/2019
by   Eunjeong Jeong, et al.
1

On-device machine learning (ML) has brought about the accessibility to a tremendous amount of data from the users while keeping their local data private instead of storing it in a central entity. However, for privacy guarantee, it is inevitable at each device to compensate for the quality of data or learning performance, especially when it has a non-IID training dataset. In this paper, we propose a data augmentation framework using a generative model: multi-hop federated augmentation with sample compression (MultFAug). A multi-hop protocol speeds up the end-to-end over-the-air transmission of seed samples by enhancing the transport capacity. The relaying devices guarantee stronger privacy preservation as well since the origin of each seed sample is hidden in those participants. For further privatization on the individual sample level, the devices compress their data samples. The devices sparsify their data samples prior to transmissions to reduce the sample size, which impacts the communication payload. This preprocessing also strengthens the privacy of each sample, which corresponds to the input perturbation for preserving sample privacy. The numerical evaluations show that the proposed framework significantly improves privacy guarantee, transmission delay, and local training performance with adjustment to the number of hops and compression rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

research
11/28/2018

Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data

On-device machine learning (ML) enables the training process to exploit ...
research
06/09/2020

XOR Mixup: Privacy-Preserving Data Augmentation for One-Shot Federated Learning

User-generated data distributions are often imbalanced across devices an...
research
08/01/2020

Sparsified Privacy-Masking for Communication-Efficient and Privacy-Preserving Federated Learning

Federated learning has received significant interests recently due to it...
research
05/02/2021

AirMixML: Over-the-Air Data Mixup for Inherently Privacy-Preserving Edge Machine Learning

Wireless channels can be inherently privacy-preserving by distorting the...
research
04/25/2020

Privacy Preserving Distributed Machine Learning with Federated Learning

Edge computing and distributed machine learning have advanced to a level...

Please sign up or login with your details

Forgot password? Click here to reset