OCTOPUS: Overcoming Performance andPrivatization Bottlenecks in Distributed Learning

05/03/2021
by   Shuo Wang, et al.
0

The diversity and quantity of the data warehousing, gathering data from distributed devices such as mobile phones, can enhance machine learning algorithms' success and robustness. Federated learning enables distributed participants to collaboratively learn a commonly-shared model while holding data locally. However, it is also faced with expensive communication and limitations due to the heterogeneity of distributed data sources and lack of access to global data. In this paper, we investigate a practical distributed learning scenario where multiple downstream tasks (e.g., classifiers) could be learned from dynamically-updated and non-iid distributed data sources, efficiently and providing local privatization. We introduce a new distributed learning scheme to address communication overhead via latent compression, leveraging global data while providing local privatization of local data without additional cost due to encryption or perturbation. This scheme divides the learning into (1) informative feature encoding, extracting and transmitting the latent space compressed representation features of local data at each node to address communication overhead; (2) downstream tasks centralized at the server using the encoded codes gathered from each node to address computing and storage overhead. Besides, a disentanglement strategy is applied to address the privatization of sensitive components of local data. Extensive experiments are conducted on image and speech datasets. The results demonstrate that downstream tasks on the compact latent representations can achieve comparable accuracy to centralized learning with the privatization of local data.

READ FULL TEXT

page 8

page 9

page 12

research
11/11/2022

More Generalized and Personalized Unsupervised Representation Learning In A Distributed System

Discriminative unsupervised learning methods such as contrastive learnin...
research
07/19/2022

SphereFed: Hyperspherical Federated Learning

Federated Learning aims at training a global model from multiple decentr...
research
06/19/2022

Scalable Neural Data Server: A Data Recommender for Transfer Learning

Absence of large-scale labeled data in the practitioner's target domain ...
research
08/13/2020

WAFFLe: Weight Anonymized Factorization for Federated Learning

In domains where data are sensitive or private, there is great value in ...
research
03/30/2016

Towards Geo-Distributed Machine Learning

Latency to end-users and regulatory requirements push large companies to...
research
08/02/2021

Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a ...
research
03/14/2022

Invariance in Policy Optimisation and Partial Identifiability in Reward Learning

It's challenging to design reward functions for complex, real-world task...

Please sign up or login with your details

Forgot password? Click here to reset