SLSGD: Secure and Efficient Distributed On-device Machine Learning

03/16/2019
by   Cong Xie, et al.
0

We consider distributed on-device learning with limited communication and security requirements. We propose a new robust distributed optimization algorithm with efficient communication and attack tolerance. The proposed algorithm has provable convergence and robustness under non-IID settings. Empirical results show that the proposed algorithm stabilizes the convergence and tolerates data poisoning on a small number of workers.

READ FULL TEXT
research
03/16/2019

Practical Distributed Learning: Secure Machine Learning with Communication-Efficient Local Updates

Federated learning on edge devices poses new challenges arising from wor...
research
08/30/2019

GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning

When the data is distributed across multiple servers, efficient data exc...
research
03/10/2019

Asynchronous Federated Optimization

Federated learning enables training on a massive number of edge devices....
research
08/20/2021

Federated Distributionally Robust Optimization for Phase Configuration of RISs

In this article, we study the problem of robust reconfigurable intellige...
research
02/26/2019

On Maintaining Linear Convergence of Distributed Learning and Optimization under Limited Communication

In parallel and distributed machine learning multiple nodes or processor...
research
06/30/2019

Network-accelerated Distributed Machine Learning Using MLFabric

Existing distributed machine learning (DML) systems focus on improving t...
research
05/18/2015

Graph Partitioning via Parallel Submodular Approximation to Accelerate Distributed Machine Learning

Distributed computing excels at processing large scale data, but the com...

Please sign up or login with your details

Forgot password? Click here to reset