Practical Distributed Learning: Secure Machine Learning with Communication-Efficient Local Updates

03/16/2019
by   Cong Xie, et al.
0

Federated learning on edge devices poses new challenges arising from workers that misbehave, privacy needs, etc. We propose a new robust federated optimization algorithm, with provable convergence and robustness under non-IID settings. Empirical results show that the proposed algorithm stabilizes the convergence and tolerates data poisoning on a small number of workers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2019

SLSGD: Secure and Efficient Distributed On-device Machine Learning

We consider distributed on-device learning with limited communication an...
research
03/10/2019

Asynchronous Federated Optimization

Federated learning enables training on a massive number of edge devices....
research
11/04/2021

TEE-based Selective Testing of Local Workers in Federated Learning Systems

This paper considers a federated learning system composed of a central c...
research
08/20/2021

Federated Distributionally Robust Optimization for Phase Configuration of RISs

In this article, we study the problem of robust reconfigurable intellige...
research
08/06/2019

Motivating Workers in Federated Learning: A Stackelberg Game Perspective

Due to the large size of the training data, distributed learning approac...
research
12/09/2020

Accurate and Fast Federated Learning via IID and Communication-Aware Grouping

Federated learning has emerged as a new paradigm of collaborative machin...
research
12/05/2022

Adaptive Configuration for Heterogeneous Participants in Decentralized Federated Learning

Data generated at the network edge can be processed locally by leveragin...

Please sign up or login with your details

Forgot password? Click here to reset