A simplified convergence theory for Byzantine resilient stochastic gradient descent

08/25/2022
by   Lindon Roberts, et al.
0

In distributed learning, a central server trains a model according to updates provided by nodes holding local data samples. In the presence of one or more malicious servers sending incorrect information (a Byzantine adversary), standard algorithms for model training such as stochastic gradient descent (SGD) fail to converge. In this paper, we present a simplified convergence theory for the generic Byzantine Resilient SGD method originally proposed by Blanchard et al. [NeurIPS 2017]. Compared to the existing analysis, we shown convergence to a stationary point in expectation under standard assumptions on the (possibly nonconvex) objective function and flexible assumptions on the stochastic gradients.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2018

Phocas: dimensional Byzantine-resilient stochastic gradient descent

We propose a novel robust aggregation rule for distributed synchronous S...
research
12/29/2019

Federated Variance-Reduced Stochastic Gradient Descent with Robustness to Byzantine Attacks

This paper deals with distributed finite-sum optimization for learning o...
research
02/22/2018

The Hidden Vulnerability of Distributed Learning in Byzantium

While machine learning is going through an era of celebrated success, co...
research
06/22/2020

Byzantine-Resilient High-Dimensional SGD with Local Iterations on Heterogeneous Data

We study stochastic gradient descent (SGD) with local iterations in the ...
research
07/05/2019

Data Encoding for Byzantine-Resilient Distributed Optimization

We study distributed optimization in the presence of Byzantine adversari...
research
08/21/2019

BRIDGE: Byzantine-resilient Decentralized Gradient Descent

Decentralized optimization techniques are increasingly being used to lea...
research
05/12/2020

Convergence of Online Adaptive and Recurrent Optimization Algorithms

We prove local convergence of several notable gradient descentalgorithms...

Please sign up or login with your details

Forgot password? Click here to reset