Zeno: Byzantine-suspicious stochastic gradient descent

05/25/2018
by   Cong Xie, et al.
0

We propose Zeno, a new robust aggregation rule, for distributed synchronous Stochastic Gradient Descent (SGD) under a general Byzantine failure model. The key idea is to suspect the workers that are potentially malicious, and use a ranking-based preference mechanism. This allows us to generalize beyond past work--in our case, the number of malicious workers can be arbitrarily large, and we use only the weakest assumption on honest workers (at least one honest worker). We prove the convergence of SGD under these scenarios. Empirical results show that Zeno outperforms existing approaches under various attacks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2019

Zeno++: robust asynchronous SGD with arbitrary number of Byzantine workers

We propose Zeno++, a new robust asynchronous synchronous Stochastic Grad...
research
10/06/2018

Anytime Stochastic Gradient Descent: A Time to Hear from all the Workers

In this paper, we focus on approaches to parallelizing stochastic gradie...
research
03/09/2021

Proof-of-Learning: Definitions and Practice

Training machine learning (ML) models typically involves expensive itera...
research
03/10/2019

Fall of Empires: Breaking Byzantine-tolerant SGD by Inner Product Manipulation

Recently, new defense techniques have been developed to tolerate Byzanti...
research
06/24/2020

Befriending The Byzantines Through Reputation Scores

We propose two novel stochastic gradient descent algorithms, ByGARS and ...
research
05/23/2023

On the Optimal Batch Size for Byzantine-Robust Distributed Learning

Byzantine-robust distributed learning (BRDL), in which computing devices...
research
09/10/2019

Byzantine-Resilient Stochastic Gradient Descent for Distributed Learning: A Lipschitz-Inspired Coordinate-wise Median Approach

In this work, we consider the resilience of distributed algorithms based...

Please sign up or login with your details

Forgot password? Click here to reset