High Dimensional Distributed Gradient Descent with Arbitrary Number of Byzantine Attackers

07/25/2023
by   Puning Zhao, et al.
0

Robust distributed learning with Byzantine failures has attracted extensive research interests in recent years. However, most of existing methods suffer from curse of dimensionality, which is increasingly serious with the growing complexity of modern machine learning models. In this paper, we design a new method that is suitable for high dimensional problems, under arbitrary number of Byzantine attackers. The core of our design is a direct high dimensional semi-verified mean estimation method. Our idea is to identify a subspace first. The components of mean value perpendicular to this subspace can be estimated via gradient vectors uploaded from worker machines, while the components within this subspace are estimated using auxiliary dataset. We then use our new method as the aggregator of distributed learning problems. Our theoretical analysis shows that the new method has minimax optimal statistical rates. In particular, the dependence on dimensionality is significantly improved compared with previous works.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2017

ByRDiE: Byzantine-resilient distributed coordinate descent for decentralized learning

Distributed machine learning algorithms enable processing of datasets th...
research
08/15/2021

Efficient Byzantine-Resilient Stochastic Gradient Desce

Distributed Learning often suffers from Byzantine failures, and there ha...
research
06/14/2018

Defending Against Saddle Point Attack in Byzantine-Robust Distributed Learning

In this paper, we study robust large-scale distributed learning in the p...
research
08/21/2019

BRIDGE: Byzantine-resilient Decentralized Gradient Descent

Decentralized optimization techniques are increasingly being used to lea...
research
03/05/2018

Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates

In large-scale distributed learning, security issues have become increas...
research
04/26/2018

Securing Distributed Machine Learning in High Dimensions

We consider securing a distributed machine learning system wherein the d...
research
07/02/2019

A New Cyclic Gradient Method Adapted to Large-Scale Linear Systems

This paper proposes a new gradient method to solve the large-scale probl...

Please sign up or login with your details

Forgot password? Click here to reset