Robust Gradient Descent via Moment Encoding with LDPC Codes

by   Raj Kumar Maity, et al.

This paper considers the problem of implementing large-scale gradient descent algorithms in a distributed computing setting in the presence of straggling processors. To mitigate the effect of the stragglers, it has been previously proposed to encode the data with an erasure-correcting code and decode at the master server at the end of the computation. We, instead, propose to encode the second-moment of the data with a low density parity-check (LDPC) code. The iterative decoding algorithms for LDPC codes have very low computational overhead and the number of decoding iterations can be made to automatically adjust with the number of stragglers in the system. We show that for a random model for stragglers, the proposed moment encoding based gradient descent method can be viewed as the stochastic gradient descent method. This allows us to obtain convergence guarantees for the proposed solution. Furthermore, the proposed moment encoding based method is shown to outperform the existing schemes in a real distributed computing setup.


Data Encoding for Byzantine-Resilient Distributed Optimization

We study distributed optimization in the presence of Byzantine adversari...

Approximate Gradient Coding with Optimal Decoding

In distributed optimization problems, a technique called gradient coding...

Distributed Stochastic Gradient Descent Using LDGM Codes

We consider a distributed learning problem in which the computation is c...

Coded Iterative Computing using Substitute Decoding

In this paper, we propose a new coded computing technique called "substi...

Gradient Descent Bit-Flipping Decoding with Momentum

In this paper, we propose a Gradient Descent Bit-Flipping (GDBF) decodin...

Homomorphically encrypted gradient descent algorithms for quadratic programming

In this paper, we evaluate the different fully homomorphic encryption sc...

Please sign up or login with your details

Forgot password? Click here to reset