Gradient Descent Bit-Flipping Decoding with Momentum

04/05/2022
by   Valentin Savin, et al.
0

In this paper, we propose a Gradient Descent Bit-Flipping (GDBF) decoding with momentum, which considers past updates to provide inertia to the decoding process. We show that GDBF or randomized GDBF decoders with momentum may closely approach the floating-point Belief-Propagation decoding performance, and even outperform it in the error-floor region, especially for graphs with high connectivity degree.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2020

Provable Acceleration of Neural Net Training via Polyak's Momentum

Incorporating a so-called "momentum" dynamic in gradient descent methods...
research
07/07/2016

Nesterov's Accelerated Gradient and Momentum as approximations to Regularised Update Descent

We present a unifying framework for adapting the update direction in gra...
research
02/24/2022

On the influence of roundoff errors on the convergence of the gradient descent method with low-precision floating-point computation

The employment of stochastic rounding schemes helps prevent stagnation o...
research
12/20/2017

ADINE: An Adaptive Momentum Method for Stochastic Gradient Descent

Two major momentum-based techniques that have achieved tremendous succes...
research
07/05/2020

Momentum Accelerates Evolutionary Dynamics

We combine momentum from machine learning with evolutionary dynamics, wh...
research
03/29/2023

Gradient Flow Decoding for LDPC Codes

The power consumption of the integrated circuit is becoming a significan...
research
05/22/2018

Robust Gradient Descent via Moment Encoding with LDPC Codes

This paper considers the problem of implementing large-scale gradient de...

Please sign up or login with your details

Forgot password? Click here to reset