Benchmarking Decoupled Neural Interfaces with Synthetic Gradients

12/22/2017
by   Ekaba Bisong, et al.
0

Artifical Neural Network are a particular class of learning system modeled after biological neural functions with an interesting penchant for Hebbian learning, that is "neurons that wire together, fire together". However, unlike their natural counterparts, artificial neural networks have a close and stringent coupling between the modules of neurons in the network. This coupling or locking imposes upon the network a strict and inflexible structure that prevent layers in the network from updating their weights until a full feed-forward and backward pass has occurred. Such a constraint though may have sufficed for a while, is now no longer feasible in the era of very-large-scale machine learning, coupled with the increased desire for parallelization of the learning process across multiple computing infrastructures. To solve this problem, synthetic gradients (SG) with decoupled neural interfaces (DNI) are introduced as a viable alternative to the backpropagation algorithm. This paper performs a speed benchmark to compare the speed and accuracy capabilities of SG-DNI as over to a standard neural interface using multilayer perceptron MLP. SG-DNI shows good promise, in that it not only captures the learning problem, it is also over 3-fold faster due to it asynchronous learning capabilities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2016

Decoupled Neural Interfaces using Synthetic Gradients

Training directed neural networks typically requires forward-propagating...
research
03/01/2017

Understanding Synthetic Gradients and Decoupled Neural Interfaces

When training neural networks, the use of Synthetic Gradients (SG) allow...
research
04/04/2022

Forward Signal Propagation Learning

We propose a new learning algorithm for propagating a learning signal an...
research
01/11/2023

Exploring the Approximation Capabilities of Multiplicative Neural Networks for Smooth Functions

Multiplication layers are a key component in various influential neural ...
research
06/11/2020

Embed Me If You Can: A Geometric Perceptron

Solving geometric tasks using machine learning is a challenging problem....
research
09/21/2020

Feed-Forward On-Edge Fine-tuning Using Static Synthetic Gradient Modules

Training deep learning models on embedded devices is typically avoided s...
research
12/06/2020

Representaciones del aprendizaje reutilizando los gradientes de la retropropagacion

This work proposes an algorithm for taking advantage of backpropagation ...

Please sign up or login with your details

Forgot password? Click here to reset