Feedback-Gated Rectified Linear Units

01/06/2023
by   Marco Kemmerling, et al.
0

Feedback connections play a prominent role in the human brain but have not received much attention in artificial neural network research. Here, a biologically inspired feedback mechanism which gates rectified linear units is proposed. On the MNIST dataset, autoencoders with feedback show faster convergence, better performance, and more robustness to noise compared to their counterparts without feedback. Some benefits, although less pronounced and less consistent, can be observed when networks with feedback are applied on the CIFAR-10 dataset.

READ FULL TEXT

page 6

page 11

page 12

page 13

page 14

research
06/13/2017

Transfer entropy-based feedback improves performance in artificial neural networks

The structure of the majority of modern deep neural networks is characte...
research
09/06/2016

Direct Feedback Alignment Provides Learning in Deep Neural Networks

Artificial neural networks are most commonly trained with the back-propa...
research
06/08/2021

On the role of feedback in visual processing: a predictive coding perspective

Brain-inspired machine learning is gaining increasing consideration, par...
research
11/15/2019

Ghost Units Yield Biologically Plausible Backprop in Deep Neural Networks

In the past few years, deep learning has transformed artificial intellig...
research
08/30/2021

Benchmarking the Accuracy and Robustness of Feedback Alignment Algorithms

Backpropagation is the default algorithm for training deep neural networ...
research
07/11/2014

Deep Networks with Internal Selective Attention through Feedback Connections

Traditional convolutional neural networks (CNN) are stationary and feedf...
research
09/07/2022

Multimodal Speech Enhancement Using Burst Propagation

This paper proposes the MBURST, a novel multimodal solution for audio-vi...

Please sign up or login with your details

Forgot password? Click here to reset