Convolutional neural networks with extra-classical receptive fields

10/27/2018
by   Brian Hu, et al.
0

Convolutional neural networks (CNNs) have had great success in many real-world applications and have also been used to model visual processing in the brain. However, these networks are quite brittle - small changes in the input image can dramatically change a network's output prediction. In contrast to what is known from biology, these networks largely rely on feedforward connections, ignoring the influence of recurrent connections. They also focus on supervised rather than unsupervised learning. To address these issues, we combine traditional supervised learning via backpropagation with a specialized unsupervised learning rule to learn lateral connections between neurons within a convolutional neural network. These connections have been shown to optimally integrate information from the surround, generating extra-classical receptive fields for the neurons in our new proposed model (CNNEx). Models with optimal lateral connections are more robust to noise and achieve better performance on noisy versions of the MNIST and CIFAR-10 datasets. Resistance to noise can be further improved by combining our model with additional regularization techniques such as dropout and weight decay. Although the image statistics of MNIST and CIFAR-10 differ greatly, the same unsupervised learning rule generalized to both datasets. Our results demonstrate the potential usefulness of combining supervised and unsupervised learning techniques and suggest that the integration of lateral connections into convolutional neural networks is an important area of future research.

READ FULL TEXT

page 3

page 5

research
05/19/2017

What are the Receptive, Effective Receptive, and Projective Fields of Neurons in Convolutional Neural Networks?

In this work, we explain in detail how receptive fields, effective recep...
research
07/09/2015

Semi-Supervised Learning with Ladder Networks

We combine supervised learning with unsupervised learning in deep neural...
research
09/26/2022

Activation Learning by Local Competitions

The backpropagation that drives the success of deep learning is most lik...
research
06/20/2018

A Review of Network Inference Techniques for Neural Activation Time Series

Studying neural connectivity is considered one of the most promising and...
research
06/02/2016

Recursive Autoconvolution for Unsupervised Learning of Convolutional Neural Networks

In visual recognition tasks, such as image classification, unsupervised ...
research
04/30/2015

Lateral Connections in Denoising Autoencoders Support Supervised Learning

We show how a deep denoising autoencoder with lateral connections can be...
research
03/22/2022

Improving Neural Predictivity in the Visual Cortex with Gated Recurrent Connections

Computational models of vision have traditionally been developed in a bo...

Please sign up or login with your details

Forgot password? Click here to reset