A new role for circuit expansion for learning in neural networks

08/19/2020
by   Julia Steinberg, et al.
0

Many sensory pathways in the brain rely on sparsely active populations of neurons downstream from the input stimuli. The biological reason for the occurrence of expanded structure in the brain is unclear, but may be because expansion can increase the expressive power of a neural network. In this work, we show that expanding a neural network can improve its generalization performance even in cases in which the expanded structure is pruned after the learning period. To study this setting we use a teacher-student framework where a perceptron teacher network generates labels which are corrupted with small amounts of noise. We then train a student network that is structurally matched to the teacher and can achieve optimal accuracy if given the teacher's synaptic weights. We find that sparse expansion of the input of a student perceptron network both increases its capacity and improves the generalization performance of the network when learning a noisy rule from a teacher perceptron when these expansions are pruned after learning. We find similar behavior when the expanded units are stochastic and uncorrelated with the input and analyze this network in the mean field limit. We show by solving the mean field equations that the generalization error of the stochastic expanded student network continues to drop as the size of the network increases. The improvement in generalization performance occurs despite the increased complexity of the student network relative to the teacher it is trying to learn. We show that this effect is closely related to the addition of slack variables in artificial neural networks and suggest possible implications for artificial and biological neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2023

Online Learning for the Random Feature Model in the Student-Teacher Framework

Deep neural networks are widely used prediction algorithms whose perform...
research
02/21/2019

Active online learning in the binary perceptron problem

The binary perceptron is the simplest artificial neural network formed b...
research
08/07/2023

The Copycat Perceptron: Smashing Barriers Through Collective Learning

We characterize the equilibrium properties of a model of y coupled binar...
research
06/14/2022

A theory of learning with constrained weight-distribution

A central question in computational neuroscience is how structure determ...
research
05/04/2023

A framework for the emergence and analysis of language in social learning agents

Artificial neural networks (ANNs) are increasingly used as research mode...
research
10/27/2017

BridgeNets: Student-Teacher Transfer Learning Based on Recursive Neural Networks and its Application to Distant Speech Recognition

Despite the remarkable progress achieved on automatic speech recognition...
research
05/10/2023

Phase transitions in the mini-batch size for sparse and dense neural networks

The use of mini-batches of data in training artificial neural networks i...

Please sign up or login with your details

Forgot password? Click here to reset