The Neural Race Reduction: Dynamics of Abstraction in Gated Networks

07/21/2022
by   Andrew M. Saxe, et al.
0

Our theoretical understanding of deep learning has not kept pace with its empirical success. While network architecture is known to be critical, we do not yet understand its effect on learned representations and network behavior, or how this architecture should reflect task structure.In this work, we begin to address this gap by introducing the Gated Deep Linear Network framework that schematizes how pathways of information flow impact learning dynamics within an architecture. Crucially, because of the gating, these networks can compute nonlinear functions of their input. We derive an exact reduction and, for certain cases, exact solutions to the dynamics of learning. Our analysis demonstrates that the learning dynamics in structured networks can be conceptualized as a neural race with an implicit bias towards shared representations, which then govern the model's ability to systematically generalize, multi-task, and transfer. We validate our key insights on naturalistic datasets and with relaxed assumptions. Taken together, our work gives rise to general hypotheses relating neural architecture to learning and provides a mathematical approach towards understanding the design of more complex architectures and the role of modularity and compositionality in solving real-world problems. The code and results are available at https://www.saxelab.org/gated-dln .

READ FULL TEXT

page 11

page 12

page 13

page 14

page 15

page 16

page 17

page 21

research
10/31/2022

Globally Gated Deep Linear Networks

Recently proposed Gated Linear Networks present a tractable nonlinear ne...
research
02/05/2022

The Implicit Bias of Gradient Descent on Generalized Gated Linear Networks

Understanding the asymptotic behavior of gradient-descent training of de...
research
06/21/2018

Gated Complex Recurrent Neural Networks

Complex numbers have long been favoured for digital signal processing, y...
research
04/24/2019

Layer Dynamics of Linearised Neural Nets

Despite the phenomenal success of deep learning in recent years, there r...
research
12/08/2020

Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning Dynamics

Predicting the dynamics of neural network parameters during training is ...
research
10/23/2018

A mathematical theory of semantic development in deep neural networks

An extensive body of empirical research has revealed remarkable regulari...
research
11/13/2021

The Three Stages of Learning Dynamics in High-Dimensional Kernel Methods

To understand how deep learning works, it is crucial to understand the t...

Please sign up or login with your details

Forgot password? Click here to reset