Spontaneous Symmetry Breaking in Neural Networks

10/17/2017
by   Ricky Fok, et al.
0

We propose a framework to understand the unprecedented performance and robustness of deep neural networks using field theory. Correlations between the weights within the same layer can be described by symmetries in that layer, and networks generalize better if such symmetries are broken to reduce the redundancies of the weights. Using a two parameter field theory, we find that the network can break such symmetries itself towards the end of training in a process commonly known in physics as spontaneous symmetry breaking. This corresponds to a network generalizing itself without any user input layers to break the symmetry, but by communication with adjacent layers. In the layer decoupling limit applicable to residual networks (He et al., 2015), we show that the remnant symmetries that survive the non-linear layers are spontaneously broken. The Lagrangian for the non-linear and weight layers together has striking similarities with the one in quantum field theory of a scalar. Using results from quantum field theory we show that our framework is able to explain many experimentally observed phenomena,such as training on random labels with zero error (Zhang et al., 2017), the information bottleneck, the phase transition out of it and gradient variance explosion (Shwartz-Ziv & Tishby, 2017), shattered gradients (Balduzzi et al., 2017), and many more.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2020

A Deep Conditioning Treatment of Neural Networks

We study the role of depth in training randomly initialized overparamete...
research
07/04/2019

A Quantum Field Theory of Representation Learning

Continuous symmetries and their breaking play a prominent role in contem...
research
05/06/2021

Noether's Learning Dynamics: The Role of Kinetic Symmetry Breaking in Deep Learning

In nature, symmetry governs regularities, while symmetry breaking brings...
research
01/26/2022

Theory of self-resonance after inflation. II. Quantum mechanics and particle-antiparticle asymmetry

We further develop a theory of self-resonance after inflation in a large...
research
02/22/2019

Capacity allocation through neural network layers

Capacity analysis has been recently introduced as a way to analyze how l...
research
07/02/2020

Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

Deep neural networks are typically initialized with random weights, with...
research
03/01/2017

Understanding Synthetic Gradients and Decoupled Neural Interfaces

When training neural networks, the use of Synthetic Gradients (SG) allow...

Please sign up or login with your details

Forgot password? Click here to reset