Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

06/30/2020
by   Yibo Jiang, et al.
0

Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2018

Neural Tangent Kernel: Convergence and Generalization in Neural Networks

At initialization, artificial neural networks (ANNs) are equivalent to G...
research
01/30/2022

Stochastic Neural Networks with Infinite Width are Deterministic

This work theoretically studies stochastic neural networks, a main type ...
research
01/27/2022

Eigenvalues of Autoencoders in Training and at Initialization

In this paper, we investigate the evolution of autoencoders near their i...
research
05/08/2021

Tensor Programs IIb: Architectural Universality of Neural Tangent Kernel Training Dynamics

Yang (2020a) recently showed that the Neural Tangent Kernel (NTK) at ini...
research
09/26/2019

Overparameterized Neural Networks Can Implement Associative Memory

Identifying computational mechanisms for memorization and retrieval is a...
research
11/28/2021

Neural Tangent Kernel of Matrix Product States: Convergence and Applications

In this work, we study the Neural Tangent Kernel (NTK) of Matrix Product...
research
09/08/2022

Beyond Double Ascent via Recurrent Neural Tangent Kernel in Sequential Recommendation

Overfitting has long been considered a common issue to large neural netw...

Please sign up or login with your details

Forgot password? Click here to reset