DeepAI AI Chat
Log In Sign Up

Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

06/30/2020
by   Yibo Jiang, et al.
0

Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/20/2018

Neural Tangent Kernel: Convergence and Generalization in Neural Networks

At initialization, artificial neural networks (ANNs) are equivalent to G...
01/30/2022

Stochastic Neural Networks with Infinite Width are Deterministic

This work theoretically studies stochastic neural networks, a main type ...
01/27/2022

Eigenvalues of Autoencoders in Training and at Initialization

In this paper, we investigate the evolution of autoencoders near their i...
05/08/2021

Tensor Programs IIb: Architectural Universality of Neural Tangent Kernel Training Dynamics

Yang (2020a) recently showed that the Neural Tangent Kernel (NTK) at ini...
09/26/2019

Overparameterized Neural Networks Can Implement Associative Memory

Identifying computational mechanisms for memorization and retrieval is a...
07/01/2021

Implicit Acceleration and Feature Learning in Infinitely Wide Neural Networks with Bottlenecks

We analyze the learning dynamics of infinitely wide neural networks with...
09/08/2022

Beyond Double Ascent via Recurrent Neural Tangent Kernel in Sequential Recommendation

Overfitting has long been considered a common issue to large neural netw...