Dense Associative Memory for Pattern Recognition

06/03/2016
by   Dmitry Krotov, et al.
0

A model of associative memory is studied, which stores and reliably retrieves many more patterns than the number of neurons in the network. We propose a simple duality between this dense associative memory and neural networks commonly used in deep learning. On the associative memory side of this duality, a family of models that smoothly interpolates between two limiting cases can be constructed. One limit is referred to as the feature-matching mode of pattern recognition, and the other one as the prototype regime. On the deep learning side of the duality, this family corresponds to feedforward neural networks with one hidden layer and various activation functions, which transmit the activities of the visible neurons to the hidden layer. This family of activation functions includes logistics, rectified linear units, and rectified polynomials of higher degrees. The proposed duality makes it possible to apply energy-based intuition from associative memory to analyze computational properties of neural networks with unusual activation functions - the higher rectified polynomials which until now have not been used in deep learning. The utility of the dense memories is illustrated for two test cases: the logical gate XOR and the recognition of handwritten digits from the MNIST data set.

READ FULL TEXT
research
08/15/2019

Improving Randomized Learning of Feedforward Neural Networks by Appropriate Generation of Random Parameters

In this work, a method of random parameters generation for randomized le...
research
11/28/2018

The SWAG Algorithm; a Mathematical Approach that Outperforms Traditional Deep Learning. Theory and Implementation

The performance of artificial neural networks (ANNs) is influenced by we...
research
05/03/2018

Lifted Neural Networks

We describe a novel family of models of multi- layer feedforward neural ...
research
03/25/2023

A Desynchronization-Based Countermeasure Against Side-Channel Analysis of Neural Networks

Model extraction attacks have been widely applied, which can normally be...
research
12/11/2022

Energy-based General Sequential Episodic Memory Networks at the Adiabatic Limit

The General Associative Memory Model (GAMM) has a constant state-dependa...
research
10/04/2021

Universal approximation properties of shallow quadratic neural networks

In this paper we propose a new class of neural network functions which a...

Please sign up or login with your details

Forgot password? Click here to reset