A relativistic extension of Hopfield neural networks via the mechanical analogy

01/05/2018
by   Adriano Barra, et al.
0

We propose a modification of the cost function of the Hopfield model whose salient features shine in its Taylor expansion and result in more than pairwise interactions with alternate signs, suggesting a unified framework for handling both with deep learning and network pruning. In our analysis, we heavily rely on the Hamilton-Jacobi correspondence relating the statistical model with a mechanical system. In this picture, our model is nothing but the relativistic extension of the original Hopfield model (whose cost function is a quadratic form in the Mattis magnetization which mimics the non-relativistic Hamiltonian for a free particle). We focus on the low-storage regime and solve the model analytically by taking advantage of the mechanical analogy, thus obtaining a complete characterization of the free energy and the associated self-consistency equations in the thermodynamic limit. On the numerical side, we test the performances of our proposal with MC simulations, showing that the stability of spurious states (limiting the capabilities of the standard Hebbian construction) is sensibly reduced due to presence of unlearning contributions in this extended framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2023

Parallel Learning by Multitasking Neural Networks

A modern challenge of Artificial Intelligence is learning multiple patte...
research
07/17/2023

Statistical Mechanics of Learning via Reverberation in Bidirectional Associative Memories

We study bi-directional associative neural networks that, exposed to noi...
research
12/24/2021

Total Energy Shaping with Neural Interconnection and Damping Assignment – Passivity Based Control

In this work we exploit the universal approximation property of Neural N...
research
01/30/2023

Learning Control from Raw Position Measurements

We propose a Model-Based Reinforcement Learning (MBRL) algorithm named V...
research
10/07/2020

Data Driven Density Functional Theory: A case for Physics Informed Learning

We propose a novel data-driven approach to solving a classical statistic...
research
02/06/2018

Critical Percolation as a Framework to Analyze the Training of Deep Networks

In this paper we approach two relevant deep learning topics: i) tackling...
research
11/10/2018

Using NonBacktracking Expansion to Analyze k-core Pruning Process

We induce the NonBacktracking Expansion Branch method to analyze the k-c...

Please sign up or login with your details

Forgot password? Click here to reset