DeepAI AI Chat
Log In Sign Up

Hopfield Neural Network Flow: A Geometric Viewpoint

by   Abhishek Halder, et al.
University of California Santa Cruz
berkeley college

We provide gradient flow interpretations for the continuous-time continuous-state Hopfield neural network (HNN). The ordinary and stochastic differential equations associated with the HNN were introduced in the literature as analog optimizers, and were reported to exhibit good performance in numerical experiments. In this work, we point out that the deterministic HNN can be transcribed into Amari's natural gradient descent, and thereby uncover the explicit relation between the underlying Riemannian metric and the activation functions. By exploiting an equivalence between the natural gradient descent and the mirror descent, we show how the choice of activation function governs the geometry of the HNN dynamics. For the stochastic HNN, we show that the so-called “diffusion machine", while not a gradient flow itself, induces a gradient flow when lifted in the space of probability measures. We characterize this infinite dimensional flow as the gradient descent of certain free energy with respect to a Wasserstein metric that depends on the geodesic distance on the ground manifold. Furthermore, we demonstrate how this gradient flow interpretation can be used for fast computation via recently developed proximal algorithms.


page 1

page 8


Natural gradient in Wasserstein statistical manifold

We study the Wasserstein natural gradient in parametric statistical mode...

Gradient Flow Algorithms for Density Propagation in Stochastic Systems

We develop a new computational framework to solve the partial differenti...

Mirrorless Mirror Descent: A More Natural Discretization of Riemannian Gradient Flow

We present a direct (primal only) derivation of Mirror Descent as a "par...

Parameterized Wasserstein Hamiltonian Flow

In this work, we propose a numerical method to compute the Wasserstein H...

The AdaBoost Flow

We introduce a dynamical system which we call the AdaBoost flow. The flo...

Information-geometry of physics-informed statistical manifolds and its use in data assimilation

The data-aware method of distributions (DA-MD) is a low-dimension data a...

Neural Networks Optimally Compress the Sawbridge

Neural-network-based compressors have proven to be remarkably effective ...