DeepAI AI Chat
Log In Sign Up

Hopfield Neural Network Flow: A Geometric Viewpoint

08/04/2019
by   Abhishek Halder, et al.
University of California Santa Cruz
berkeley college
9

We provide gradient flow interpretations for the continuous-time continuous-state Hopfield neural network (HNN). The ordinary and stochastic differential equations associated with the HNN were introduced in the literature as analog optimizers, and were reported to exhibit good performance in numerical experiments. In this work, we point out that the deterministic HNN can be transcribed into Amari's natural gradient descent, and thereby uncover the explicit relation between the underlying Riemannian metric and the activation functions. By exploiting an equivalence between the natural gradient descent and the mirror descent, we show how the choice of activation function governs the geometry of the HNN dynamics. For the stochastic HNN, we show that the so-called “diffusion machine", while not a gradient flow itself, induces a gradient flow when lifted in the space of probability measures. We characterize this infinite dimensional flow as the gradient descent of certain free energy with respect to a Wasserstein metric that depends on the geodesic distance on the ground manifold. Furthermore, we demonstrate how this gradient flow interpretation can be used for fast computation via recently developed proximal algorithms.

READ FULL TEXT

page 1

page 8

05/22/2018

Natural gradient in Wasserstein statistical manifold

We study the Wasserstein natural gradient in parametric statistical mode...
08/01/2019

Gradient Flow Algorithms for Density Propagation in Stochastic Systems

We develop a new computational framework to solve the partial differenti...
04/02/2020

Mirrorless Mirror Descent: A More Natural Discretization of Riemannian Gradient Flow

We present a direct (primal only) derivation of Mirror Descent as a "par...
05/31/2023

Parameterized Wasserstein Hamiltonian Flow

In this work, we propose a numerical method to compute the Wasserstein H...
10/28/2011

The AdaBoost Flow

We introduce a dynamical system which we call the AdaBoost flow. The flo...
03/01/2021

Information-geometry of physics-informed statistical manifolds and its use in data assimilation

The data-aware method of distributions (DA-MD) is a low-dimension data a...
11/10/2020

Neural Networks Optimally Compress the Sawbridge

Neural-network-based compressors have proven to be remarkably effective ...