DeepAI AI Chat
Log In Sign Up

Mirrorless Mirror Descent: A More Natural Discretization of Riemannian Gradient Flow

by   Suriya Gunasekar, et al.
Toyota Technological Institute at Chicago

We present a direct (primal only) derivation of Mirror Descent as a "partial" discretization of gradient flow on a Riemannian manifold where the metric tensor is the Hessian of the Mirror Descent potential function. We argue that this discretization is more faithful to the geometry than Natural Gradient Descent, which is obtained by a "full" forward Euler discretization. This view helps shed light on the relationship between the methods and allows generalizing Mirror Descent to any Riemannian geometry, even when the metric tensor is not a Hessian, and thus there is no "dual."


page 1

page 2

page 3

page 4


Natural gradient in Wasserstein statistical manifold

We study the Wasserstein natural gradient in parametric statistical mode...

Hopfield Neural Network Flow: A Geometric Viewpoint

We provide gradient flow interpretations for the continuous-time continu...

Achieving High Accuracy with PINNs via Energy Natural Gradients

We propose energy natural gradient descent, a natural gradient method wi...

Thoughts on the Consistency between Ricci Flow and Neural Network Behavior

The Ricci flow is a partial differential equation for evolving the metri...

A Neural Network model with Bidirectional Whitening

We present here a new model and algorithm which performs an efficient Na...

Deep Metric Tensor Regularized Policy Gradient

Policy gradient algorithms are an important family of deep reinforcement...

Mechanics of geodesics in Information geometry

In this article we attempt to formulate Riemannian and Randers-Finsler m...