DeepAI AI Chat
Log In Sign Up

Mirrorless Mirror Descent: A More Natural Discretization of Riemannian Gradient Flow

04/02/2020
by   Suriya Gunasekar, et al.
Toyota Technological Institute at Chicago
0

We present a direct (primal only) derivation of Mirror Descent as a "partial" discretization of gradient flow on a Riemannian manifold where the metric tensor is the Hessian of the Mirror Descent potential function. We argue that this discretization is more faithful to the geometry than Natural Gradient Descent, which is obtained by a "full" forward Euler discretization. This view helps shed light on the relationship between the methods and allows generalizing Mirror Descent to any Riemannian geometry, even when the metric tensor is not a Hessian, and thus there is no "dual."

READ FULL TEXT

page 1

page 2

page 3

page 4

05/22/2018

Natural gradient in Wasserstein statistical manifold

We study the Wasserstein natural gradient in parametric statistical mode...
08/04/2019

Hopfield Neural Network Flow: A Geometric Viewpoint

We provide gradient flow interpretations for the continuous-time continu...
02/25/2023

Achieving High Accuracy with PINNs via Energy Natural Gradients

We propose energy natural gradient descent, a natural gradient method wi...
11/16/2021

Thoughts on the Consistency between Ricci Flow and Neural Network Behavior

The Ricci flow is a partial differential equation for evolving the metri...
04/24/2017

A Neural Network model with Bidirectional Whitening

We present here a new model and algorithm which performs an efficient Na...
05/18/2023

Deep Metric Tensor Regularized Policy Gradient

Policy gradient algorithms are an important family of deep reinforcement...
12/14/2022

Mechanics of geodesics in Information geometry

In this article we attempt to formulate Riemannian and Randers-Finsler m...