From Optimization to Sampling Through Gradient Flows

02/22/2023
by   N. Garcia Trillos, et al.
0

This article overviews how gradient flows, and discretizations thereof, are useful to design and analyze optimization and sampling algorithms. The interplay between optimization, sampling, and gradient flows is an active research area; our goal is to provide an accessible and lively introduction to some core ideas, emphasizing that gradient flows uncover the conceptual unity behind many optimization and sampling algorithms, and that they give a rich mathematical framework for their rigorous analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/25/2019

Normalizing Flows: Introduction and Ideas

Normalizing Flows are generative models which produce tractable distribu...
research
08/09/2018

Policy Optimization as Wasserstein Gradient Flows

Policy optimization is a core component of reinforcement learning (RL), ...
research
11/14/2020

Self Normalizing Flows

Efficient gradient computation of the Jacobian determinant term is a cor...
research
01/10/2020

An eikonal equation approach to thermodynamics and the gradient flows in information geometry

We can consider the equations of states in thermodynamics as the general...
research
10/06/2020

Optimizing Deep Neural Networks via Discretization of Finite-Time Convergent Flows

In this paper, we investigate in the context of deep neural networks, th...
research
07/12/2023

Embracing the chaos: analysis and diagnosis of numerical instability in variational flows

In this paper, we investigate the impact of numerical instability on the...

Please sign up or login with your details

Forgot password? Click here to reset