Stochastic optimization on matrices and a graphon McKean-Vlasov limit

10/02/2022
by   Zaid Harchaoui, et al.
0

We consider stochastic gradient descents on the space of large symmetric matrices of suitable functions that are invariant under permuting the rows and columns using the same permutation. We establish deterministic limits of these random curves as the dimensions of the matrices go to infinity while the entries remain bounded. Under a “small noise” assumption the limit is shown to be the gradient flow of functions on graphons whose existence was established in arXiv:2111.09459. We also consider limits of stochastic gradient descents with added properly scaled reflected Brownian noise. The limiting curve of graphons is characterized by a family of stochastic differential equations with reflections and can be thought of as an extension of the classical McKean-Vlasov limit for interacting diffusions. The proofs introduce a family of infinite-dimensional exchangeable arrays of reflected diffusions and a novel notion of propagation of chaos for large matrices of interacting diffusions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2023

Path convergence of Markov chains on large graphs

We consider two classes of natural stochastic processes on finite unlabe...
research
04/03/2023

High-dimensional scaling limits and fluctuations of online least-squares SGD with smooth covariance

We derive high-dimensional scaling limits and fluctuations for the onlin...
research
01/28/2022

The limiting spectral distribution of large dimensional general information-plus-noise type matrices

Let X_n be n× N random complex matrices, R_n and T_n be non-random compl...
research
02/03/2019

Quantitative Central Limit Theorems for Discrete Stochastic Processes

In this paper, we establish a generalization of the classical Central Li...
research
06/09/2023

Functional Central Limit Theorem for Two Timescale Stochastic Approximation

Two time scale stochastic approximation algorithms emulate singularly pe...
research
11/18/2021

Gradient flows on graphons: existence, convergence, continuity equations

Wasserstein gradient flows on probability measures have found a host of ...
research
04/03/2019

A Stochastic Interpretation of Stochastic Mirror Descent: Risk-Sensitive Optimality

Stochastic mirror descent (SMD) is a fairly new family of algorithms tha...

Please sign up or login with your details

Forgot password? Click here to reset