Algorithmic stability is an important notion that has proven powerful fo...
Cyclic and randomized stepsizes are widely used in the deep learning pra...
Heavy-tail phenomena in stochastic gradient descent (SGD) have been repo...
We consider the constrained sampling problem where the goal is to sample...
Recent studies have shown that heavy tails can emerge in stochastic
opti...
Recent theoretical studies have shown that heavy-tails can emerge in
sto...
Understanding generalization in deep learning has been one of the major
...
Recent studies have provided both empirical and theoretical evidence
ill...
Gaussian noise injections (GNIs) are a family of simple and widely-used
...
Stochastic gradient Langevin dynamics (SGLD) and stochastic gradient
Ham...
In recent years, various notions of capacity and complexity have been
pr...
Stochastic gradient Langevin dynamics (SGLD) is a poweful algorithm for
...
Stochastic gradient descent with momentum (SGDm) is one of the most popu...
Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's
acc...
Langevin dynamics (LD) has been proven to be a powerful technique for
op...
Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is a variant of
stoc...