Tractable structured natural gradient descent using local parameterizations

02/15/2021
by   Wu Lin, et al.
0

Natural-gradient descent on structured parameter spaces (e.g., low-rank covariances) is computationally challenging due to complicated inverse Fisher-matrix computations. We address this issue for optimization, inference, and search problems by using local-parameter coordinates. Our method generalizes an existing evolutionary-strategy method, recovers Newton and Riemannian-gradient methods as special cases, and also yields new tractable natural-gradient algorithms for learning flexible covariance structures of Gaussian and Wishart-based distributions. We show results on a range of applications on deep learning, variational inference, and evolution strategies. Our work opens a new direction for scalable structured geometric methods via local parameterizations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2021

Structured second-order methods via natural gradient descent

In this paper, we propose new structured second-order methods and struct...
research
09/26/2018

Learning Preconditioners on Lie Groups

We study two types of preconditioners and preconditioned stochastic grad...
research
11/11/2018

SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient

Uncertainty estimation in large deep-learning models is a computationall...
research
07/09/2021

The Bayesian Learning Rule

We show that many machine-learning algorithms are specific instances of ...
research
09/13/2018

Stochastic Variational Optimization

Variational Optimization forms a differentiable upper bound on an object...
research
03/04/2015

Bethe Projections for Non-Local Inference

Many inference problems in structured prediction are naturally solved by...
research
02/24/2020

Handling the Positive-Definite Constraint in the Bayesian Learning Rule

Bayesian learning rule is a recently proposed variational inference meth...

Please sign up or login with your details

Forgot password? Click here to reset