Faster Optimization on Sparse Graphs via Neural Reparametrization

05/26/2022
by   Nima Dehmamy, et al.
0

In mathematical optimization, second-order Newton's methods generally converge faster than first-order methods, but they require the inverse of the Hessian, hence are computationally expensive. However, we discover that on sparse graphs, graph neural networks (GNN) can implement an efficient Quasi-Newton method that can speed up optimization by a factor of 10-100x. Our method, neural reparametrization, modifies the optimization parameters as the output of a GNN to reshape the optimization landscape. Using a precomputed Hessian as the propagation rule, the GNN can effectively utilize the second-order information, reaching a similar effect as adaptive gradient methods. As our method solves optimization through architecture design, it can be used in conjunction with any optimizers such as Adam and RMSProp. We show the application of our method on scientifically relevant problems including heat diffusion, synchronization and persistent homology.

READ FULL TEXT

page 4

page 5

page 10

page 11

page 12

page 14

page 15

page 19

research
12/01/2022

Second-order optimization with lazy Hessians

We analyze Newton's method with lazy Hessian updates for solving general...
research
02/12/2021

Kronecker-factored Quasi-Newton Methods for Convolutional Neural Networks

Second-order methods have the capability of accelerating optimization by...
research
08/03/2022

Neural Nets with a Newton Conjugate Gradient Method on Multiple GPUs

Training deep neural networks consumes increasing computational resource...
research
12/17/2022

Improving Levenberg-Marquardt Algorithm for Neural Networks

We explore the usage of the Levenberg-Marquardt (LM) algorithm for regre...
research
05/20/2022

pISTA: preconditioned Iterative Soft Thresholding Algorithm for Graphical Lasso

We propose a novel quasi-Newton method for solving the sparse inverse co...
research
02/16/2020

Distributed Averaging Methods for Randomized Second Order Optimization

We consider distributed optimization problems where forming the Hessian ...

Please sign up or login with your details

Forgot password? Click here to reset