Computing the Newton-step faster than Hessian accumulation

08/02/2021
by   Akshay Srinivasan, et al.
25

Computing the Newton-step of a generic function with N decision variables takes O(N^3) flops. In this paper, we show that given the computational graph of the function, this bound can be reduced to O(mτ^3), where τ, m are the width and size of a tree-decomposition of the graph. The proposed algorithm generalizes nonlinear optimal-control methods based on LQR to general optimization problems and provides non-trivial gains in iteration-complexity even in cases where the Hessian is dense.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2021

Structure-Exploiting Newton-Type Method for Optimal Control of Switched Systems

This study proposes an efficient Newton-type method for the optimal cont...
research
09/27/2016

Exact and Inexact Subsampled Newton Methods for Optimization

The paper studies the solution of stochastic optimization problems in wh...
research
04/01/2021

Nonlinear optimized Schwarz preconditioner for elliptic optimal control problems

We introduce a domain decomposition-based nonlinear preconditioned itera...
research
07/01/2018

Trust-Region Algorithms for Training Responses: Machine Learning Methods Using Indefinite Hessian Approximations

Machine learning (ML) problems are often posed as highly nonlinear and n...
research
02/25/2023

Provably Efficient Gauss-Newton Temporal Difference Learning Method with Function Approximation

In this paper, based on the spirit of Fitted Q-Iteration (FQI), we propo...
research
03/15/2021

Eigen Space of Mesh Distortion Energy Hessian

Mesh distortion optimization is a popular research topic and has wide ra...
research
11/21/2022

Efficient Second-Order Plane Adjustment

Planes are generally used in 3D reconstruction for depth sensors, such a...

Please sign up or login with your details

Forgot password? Click here to reset