Newton retraction as approximate geodesics on submanifolds

06/26/2020
by   Ruda Zhang, et al.
0

Efficient approximation of geodesics is crucial for practical algorithms on manifolds. Here we introduce a class of retractions on submanifolds, induced by a foliation of the ambient manifold. They match the projective retraction to the third order and thus match the exponential map to the second order. In particular, we show that Newton retraction (NR) is always stabler than the popular approach known as oblique projection or orthographic retraction: per Kantorovich-type convergence theorems, the superlinear convergence regions of NR include those of the latter. We also show that NR always has a lower computational cost. The preferable properties of NR are useful for optimization, sampling, and many other statistical problems on manifolds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2017

A Unifying Framework for Convergence Analysis of Approximate Newton Methods

Many machine learning models are reformulated as optimization problems. ...
research
07/15/2021

Newton-LESS: Sparsification without Trade-offs for the Sketched Newton Update

In second-order optimization, a potential bottleneck can be computing th...
research
04/06/2020

Deep Neural Network Learning with Second-Order Optimizers – a Practical Study with a Stochastic Quasi-Gauss-Newton Method

Training in supervised deep learning is computationally demanding, and t...
research
03/23/2020

Fast Alternating Projections on Manifolds Based on Tangent Spaces

In this paper, we study alternating projections on nontangential manifol...
research
06/27/2022

Euclidean distance and maximum likelihood retractions by homotopy continuation

We define a new second-order retraction map for statistical models. We a...
research
03/23/2021

The Newton Product of Polynomial Projectors. Part 2 : approximation properties

We prove that the Newton product of efficient polynomial projectors is s...
research
08/26/2021

Subgradient methods near active manifolds: saddle point avoidance, local convergence, and asymptotic normality

Nonsmooth optimization problems arising in practice tend to exhibit bene...

Please sign up or login with your details

Forgot password? Click here to reset