DeepAI AI Chat
Log In Sign Up

Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization

02/17/2020
by   Quoc Tran-Dinh, et al.
0

We develop two new stochastic Gauss-Newton algorithms for solving a class of stochastic nonconvex compositional optimization problems frequently arising in practice. We consider both the expectation and finite-sum settings under standard assumptions. We use both classical stochastic and SARAH estimators for approximating function values and Jacobians. In the expectation case, we establish O(ε^-2) iteration complexity to achieve a stationary point in expectation and estimate the total number of stochastic oracle calls for both function values and its Jacobian, where ε is a desired accuracy. In the finite sum case, we also estimate the same iteration complexity and the total oracle calls with high probability. To our best knowledge, this is the first time such global stochastic oracle complexity is established for stochastic Gauss-Newton methods. We illustrate our theoretical results via numerical examples on both synthetic and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/05/2016

Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization

In this paper we study stochastic quasi-Newton methods for nonconvex sto...
07/08/2019

A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization

In this paper, we introduce a new approach to develop stochastic optimiz...
10/25/2022

An Optimal Stochastic Algorithm for Decentralized Nonconvex Finite-sum Optimization

This paper studies the synchronized decentralized nonconvex optimization...
06/19/2021

SAN: Stochastic Average Newton Algorithm for Minimizing Finite Sums

We present a principled approach for designing stochastic Newton methods...
01/10/2023

A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization

In this paper we consider finding an approximate second-order stationary...