Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods

02/23/2023
by   El Mahdi Chayti, et al.
0

We study the widely known Cubic-Newton method in the stochastic setting and propose a general framework to use variance reduction which we call the helper framework. In all previous work, these methods were proposed with very large batches (both in gradients and Hessians) and with various and often strong assumptions. In this work, we investigate the possibility of using such methods without large batches and use very simple assumptions that are sufficient for all our methods to work. In addition, we study these methods applied to gradient-dominated functions. In the general case, we show improved convergence (compared to first-order methods) to an approximate local minimum, and for gradient-dominated functions, we show convergence to approximate global minima.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/19/2022

Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized Learning: Part I

In this work, we investigate stochastic quasi-Newton methods for minimiz...
research
11/08/2017

Stochastic Cubic Regularization for Fast Nonconvex Optimization

This paper proposes a stochastic variant of a classic algorithm---the cu...
research
01/13/2020

Rapid multi-component phase-split calculations using volume functions and reduction methods

We present a new family of fast and robust methods for the calculation o...
research
05/03/2022

Convergence of Stochastic Approximation via Martingale and Converse Lyapunov Methods

This paper is dedicated to Prof. Eduardo Sontag on the occasion of his s...
research
05/29/2019

Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data

Stochastic variance reduced methods have gained a lot of interest recent...
research
08/21/2023

A Homogenization Approach for Gradient-Dominated Stochastic Optimization

Gradient dominance property is a condition weaker than strong convexity,...
research
05/25/2022

Stochastic Second-Order Methods Provably Beat SGD For Gradient-Dominated Functions

We study the performance of Stochastic Cubic Regularized Newton (SCRN) o...

Please sign up or login with your details

Forgot password? Click here to reset