On Curvature-aided Incremental Aggregated Gradient Methods

by   Hoi-To Wai, et al.

This paper studies an acceleration technique for incremental aggregated gradient methods which exploits curvature information for solving strongly convex finite sum optimization problems. These optimization problems of interest arise in large-scale learning applications relevant to machine learning systems. The proposed methods utilizes a novel curvature-aided gradient tracking technique to produce gradient estimates using the aids of Hessian information during computation. We propose and analyze two curvature-aided methods --- the first method, called curvature-aided incremental aggregated gradient (CIAG) method, can be developed from the standard gradient method and it computes an ϵ-optimal solution using O( κ ( 1 / ϵ ) ) iterations for a small ϵ; the second method, called accelerated CIAG (A-CIAG) method, incorporates Nesterov's acceleration into CIAG and requires O( √(κ) ( 1 / ϵ ) ) iterations for a small ϵ, where κ is the problem's condition number. Importantly, the asymptotic convergence rates above are the same as those of the full gradient and accelerated full gradient methods, respectively, and they are independent of the number of component functions involved. The proposed methods are significantly faster than the state-of-the-art methods, especially for large-scale problems with a massive amount of data. The source codes are available at https://github.com/hoitowai/ciag/


page 1

page 2

page 3

page 4


Curvature-aided Incremental Aggregated Gradient Method

We propose a new algorithm for finite sum optimization which we call the...

SUCAG: Stochastic Unbiased Curvature-aided Gradient Method for Distributed Optimization

We propose and analyze a new stochastic gradient method, which we call S...

General Proximal Incremental Aggregated Gradient Algorithms: Better and Novel Results under General Scheme

The incremental aggregated gradient algorithm is popular in network opti...

FLAG n' FLARE: Fast Linearly-Coupled Adaptive Gradient Methods

We consider first order gradient methods for effectively optimizing a co...

HAMSI: A Parallel Incremental Optimization Algorithm Using Quadratic Approximations for Solving Partially Separable Problems

We propose HAMSI (Hessian Approximated Multiple Subsets Iteration), whic...

Local Saddle Point Optimization: A Curvature Exploitation Approach

Gradient-based optimization methods are the most popular choice for find...

Hybrid Deterministic-Stochastic Methods for Data Fitting

Many structured data-fitting applications require the solution of an opt...

Code Repositories


companion codes for the paper "On Curvature-aided Incremental Aggregated Gradient Methods" submitted

view repo