Hybrid Deterministic-Stochastic Methods for Data Fitting

04/13/2011
by   Michael P. Friedlander, et al.
0

Many structured data-fitting applications require the solution of an optimization problem involving a sum over a potentially large number of measurements. Incremental gradient algorithms offer inexpensive iterations by sampling a subset of the terms in the sum. These methods can make great progress initially, but often slow as they approach a solution. In contrast, full-gradient methods achieve steady convergence at the expense of evaluating the full objective and gradient on each iteration. We explore hybrid methods that exhibit the benefits of both approaches. Rate-of-convergence analysis shows that by controlling the sample size in an incremental gradient algorithm, it is possible to maintain the steady convergence rates of full-gradient methods. We detail a practical quasi-Newton implementation based on this approach. Numerical experiments illustrate its potential benefits.

READ FULL TEXT
research
01/18/2016

Sub-Sampled Newton Methods II: Local Convergence Rates

Many data-fitting applications require the solution of an optimization p...
research
05/26/2023

Sharpened Lazy Incremental Quasi-Newton Method

We consider the finite sum minimization of n strongly convex and smooth ...
research
06/10/2021

Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach

In this paper, we study the application of quasi-Newton methods for solv...
research
02/26/2022

Faster One-Sample Stochastic Conditional Gradient Method for Composite Convex Minimization

We propose a stochastic conditional gradient method (CGM) for minimizing...
research
10/24/2017

Curvature-aided Incremental Aggregated Gradient Method

We propose a new algorithm for finite sum optimization which we call the...
research
03/07/2021

Retrospective Approximation for Smooth Stochastic Optimization

We consider stochastic optimization problems where a smooth (and potenti...
research
05/31/2018

On Curvature-aided Incremental Aggregated Gradient Methods

This paper studies an acceleration technique for incremental aggregated ...

Please sign up or login with your details

Forgot password? Click here to reset