Optimization with First-Order Surrogate Functions

05/14/2013
by   Julien Mairal, et al.
0

In this paper, we study optimization methods consisting of iteratively minimizing surrogates of an objective function. By proposing several algorithmic variants and simple convergence analyses, we make two main contributions. First, we provide a unified viewpoint for several first-order optimization techniques such as accelerated proximal gradient, block coordinate descent, or Frank-Wolfe algorithms. Second, we introduce a new incremental scheme that experimentally matches or outperforms state-of-the-art solvers for large-scale optimization problems typically arising in machine learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/19/2013

Stochastic Majorization-Minimization Algorithms for Large-Scale Optimization

Majorization-minimization algorithms consist of iteratively minimizing a...
10/24/2015

Fast and Scalable Lasso via Stochastic Frank-Wolfe Methods with a Convergence Guarantee

Frank-Wolfe (FW) algorithms have been often proposed over the last few y...
12/15/2017

Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

We introduce a generic scheme for accelerating gradient-based optimizati...
01/25/2019

Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

In this paper, we propose a unified view of gradient-based algorithms fo...
06/15/2016

Optimization Methods for Large-Scale Machine Learning

This paper provides a review and commentary on the past, present, and fu...
01/16/2018

Combinatorial Preconditioners for Proximal Algorithms on Graphs

We present a novel preconditioning technique for proximal optimization m...