Optimization with First-Order Surrogate Functions

05/14/2013
by   Julien Mairal, et al.
0

In this paper, we study optimization methods consisting of iteratively minimizing surrogates of an objective function. By proposing several algorithmic variants and simple convergence analyses, we make two main contributions. First, we provide a unified viewpoint for several first-order optimization techniques such as accelerated proximal gradient, block coordinate descent, or Frank-Wolfe algorithms. Second, we introduce a new incremental scheme that experimentally matches or outperforms state-of-the-art solvers for large-scale optimization problems typically arising in machine learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2013

Stochastic Majorization-Minimization Algorithms for Large-Scale Optimization

Majorization-minimization algorithms consist of iteratively minimizing a...
research
10/24/2015

Fast and Scalable Lasso via Stochastic Frank-Wolfe Methods with a Convergence Guarantee

Frank-Wolfe (FW) algorithms have been often proposed over the last few y...
research
12/15/2017

Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

We introduce a generic scheme for accelerating gradient-based optimizati...
research
09/19/2023

A Novel Gradient Methodology with Economical Objective Function Evaluations for Data Science Applications

Gradient methods are experiencing a growth in methodological and theoret...
research
06/15/2016

Optimization Methods for Large-Scale Machine Learning

This paper provides a review and commentary on the past, present, and fu...
research
01/16/2018

Combinatorial Preconditioners for Proximal Algorithms on Graphs

We present a novel preconditioning technique for proximal optimization m...

Please sign up or login with your details

Forgot password? Click here to reset