Convex Optimization: Algorithms and Complexity

05/20/2014
by   Sébastien Bubeck, et al.
0

This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms. Starting from the fundamental theory of black-box optimization, the material progresses towards recent advances in structural optimization and stochastic optimization. Our presentation of black-box optimization, strongly influenced by Nesterov's seminal book and Nemirovski's lecture notes, includes the analysis of cutting plane methods, as well as (accelerated) gradient descent schemes. We also pay special attention to non-Euclidean settings (relevant algorithms include Frank-Wolfe, mirror descent, and dual averaging) and discuss their relevance in machine learning. We provide a gentle introduction to structural optimization with FISTA (to optimize a sum of a smooth and a simple non-smooth term), saddle-point mirror prox (Nemirovski's alternative to Nesterov's smoothing), and a concise description of interior point methods. In stochastic optimization we discuss stochastic gradient descent, mini-batches, random coordinate descent, and sublinear algorithms. We also briefly touch upon convex relaxation of combinatorial problems and the use of randomness to round solutions, as well as random walks based methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2021

Never Go Full Batch (in Stochastic Convex Optimization)

We study the generalization performance of full-batch optimization algor...
research
05/02/2022

Gradient Descent, Stochastic Optimization, and Other Tales

The goal of this paper is to debunk and dispel the magic behind black-bo...
research
10/30/2019

Unifying mirror descent and dual averaging

We introduce and analyse a new family of algorithms which generalizes an...
research
06/18/2020

Stochastic Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization

In this paper, we introduce a simplified and unified method for finite-s...
research
02/25/2020

Biased Stochastic Gradient Descent for Conditional Stochastic Optimization

Conditional Stochastic Optimization (CSO) covers a variety of applicatio...
research
08/07/2022

Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity

Many fundamental problems in machine learning can be formulated by the c...
research
06/03/2021

Convex optimization

This textbook is based on lectures given by the authors at MIPT (Moscow)...

Please sign up or login with your details

Forgot password? Click here to reset