Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization

05/25/2022
by   Benjamin Dubois-Taine, et al.
0

We consider the problem of minimizing the sum of two convex functions. One of those functions has Lipschitz-continuous gradients, and can be accessed via stochastic oracles, whereas the other is "simple". We provide a Bregman-type algorithm with accelerated convergence in function values to a ball containing the minimum. The radius of this ball depends on problem-dependent constants, including the variance of the stochastic oracle. We further show that this algorithmic setup naturally leads to a variant of Frank-Wolfe achieving acceleration under parallelization. More precisely, when minimizing a smooth convex function on a bounded domain, we show that one can achieve an ϵ primal-dual gap (in expectation) in Õ(1/ √(ϵ)) iterations, by only accessing gradients of the original function and a linear maximization oracle with O(1/√(ϵ)) computing units in parallel. We illustrate this fast convergence on synthetic numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2020

Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms

We introduce a new primal-dual algorithm for minimizing the sum of three...
research
01/22/2019

Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's acc...
research
12/23/2009

Fast Alternating Linearization Methods for Minimizing the Sum of Two Convex Functions

We present in this paper first-order alternating linearization algorithm...
research
06/25/2018

A DCA-Like Algorithm and its Accelerated Version with Application in Data Visualization

In this paper, we present two variants of DCA (Different of Convex funct...
research
11/17/2020

Simple Iterative Methods for Linear Optimization over Convex Sets

We give simple iterative methods for computing approximately optimal pri...
research
06/29/2019

Conjugate Gradients and Accelerated Methods Unified: The Approximate Duality Gap View

This note provides a novel, simple analysis of the method of conjugate g...
research
05/04/2021

Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss

We characterize the complexity of minimizing max_i∈[N] f_i(x) for convex...

Please sign up or login with your details

Forgot password? Click here to reset