DeepAI AI Chat
Log In Sign Up

SEGA: Variance Reduction via Gradient Sketching

09/09/2018
by   Filip Hanzely, et al.
0

We propose a randomized first order optimization method--SEGA (SkEtched GrAdient method)-- which progressively throughout its iterations builds a variance-reduced estimate of the gradient from random linear measurements (sketches) of the gradient obtained from an oracle. In each iteration, SEGA updates the current estimate of the gradient through a sketch-and-project operation using the information provided by the latest sketch, and this is subsequently used to compute an unbiased estimate of the true gradient through a random relaxation procedure. This unbiased estimate is then used to perform a gradient step. Unlike standard subspace descent methods, such as coordinate descent, SEGA can be used for optimization problems with a non-separable proximal term. We provide a general convergence analysis and prove linear convergence for strongly convex objectives. In the special case of coordinate sketches, SEGA can be enhanced with various techniques such as importance sampling, minibatching and acceleration, and its rate is up to a small constant factor identical to the best-known rate of coordinate descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/13/2014

Stochastic Optimization with Importance Sampling

Uniform sampling of training data has been commonly used in traditional ...
05/27/2019

A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent

In this paper we introduce a unified analysis of a large family of varia...
10/21/2018

Dynamic Average Diffusion with randomized Coordinate Updates

This work derives and analyzes an online learning strategy for tracking ...
08/16/2016

Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition

In 1963, Polyak proposed a simple condition that is sufficient to show a...
06/11/2020

Randomized Fast Subspace Descent Methods

Randomized Fast Subspace Descent (RFASD) Methods are developed and analy...
06/02/2023

Linearly convergent adjoint free solution of least squares problems by random descent

We consider the problem of solving linear least squares problems in a fr...