Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

10/18/2016
by   Arda Aytekin, et al.
0

This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems. The algorithm can handle both general convex (possibly non-smooth) regularizers and general convex constraints. When the empirical data loss is strongly convex, we establish linear convergence rate, give explicit expressions for step-size choices that guarantee convergence to the optimum, and bound the associated convergence factors. The expressions have an explicit dependence on the degree of asynchrony and recover classical results under synchronous operation. Simulations and implementations on commercial compute clouds validate our findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2015

An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization

Mini-batch optimization has proven to be a powerful paradigm for large-s...
research
03/10/2019

Asynchronous Federated Optimization

Federated learning enables training on a massive number of edge devices....
research
02/23/2023

A subgradient method with constant step-size for ℓ_1-composite optimization

Subgradient methods are the natural extension to the non-smooth case of ...
research
10/11/2019

General Proximal Incremental Aggregated Gradient Algorithms: Better and Novel Results under General Scheme

The incremental aggregated gradient algorithm is popular in network opti...
research
10/27/2017

Stochastic Conjugate Gradient Algorithm with Variance Reduction

Conjugate gradient methods are a class of important methods for solving ...
research
07/18/2012

Stochastic optimization and sparse statistical recovery: An optimal algorithm for high dimensions

We develop and analyze stochastic optimization algorithms for problems i...
research
07/12/2023

Low complexity convergence rate bounds for the synchronous gossip subclass of push-sum algorithms

We develop easily accessible quantities for bounding the almost sure exp...

Please sign up or login with your details

Forgot password? Click here to reset