Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis

04/15/2023
by   Dachao Lin, et al.
0

We study finite-sum distributed optimization problems with n-clients under popular δ-similarity condition and μ-strong convexity. We propose two new algorithms: SVRS and AccSVRS motivated by previous works. The non-accelerated SVRS method combines the techniques of gradient-sliding and variance reduction, which achieves superior communication complexity (n +√(n)δ/μ) compared to existing non-accelerated algorithms. Applying the framework proposed in Katyusha X, we also build a direct accelerated practical version named AccSVRS with totally smoothness-free (n + n^3/4√(δ/μ)) communication complexity that improves upon existing algorithms on ill-conditioning cases. Furthermore, we show a nearly matched lower bound to verify the tightness of our AccSVRS method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2022

Faster federated optimization under second-order similarity

Federated learning (FL) is a subfield of machine learning where multiple...
research
02/09/2022

Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods

We design accelerated algorithms with improved rates for several fundame...
research
09/30/2021

Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free Optimization

We show that stochastic acceleration can be achieved under the perturbed...
research
05/29/2019

A unified variance-reduced accelerated gradient method for convex optimization

We propose a novel randomized incremental gradient algorithm, namely, VA...
research
10/28/2022

GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity

In this work, we study distributed optimization algorithms that reduce t...
research
05/20/2018

Communication-Efficient Projection-Free Algorithm for Distributed Optimization

Distributed optimization has gained a surge of interest in recent years....
research
09/18/2021

An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm for First-order and Zeroth-order Optimization

The conditional gradient algorithm (also known as the Frank-Wolfe algori...

Please sign up or login with your details

Forgot password? Click here to reset