Q-SHED: Distributed Optimization at the Edge via Hessian Eigenvectors Quantization

05/18/2023
by   Nicolò Dal Fabbro, et al.
0

Edge networks call for communication efficient (low overhead) and robust distributed optimization (DO) algorithms. These are, in fact, desirable qualities for DO frameworks, such as federated edge learning techniques, in the presence of data and system heterogeneity, and in scenarios where internode communication is the main bottleneck. Although computationally demanding, Newton-type (NT) methods have been recently advocated as enablers of robust convergence rates in challenging DO problems where edge devices have sufficient computational power. Along these lines, in this work we propose Q-SHED, an original NT algorithm for DO featuring a novel bit-allocation scheme based on incremental Hessian eigenvectors quantization. The proposed technique is integrated with the recent SHED algorithm, from which it inherits appealing features like the small number of required Hessian computations, while being bandwidth-versatile at a bit-resolution level. Our empirical evaluation against competing approaches shows that Q-SHED can reduce by up to 60 communication rounds required for convergence.

READ FULL TEXT
research
02/11/2022

A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing

There is a growing interest in the decentralized optimization framework ...
research
10/07/2021

A Stochastic Newton Algorithm for Distributed Convex Optimization

We propose and analyze a stochastic Newton algorithm for homogeneous dis...
research
03/20/2023

Over-the-Air Federated Edge Learning with Error-Feedback One-Bit Quantization and Power Control

Over-the-air federated edge learning (Air-FEEL) is a communication-effic...
research
12/29/2021

Training Time Minimization for Federated Edge Learning with Optimized Gradient Quantization and Bandwidth Allocation

Training a machine learning model with federated edge learning (FEEL) is...
research
02/09/2023

Communication-Efficient Federated Hypergradient Computation via Aggregated Iterative Differentiation

Federated bilevel optimization has attracted increasing attention due to...
research
03/09/2022

Correlated quantization for distributed mean estimation and optimization

We study the problem of distributed mean estimation and optimization und...

Please sign up or login with your details

Forgot password? Click here to reset