A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing

02/11/2022
by   Nicolò Dal Fabbro, et al.
0

There is a growing interest in the decentralized optimization framework that goes under the name of Federated Learning (FL). In particular, much attention is being turned to FL scenarios where the network is strongly heterogeneous in terms of communication resources (e.g., bandwidth) and data distribution. In these cases, communication between local machines (agents) and the central server (Master) is a main consideration. In this work, we present an original communication-constrained Newton-type (NT) algorithm designed to accelerate FL in such heterogeneous scenarios. The algorithm is by design robust to non i.i.d. data distributions, handles heterogeneity of agents' communication resources (CRs), only requires sporadic Hessian computations, and achieves super-linear convergence. This is possible thanks to an incremental strategy, based on a singular value decomposition (SVD) of the local Hessian matrices, which exploits (possibly) outdated second-order information. The proposed solution is thoroughly validated on real datasets by assessing (i) the number of communication rounds required for convergence, (ii) the overall amount of data transmitted and (iii) the number of local Hessian computations required. For all these metrics, the proposed approach shows superior performance against state-of-the art techniques like GIANT and FedNL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2021

FedLGA: Towards System-Heterogeneity of Federated Learning via Local Gradient Approximation

Federated Learning (FL) is a decentralized machine learning architecture...
research
05/18/2023

Q-SHED: Distributed Optimization at the Edge via Hessian Eigenvectors Quantization

Edge networks call for communication efficient (low overhead) and robust...
research
09/06/2021

On Second-order Optimization Methods for Federated Learning

We consider federated learning (FL), where the training data is distribu...
research
06/11/2023

FedDec: Peer-to-peer Aided Federated Learning

Federated learning (FL) has enabled training machine learning models exp...
research
06/05/2021

FedNL: Making Newton-Type Methods Applicable to Federated Learning

Inspired by recent work of Islamov et al (2021), we propose a family of ...
research
05/24/2022

Federated singular value decomposition for high dimensional data

Federated learning (FL) is emerging as a privacy-aware alternative to cl...
research
08/17/2022

NET-FLEET: Achieving Linear Convergence Speedup for Fully Decentralized Federated Learning with Heterogeneous Data

Federated learning (FL) has received a surge of interest in recent years...

Please sign up or login with your details

Forgot password? Click here to reset