Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning

11/02/2021
by   Xun Qian, et al.
1

Recent advances in distributed optimization have shown that Newton-type methods with proper communication compression mechanisms can guarantee fast local rates and low communication cost compared to first order methods. We discover that the communication cost of these methods can be further reduced, sometimes dramatically so, with a surprisingly simple trick: Basis Learn (BL). The idea is to transform the usual representation of the local Hessians via a change of basis in the space of matrices and apply compression tools to the new representation. To demonstrate the potential of using custom bases, we design a new Newton-type method (BL1), which reduces communication cost via both BL technique and bidirectional compression mechanism. Furthermore, we present two alternative extensions (BL2 and BL3) to partial participation to accommodate federated learning applications. We prove local linear and superlinear rates independent of the condition number. Finally, we support our claims with numerical experiments by comparing several first and second order methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2022

Distributed Newton-Type Methods with Communication Compression and Bernoulli Aggregation

Despite their high computation and communication costs, Newton-type meth...
research
06/05/2021

FedNL: Making Newton-Type Methods Applicable to Federated Learning

Inspired by recent work of Islamov et al (2021), we propose a family of ...
research
10/18/2022

FLECS-CGD: A Federated Learning Second-Order Framework via Compression and Sketching with Compressed Gradient Differences

In the recent paper FLECS (Agafonov et al, FLECS: A Federated Learning S...
research
09/06/2021

On Second-order Optimization Methods for Federated Learning

We consider federated learning (FL), where the training data is distribu...
research
02/14/2021

Distributed Second Order Methods with Fast Rates and Compressed Communication

We develop several new communication-efficient second-order methods for ...
research
02/05/2021

Bias-Variance Reduced Local SGD for Less Heterogeneous Federated Learning

Federated learning is one of the important learning scenarios in distrib...
research
02/20/2020

Uncertainty Principle for Communication Compression in Distributed and Federated Learning and the Search for an Optimal Compressor

In order to mitigate the high communication cost in distributed and fede...

Please sign up or login with your details

Forgot password? Click here to reset