FedNL: Making Newton-Type Methods Applicable to Federated Learning

06/05/2021
by   Mher Safaryan, et al.
0

Inspired by recent work of Islamov et al (2021), we propose a family of Federated Newton Learn (FedNL) methods, which we believe is a marked step in the direction of making second-order methods applicable to FL. In contrast to the aforementioned work, FedNL employs a different Hessian learning technique which i) enhances privacy as it does not rely on the training data to be revealed to the coordinating server, ii) makes it applicable beyond generalized linear models, and iii) provably works with general contractive compression operators for compressing the local Hessians, such as Top-K or Rank-R, which are vastly superior in practice. Notably, we do not need to rely on error feedback for our methods to work with contractive compressors. Moreover, we develop FedNL-PP, FedNL-CR and FedNL-LS, which are variants of FedNL that support partial participation, and globalization via cubic regularization and line search, respectively, and FedNL-BC, which is a variant that can further benefit from bidirectional compression of gradients and models, i.e., smart uplink gradient and smart downlink model compression. We prove local convergence rates that are independent of the condition number, the number of training data points, and compression variance. Our communication efficient Hessian learning technique provably learns the Hessian at the optimum. Finally, we perform a variety of numerical experiments that show that our FedNL methods have state-of-the-art communication complexity when compared to key baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2022

Distributed Newton-Type Methods with Communication Compression and Bernoulli Aggregation

Despite their high computation and communication costs, Newton-type meth...
research
11/02/2021

Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning

Recent advances in distributed optimization have shown that Newton-type ...
research
02/14/2021

Distributed Second Order Methods with Fast Rates and Compressed Communication

We develop several new communication-efficient second-order methods for ...
research
02/11/2022

A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing

There is a growing interest in the decentralized optimization framework ...
research
06/05/2023

Improving Accelerated Federated Learning with Compression and Importance Sampling

Federated Learning is a collaborative training framework that leverages ...
research
06/17/2022

FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning

Newton-type methods are popular in federated learning due to their fast ...
research
06/01/2022

Optimization with access to auxiliary information

We investigate the fundamental optimization question of minimizing a tar...

Please sign up or login with your details

Forgot password? Click here to reset