FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning

06/17/2022
by   Anis Elgabli, et al.
14

Newton-type methods are popular in federated learning due to their fast convergence. Still, they suffer from two main issues, namely: low communication efficiency and low privacy due to the requirement of sending Hessian information from clients to parameter server (PS). In this work, we introduced a novel framework called FedNew in which there is no need to transmit Hessian information from clients to PS, hence resolving the bottleneck to improve communication efficiency. In addition, FedNew hides the gradient information and results in a privacy-preserving approach compared to the existing state-of-the-art. The core novel idea in FedNew is to introduce a two level framework, and alternate between updating the inverse Hessian-gradient product using only one alternating direction method of multipliers (ADMM) step and then performing the global model update using Newton's method. Though only one ADMM pass is used to approximate the inverse Hessian-gradient product at each iteration, we develop a novel theoretical approach to show the converging behavior of FedNew for convex problems. Additionally, a significant reduction in communication overhead is achieved by utilizing stochastic quantization. Numerical results using real datasets show the superiority of FedNew compared to existing methods in terms of communication costs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2021

AsySQN: Faster Vertical Federated Learning Algorithms with Better Computation Resource Utilization

Vertical federated learning (VFL) is an effective paradigm of training t...
research
11/25/2020

Distributed Additive Encryption and Quantization for Privacy Preserving Federated Deep Learning

Homomorphic encryption is a very useful gradient protection technique us...
research
04/22/2022

Federated Learning via Inexact ADMM

One of the crucial issues in federated learning is how to develop effici...
research
09/12/2022

Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning

Federated learning has attracted growing interest as it preserves the cl...
research
10/03/2020

Secant Penalized BFGS: A Noise Robust Quasi-Newton Method Via Penalizing The Secant Condition

In this paper, we introduce a new variant of the BFGS method designed to...
research
04/06/2022

A Hessian inversion-free exact second order method for distributed consensus optimization

We consider a standard distributed consensus optimization problem where ...
research
06/05/2021

FedNL: Making Newton-Type Methods Applicable to Federated Learning

Inspired by recent work of Islamov et al (2021), we propose a family of ...

Please sign up or login with your details

Forgot password? Click here to reset