AsySQN: Faster Vertical Federated Learning Algorithms with Better Computation Resource Utilization

09/26/2021
by   Qingsong Zhang, et al.
0

Vertical federated learning (VFL) is an effective paradigm of training the emerging cross-organizational (e.g., different corporations, companies and organizations) collaborative learning with privacy preserving. Stochastic gradient descent (SGD) methods are the popular choices for training VFL models because of the low per-iteration computation. However, existing SGD-based VFL algorithms are communication-expensive due to a large number of communication rounds. Meanwhile, most existing VFL algorithms use synchronous computation which seriously hamper the computation resource utilization in real-world applications. To address the challenges of communication and computation resource utilization, we propose an asynchronous stochastic quasi-Newton (AsySQN) framework for VFL, under which three algorithms, i.e. AsySQN-SGD, -SVRG and -SAGA, are proposed. The proposed AsySQN-type algorithms making descent steps scaled by approximate (without calculating the inverse Hessian matrix explicitly) Hessian information convergence much faster than SGD-based methods in practice and thus can dramatically reduce the number of communication rounds. Moreover, the adopted asynchronous computation can make better use of the computation resource. We theoretically prove the convergence rates of our proposed algorithms for strongly convex problems. Extensive numerical experiments on real-word datasets demonstrate the lower communication costs and better computation resource utilization of our algorithms compared with state-of-the-art VFL algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2021

Secure Bilevel Asynchronous Vertical Federated Learning with Backward Updating

Vertical federated learning (VFL) attracts increasing attention due to t...
research
06/17/2022

FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning

Newton-type methods are popular in federated learning due to their fast ...
research
08/14/2020

Privacy-Preserving Asynchronous Federated Learning Algorithms for Multi-Party Vertically Collaborative Learning

The privacy-preserving federated learning for vertically partitioned dat...
research
12/01/2019

A Quasi-Newton Method Based Vertical Federated Learning Framework for Logistic Regression

Data privacy and security becomes a major concern in building machine le...
research
03/19/2022

Desirable Companion for Vertical Federated Learning: New Zeroth-Order Gradient Based Algorithm

Vertical federated learning (VFL) attracts increasing attention due to t...
research
04/13/2021

Sample-based and Feature-based Federated Learning via Mini-batch SSCA

Due to the resource consumption for transmitting massive data and the co...
research
07/29/2022

Towards Communication-efficient Vertical Federated Learning Training via Cache-enabled Local Updates

Vertical federated learning (VFL) is an emerging paradigm that allows di...

Please sign up or login with your details

Forgot password? Click here to reset