A Multi-Token Coordinate Descent Method for Semi-Decentralized Vertical Federated Learning

09/18/2023
by   Pedro Valdeira, et al.
0

Communication efficiency is a major challenge in federated learning (FL). In client-server schemes, the server constitutes a bottleneck, and while decentralized setups spread communications, they do not necessarily reduce them due to slower convergence. We propose Multi-Token Coordinate Descent (MTCD), a communication-efficient algorithm for semi-decentralized vertical federated learning, exploiting both client-server and client-client communications when each client holds a small subset of features. Our multi-token method can be seen as a parallel Markov chain (block) coordinate descent algorithm and it subsumes the client-server and decentralized setups as special cases. We obtain a convergence rate of 𝒪(1/T) for nonconvex objectives when tokens roam over disjoint subsets of clients and for convex objectives when they roam over possibly overlapping subsets. Numerical results show that MTCD improves the state-of-the-art communication efficiency and allows for a tunable amount of parallel communications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2020

VAFL: a Method of Vertical Asynchronous Federated Learning

Horizontal Federated learning (FL) handles multi-client data that share ...
research
08/19/2021

Cross-Silo Federated Learning for Multi-Tier Networks with Vertical and Horizontal Data Partitioning

We consider federated learning in tiered communication networks. Our net...
research
02/06/2021

Multi-Tier Federated Learning for Vertically Partitioned Data

We consider decentralized model training in tiered communication network...
research
10/25/2022

SWIFT: Rapid Decentralized Federated Learning via Wait-Free Model Communication

The decentralized Federated Learning (FL) setting avoids the role of a p...
research
03/28/2023

Communication-Efficient Vertical Federated Learning with Limited Overlapping Samples

Federated learning is a popular collaborative learning approach that ena...
research
08/01/2023

Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation

Asynchronous Federated Learning with Buffered Aggregation (FedBuff) is a...
research
08/05/2021

Decentralized Federated Learning with Unreliable Communications

Decentralized federated learning, inherited from decentralized learning,...

Please sign up or login with your details

Forgot password? Click here to reset