Multi-Tier Federated Learning for Vertically Partitioned Data

02/06/2021
by   Anirban Das, et al.
0

We consider decentralized model training in tiered communication networks. Our network model consists of a set of silos, each holding a vertical partition of the data. Each silo contains a hub and a set of clients, with the silo's vertical data shard partitioned horizontally across its clients. We propose Tiered Decentralized Coordinate Descent (TDCD), a communication-efficient decentralized training algorithm for such two-tiered networks. To reduce communication overhead, the clients in each silo perform multiple local gradient steps before sharing updates with their hub. Each hub adjusts its coordinates by averaging its workers' updates, and then hubs exchange intermediate updates with one another. We present a theoretical analysis of our algorithm and show the dependence of the convergence rate on the number of vertical partitions, the number of local updates, and the number of clients in each hub. We further validate our approach empirically via simulation-based experiments using a variety of datasets and both convex and non-convex objectives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2021

Cross-Silo Federated Learning for Multi-Tier Networks with Vertical and Horizontal Data Partitioning

We consider federated learning in tiered communication networks. Our net...
research
06/16/2022

Compressed-VFL: Communication-Efficient Learning with Vertically Partitioned Data

We propose Compressed Vertical Federated Learning (C-VFL) for communicat...
research
05/23/2022

Semi-Decentralized Federated Learning with Collaborative Relaying

We present a semi-decentralized federated learning algorithm wherein cli...
research
09/18/2023

A Multi-Token Coordinate Descent Method for Semi-Decentralized Vertical Federated Learning

Communication efficiency is a major challenge in federated learning (FL)...
research
01/28/2022

FedGCN: Convergence and Communication Tradeoffs in Federated Training of Graph Convolutional Networks

Distributed methods for training models on graph datasets have recently ...
research
08/12/2022

A Fast Blockchain-based Federated Learning Framework with Compressed Communications

Recently, blockchain-based federated learning (BFL) has attracted intens...
research
10/10/2022

On the Performance of Gradient Tracking with Local Updates

We study the decentralized optimization problem where a network of n age...

Please sign up or login with your details

Forgot password? Click here to reset