FedShuffle: Recipes for Better Use of Local Work in Federated Learning

04/27/2022
by   Samuel Horvath, et al.
1

The practice of applying several local updates before aggregation across clients has been empirically shown to be a successful approach to overcoming the communication bottleneck in Federated Learning (FL). In this work, we propose a general recipe, FedShuffle, that better utilizes the local updates in FL, especially in the heterogeneous regime. Unlike many prior works, FedShuffle does not assume any uniformity in the number of updates per device. Our FedShuffle recipe comprises four simple-yet-powerful ingredients: 1) local shuffling of the data, 2) adjustment of the local learning rates, 3) update weighting, and 4) momentum variance reduction (Cutkosky and Orabona, 2019). We present a comprehensive theoretical analysis of FedShuffle and show that both theoretically and empirically, our approach does not suffer from the objective function mismatch that is present in FL methods which assume homogeneous updates in heterogeneous FL setups, e.g., FedAvg (McMahan et al., 2017). In addition, by combining the ingredients above, FedShuffle improves upon FedNova (Wang et al., 2020), which was previously proposed to solve this mismatch. We also show that FedShuffle with momentum variance reduction can improve upon non-local methods under a Hessian similarity assumption. Finally, through experiments on synthetic and real-world datasets, we illustrate how each of the four ingredients used in FedShuffle helps improve the use of local updates in FL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2020

Federated learning with hierarchical clustering of local updates to improve training on non-IID data

Federated learning (FL) is a well established method for performing mach...
research
09/25/2022

On the Stability Analysis of Open Federated Learning Systems

We consider the open federated learning (FL) systems, where clients may ...
research
08/16/2021

Reducing the Communication Cost of Federated Learning through Multistage Optimization

A central question in federated learning (FL) is how to design optimizat...
research
01/31/2022

Federated Learning with Erroneous Communication Links

In this paper, we consider the federated learning (FL) problem in the pr...
research
05/26/2022

FedAug: Reducing the Local Learning Bias Improves Federated Learning on Heterogeneous Data

Federated Learning (FL) is a machine learning paradigm that learns from ...
research
08/10/2022

Fast Heterogeneous Federated Learning with Hybrid Client Selection

Client selection schemes are widely adopted to handle the communication-...
research
06/19/2021

STEM: A Stochastic Two-Sided Momentum Algorithm Achieving Near-Optimal Sample and Communication Complexities for Federated Learning

Federated Learning (FL) refers to the paradigm where multiple worker nod...

Please sign up or login with your details

Forgot password? Click here to reset