Accelerating Federated Learning with a Global Biased Optimiser

08/20/2021
by   Jed Mills, et al.
0

Federated Learning (FL) is a recent development in the field of machine learning that collaboratively trains models without the training data leaving client devices, to preserve data privacy. In realistic FL settings, the training set is distributed over clients in a highly non-Independent and Identically Distributed (non-IID) fashion, which has been shown extensively to harm FL convergence speed and final model performance. To address this challenge, we propose a novel, generalised approach for incorporating adaptive optimisation techniques into FL with the Federated Global Biased Optimiser (FedGBO) algorithm. FedGBO accelerates FL by employing a set of global biased optimiser values during the client-training phase, which helps to reduce `client-drift' from non-IID data, whilst also benefiting from adaptive optimisation. We show that the FedGBO update with a generic optimiser can be reformulated as centralised training using biased gradients and optimiser updates, and apply this theoretical framework to prove the convergence of FedGBO using momentum-Stochastic Gradient Descent (SGDm). We also conduct extensive experiments using 4 realistic FL benchmark datasets (CIFAR100, Sent140, FEMNIST, Shakespeare) and 3 popular adaptive optimisers (RMSProp, SGDm, Adam) to compare the performance of state-of-the-art adaptive-FL algorithms. The results demonstrate that FedGBO has highly competitive performance whilst achieving lower communication and computation costs, and provide practical insights into the trade-offs associated with the different adaptive-FL algorithms and optimisers for real-world FL deployments.

READ FULL TEXT

page 6

page 7

page 13

research
06/04/2021

Local Adaptivity in Federated Learning: Convergence and Consistency

The federated learning (FL) framework trains a machine learning model us...
research
05/16/2023

Faster Federated Learning with Decaying Number of Local SGD Steps

In Federated Learning (FL) client devices connected over the internet co...
research
05/26/2022

A Unified Analysis of Federated Learning with Arbitrary Client Participation

Federated learning (FL) faces challenges of intermittent client availabi...
research
02/21/2023

FedSpeed: Larger Local Interval, Less Communication Round, and Higher Generalization Accuracy

Federated learning is an emerging distributed machine learning framework...
research
05/02/2023

FedAVO: Improving Communication Efficiency in Federated Learning with African Vultures Optimizer

Federated Learning (FL), a distributed machine learning technique has re...
research
06/23/2022

Efficient Adaptive Federated Optimization of Federated Learning for IoT

The proliferation of the Internet of Things (IoT) and widespread use of ...
research
09/04/2023

DRAG: Divergence-based Adaptive Aggregation in Federated learning on Non-IID Data

Local stochastic gradient descent (SGD) is a fundamental approach in ach...

Please sign up or login with your details

Forgot password? Click here to reset