Fast Convergent Federated Learning with Aggregated Gradients

03/28/2023
by   Wenhao Yuan, et al.
0

Federated Learning (FL) is a novel machine learning framework, which enables multiple distributed devices cooperatively to train a shared model scheduled by a central server while protecting private data locally. However, the non-independent-and-identically-distributed (Non-IID) data samples and frequent communication across participants may significantly slow down the convergent rate and increase communication costs. To achieve fast convergence, we ameliorate the conventional local updating rule by introducing the aggregated gradients at each local update epoch, and propose an adaptive learning rate algorithm that further takes the deviation of local parameter and global parameter into consideration. The above adaptive learning rate design requires all clients' local information including the local parameters and gradients, which is challenging as there is no communication during the local update epochs. To obtain a decentralized adaptive learning rate for each client, we utilize the mean field approach by introducing two mean field terms to estimate the average local parameters and gradients respectively, which does not require the clients to exchange their local information with each other at each local epoch. Numerical results show that our proposed framework is superior to the state-of-art FL schemes in both model accuracy and convergent rate for IID and Non-IID datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/27/2023

Adaptive Federated Learning via New Entropy Approach

Federated Learning (FL) has recently emerged as a popular framework, whi...
research
02/13/2023

FedDA: Faster Framework of Local Adaptive Gradient Methods via Restarted Dual Averaging

Federated learning (FL) is an emerging learning paradigm to tackle massi...
research
01/25/2023

When to Trust Aggregated Gradients: Addressing Negative Client Sampling in Federated Learning

Federated Learning has become a widely-used framework which allows learn...
research
12/11/2021

Server-Side Local Gradient Averaging and Learning Rate Acceleration for Scalable Split Learning

In recent years, there have been great advances in the field of decentra...
research
03/02/2022

Towards Efficient and Stable K-Asynchronous Federated Learning with Unbounded Stale Gradients on Non-IID Data

Federated learning (FL) is an emerging privacy-preserving paradigm that ...
research
08/07/2022

Federated Adversarial Learning: A Framework with Convergence Analysis

Federated learning (FL) is a trending training paradigm to utilize decen...
research
10/29/2021

ADDS: Adaptive Differentiable Sampling for Robust Multi-Party Learning

Distributed multi-party learning provides an effective approach for trai...

Please sign up or login with your details

Forgot password? Click here to reset