Federated Learning in Temporal Heterogeneity

09/17/2023
by   Junghwan Lee, et al.
0

In this work, we explored federated learning in temporal heterogeneity across clients. We observed that global model obtained by trained with fixed-length sequences shows faster convergence than varying-length sequences. We proposed methods to mitigate temporal heterogeneity for efficient federated learning based on the empirical observation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2020

Federated Residual Learning

We study a new form of federated learning where the clients train person...
research
06/23/2023

Synthetic data shuffling accelerates the convergence of federated learning under data heterogeneity

In federated learning, data heterogeneity is a critical challenge. A str...
research
10/01/2022

Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning

Federated learning aims to train models collaboratively across different...
research
07/18/2021

An Experimental Study of Data Heterogeneity in Federated Learning Methods for Medical Imaging

Federated learning enables multiple institutions to collaboratively trai...
research
02/01/2021

Curse or Redemption? How Data Heterogeneity Affects the Robustness of Federated Learning

Data heterogeneity has been identified as one of the key features in fed...
research
04/16/2022

DRFLM: Distributionally Robust Federated Learning with Inter-client Noise via Local Mixup

Recently, federated learning has emerged as a promising approach for tra...
research
06/15/2021

On Large-Cohort Training for Federated Learning

Federated learning methods typically learn a model by iteratively sampli...

Please sign up or login with your details

Forgot password? Click here to reset