Fairness and Accuracy in Federated Learning

12/18/2020
by   Wei Huang, et al.
4

In the federated learning setting, multiple clients jointly train a model under the coordination of the central server, while the training data is kept on the client to ensure privacy. Normally, inconsistent distribution of data across different devices in a federated network and limited communication bandwidth between end devices impose both statistical heterogeneity and expensive communication as major challenges for federated learning. This paper proposes an algorithm to achieve more fairness and accuracy in federated learning (FedFa). It introduces an optimization scheme that employs a double momentum gradient, thereby accelerating the convergence rate of the model. An appropriate weight selection algorithm that combines the information quantity of training accuracy and training frequency to measure the weights is proposed. This procedure assists in addressing the issue of unfairness in federated learning due to preferences for certain clients. Our results show that the proposed FedFa algorithm outperforms the baseline algorithm in terms of accuracy and fairness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2020

Federated Residual Learning

We study a new form of federated learning where the clients train person...
research
05/08/2021

Loss Tolerant Federated Learning

Federated learning has attracted attention in recent years for collabora...
research
02/24/2023

Personalizing Federated Learning with Over-the-Air Computations

Federated edge learning is a promising technology to deploy intelligence...
research
06/28/2021

Weight Divergence Driven Divide-and-Conquer Approach for Optimal Federated Learning from non-IID Data

Federated Learning allows training of data stored in distributed devices...
research
08/03/2023

Hierarchical Federated Learning in Wireless Networks: Pruning Tackles Bandwidth Scarcity and System Heterogeneity

While a practical wireless network has many tiers where end users do not...
research
06/23/2023

Synthetic data shuffling accelerates the convergence of federated learning under data heterogeneity

In federated learning, data heterogeneity is a critical challenge. A str...
research
11/29/2021

SPATL: Salient Parameter Aggregation and Transfer Learning for Heterogeneous Clients in Federated Learning

Efficient federated learning is one of the key challenges for training a...

Please sign up or login with your details

Forgot password? Click here to reset