More Industry-friendly: Federated Learning with High Efficient Design

12/16/2020
by   Dingwei Li, et al.
0

Although many achievements have been made since Google threw out the paradigm of federated learning (FL), there still exists much room for researchers to optimize its efficiency. In this paper, we propose a high efficient FL method equipped with the double head design aiming for personalization optimization over non-IID dataset, and the gradual model sharing design for communication saving. Experimental results show that, our method has more stable accuracy performance and better communication efficient across various data distributions than other state of art methods (SOTAs), makes it more industry-friendly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2021

FedLab: A Flexible Federated Learning Framework

Federated learning (FL) is a machine learning field in which researchers...
research
05/29/2023

Deep Equilibrium Models Meet Federated Learning

In this study the problem of Federated Learning (FL) is explored under a...
research
06/01/2021

H-FL: A Hierarchical Communication-Efficient and Privacy-Protected Architecture for Federated Learning

The longstanding goals of federated learning (FL) require rigorous priva...
research
05/25/2022

VeriFi: Towards Verifiable Federated Unlearning

Federated learning (FL) is a collaborative learning paradigm where parti...
research
11/19/2020

FedEval: A Benchmark System with a Comprehensive Evaluation Model for Federated Learning

As an innovative solution for privacy-preserving machine learning (ML), ...
research
08/19/2021

Towards More Efficient Federated Learning with Better Optimization Objects

Federated Learning (FL) is a privacy-protected machine learning paradigm...
research
05/17/2021

EasyFL: A Low-code Federated Learning Platform For Dummies

Academia and industry have developed several platforms to support the po...

Please sign up or login with your details

Forgot password? Click here to reset