One-Shot Federated Learning

02/28/2019
by   Neel Guha, et al.
0

We present one-shot federated learning, where a central server learns a global model over a network of federated devices in a single round of communication. Our approach - drawing on ensemble learning and knowledge aggregation - achieves an average relative gain of 51.5 baselines and comes within 90.1 these methods and identify several promising directions of future work.

READ FULL TEXT
research
10/19/2020

From Distributed Machine Learning To Federated Learning: In The View Of Data Privacy And Security

Federated learning is an improved version of distributed machine learnin...
research
08/11/2022

A Modified UDP for Federated Learning Packet Transmissions

This paper introduces a Modified User Datagram Protocol (UDP) for Federa...
research
09/17/2020

Distilled One-Shot Federated Learning

Current federated learning algorithms take tens of communication rounds ...
research
10/11/2019

Central Server Free Federated Learning over Single-sided Trust Social Networks

Federated learning has become increasingly important for modern machine ...
research
01/07/2020

FedDANE: A Federated Newton-Type Method

Federated learning aims to jointly learn statistical models over massive...
research
06/02/2022

Federated Learning with a Sampling Algorithm under Isoperimetry

Federated learning uses a set of techniques to efficiently distribute th...
research
02/23/2023

Data-Free Diversity-Based Ensemble Selection For One-Shot Federated Learning in Machine Learning Model Market

The emerging availability of trained machine learning models has put for...

Please sign up or login with your details

Forgot password? Click here to reset