On-device Federated Learning with Flower

by   Akhil Mathur, et al.

Federated Learning (FL) allows edge devices to collaboratively learn a shared prediction model while keeping their training data on the device, thereby decoupling the ability to do machine learning from the need to store data in the cloud. Despite the algorithmic advancements in FL, the support for on-device training of FL algorithms on edge devices remains poor. In this paper, we present an exploration of on-device FL on various smartphones and embedded devices using the Flower framework. We also evaluate the system costs of on-device FL and discuss how this quantification could be used to design more efficient FL algorithms.


page 1

page 2

page 3

page 4


Flower: A Friendly Federated Learning Research Framework

Federated Learning (FL) has emerged as a promising technique for edge de...

FLAME: Federated Learning Across Multi-device Environments

Federated Learning (FL) enables distributed training of machine learning...

FedHe: Heterogeneous Models and Communication-Efficient Federated Learning

Federated learning (FL) is able to manage edge devices to cooperatively ...

Resource-Efficient and Delay-Aware Federated Learning Design under Edge Heterogeneity

Federated learning (FL) has emerged as a popular technique for distribut...

Quantization Robust Federated Learning for Efficient Inference on Heterogeneous Devices

Federated Learning (FL) is a machine learning paradigm to distributively...

Adaptive Dynamic Pruning for Non-IID Federated Learning

Federated Learning (FL) has emerged as a new paradigm of training machin...

Federated Dynamic Sparse Training: Computing Less, Communicating Less, Yet Learning Better

Federated learning (FL) enables distribution of machine learning workloa...