On-Device Federated Learning via Blockchain and its Latency Analysis

08/12/2018
by   Hyesung Kim, et al.
0

In this letter, we propose a block-chained federated learning (BlockFL) architecture, where mobile devices' local learning model updates are exchanged and verified by leveraging blockchain. This enables on-device machine learning without any central coordination, even when each device lacks its own training data samples. We investigate the end-to-end learning completion latency of BlockFL, thereby yielding the optimal block generation rate as well as important insights in terms of network scalability and robustness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2020

Resource Management for Blockchain-enabled Federated Learning: A Deep Reinforcement Learning Approach

Blockchain-enabled Federated Learning (BFL) enables model updates of Fed...
research
05/10/2021

Latency Analysis of Consortium Blockchained Federated Learning

A decentralized federated learning architecture is proposed to apply to ...
research
02/03/2022

End-to-End Latency Analysis and Optimal Block Size of Proof-of-Work Blockchain Applications

Due to the increasing interest in blockchain technology for fostering se...
research
04/04/2022

ScaleSFL: A Sharding Solution for Blockchain-Based Federated Learning

Blockchain-based federated learning has gained significant interest over...
research
03/20/2021

Demystifying the Effects of Non-Independence in Federated Learning

Federated Learning (FL) enables statistical models to be built on user-g...
research
10/28/2021

DFL: High-Performance Blockchain-Based Federated Learning

Many researchers are trying to replace the aggregation server in federat...
research
11/01/2020

One-Shot Federated Learning with Neuromorphic Processors

Being very low power, the use of neuromorphic processors in mobile devic...

Please sign up or login with your details

Forgot password? Click here to reset