Wireless Ad Hoc Federated Learning: A Fully Distributed Cooperative Machine Learning

05/24/2022
by   Hideya Ochiai, et al.
0

Federated learning has allowed training of a global model by aggregating local models trained on local nodes. However, it still takes client-server model, which can be further distributed, fully decentralized, or even partially connected, or totally opportunistic. In this paper, we propose a wireless ad hoc federated learning (WAFL) – a fully distributed cooperative machine learning organized by the nodes physically nearby. Here, each node has a wireless interface and can communicate with each other when they are within the radio range. The nodes are expected to move with people, vehicles, or robots, producing opportunistic contacts with each other. In WAFL, each node trains a model individually with the local data it has. When a node encounter with others, they exchange their trained models, and generate new aggregated models, which are expected to be more general compared to the locally trained models on Non-IID data. For evaluation, we have prepared four static communication networks and two types of dynamic and opportunistic communication networks based on random waypoint mobility and community-structured environment, and then studied the training process of a fully connected neural network with 90 Non-IID MNIST dataset. The evaluation results indicate that WAFL allowed the convergence of model parameters among the nodes toward generalization, even with opportunistic node contact scenarios – whereas in self-training (or lonely training) case, they have diverged. This WAFL's model generalization contributed to achieving higher accuracy 94.7-96.2 compared to the self-training case 84.7

READ FULL TEXT

page 1

page 2

page 4

page 8

page 10

page 13

research
11/07/2022

Resilience of Wireless Ad Hoc Federated Learning against Model Poisoning Attacks

Wireless ad hoc federated learning (WAFL) is a fully decentralized colla...
research
04/15/2021

D-Cliques: Compensating NonIIDness in Decentralized Federated Learning with Topology

The convergence speed of machine learning models trained with Federated ...
research
04/15/2023

SalientGrads: Sparse Models for Communication Efficient and Data Aware Distributed Federated Training

Federated learning (FL) enables the training of a model leveraging decen...
research
10/27/2022

Resource Constrained Vehicular Edge Federated Learning with Highly Mobile Connected Vehicles

This paper proposes a vehicular edge federated learning (VEFL) solution,...
research
08/17/2020

WAFFLE: Watermarking in Federated Learning

Creators of machine learning models can use watermarking as a technique ...
research
04/06/2022

Federated Learning for Distributed Spectrum Sensing in NextG Communication Networks

NextG networks are intended to provide the flexibility of sharing the sp...
research
01/30/2020

Learning from Peers at the Wireless Edge

The last mile connection is dominated by wireless links where heterogene...

Please sign up or login with your details

Forgot password? Click here to reset