Optimising Communication Overhead in Federated Learning Using NSGA-II

04/01/2022
by   José Ángel Morell, et al.
0

Federated learning is a training paradigm according to which a server-based model is cooperatively trained using local models running on edge devices and ensuring data privacy. These devices exchange information that induces a substantial communication load, which jeopardises the functioning efficiency. The difficulty of reducing this overhead stands in achieving this without decreasing the model's efficiency (contradictory relation). To do so, many works investigated the compression of the pre/mid/post-trained models and the communication rounds, separately, although they jointly contribute to the communication overload. Our work aims at optimising communication overhead in federated learning by (I) modelling it as a multi-objective problem and (II) applying a multi-objective optimization algorithm (NSGA-II) to solve it. To the best of the author's knowledge, this is the first work that explores the add-in that evolutionary computation could bring for solving such a problem, and considers both the neuron and devices features together. We perform the experimentation by simulating a server/client architecture with 4 slaves. We investigate both convolutional and fully-connected neural networks with 12 and 3 layers, 887,530 and 33,400 weights, respectively. We conducted the validation on the dataset containing 70,000 images. The experiments have shown that our proposal could reduce communication by 99 obtained by the FedAvg Algorithm that uses 100

READ FULL TEXT

Authors

page 13

page 14

12/18/2018

Multi-objective Evolutionary Federated Learning

Federated learning is an emerging technique used to prevent the leakage ...
10/06/2021

Federated Learning via Plurality Vote

Federated learning allows collaborative workers to solve a machine learn...
04/25/2021

FedSup: A Communication-Efficient Federated Learning Fatigue Driving Behaviors Supervision Framework

With the proliferation of edge smart devices and the Internet of Vehicle...
08/20/2021

FedSkel: Efficient Federated Learning on Heterogeneous Systems with Skeleton Gradients Update

Federated learning aims to protect users' privacy while performing data ...
11/14/2020

CatFedAvg: Optimising Communication-efficiency and Classification Accuracy in Federated Learning

Federated learning has allowed the training of statistical models over r...
06/02/2022

Federated Learning with a Sampling Algorithm under Isoperimetry

Federated learning uses a set of techniques to efficiently distribute th...
10/23/2020

Throughput-Optimal Topology Design for Cross-Silo Federated Learning

Federated learning usually employs a client-server architecture where an...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.