Optimising Communication Overhead in Federated Learning Using NSGA-II

04/01/2022
by   José Ángel Morell, et al.
0

Federated learning is a training paradigm according to which a server-based model is cooperatively trained using local models running on edge devices and ensuring data privacy. These devices exchange information that induces a substantial communication load, which jeopardises the functioning efficiency. The difficulty of reducing this overhead stands in achieving this without decreasing the model's efficiency (contradictory relation). To do so, many works investigated the compression of the pre/mid/post-trained models and the communication rounds, separately, although they jointly contribute to the communication overload. Our work aims at optimising communication overhead in federated learning by (I) modelling it as a multi-objective problem and (II) applying a multi-objective optimization algorithm (NSGA-II) to solve it. To the best of the author's knowledge, this is the first work that explores the add-in that evolutionary computation could bring for solving such a problem, and considers both the neuron and devices features together. We perform the experimentation by simulating a server/client architecture with 4 slaves. We investigate both convolutional and fully-connected neural networks with 12 and 3 layers, 887,530 and 33,400 weights, respectively. We conducted the validation on the dataset containing 70,000 images. The experiments have shown that our proposal could reduce communication by 99 obtained by the FedAvg Algorithm that uses 100

READ FULL TEXT

page 13

page 14

research
12/18/2018

Multi-objective Evolutionary Federated Learning

Federated learning is an emerging technique used to prevent the leakage ...
research
10/06/2021

Federated Learning via Plurality Vote

Federated learning allows collaborative workers to solve a machine learn...
research
08/19/2022

Communication Size Reduction of Federated Learning based on Neural ODE Model

Federated learning is a machine learning method in which data is not agg...
research
04/25/2021

FedSup: A Communication-Efficient Federated Learning Fatigue Driving Behaviors Supervision Framework

With the proliferation of edge smart devices and the Internet of Vehicle...
research
06/02/2022

Federated Learning with a Sampling Algorithm under Isoperimetry

Federated learning uses a set of techniques to efficiently distribute th...
research
06/20/2020

FedMGDA+: Federated Learning meets Multi-objective Optimization

Federated learning has emerged as a promising, massively distributed way...
research
09/19/2023

Sparser Random Networks Exist: Enforcing Communication-Efficient Federated Learning via Regularization

This work presents a new method for enhancing communication efficiency i...

Please sign up or login with your details

Forgot password? Click here to reset