Split Federated Learning: Speed up Model Training in Resource-Limited Wireless Networks

05/30/2023
by   Songge Zhang, et al.
0

In this paper, we propose a novel distributed learning scheme, named group-based split federated learning (GSFL), to speed up artificial intelligence (AI) model training. Specifically, the GSFL operates in a split-then-federated manner, which consists of three steps: 1) Model distribution, in which the access point (AP) splits the AI models and distributes the client-side models to clients; 2) Model training, in which each client executes forward propagation and transmit the smashed data to the edge server. The edge server executes forward and backward propagation and then returns the gradient to the clients for updating local client-side models; and 3) Model aggregation, in which edge servers aggregate the server-side and client-side models. Simulation results show that the GSFL outperforms vanilla split learning and federated learning schemes in terms of overall training latency while achieving satisfactory accuracy.

READ FULL TEXT

page 1

page 2

research
03/26/2023

Efficient Parallel Split Learning over Resource-constrained Wireless Edge Networks

The increasingly deeper neural networks hinder the democratization of pr...
research
12/16/2022

SplitGP: Achieving Both Generalization and Personalization in Federated Learning

A fundamental challenge to providing edge-AI services is the need for a ...
research
03/26/2021

Hierarchical Quantized Federated Learning: Convergence Analysis and System Design

Federated learning is a collaborative machine learning framework to trai...
research
10/22/2019

Abnormal Client Behavior Detection in Federated Learning

In federated learning systems, clients are autonomous in that their beha...
research
02/24/2023

Personalizing Federated Learning with Over-the-Air Computations

Federated edge learning is a promising technology to deploy intelligence...
research
05/09/2022

ResSFL: A Resistance Transfer Framework for Defending Model Inversion Attack in Split Federated Learning

This work aims to tackle Model Inversion (MI) attack on Split Federated ...
research
08/15/2023

PoFEL: Energy-efficient Consensus for Blockchain-based Hierarchical Federated Learning

Facilitated by mobile edge computing, client-edge-cloud hierarchical fed...

Please sign up or login with your details

Forgot password? Click here to reset