When MiniBatch SGD Meets SplitFed Learning:Convergence Analysis and Performance Evaluation

08/23/2023
by   Chao Huang, et al.
0

Federated learning (FL) enables collaborative model training across distributed clients (e.g., edge devices) without sharing raw data. Yet, FL can be computationally expensive as the clients need to train the entire model multiple times. SplitFed learning (SFL) is a recent distributed approach that alleviates computation workload at the client device by splitting the model at a cut layer into two parts, where clients only need to train part of the model. However, SFL still suffers from the client drift problem when clients' data are highly non-IID. To address this issue, we propose MiniBatch-SFL. This algorithm incorporates MiniBatch SGD into SFL, where the clients train the client-side model in an FL fashion while the server trains the server-side model similar to MiniBatch SGD. We analyze the convergence of MiniBatch-SFL and show that the bound of the expected loss can be obtained by analyzing the expected server-side and client-side model updates, respectively. The server-side updates do not depend on the non-IID degree of the clients' datasets and can potentially mitigate client drift. However, the client-side model relies on the non-IID degree and can be optimized by properly choosing the cut layer. Perhaps counter-intuitive, our empirical result shows that a latter position of the cut layer leads to a smaller average gradient divergence and a better algorithm performance. Moreover, numerical results show that MiniBatch-SFL achieves higher accuracy than conventional SFL and FL. The accuracy improvement can be up to 24.1% and 17.1% with highly non-IID data, respectively.

READ FULL TEXT
research
07/19/2023

FedBug: A Bottom-Up Gradual Unfreezing Framework for Federated Learning

Federated Learning (FL) offers a collaborative training framework, allow...
research
06/19/2023

Adaptive Federated Learning with Auto-Tuned Clients

Federated learning (FL) is a distributed machine learning framework wher...
research
04/27/2022

AdaBest: Minimizing Client Drift in Federated Learning via Adaptive Bias Estimation

In Federated Learning (FL), a number of clients or devices collaborate t...
research
07/25/2023

Blockchain-based Optimized Client Selection and Privacy Preserved Framework for Federated Learning

Federated learning is a distributed mechanism that trained large-scale n...
research
05/11/2021

Separate but Together: Unsupervised Federated Learning for Speech Enhancement from Non-IID Data

We propose FEDENHANCE, an unsupervised federated learning (FL) approach ...
research
04/23/2022

Federated Geometric Monte Carlo Clustering to Counter Non-IID Datasets

Federated learning allows clients to collaboratively train models on dat...
research
02/23/2021

CIAO: An Optimization Framework for Client-Assisted Data Loading

Data loading has been one of the most common performance bottlenecks for...

Please sign up or login with your details

Forgot password? Click here to reset