FedSL: Federated Split Learning on Distributed Sequential Data in Recurrent Neural Networks

11/06/2020
by   Ali Abedi, et al.
0

Federated Learning (FL) and Split Learning (SL) are privacy-preserving Machine-Learning (ML) techniques that enable training ML models over data distributed among clients without requiring direct access to their raw data. Existing FL and SL approaches work on horizontally or vertically partitioned data and cannot handle sequentially partitioned data where segments of multiple-segment sequential data are distributed across clients. In this paper, we propose a novel federated split learning framework, FedSL, to train models on distributed sequential data. The most common ML models to train on sequential data are Recurrent Neural Networks (RNNs). Since the proposed framework is privacy preserving, segments of multiple-segment sequential data cannot be shared between clients or between clients and server. To circumvent this limitation, we propose a novel SL approach tailored for RNNs. A RNN is split into sub-networks, and each sub-network is trained on one client containing single segments of multiple-segment training sequences. During local training, the sub-networks on different clients communicate with each other to capture latent dependencies between consecutive segments of multiple-segment sequential data on different clients, but without sharing raw data or complete model parameters. After training local sub-networks with local sequential data segments, all clients send their sub-networks to a federated server where sub-networks are aggregated to generate a global model. The experimental results on simulated and real-world datasets demonstrate that the proposed method successfully train models on distributed sequential data, while preserving privacy, and outperforms previous FL and centralized learning approaches in terms of achieving higher accuracy in fewer communication rounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2020

SplitFed: When Federated Learning Meets Split Learning

Federated learning (FL) and split learning (SL) are two recent distribut...
research
03/08/2022

LSTMSPLIT: Effective SPLIT Learning based LSTM on Sequential Time-Series Data

Federated learning (FL) and split learning (SL) are the two popular dist...
research
03/05/2021

FedDis: Disentangled Federated Learning for Unsupervised Brain Pathology Segmentation

In recent years, data-driven machine learning (ML) methods have revoluti...
research
04/26/2023

FedVS: Straggler-Resilient and Privacy-Preserving Vertical Federated Learning for Split Models

In a vertical federated learning (VFL) system consisting of a central se...
research
11/13/2020

Hybrid Federated and Centralized Learning

Many of the machine learning (ML) tasks are focused on centralized learn...
research
07/04/2022

Federated Split GANs

Mobile devices and the immense amount and variety of data they generate ...
research
03/02/2020

Communication-Efficient Multimodal Split Learning for mmWave Received Power Prediction

The goal of this study is to improve the accuracy of millimeter wave rec...

Please sign up or login with your details

Forgot password? Click here to reset