Visual Transformer Meets CutMix for Improved Accuracy, Communication Efficiency, and Data Privacy in Split Learning

by   Sihun Baek, et al.

This article seeks for a distributed learning solution for the visual transformer (ViT) architectures. Compared to convolutional neural network (CNN) architectures, ViTs often have larger model sizes, and are computationally expensive, making federated learning (FL) ill-suited. Split learning (SL) can detour this problem by splitting a model and communicating the hidden representations at the split-layer, also known as smashed data. Notwithstanding, the smashed data of ViT are as large as and as similar as the input data, negating the communication efficiency of SL while violating data privacy. To resolve these issues, we propose a new form of CutSmashed data by randomly punching and compressing the original smashed data. Leveraging this, we develop a novel SL framework for ViT, coined CutMixSL, communicating CutSmashed data. CutMixSL not only reduces communication costs and privacy leakage, but also inherently involves the CutMix data augmentation, improving accuracy and scalability. Simulations corroborate that CutMixSL outperforms baselines such as parallelized SL and SplitFed that integrates FL with SL.


page 2

page 5

page 11


Reduce Communication Costs and Preserve Privacy: Prompt Tuning Method in Federated Learning

Federated learning (FL) has enabled global model training on decentraliz...

StatMix: Data augmentation method that relies on image statistics in federated learning

Availability of large amount of annotated data is one of the pillars of ...

Server-Side Local Gradient Averaging and Learning Rate Acceleration for Scalable Split Learning

In recent years, there have been great advances in the field of decentra...

Hybrid Architectures for Distributed Machine Learning in Heterogeneous Wireless Networks

The ever-growing data privacy concerns have transformed machine learning...

Multi-Task Distributed Learning using Vision Transformer with Random Patch Permutation

The widespread application of artificial intelligence in health research...

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

This study develops a federated learning (FL) framework overcoming large...

Communication-Efficient Multimodal Split Learning for mmWave Received Power Prediction

The goal of this study is to improve the accuracy of millimeter wave rec...