Multi-Task Distributed Learning using Vision Transformer with Random Patch Permutation

04/07/2022
by   Sangjoon Park, et al.
0

The widespread application of artificial intelligence in health research is currently hampered by limitations in data availability. Distributed learning methods such as federated learning (FL) and shared learning (SL) are introduced to solve this problem as well as data management and ownership issues with their different strengths and weaknesses. The recent proposal of federated split task-agnostic (FeSTA) learning tries to reconcile the distinct merits of FL and SL by enabling the multi-task collaboration between participants through Vision Transformer (ViT) architecture, but they suffer from higher communication overhead. To address this, here we present a multi-task distributed learning using ViT with random patch permutation. Instead of using a CNN based head as in FeSTA, p-FeSTA adopts a randomly permuting simple patch embedder, improving the multi-task learning performance without sacrificing privacy. Experimental results confirm that the proposed method significantly enhances the benefit of multi-task collaboration, communication efficiency, and privacy preservation, shedding light on practical multi-task distributed learning in the field of medical imaging.

READ FULL TEXT

page 1

page 3

page 4

research
05/30/2017

Federated Multi-Task Learning

Federated learning poses new statistical and systems challenges in train...
research
02/17/2022

CoFED: Cross-silo Heterogeneous Federated Multi-task Learning via Co-training

Federated Learning (FL) is a machine learning technique that enables par...
research
06/26/2023

FeSViBS: Federated Split Learning of Vision Transformer with Block Sampling

Data scarcity is a significant obstacle hindering the learning of powerf...
research
07/17/2020

User-Oriented Multi-Task Federated Deep Learning for Mobile Edge Computing

Federated Learning (FL) is a recent approach for collaboratively trainin...
research
07/01/2022

Visual Transformer Meets CutMix for Improved Accuracy, Communication Efficiency, and Data Privacy in Split Learning

This article seeks for a distributed learning solution for the visual tr...
research
01/05/2023

MS-DINO: Efficient Distributed Training of Vision Transformer Foundation Model in Medical Domain through Masked Sampling

In spite of the recent success of deep learning in the medical domain, t...
research
06/01/2023

RHFedMTL: Resource-Aware Hierarchical Federated Multi-Task Learning

The rapid development of artificial intelligence (AI) over massive appli...

Please sign up or login with your details

Forgot password? Click here to reset