FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks

12/16/2022
by   Weilong Dong, et al.
0

Massively multi-task learning with large language models has recently made substantial progress on few-shot generalization. However, this is usually performed in a centralized learning fashion, ignoring the privacy sensitivity issue of (annotated) data used in multiple tasks. To mitigate this issue, we propose FewFedWeight, a few-shot federated learning framework across multiple tasks, to achieve the best of both worlds: privacy preservation and cross-task generalization. FewFedWeight trains client models in isolated devices without sharing data. It broadcasts the global model in the server to each client and produces pseudo data for clients so that knowledge from the global model can be explored to enhance few-shot learning of each client model. An energy-based algorithm is further proposed to weight pseudo samples in order to reduce the negative impact of noise from the generated pseudo data. Adaptive model weights of client models are also tuned according to their performance. We use these model weights to dynamically aggregate client models to update the global model. Experiments on 118 NLP tasks show that FewFedWeight can significantly improve the performance of client models on 61 performance improvement rate of 30.5 outperform FedAvg and other decentralized learning methods.

READ FULL TEXT

page 2

page 10

research
06/30/2023

FedBone: Towards Large-Scale Federated Multi-Task Learning

Heterogeneous federated multi-task learning (HFMTL) is a federated learn...
research
05/07/2021

FedGL: Federated Graph Learning Framework with Global Self-Supervision

Graph data are ubiquitous in the real world. Graph learning (GL) tries t...
research
04/01/2021

Federated Few-Shot Learning with Adversarial Learning

We are interested in developing a unified machine learning model over ma...
research
05/26/2022

Federated Split BERT for Heterogeneous Text Classification

Pre-trained BERT models have achieved impressive performance in many nat...
research
12/12/2022

Federated NLP in Few-shot Scenarios

Natural language processing (NLP) sees rich mobile applications. To supp...
research
08/18/2020

Adaptive Distillation for Decentralized Learning from Heterogeneous Clients

This paper addresses the problem of decentralized learning to achieve a ...
research
05/20/2022

AutoFedNLP: An efficient FedNLP framework

Transformer-based pre-trained models have revolutionized NLP for superio...

Please sign up or login with your details

Forgot password? Click here to reset