Collaborating Heterogeneous Natural Language Processing Tasks via Federated Learning

12/12/2022
by   Chenhe Dong, et al.
0

The increasing privacy concerns on personal private text data promote the development of federated learning (FL) in recent years. However, the existing studies on applying FL in NLP are not suitable to coordinate participants with heterogeneous or private learning objectives. In this study, we further broaden the application scope of FL in NLP by proposing an Assign-Then-Contrast (denoted as ATC) framework, which enables clients with heterogeneous NLP tasks to construct an FL course and learn useful knowledge from each other. Specifically, the clients are suggested to first perform local training with the unified tasks assigned by the server rather than using their own learning objectives, which is called the Assign training stage. After that, in the Contrast training stage, clients train with different local learning objectives and exchange knowledge with other clients who contribute consistent and useful model updates. We conduct extensive experiments on six widely-used datasets covering both Natural Language Understanding (NLU) and Natural Language Generation (NLG) tasks, and the proposed ATC framework achieves significant improvements compared with various baseline methods. The source code is available at <https://github.com/alibaba/FederatedScope/tree/master/federatedscope/nlp/hetero_tasks>.

READ FULL TEXT
research
12/20/2022

When Federated Learning Meets Pre-trained Language Models' Parameter-Efficient Tuning Methods

With increasing privacy concerns on data, recent studies have made signi...
research
01/28/2022

A Secure and Efficient Federated Learning Framework for NLP

In this work, we consider the problem of designing secure and efficient ...
research
05/26/2022

A Fair Federated Learning Framework With Reinforcement Learning

Federated learning (FL) is a paradigm where many clients collaboratively...
research
05/26/2022

Federated Split BERT for Heterogeneous Text Classification

Pre-trained BERT models have achieved impressive performance in many nat...
research
06/06/2022

Pretrained Models for Multilingual Federated Learning

Since the advent of Federated Learning (FL), research has applied these ...
research
12/12/2022

Federated NLP in Few-shot Scenarios

Natural language processing (NLP) sees rich mobile applications. To supp...
research
05/16/2023

FedHGN: A Federated Framework for Heterogeneous Graph Neural Networks

Heterogeneous graph neural networks (HGNNs) can learn from typed and rel...

Please sign up or login with your details

Forgot password? Click here to reset