All Birds with One Stone: Multi-task Text Classification for Efficient Inference with One Forward Pass

05/22/2022
by   Jiaxin Huang, et al.
0

Multi-Task Learning (MTL) models have shown their robustness, effectiveness, and efficiency for transferring learned knowledge across tasks. In real industrial applications such as web content classification, multiple classification tasks are predicted from the same input text such as a web article. However, at the serving time, the existing multitask transformer models such as prompt or adaptor based approaches need to conduct N forward passes for N tasks with O(N) computation cost. To tackle this problem, we propose a scalable method that can achieve stronger performance with close to O(1) computation cost via only one forward pass. To illustrate real application usage, we release a multitask dataset on news topic and style classification. Our experiments show that our proposed method outperforms strong baselines on both the GLUE benchmark and our news dataset. Our code and dataset are publicly available at https://bit.ly/mtop-code.

READ FULL TEXT
research
04/19/2017

Adversarial Multi-task Learning for Text Classification

Neural network models have shown their promising opportunities for multi...
research
05/17/2016

Recurrent Neural Network for Text Classification with Multi-Task Learning

Neural network based methods have obtained great progress on a variety o...
research
10/05/2020

Lifelong Language Knowledge Distillation

It is challenging to perform lifelong language learning (LLL) on a strea...
research
08/30/2021

N15News: A New Dataset for Multimodal News Classification

Current news datasets merely focus on text features on the news and rare...
research
03/04/2020

SeMemNN: A Semantic Matrix-Based Memory Neural Network for Text Classification

Text categorization is the task of assigning labels to documents written...

Please sign up or login with your details

Forgot password? Click here to reset