Meta Multi-Task Learning for Sequence Modeling

02/25/2018
by   Junkun Chen, et al.
0

Semantic composition functions have been playing a pivotal role in neural representation learning of text sequences. In spite of their success, most existing models suffer from the underfitting problem: they use the same shared compositional function on all the positions in the sequence, thereby lacking expressive power due to incapacity to capture the richness of compositionality. Besides, the composition functions of different tasks are independent and learned from scratch. In this paper, we propose a new sharing scheme of composition function across multiple tasks. Specifically, we use a shared meta-network to capture the meta-knowledge of semantic composition and generate the parameters of the task-specific semantic composition models. We conduct extensive experiments on two types of tasks, text classification and sequence tagging, which demonstrate the benefits of our approach. Besides, we show that the shared meta-knowledge learned by our proposed model can be regarded as off-the-shelf knowledge and easily transferred to new tasks.

READ FULL TEXT
research
04/19/2017

Adversarial Multi-task Learning for Text Classification

Neural network models have shown their promising opportunities for multi...
research
04/22/2018

Same Representation, Different Attentions: Shareable Sentence Representation Learning from Multiple Tasks

Distributed representation plays an important role in deep learning base...
research
05/11/2017

Dynamic Compositional Neural Networks over Tree Structure

Tree-structured neural networks have proven to be effective in learning ...
research
08/23/2018

Exploring Shared Structures and Hierarchies for Multiple NLP Tasks

Designing shared neural architecture plays an important role in multi-ta...
research
09/10/2021

Knowledge-Aware Meta-learning for Low-Resource Text Classification

Meta-learning has achieved great success in leveraging the historical le...
research
08/11/2023

Deep Task-specific Bottom Representation Network for Multi-Task Recommendation

Neural-based multi-task learning (MTL) has gained significant improvemen...
research
11/05/2018

Learning Shared Dynamics with Meta-World Models

Humans have consciousness as the ability to perceive events and objects:...

Please sign up or login with your details

Forgot password? Click here to reset