Same Representation, Different Attentions: Shareable Sentence Representation Learning from Multiple Tasks

04/22/2018
by   Renjie Zheng, et al.
0

Distributed representation plays an important role in deep learning based natural language processing. However, the representation of a sentence often varies in different tasks, which is usually learned from scratch and suffers from the limited amounts of training data. In this paper, we claim that a good sentence representation should be invariant and can benefit the various subsequent tasks. To achieve this purpose, we propose a new scheme of information sharing for multi-task learning. More specifically, all tasks share the same sentence representation and each task can select the task-specific information from the shared sentence representation with attention mechanism. The query vector of each task's attention could be either static parameters or generated dynamically. We conduct extensive experiments on 16 different text classification tasks, which demonstrate the benefits of our architecture.

READ FULL TEXT
research
08/23/2018

Exploring Shared Structures and Hierarchies for Multiple NLP Tasks

Designing shared neural architecture plays an important role in multi-ta...
research
05/25/2022

A Simple and Unified Tagging Model with Priming for Relational Structure Predictions

Relational structure extraction covers a wide range of tasks and plays a...
research
02/25/2018

Meta Multi-Task Learning for Sequence Modeling

Semantic composition functions have been playing a pivotal role in neura...
research
06/09/2021

DGA-Net Dynamic Gaussian Attention Network for Sentence Semantic Matching

Sentence semantic matching requires an agent to determine the semantic r...
research
12/27/2017

Combining Representation Learning with Logic for Language Processing

The current state-of-the-art in many natural language processing and aut...
research
04/29/2020

"The Boating Store Had Its Best Sail Ever": Pronunciation-attentive Contextualized Pun Recognition

Humor plays an important role in human languages and it is essential to ...
research
07/27/2021

Exceeding the Limits of Visual-Linguistic Multi-Task Learning

By leveraging large amounts of product data collected across hundreds of...

Please sign up or login with your details

Forgot password? Click here to reset