Why and How to Pay Different Attention to Phrase Alignments of Different Intensities

04/23/2016
by   Wenpeng Yin, et al.
0

This work studies comparatively two typical sentence pair classification tasks: textual entailment (TE) and answer selection (AS), observing that phrase alignments of different intensities contribute differently in these tasks. We address the problems of identifying phrase alignments of flexible granularity and pooling alignments of different intensities for these tasks. Examples for flexible granularity are alignments between two single words, between a single word and a phrase and between a short phrase and a long phrase. By intensity we roughly mean the degree of match, it ranges from identity over surface-form co-occurrence, rephrasing and other semantic relatedness to unrelated words as in lots of parenthesis text. Prior work (i) has limitations in phrase generation and representation, or (ii) conducts alignment at word and phrase levels by handcrafted features or (iii) utilizes a single attention mechanism over alignment intensities without considering the characteristics of specific tasks, which limits the system's effectiveness across tasks. We propose an architecture based on Gated Recurrent Unit that supports (i) representation learning of phrases of arbitrary granularity and (ii) task-specific focusing of phrase alignments between two sentences by attention pooling. Experimental results on TE and AS match our observation and are state-of-the-art.

READ FULL TEXT
research
01/09/2017

Task-Specific Attentive Pooling of Phrase Alignments Contributes to Sentence Matching

This work studies comparatively two typical sentence matching tasks: tex...
research
10/13/2017

Learning Phrase Embeddings from Paraphrases with GRUs

Learning phrase representations has been widely explored in many Natural...
research
05/25/2016

BattRAE: Bidimensional Attention-Based Recursive Autoencoders for Learning Bilingual Phrase Embeddings

In this paper, we propose a bidimensional attention based recursive auto...
research
09/05/2022

Continuous Decomposition of Granularity for Neural Paraphrase Generation

While Transformers have had significant success in paragraph generation,...
research
09/12/2021

Constructing Phrase-level Semantic Labels to Form Multi-Grained Supervision for Image-Text Retrieval

Existing research for image text retrieval mainly relies on sentence-lev...
research
12/05/2018

Enriching Article Recommendation with Phrase Awareness

Recent deep learning methods for recommendation systems are highly sophi...
research
05/14/2019

Convolutional Poisson Gamma Belief Network

For text analysis, one often resorts to a lossy representation that eith...

Please sign up or login with your details

Forgot password? Click here to reset