Enhancing Sentence Embedding with Generalized Pooling

06/26/2018
by   Qian Chen, et al.
0

Pooling is an essential component of a wide variety of sentence representation and embedding models. This paper explores generalized pooling methods to enhance sentence embedding. We propose vector-based multi-head attention that includes the widely used max pooling, mean pooling, and scalar self-attention as special cases. The model benefits from properly designed penalization terms to reduce redundancy in multi-head attention. We evaluate the proposed model on three different tasks: natural language inference (NLI), author profiling, and sentiment classification. The experiments show that the proposed model achieves significant improvement over strong sentence-encoding-based methods, resulting in state-of-the-art performances on four datasets. The proposed approach can be easily implemented for more problems than we discuss in this paper.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2018

Natural Language Inference with Hierarchical BiLSTM Max Pooling Architecture

Recurrent neural networks have proven to be very effective for natural l...
research
11/26/2019

Low Rank Factorization for Compact Multi-Head Self-Attention

Effective representation learning from text has been an active area of r...
research
03/09/2017

A Structured Self-attentive Sentence Embedding

This paper proposes a new model for extracting an interpretable sentence...
research
12/21/2017

Encoding CNN Activations for Writer Recognition

The encoding of local features is an essential part for writer identific...
research
05/30/2016

Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention

In this paper, we proposed a sentence encoding-based model for recognizi...
research
05/01/2020

Why and when should you pool? Analyzing Pooling in Recurrent Architectures

Pooling-based recurrent neural architectures consistently outperform the...
research
08/25/2023

MMBAttn: Max-Mean and Bit-wise Attention for CTR Prediction

With the increasing complexity and scale of click-through rate (CTR) pre...

Please sign up or login with your details

Forgot password? Click here to reset