An Introductory Survey on Attention Mechanisms in NLP Problems

11/12/2018
by   Dichao Hu, et al.
0

First derived from human intuition, later adapted to machine translation for automatic token alignment, attention mechanism, a simple method that can be used for encoding sequence data based on the importance score each element is assigned, has been widely applied to and attained significant improvement in various tasks in natural language processing, including sentiment classification, text summarization, question answering, dependency parsing, etc. In this paper, we survey through recent works and conduct an introductory summary of the attention mechanism in different NLP problems, aiming to provide our readers with basic knowledge on this widely used method, discuss its different variants for different tasks, explore its association with other techniques in machine learning, and examine methods for evaluating its performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2017

Natural Language Processing: State of The Art, Current Trends and Challenges

Natural language processing (NLP) has recently gained much attention for...
research
03/28/2023

Comparative Analysis of CHATGPT and the evolution of language models

Interest in Large Language Models (LLMs) has increased drastically since...
research
08/17/2022

Understanding Attention for Vision-and-Language Tasks

Attention mechanism has been used as an important component across Visio...
research
04/07/2020

Salience Estimation with Multi-Attention Learning for Abstractive Text Summarization

Attention mechanism plays a dominant role in the sequence generation mod...
research
07/06/2021

Shell Language Processing: Unix command parsing for Machine Learning

In this article, we present a Shell Language Preprocessing (SLP) library...
research
01/08/2022

Clustering Text Using Attention

Clustering Text has been an important problem in the domain of Natural L...
research
05/19/2020

Staying True to Your Word: (How) Can Attention Become Explanation?

The attention mechanism has quickly become ubiquitous in NLP. In additio...

Please sign up or login with your details

Forgot password? Click here to reset