Clustering Text Using Attention

01/08/2022
by   Lovedeep Singh, et al.
0

Clustering Text has been an important problem in the domain of Natural Language Processing. While there are techniques to cluster text based on using conventional clustering techniques on top of contextual or non-contextual vector space representations, it still remains a prevalent area of research possible to various improvements in performance and implementation of these techniques. This paper discusses a novel technique to cluster text using attention mechanisms. Attention Mechanisms have proven to be highly effective in various NLP tasks in recent times. This paper extends the idea of attention mechanism in clustering space and sheds some light on a whole new area of research

READ FULL TEXT

page 1

page 3

research
09/25/2020

Attention Meets Perturbations: Robust and Interpretable Attention with Adversarial Training

In recent years, deep learning models have placed more emphasis on the i...
research
08/10/2018

Hierarchical Attention: What Really Counts in Various NLP Tasks

Attention mechanisms in sequence to sequence models have shown great abi...
research
04/17/2023

Improving Autoregressive NLP Tasks via Modular Linearized Attention

Various natural language processing (NLP) tasks necessitate models that ...
research
03/22/2019

An end-to-end Neural Network Framework for Text Clustering

The unsupervised text clustering is one of the major tasks in natural la...
research
11/12/2018

An Introductory Survey on Attention Mechanisms in NLP Problems

First derived from human intuition, later adapted to machine translation...
research
03/16/2017

Improving Document Clustering by Eliminating Unnatural Language

Technical documents contain a fair amount of unnatural language, such as...
research
07/31/2017

Skill2vec: Machine Learning Approaches for Determining the Relevant Skill from Job Description

Un-supervise learned word embeddings have seen tremendous success in num...

Please sign up or login with your details

Forgot password? Click here to reset