Ultradense Word Embeddings by Orthogonal Transformation

02/24/2016
by   Sascha Rothe, et al.
0

Embeddings are generic representations that are useful for many NLP tasks. In this paper, we introduce DENSIFIER, a method that learns an orthogonal transformation of the embedding space that focuses the information relevant for a task in an ultradense subspace of a dimensionality that is smaller by a factor of 100 than the original space. We show that ultradense embeddings generated by DENSIFIER reach state of the art on a lexicon creation task in which words are annotated with three types of lexical information - sentiment, concreteness and frequency. On the SemEval2015 10B sentiment analysis task we show that no information is lost when the ultradense subspace is used, but training is an order of magnitude more efficient due to the compactness of the ultradense space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2016

Lexicon Integrated CNN Models with Attention for Sentiment Analysis

With the advent of word embeddings, lexicons are no longer fully utilize...
research
11/05/2020

Learning Efficient Task-Specific Meta-Embeddings with Word Prisms

Word embeddings are trained to predict word cooccurrence statistics, whi...
research
10/08/2015

Mapping Unseen Words to Task-Trained Embedding Spaces

We consider the supervised training setting in which we learn task-speci...
research
01/15/2013

The Expressive Power of Word Embeddings

We seek to better understand the difference in quality of the several pu...
research
09/30/2018

Zero-training Sentence Embedding via Orthogonal Basis

We propose a simple and robust training-free approach for building sente...
research
09/18/2019

Decision-Directed Data Decomposition

We present an algorithm, Decision-Directed Data Decomposition, which dec...
research
07/29/2019

Learning Invariant Representations for Sentiment Analysis: The Missing Material is Datasets

Learning representations which remain invariant to a nuisance factor has...

Please sign up or login with your details

Forgot password? Click here to reset