Semantic Representations of Word Senses and Concepts

08/02/2016
by   Jose Camacho-Collados, et al.
0

Representing the semantics of linguistic items in a machine-interpretable form has been a major goal of Natural Language Processing since its earliest days. Among the range of different linguistic items, words have attracted the most research attention. However, word representations have an important limitation: they conflate different meanings of a word into a single vector. Representations of word senses have the potential to overcome this inherent limitation. Indeed, the representation of individual word senses and concepts has recently gained in popularity with several experimental results showing that a considerable performance improvement can be achieved across different NLP applications upon moving from word level to the deeper sense and concept levels. Another interesting point regarding the representation of concepts and word senses is that these models can be seamlessly applied to other linguistic items, such as words, phrases and sentences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2019

Learning Taxonomies of Concepts and not Words using Contextualized Word Representations: A Position Paper

Taxonomies are semantic hierarchies of concepts. One limitation of curre...
research
06/18/2015

"The Sum of Its Parts": Joint Learning of Word and Phrase Representations with Autoencoders

Recently, there has been a lot of effort to represent words in continuou...
research
02/12/2017

Vector Embedding of Wikipedia Concepts and Entities

Using deep learning for different machine learning tasks such as image c...
research
08/05/2016

De-Conflated Semantic Representations

One major deficiency of most semantic representation techniques is that ...
research
11/05/2019

Sparse Lifting of Dense Vectors: Unifying Word and Sentence Representations

As the first step in automated natural language processing, representing...
research
06/29/2020

Measuring Memorization Effect in Word-Level Neural Networks Probing

Multiple studies have probed representations emerging in neural networks...
research
10/16/2013

Distributed Representations of Words and Phrases and their Compositionality

The recently introduced continuous Skip-gram model is an efficient metho...

Please sign up or login with your details

Forgot password? Click here to reset