Cross-topic distributional semantic representations via unsupervised mappings

04/11/2019
by   Eleftheria Briakou, et al.
0

In traditional Distributional Semantic Models (DSMs) the multiple senses of a polysemous word are conflated into a single vector space representation. In this work, we propose a DSM that learns multiple distributional representations of a word based on different topics. First, a separate DSM is trained for each topic and then each of the topic-based DSMs is aligned to a common vector space. Our unsupervised mapping approach is motivated by the hypothesis that words preserving their relative distances in different topic semantic sub-spaces constitute robust semantic anchors that define the mappings between them. Aligned cross-topic representations achieve state-of-the-art results for the task of contextual word similarity. Furthermore, evaluation on NLP downstream tasks shows that multiple topic-based embeddings outperform single-prototype models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2019

Multi Sense Embeddings from Topic Models

Distributed word embeddings have yielded state-of-the-art performance in...
research
06/29/2016

A Distributional Semantics Approach to Implicit Language Learning

In the present paper we show that distributional information is particul...
research
12/14/2016

Hypernyms under Siege: Linguistically-motivated Artillery for Hypernymy Detection

The fundamental role of hypernymy in NLP has motivated the development o...
research
08/24/2016

Improving Sparse Word Representations with Distributional Inference for Semantic Composition

Distributional models are derived from co-occurrences in a corpus, where...
research
05/01/2017

Learning Topic-Sensitive Word Representations

Distributed word representations are widely used for modeling words in N...
research
07/16/2015

Exploratory topic modeling with distributional semantics

As we continue to collect and store textual data in a multitude of domai...
research
04/01/2016

Nonparametric Spherical Topic Modeling with Word Embeddings

Traditional topic models do not account for semantic regularities in lan...

Please sign up or login with your details

Forgot password? Click here to reset