Deep Generative Model for Joint Alignment and Word Representation

02/16/2018
by   Miguel Rios, et al.
0

This work exploits translation data as a source of semantically relevant learning signal for models of word representation. In particular, we exploit equivalence through translation as a form of distributed context and jointly learn how to embed and align with a deep generative model. Our EmbedAlign model embeds words in their complete observed context and learns by marginalisation of latent lexical alignments. Besides, it embeds words as posterior probability densities, rather than point estimates, which allows us to compare words in context using a measure of overlap between distributions (e.g. KL divergence). We investigate our model's performance on a range of lexical semantics tasks achieving competitive results on several standard benchmarks including natural language inference, paraphrasing, and text similarity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2017

Embedding Words as Distributions with a Bayesian Skip-gram Model

We introduce a method for embedding words as probability densities in a ...
research
05/28/2018

A Stochastic Decoder for Neural Machine Translation

The process of translation is ambiguous, in that there are typically man...
research
09/28/2020

Generative latent neural models for automatic word alignment

Word alignments identify translational correspondences between words in ...
research
06/07/2022

Always Keep your Target in Mind: Studying Semantics and Improving Performance of Neural Lexical Substitution

Lexical substitution, i.e. generation of plausible words that can replac...
research
11/12/2018

Unseen Word Representation by Aligning Heterogeneous Lexical Semantic Spaces

Word embedding techniques heavily rely on the abundance of training data...
research
05/23/2023

Deep Generative Model for Simultaneous Range Error Mitigation and Environment Identification

Received waveforms contain rich information for both range information a...
research
07/08/2018

A Deep Generative Model of Vowel Formant Typology

What makes some types of languages more probable than others? For instan...

Please sign up or login with your details

Forgot password? Click here to reset