Encoding word order in complex embeddings

12/27/2019
by   Benyou Wang, et al.
0

Sequential word order is important when processing text. Currently, neural networks (NNs) address this by modeling word position using position embeddings. The problem is that position embeddings capture the position of individual words, but not the ordered relationship (e.g., adjacency or precedence) between individual word positions. We present a novel and principled solution for modeling both the global absolute positions of words and their order relationships. Our solution generalizes word embeddings, previously defined as independent vectors, to continuous word functions over a variable (position). The benefit of continuous functions over variable positions is that word representations shift smoothly with increasing positions. Hence, word representations in different positions can correlate with each other in a continuous function. The general solution of these functions is extended to complex-valued domain due to richer representations. We extend CNN, RNN and Transformer NNs to complex-valued versions to incorporate our complex embedding (we make all code available). Experiments on text classification, machine translation and language modeling show gains over both classical word embeddings and position-enriched word embeddings. To our knowledge, this is the first work in NLP to link imaginary numbers in complex-valued representations to concrete meanings (i.e., word order).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2021

Spanish Biomedical and Clinical Language Embeddings

We computed both Word and Sub-word Embeddings using FastText. For Sub-wo...
research
09/19/2017

Think Globally, Embed Locally --- Locally Linear Meta-embedding of Words

Distributed word embeddings have shown superior performances in numerous...
research
09/02/2021

An Empirical Study on Leveraging Position Embeddings for Target-oriented Opinion Words Extraction

Target-oriented opinion words extraction (TOWE) (Fan et al., 2019b) is a...
research
05/10/2018

Joint Embedding of Words and Labels for Text Classification

Word embeddings are effective intermediate representations for capturing...
research
03/18/2020

Anchor Transform: Learning Sparse Representations of Discrete Objects

Learning continuous representations of discrete objects such as text, us...
research
10/06/2020

Compositional Demographic Word Embeddings

Word embeddings are usually derived from corpora containing text from ma...
research
09/13/2021

SHAPE: Shifted Absolute Position Embedding for Transformers

Position representation is crucial for building position-aware represent...

Please sign up or login with your details

Forgot password? Click here to reset