DeepAI AI Chat
Log In Sign Up

Modelling Lexical Ambiguity with Density Matrices

by   Francois Meyer, et al.

Words can have multiple senses. Compositional distributional models of meaning have been argued to deal well with finer shades of meaning variation known as polysemy, but are not so well equipped to handle word senses that are etymologically unrelated, or homonymy. Moving from vectors to density matrices allows us to encode a probability distribution over different senses of a word, and can also be accommodated within a compositional distributional model of meaning. In this paper we present three new neural models for learning density matrices from a corpus, and test their ability to discriminate between word senses on a range of compositional datasets. When paired with a particular composition method, our best model outperforms existing vector-based compositional models as well as strong sentence encoders.


page 1

page 2

page 3

page 4


Concrete Sentence Spaces for Compositional Distributional Models of Meaning

Coecke, Sadrzadeh, and Clark (arXiv:1003.4394v1 [cs.CL]) developed a com...

Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning

Deep compositional models of meaning acting on distributional representa...

Quantum Inspired Word Representation and Computation

Word meaning has different aspects, while the existing word representati...

A Compositional Explanation of the Pet Fish Phenomenon

The `pet fish' phenomenon is often cited as a paradigm example of the `n...

Collaborative Training of Tensors for Compositional Distributional Semantics

Type-based compositional distributional semantic models present an inter...

Dual Density Operators and Natural Language Meaning

Density operators allow for representing ambiguity about a vector repres...

Compositional Distributional Semantics with Compact Closed Categories and Frobenius Algebras

This thesis contributes to ongoing research related to the categorical c...