Learning Sense-Specific Static Embeddings using Contextualised Word Embeddings as a Proxy

10/05/2021
by   Yi Zhou, et al.
0

Contextualised word embeddings generated from Neural Language Models (NLMs), such as BERT, represent a word with a vector that considers the semantics of the target word as well its context. On the other hand, static word embeddings such as GloVe represent words by relatively low-dimensional, memory- and compute-efficient vectors but are not sensitive to the different senses of the word. We propose Context Derived Embeddings of Senses (CDES), a method that extracts sense related information from contextualised embeddings and injects it into static embeddings to create sense-specific static embeddings. Experimental results on multiple benchmarks for word sense disambiguation and sense discrimination tasks show that CDES can accurately learn sense-specific static embeddings reporting comparable performance to the current state-of-the-art sense embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2017

Making Sense of Word Embeddings

We present a simple yet effective approach for learning word sense embed...
research
02/08/2019

Humor in Word Embeddings: Cockamamie Gobbledegook for Nincompoops

We study humor in Word Embeddings, a popular AI tool that associates eac...
research
11/09/2020

Catch the "Tails" of BERT

Recently, contextualized word embeddings outperform static word embeddin...
research
03/14/2022

Sense Embeddings are also Biased–Evaluating Social Biases in Static and Contextualised Sense Embeddings

Sense embedding learning methods learn different embeddings for the diff...
research
10/14/2021

Large Scale Substitution-based Word Sense Induction

We present a word-sense induction method based on pre-trained masked lan...
research
09/23/2019

Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings

Contextualized word embeddings (CWE) such as provided by ELMo (Peters et...
research
11/07/2019

Using Dynamic Embeddings to Improve Static Embeddings

How to build high-quality word embeddings is a fundamental research ques...

Please sign up or login with your details

Forgot password? Click here to reset