Dynamic Contextualized Word Embeddings

10/23/2020
by   Valentin Hofmann, et al.
0

Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and extralinguistic context. Based on a pretrained language model (PLM), dynamic contextualized word embeddings model time and social space jointly, which makes them attractive for various tasks in the computational social sciences. We highlight potential applications by means of qualitative and quantitative analyses.

READ FULL TEXT
research
07/22/2019

Learning dynamic word embeddings with drift regularisation

Word usage, meaning and connotation change throughout time. Diachronic w...
research
11/20/2016

Visualizing Linguistic Shift

Neural network based models are a very powerful tool for creating word e...
research
07/22/2021

Theoretical foundations and limits of word embeddings: what types of meaning can they capture?

Measuring meaning is a central problem in cultural sociology and word em...
research
04/06/2019

Simple dynamic word embeddings for mapping perceptions in the public sphere

Word embeddings trained on large-scale historical corpora can illuminate...
research
10/07/2020

Analogies minus analogy test: measuring regularities in word embeddings

Vector space models of words have long been claimed to capture linguisti...
research
07/31/2020

Evaluating Semantic Interaction on Word Embeddings via Simulation

Semantic interaction (SI) attempts to learn the user's cognitive intents...
research
12/19/2022

Independent Components of Word Embeddings Represent Semantic Features

Independent Component Analysis (ICA) is an algorithm originally develope...

Please sign up or login with your details

Forgot password? Click here to reset