Dynamic Contextualized Word Embeddings

10/23/2020
by   Valentin Hofmann, et al.
0

Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and extralinguistic context. Based on a pretrained language model (PLM), dynamic contextualized word embeddings model time and social space jointly, which makes them attractive for various tasks in the computational social sciences. We highlight potential applications by means of qualitative and quantitative analyses.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset