Dynamic Bernoulli Embeddings for Language Evolution

03/23/2017
by   Maja Rudolph, et al.
0

Word embeddings are a powerful approach for unsupervised analysis of language. Recently, Rudolph et al. (2016) developed exponential family embeddings, which cast word embeddings in a probabilistic framework. Here, we develop dynamic embeddings, building on exponential family embeddings to capture how the meanings of words change over time. We use dynamic embeddings to analyze three large collections of historical texts: the U.S. Senate speeches from 1858 to 2009, the history of computer science ACM abstracts from 1951 to 2014, and machine learning papers on the Arxiv from 2007 to 2015. We find dynamic embeddings provide better fits than classical embeddings and capture interesting patterns about how language changes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2019

Learning dynamic word embeddings with drift regularisation

Word usage, meaning and connotation change throughout time. Diachronic w...
research
09/28/2017

Structured Embedding Models for Grouped Data

Word embeddings are a powerful approach for analyzing language, and expo...
research
09/04/2019

Empirical Study of Diachronic Word Embeddings for Scarce Data

Word meaning change can be inferred from drifts of time-varying word emb...
research
08/02/2016

Exponential Family Embeddings

Word embeddings are a powerful approach for capturing semantic similarit...
research
03/24/2018

Equation Embeddings

We present an unsupervised approach for discovering semantic representat...
research
06/13/2023

Curatr: A Platform for Semantic Analysis and Curation of Historical Literary Texts

The increasing availability of digital collections of historical and con...
research
04/06/2019

Simple dynamic word embeddings for mapping perceptions in the public sphere

Word embeddings trained on large-scale historical corpora can illuminate...

Please sign up or login with your details

Forgot password? Click here to reset