Theoretical foundations and limits of word embeddings: what types of meaning can they capture?

07/22/2021
by   Alina Arseniev-Koehler, et al.
0

Measuring meaning is a central problem in cultural sociology and word embeddings may offer powerful new tools to do so. But like any tool, they build on and exert theoretical assumptions. In this paper I theorize the ways in which word embeddings model three core premises of a structural linguistic theory of meaning: that meaning is relational, coherent, and may be analyzed as a static system. In certain ways, word embedding methods are vulnerable to the same, enduring critiques of these premises. In other ways, they offer novel solutions to these critiques. More broadly, formalizing the study of meaning with word embeddings offers theoretical opportunities to clarify core concepts and debates in cultural sociology, such as the coherence of meaning. Just as network analysis specified the once vague notion of social relations (Borgatti et al. 2009), formalizing meaning with embedding methods can push us to specify and reimagine meaning itself.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2020

Dynamic Contextualized Word Embeddings

Static word embeddings that represent words by a single vector cannot ca...
research
03/25/2018

The Geometry of Culture: Analyzing Meaning through Word Embeddings

We demonstrate the utility of a new methodological tool, neural-network ...
research
05/16/2022

What company do words keep? Revisiting the distributional semantics of J.R. Firth Zellig Harris

The power of word embeddings is attributed to the linguistic theory that...
research
07/09/2020

Cultural Cartography with Word Embeddings

Using the presence or frequency of keywords is a classic approach in the...
research
11/05/2021

On the Impact of Temporal Representations on Metaphor Detection

State-of-the-art approaches for metaphor detection compare their literal...
research
02/12/2015

RAND-WALK: A Latent Variable Model Approach to Word Embeddings

Semantic word embeddings represent the meaning of a word via a vector, a...
research
05/18/2021

Revisiting Additive Compositionality: AND, OR and NOT Operations with Word Embeddings

It is well-known that typical word embedding methods such as Word2Vec an...

Please sign up or login with your details

Forgot password? Click here to reset