DeepAI AI Chat
Log In Sign Up

Principal Components of the Meaning

by   Neslihan Suzen, et al.

In this paper we argue that (lexical) meaning in science can be represented in a 13 dimension Meaning Space. This space is constructed using principal component analysis (singular decomposition) on the matrix of word category relative information gains, where the categories are those used by the Web of Science, and the words are taken from a reduced word set from texts in the Web of Science. We show that this reduced word set plausibly represents all texts in the corpus, so that the principal component analysis has some objective meaning with respect to the corpus. We argue that 13 dimensions is adequate to describe the meaning of scientific texts, and hypothesise about the qualitative meaning of the principal components.


A Bias Trick for Centered Robust Principal Component Analysis

Outlier based Robust Principal Component Analysis (RPCA) requires center...

Measuring Meaning on the World-Wide Web

We introduce the notion of the 'meaning bound' of a word with respect to...

A Computational Approach to Walt Whitman's Stylistic Changes in Leaves of Grass

This study analyzes Walt Whitman's stylistic changes in his phenomenal w...

Reduced Order and Surrogate Models for Gravitational Waves

We present an introduction to some of the state of the art in reduced or...

Principal Word Vectors

We generalize principal component analysis for embedding words into a ve...

Decomposing an information stream into the principal components

We propose an approach to decomposing a thematic information stream into...