The physical structure of grammatical correlations: equivalences, formalizations and consequences

08/04/2017
by   Angel J. Gallego, et al.
0

In this paper we consider some well-known facts in syntax from a physics perspective, which allows us to establish some remarkable equivalences. Specifically, we observe that the operation MERGE put forward by N. Chomsky in 1995 can be interpreted as a physical information coarse-graining. Thus, MERGE in linguistics entails information renormalization in physics, according to different time scales. We make this point mathematically formal in terms of language models, i.e., probability distributions over word sequences, widely used in natural language processing as well as other ambits. In this setting, MERGE corresponds to a 3-index probability tensor implementing a coarse-graining, akin to a probabilistic context-free grammar. The probability vectors of meaningful sentences are naturally given by tensor networks (TN) that are mostly loop-free, such as Tree Tensor Networks and Matrix Product States. These structures have short-ranged correlations in the syntactic distance by construction and, because of the peculiarities of human language, they are extremely efficient to manipulate computationally. We also propose how to obtain such language models from probability distributions of certain TN quantum states, which we show to be efficiently preparable by a quantum computer. Moreover, using tools from quantum information and entanglement theory, we use these quantum states to prove classical lower bounds on the perplexity of the probability distribution for a set of words in a sentence. Implications of these results are discussed in the ambits of theoretical and computational linguistics, artificial intelligence, programming languages, RNA and protein sequencing, quantum many-body systems, and beyond.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2018

Implementing Entangled States on a Quantum Computer

The study of tensor network theory is an important field and promises a ...
research
06/22/2020

Attention-based Quantum Tomography

With rapid progress across platforms for quantum systems, the problem of...
research
08/23/2020

Quantum Language Model with Entanglement Embedding for Question Answering

Quantum Language Models (QLMs) in which words are modelled as quantum su...
research
10/20/2020

Quantum Tensor Networks, Stochastic Processes, and Weighted Automata

Modeling joint probability distributions over sequences has been studied...
research
01/17/2017

On the Equivalence of Restricted Boltzmann Machines and Tensor Network States

Restricted Boltzmann machine (RBM) is one of the fundamental building bl...
research
10/01/2017

Mathematical foundations of matrix syntax

Matrix syntax is a formal model of syntactic relations in language. The ...
research
05/29/2023

Mathematical Structure of Syntactic Merge

The syntactic Merge operation of the Minimalist Program in linguistics c...

Please sign up or login with your details

Forgot password? Click here to reset