DeepAI AI Chat
Log In Sign Up

Criticality in Formal Languages and Statistical Physics

by   Henry W. Lin, et al.

We show that the mutual information between two symbols, as a function of the number of symbols between the two, decays exponentially in any probabilistic regular grammar, but can decay like a power law for a context-free grammar. This result about formal languages is closely related to a well-known result in classical statistical mechanics that there are no phase transitions in dimensions fewer than two. It is also related to the emergence of power-law correlations in turbulence and cosmological inflation through recursive generative processes. We elucidate these physics connections and comment on potential applications of our results to machine learning tasks like training artificial recurrent neural networks. Along the way, we introduce a useful quantity which we dub the rational mutual information and discuss generalizations of our claims involving more complicated Bayesian networks.


page 1

page 2

page 3

page 4


Tensor networks and efficient descriptions of classical data

We investigate the potential of tensor network based machine learning me...

Mutual Information Scaling and Expressive Power of Sequence Models

Sequence models assign probabilities to variable-length sequences such a...

Inducing Regular Grammars Using Recurrent Neural Networks

Grammar induction is the task of learning a grammar from a set of exampl...

Minimizing couplings in renormalization by preserving short-range mutual information

The connections between renormalization in statistical mechanics and inf...

There Are Fewer Facts Than Words: Communication With A Growing Complexity

We present an impossibility result, called a theorem about facts and wor...

The Power of Constraint Grammars Revisited

Sequential Constraint Grammar (SCG) (Karlsson, 1990) and its extensions ...

V-like formations in flocks of artificial birds

We consider flocks of artificial birds and study the emergence of V-like...