Natural Language Syntax Complies with the Free-Energy Principle

10/27/2022
by   Elliot Murphy, et al.
0

Natural language syntax yields an unbounded array of hierarchically structured expressions. We claim that these are used in the service of active inference in accord with the free-energy principle (FEP). While conceptual advances alongside modelling and simulation work have attempted to connect speech segmentation and linguistic communication with the FEP, we extend this program to the underlying computations responsible for generating syntactic objects. We argue that recently proposed principles of economy in language design - such as "minimal search" criteria from theoretical syntax - adhere to the FEP. This affords a greater degree of explanatory power to the FEP - with respect to higher language functions - and offers linguistics a grounding in first principles with respect to computability. We show how both tree-geometric depth and a Kolmogorov complexity estimate (recruiting a Lempel-Ziv compression algorithm) can be used to accurately predict legal operations on syntactic workspaces, directly in line with formulations of variational free energy minimization. This is used to motivate a general principle of language design that we term Turing-Chomsky Compression (TCC). We use TCC to align concerns of linguists with the normative account of self-organization furnished by the FEP, by marshalling evidence from theoretical linguistics and psycholinguistics to ground core principles of efficient syntactic computation within active inference.

READ FULL TEXT
research
11/29/2022

Syntactic Substitutability as Unsupervised Dependency Syntax

Syntax is a latent hierarchical structure which underpins the robust and...
research
07/13/2022

The Free Energy Principle for Perception and Action: A Deep Learning Perspective

The free energy principle, and its corollary active inference, constitut...
research
09/10/2019

Countering Language Drift via Visual Grounding

Emergent multi-agent communication protocols are very different from nat...
research
07/22/2016

Syntax-based Attention Model for Natural Language Inference

Introducing attentional mechanism in neural network is a powerful concep...
research
02/25/2019

Cooperative Learning of Disjoint Syntax and Semantics

There has been considerable attention devoted to models that learn to jo...
research
11/25/2020

Free Energy Minimization: A Unified Framework for Modelling, Inference, Learning,and Optimization

The goal of these lecture notes is to review the problem of free energy ...
research
09/10/2021

Predicting emergent linguistic compositions through time: Syntactic frame extension via multimodal chaining

Natural language relies on a finite lexicon to express an unbounded set ...

Please sign up or login with your details

Forgot password? Click here to reset