One Model for the Learning of Language

11/16/2017
by   Yuan Yang, et al.
0

A major target of linguistics and cognitive science has been to understand what class of learning systems can acquire the key structures of natural language. Until recently, the computational requirements of language have been used to argue that learning is impossible without a highly constrained hypothesis space. Here, we describe a learning system that is maximally unconstrained, operating over the space of all computations, and is able to acquire several of the key structures present natural language from positive evidence alone. The model successfully acquires regular (e.g. (ab)^n), context-free (e.g. a^n b^n, x x^R), and context-sensitive (e.g. a^nb^nc^n, a^nb^mc^nd^m, xx) formal languages. Our approach develops the concept of factorized programs in Bayesian program induction in order to help manage the complexity of representation. We show in learning, the model predicts several phenomena empirically observed in human grammar acquisition experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2021

VLGrammar: Grounded Grammar Induction of Vision and Language

Cognitive grammar suggests that the acquisition of language grammar is g...
research
09/23/2014

A Concept Learning Approach to Multisensory Object Perception

This paper presents a computational model of concept learning using Baye...
research
06/10/2019

Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations

Deep learning sequence models have led to a marked increase in performan...
research
12/09/2020

Towards Coinductive Models for Natural Language Understanding. Bringing together Deep Learning and Deep Semantics

This article contains a proposal to add coinduction to the computational...
research
09/23/2022

A Neural Model for Regular Grammar Induction

Grammatical inference is a classical problem in computational learning t...
research
11/05/2018

Superregular grammars do not provide additional explanatory power but allow for a compact analysis of animal song

A pervasive belief with regard to the differences between human language...
research
03/24/2017

Interactive Natural Language Acquisition in a Multi-modal Recurrent Neural Architecture

The human brain is one of the most complex dynamic systems that enables ...

Please sign up or login with your details

Forgot password? Click here to reset