Modelling Compositionality and Structure Dependence in Natural Language

11/22/2020
by   Karthikeya Ramesh Kaushik, et al.
0

Human beings possess the most sophisticated computational machinery in the known universe. We can understand language of rich descriptive power, and communicate in the same environment with astonishing clarity. Two of the many contributors to the interest in natural language - the properties of Compositionality and Structure Dependence, are well documented, and offer a vast space to ask interesting modelling questions. The first step to begin answering these questions is to ground verbal theory in formal terms. Drawing on linguistics and set theory, a formalisation of these ideas is presented in the first half of this thesis. We see how cognitive systems that process language need to have certain functional constraints, viz. time based, incremental operations that rely on a structurally defined domain. The observations that result from analysing this formal setup are examined as part of a modelling exercise. Using the advances of word embedding techniques, a model of relational learning is simulated with a custom dataset to demonstrate how a time based role-filler binding mechanism satisfies some of the constraints described in the first section. The model's ability to map structure, along with its symbolic-connectionist architecture makes for a cognitively plausible implementation. The formalisation and simulation are together an attempt to recognise the constraints imposed by linguistic theory, and explore the opportunities presented by a cognitive model of relation learning to realise these constraints.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

03/13/2017

El Lenguaje Natural como Lenguaje Formal

Formal languages theory is useful for the study of natural language. In ...
02/22/2016

From quantum foundations via natural language meaning to a theory of everything

In this paper we argue for a paradigmatic shift from `reductionism' to `...
07/17/2020

Toward Givenness Hierarchy Theoretic Natural Language Generation

Language-capable interactive robots participating in dialogues with huma...
10/05/2019

Natural- to formal-language generation using Tensor Product Representations

Generating formal-language represented by relational tuples, such as Lis...
11/08/2019

Are we asking the right questions in MovieQA?

Joint vision and language tasks like visual question answering are fasci...
07/01/2016

Throwing fuel on the embers: Probability or Dichotomy, Cognitive or Linguistic?

Prof. Robert Berwick's abstract for his forthcoming invited talk at the ...
05/18/2021

A General Theory for the Evolution of Application Models – Full version

In this article we focus on evolving information systems. First a delimi...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.