Ecological Semantics: Programming Environments for Situated Language Understanding

03/10/2020
by   Ronen Tamari, et al.
0

Large-scale natural language understanding (NLU) systems have made impressive progress: they can be applied flexibly across a variety of tasks, and employ minimal structural assumptions. However, extensive empirical research has shown this to be a double-edged sword, coming at the cost of shallow understanding: inferior generalization, grounding and explainability. Grounded language learning approaches offer the promise of deeper understanding by situating learning in richer, more structured training environments, but are limited in scale to relatively narrow, predefined domains. How might we enjoy the best of both worlds: grounded, general NLU? Following extensive contemporary cognitive science, we propose treating environments as “first-class citizens” in semantic representations, worthy of research and development in their own right. Importantly, models should also be partners in the creation and configuration of environments, rather than just actors within them, as in existing approaches. To do so, we argue that models must begin to understand and program in the language of affordances (which define possible actions in a given situation) both for online, situated discourse comprehension, as well as large-scale, offline common-sense knowledge mining. To this end we propose an environment-oriented ecological semantics, outlining theoretical and practical approaches towards implementation. We further provide actual demonstrations building upon interactive fiction programming languages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2022

Collecting Interactive Multi-modal Datasets for Grounded Language Understanding

Human intelligence can remarkably adapt quickly to new tasks and environ...
research
05/05/2022

Interactive Grounded Language Understanding in a Collaborative Environment: IGLU 2021

Human intelligence has the remarkable ability to quickly adapt to new ta...
research
05/01/2020

Language (Re)modelling: Towards Embodied Language Understanding

While natural language understanding (NLU) is advancing rapidly, today's...
research
12/09/2020

Towards Coinductive Models for Natural Language Understanding. Bringing together Deep Learning and Deep Semantics

This article contains a proposal to add coinduction to the computational...
research
05/23/2023

Understanding Programs by Exploiting (Fuzzing) Test Cases

Semantic understanding of programs has attracted great attention in the ...
research
04/18/2021

A recipe for annotating grounded clarifications

In order to interpret the communicative intents of an utterance, it need...
research
10/03/2022

ContraGen: Effective Contrastive Learning For Causal Language Model

Despite exciting progress in large-scale language generation, the expres...

Please sign up or login with your details

Forgot password? Click here to reset