Developing Embodied Multisensory Dialogue Agents

11/29/2011
by   Michał B. Paradowski, et al.
0

A few decades of work in the AI field have focused efforts on developing a new generation of systems which can acquire knowledge via interaction with the world. Yet, until very recently, most such attempts were underpinned by research which predominantly regarded linguistic phenomena as separated from the brain and body. This could lead one into believing that to emulate linguistic behaviour, it suffices to develop 'software' operating on abstract representations that will work on any computational machine. This picture is inaccurate for several reasons, which are elucidated in this paper and extend beyond sensorimotor and semantic resonance. Beginning with a review of research, I list several heterogeneous arguments against disembodied language, in an attempt to draw conclusions for developing embodied multisensory agents which communicate verbally and non-verbally with their environment. Without taking into account both the architecture of the human brain, and embodiment, it is unrealistic to replicate accurately the processes which take place during language acquisition, comprehension, production, or during non-linguistic actions. While robots are far from isomorphic with humans, they could benefit from strengthened associative connections in the optimization of their processes and their reactivity and sensitivity to environmental stimuli, and in situated human-machine interaction. The concept of multisensory integration should be extended to cover linguistic input and the complementary information combined from temporally coincident sensory impressions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2022

What Artificial Neural Networks Can Tell Us About Human Language Acquisition

Rapid progress in machine learning for natural language processing has t...
research
11/25/2020

Sensorimotor representation learning for an "active self" in robots: A model survey

Safe human-robot interactions require robots to be able to learn how to ...
research
09/02/2021

So Cloze yet so Far: N400 Amplitude is Better Predicted by Distributional Information than Human Predictability Judgements

More predictable words are easier to process - they are read faster and ...
research
06/02/2021

Uncovering Constraint-Based Behavior in Neural Models via Targeted Fine-Tuning

A growing body of literature has focused on detailing the linguistic kno...
research
10/15/2018

Assessing the Contribution of Semantic Congruency to Multisensory Integration and Conflict Resolution

The efficient integration of multisensory observations is a key property...
research
05/16/2023

Mirages: On Anthropomorphism in Dialogue Systems

Automated dialogue or conversational systems are anthropomorphised by de...
research
09/11/2018

What can linguistics and deep learning contribute to each other?

Joe Pater's target article calls for greater interaction between neural ...

Please sign up or login with your details

Forgot password? Click here to reset