Chat with the Environment: Interactive Multimodal Perception using Large Language Models

03/14/2023
by   Xufeng Zhao, et al.
0

Programming robot behaviour in a complex world faces challenges on multiple levels, from dextrous low-level skills to high-level planning and reasoning. Recent pre-trained Large Language Models (LLMs) have shown remarkable reasoning ability in zero-shot robotic planning. However, it remains challenging to ground LLMs in multimodal sensory input and continuous action output, while enabling a robot to interact with its environment and acquire novel information as its policies unfold. We develop a robot interaction scenario with a partially observable state, which necessitates a robot to decide on a range of epistemic actions in order to sample sensory information among multiple modalities, before being able to execute the task correctly. An interactive perception framework is therefore proposed with an LLM as its backbone, whose ability is exploited to instruct epistemic actions and to reason over the resulting multimodal sensations (vision, sound, haptics, proprioception), as well as to plan an entire task execution based on the interactively acquired information. Our study demonstrates that LLMs can provide high-level planning and reasoning skills and control interactive robot behaviour in a multimodal environment, while multimodal modules with the context of the environmental state help ground the LLMs and extend their processing ability.

READ FULL TEXT

page 1

page 5

page 6

research
08/13/2023

Ground Manipulator Primitive Tasks to Executable Actions using Large Language Models

Layered architectures have been widely used in robot systems. The majori...
research
09/19/2023

Perceptual Factors for Environmental Modeling in Robotic Active Perception

Accurately assessing the potential value of new sensor observations is a...
research
05/10/2023

Multimodal Contextualized Plan Prediction for Embodied Task Completion

Task planning is an important component of traditional robotics systems ...
research
04/17/2023

Grounding Classical Task Planners via Vision-Language Models

Classical planning systems have shown great advances in utilizing rule-b...
research
10/22/2018

A Review on Learning Planning Action Models for Socio-Communicative HRI

For social robots to be brought more into widespread use in the fields o...
research
06/14/2023

Language to Rewards for Robotic Skill Synthesis

Large language models (LLMs) have demonstrated exciting progress in acqu...

Please sign up or login with your details

Forgot password? Click here to reset