DialFRED: Dialogue-Enabled Agents for Embodied Instruction Following

02/27/2022
by   Xiaofeng Gao, et al.
0

Language-guided Embodied AI benchmarks requiring an agent to navigate an environment and manipulate objects typically allow one-way communication: the human user gives a natural language command to the agent, and the agent can only follow the command passively. We present DialFRED, a dialogue-enabled embodied instruction following benchmark based on the ALFRED benchmark. DialFRED allows an agent to actively ask questions to the human user; the additional information in the user's response is used by the agent to better complete its task. We release a human-annotated dataset with 53K task-relevant questions and answers and an oracle to answer questions. To solve DialFRED, we propose a questioner-performer framework wherein the questioner is pre-trained with the human-annotated data and fine-tuned with reinforcement learning. We make DialFRED publicly available and encourage researchers to propose and evaluate their solutions to building dialog-enabled embodied agents.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/16/2022

Manual-Guided Dialogue for Flexible Conversational Agents

How to build and use dialogue data efficiently, and how to deploy models...
research
09/28/2022

Improving alignment of dialogue agents via targeted human judgements

We present Sparrow, an information-seeking dialogue agent trained to be ...
research
03/02/2023

Alexa Arena: A User-Centric Interactive Platform for Embodied AI

We introduce Alexa Arena, a user-centric simulation platform for Embodie...
research
10/01/2021

TEACh: Task-driven Embodied Agents that Chat

Robots operating in human spaces must be able to engage in natural langu...
research
05/02/2020

RMM: A Recursive Mental Model for Dialog Navigation

Fluent communication requires understanding your audience. In the new co...
research
01/29/2023

Distilling Internet-Scale Vision-Language Models into Embodied Agents

Instruction-following agents must ground language into their observation...

Please sign up or login with your details

Forgot password? Click here to reset