Natural Language Interactions in Autonomous Vehicles: Intent Detection and Slot Filling from Passenger Utterances

04/23/2019
by   Eda Okur, et al.
0

Understanding passenger intents and extracting relevant slots are important building blocks towards developing contextual dialogue systems for natural interactions in autonomous vehicles (AV). In this work, we explored AMIE (Automated-vehicle Multi-modal In-cabin Experience), the in-cabin agent responsible for handling certain passenger-vehicle interactions. When the passengers give instructions to AMIE, the agent should parse such commands properly and trigger the appropriate functionality of the AV system. In our current explorations, we focused on AMIE scenarios describing usages around setting or changing the destination and route, updating driving behavior or speed, finishing the trip and other use-cases to support various natural commands. We collected a multi-modal in-cabin dataset with multi-turn dialogues between the passengers and AMIE using a Wizard-of-Oz scheme via a realistic scavenger hunt game activity. After exploring various recent Recurrent Neural Networks (RNN) based techniques, we introduced our own hierarchical joint models to recognize passenger intents along with relevant slots associated with the action to be performed in AV scenarios. Our experimental results outperformed certain competitive baselines and achieved overall F1 scores of 0.91 for utterance-level intent detection and 0.96 for slot filling tasks. In addition, we conducted initial speech-to-text explorations by comparing intent/slot models trained and tested on human transcriptions versus noisy Automatic Speech Recognition (ASR) outputs. Finally, we compared the results with single passenger rides versus the rides with multiple passengers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2018

Conversational Intent Understanding for Passengers in Autonomous Vehicles

Understanding passenger intents and extracting relevant slots are import...
research
09/20/2019

Towards Multimodal Understanding of Passenger-Vehicle Interactions in Autonomous Vehicles: Intent/Slot Recognition Utilizing Audio-Visual Data

Understanding passenger intents from spoken interactions and car's visio...
research
12/22/2018

Joint Slot Filling and Intent Detection via Capsule Neural Networks

Being able to recognize words as slots and detect the intent of an utter...
research
08/18/2021

Joint Multiple Intent Detection and Slot Filling via Self-distillation

Intent detection and slot filling are two main tasks in natural language...
research
07/08/2020

Audio-Visual Understanding of Passenger Intents for In-Cabin Conversational Agents

Building multimodal dialogue understanding capabilities situated in the ...
research
06/11/2021

CONDA: a CONtextual Dual-Annotated dataset for in-game toxicity understanding and detection

Traditional toxicity detection models have focused on the single utteran...
research
05/18/2020

Building BROOK: A Multi-modal and Facial Video Database for Human-Vehicle Interaction Research

With the growing popularity of Autonomous Vehicles, more opportunities h...

Please sign up or login with your details

Forgot password? Click here to reset