Adaptive User-Centered Multimodal Interaction towards Reliable and Trusted Automotive Interfaces

11/07/2022
by   Amr Gomaa, et al.
0

With the recently increasing capabilities of modern vehicles, novel approaches for interaction emerged that go beyond traditional touch-based and voice command approaches. Therefore, hand gestures, head pose, eye gaze, and speech have been extensively investigated in automotive applications for object selection and referencing. Despite these significant advances, existing approaches mostly employ a one-model-fits-all approach unsuitable for varying user behavior and individual differences. Moreover, current referencing approaches either consider these modalities separately or focus on a stationary situation, whereas the situation in a moving vehicle is highly dynamic and subject to safety-critical constraints. In this paper, I propose a research plan for a user-centered adaptive multimodal fusion approach for referencing external objects from a moving vehicle. The proposed plan aims to provide an open-source framework for user-centered adaptation and personalization using user observations and heuristics, multimodal fusion, clustering, transfer-of-learning for model adaptation, and continuous learning, moving towards trusted human-centered artificial intelligence.

READ FULL TEXT
research
09/23/2020

Studying Person-Specific Pointing and Gaze Behavior for Multimodal Referencing of Outside Objects from a Moving Vehicle

Hand pointing and eye gaze have been extensively investigated in automot...
research
11/03/2021

ML-PersRef: A Machine Learning-based Personalized Multimodal Fusion Approach for Referencing Outside Objects From a Moving Vehicle

Over the past decades, the addition of hundreds of sensors to modern veh...
research
07/26/2021

Multimodal Fusion Using Deep Learning Applied to Driver's Referencing of Outside-Vehicle Objects

There is a growing interest in more intelligent natural user interaction...
research
12/24/2020

You Have a Point There: Object Selection Inside an Automobile Using Gaze, Head Pose and Finger Pointing

Sophisticated user interaction in the automotive industry is a fast emer...
research
02/15/2022

Multimodal Driver Referencing: A Comparison of Pointing to Objects Inside and Outside the Vehicle

Advanced in-cabin sensing technologies, especially vision based approach...
research
09/11/2023

Adaptive User-centered Neuro-symbolic Learning for Multimodal Interaction with Autonomous Systems

Recent advances in machine learning, particularly deep learning, have en...
research
03/02/2021

Natural interaction with traffic control cameras through multimodal interfaces

Human-Computer Interfaces have always played a fundamental role in usabi...

Please sign up or login with your details

Forgot password? Click here to reset