Matching Representations of Explainable Artificial Intelligence and Eye Gaze for Human-Machine Interaction

01/30/2021
by   Tiffany Hwu, et al.
0

Rapid non-verbal communication of task-based stimuli is a challenge in human-machine teaming, particularly in closed-loop interactions such as driving. To achieve this, we must understand the representations of information for both the human and machine, and determine a basis for bridging these representations. Techniques of explainable artificial intelligence (XAI) such as layer-wise relevance propagation (LRP) provide visual heatmap explanations for high-dimensional machine learning techniques such as deep neural networks. On the side of human cognition, visual attention is driven by the bottom-up and top-down processing of sensory input related to the current task. Since both XAI and human cognition should focus on task-related stimuli, there may be overlaps between their representations of visual attention, potentially providing a means of nonverbal communication between the human and machine. In this work, we examine the correlations between LRP heatmap explanations of a neural network trained to predict driving behavior and eye gaze heatmaps of human drivers. The analysis is used to determine the feasibility of using such a technique for enhancing driving performance. We find that LRP heatmaps show increasing levels of similarity with eye gaze according to the task specificity of the neural network. We then propose how these findings may assist humans by visually directing attention towards relevant areas. To our knowledge, our work provides the first known analysis of LRP and eye gaze for driving tasks.

READ FULL TEXT

page 3

page 5

research
10/03/2020

Creation and Validation of a Chest X-Ray Dataset with Eye-tracking and Report Dictation for AI Development

We developed a rich dataset of Chest X-Ray (CXR) images to assist invest...
research
09/15/2020

Creation and Validation of a Chest X-Ray Dataset with Eye-tracking and Report Dictation for AI Tool Development

We developed a rich dataset of Chest X-Ray (CXR) images to assist invest...
research
08/03/2021

Who is Better at Anticipating Traffic Crashes, Human or Artificial Intelligence? A Gaze Data-based Exploratory Study

Enhancing roadway safety is a priority of transportation. Hence, Artific...
research
05/25/2022

Eye-gaze-guided Vision Transformer for Rectifying Shortcut Learning

Learning harmful shortcuts such as spurious correlations and biases prev...
research
07/13/2018

Towards Modeling the Interaction of Spatial-Associative Neural Network Representations for Multisensory Perception

Our daily perceptual experience is driven by different neural mechanisms...
research
07/25/2023

Do humans and Convolutional Neural Networks attend to similar areas during scene classification: Effects of task and image type

Deep Learning models like Convolutional Neural Networks (CNN) are powerf...
research
02/15/2022

Gaze-Guided Class Activation Mapping: Leveraging Human Attention for Network Attention in Chest X-rays Classification

The increased availability and accuracy of eye-gaze tracking technology ...

Please sign up or login with your details

Forgot password? Click here to reset