Is my Driver Observation Model Overconfident? Input-guided Calibration Networks for Reliable and Interpretable Confidence Estimates

04/10/2022
by   Alina Roitberg, et al.
6

Driver observation models are rarely deployed under perfect conditions. In practice, illumination, camera placement and type differ from the ones present during training and unforeseen behaviours may occur at any time. While observing the human behind the steering wheel leads to more intuitive human-vehicle-interaction and safer driving, it requires recognition algorithms which do not only predict the correct driver state, but also determine their prediction quality through realistic and interpretable confidence measures. Reliable uncertainty estimates are crucial for building trust and are a serious obstacle for deploying activity recognition networks in real driving systems. In this work, we for the first time examine how well the confidence values of modern driver observation models indeed match the probability of the correct outcome and show that raw neural network-based approaches tend to significantly overestimate their prediction quality. To correct this misalignment between the confidence values and the actual uncertainty, we consider two strategies. First, we enhance two activity recognition models often used for driver observation with temperature scaling-an off-the-shelf method for confidence calibration in image classification. Then, we introduce Calibrated Action Recognition with Input Guidance (CARING)-a novel approach leveraging an additional neural network to learn scaling the confidences depending on the video representation. Extensive experiments on the Drive Act dataset demonstrate that both strategies drastically improve the quality of model confidences, while our CARING model out-performs both, the original architectures and their temperature scaling enhancement, leading to best uncertainty estimates.

READ FULL TEXT

page 1

page 3

page 6

page 9

page 10

page 11

page 12

research
01/02/2021

Uncertainty-sensitive Activity Recognition: a Reliability Benchmark and the CARING Models

Beyond assigning the correct class, an activity recognition model should...
research
03/02/2022

TransDARC: Transformer-based Driver Activity Recognition with Latent Space Feature Calibration

Traditional video-based human activity recognition has experienced remar...
research
06/28/2020

A Confidence-Calibrated MOBA Game Winner Predictor

In this paper, we propose a confidence-calibration method for predicting...
research
12/10/2020

Uncertainty-Aware Deep Calibrated Salient Object Detection

Existing deep neural network based salient object detection (SOD) method...
research
11/27/2018

Uncertainty aware multimodal activity recognition with Bayesian inference

Deep neural networks (DNNs) provide state-of-the-art results for a multi...
research
01/16/2019

Uncertainty-Aware Driver Trajectory Prediction at Urban Intersections

Predicting the motion of a driver's vehicle is crucial for advanced driv...

Please sign up or login with your details

Forgot password? Click here to reset