Uncertainty Propagation in Deep Neural Networks Using Extended Kalman Filtering

09/17/2018
by   Jessica S. Titensky, et al.
0

Extended Kalman Filtering (EKF) can be used to propagate and quantify input uncertainty through a Deep Neural Network (DNN) assuming mild hypotheses on the input distribution. This methodology yields results comparable to existing methods of uncertainty propagation for DNNs while lowering the computational overhead considerably. Additionally, EKF allows model error to be naturally incorporated into the output uncertainty.

READ FULL TEXT
research
03/10/2019

Uncertainty Propagation in Deep Neural Network Using Active Subspace

The inputs of deep neural network (DNN) from real-world data usually com...
research
04/06/2022

Neural Network-augmented Kalman Filtering for Robust Online Speech Dereverberation in Noisy Reverberant Environments

In this paper, a neural network-augmented algorithm for noise-robust onl...
research
03/20/2023

Uncertainty-aware deep learning for digital twin-driven monitoring: Application to fault detection in power lines

Deep neural networks (DNNs) are often coupled with physics-based models ...
research
03/31/2022

AKF-SR: Adaptive Kalman Filtering-based Successor Representation

Recent studies in neuroscience suggest that Successor Representation (SR...
research
07/28/2020

Neural Kalman Filtering for Speech Enhancement

Statistical signal processing based speech enhancement methods adopt exp...
research
11/29/2018

Uncertainty propagation in neural networks for sparse coding

A novel method to propagate uncertainty through the soft-thresholding no...
research
11/16/2021

Assessing Deep Neural Networks as Probability Estimators

Deep Neural Networks (DNNs) have performed admirably in classification t...

Please sign up or login with your details

Forgot password? Click here to reset