Uncertainty Propagation in Deep Neural Networks Using Extended Kalman Filtering

09/17/2018
by   Jessica S. Titensky, et al.
0

Extended Kalman Filtering (EKF) can be used to propagate and quantify input uncertainty through a Deep Neural Network (DNN) assuming mild hypotheses on the input distribution. This methodology yields results comparable to existing methods of uncertainty propagation for DNNs while lowering the computational overhead considerably. Additionally, EKF allows model error to be naturally incorporated into the output uncertainty.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset