DeepAI AI Chat
Log In Sign Up

Uncertainty Disentanglement with Non-stationary Heteroscedastic Gaussian Processes for Active Learning

by   Zeel B Patel, et al.

Gaussian processes are Bayesian non-parametric models used in many areas. In this work, we propose a Non-stationary Heteroscedastic Gaussian process model which can be learned with gradient-based techniques. We demonstrate the interpretability of the proposed model by separating the overall uncertainty into aleatoric (irreducible) and epistemic (model) uncertainty. We illustrate the usability of derived epistemic uncertainty on active learning problems. We demonstrate the efficacy of our model with various ablations on multiple datasets.


page 1

page 2

page 3

page 4


Non-stationary Gaussian Process Surrogates

We provide a survey of non-stationary surrogate models which utilize Gau...

Deeper Connections between Neural Networks and Gaussian Processes Speed-up Active Learning

Active learning methods for neural networks are usually based on greedy ...

Monotonic Gaussian Process Flow

We propose a new framework of imposing monotonicity constraints in a Bay...

Active Learning for Deep Gaussian Process Surrogates

Deep Gaussian processes (DGPs) are increasingly popular as predictive mo...

Hierarchical-Hyperplane Kernels for Actively Learning Gaussian Process Models of Nonstationary Systems

Learning precise surrogate models of complex computer simulations and ph...

Gaussian Process Probes (GPP) for Uncertainty-Aware Probing

Understanding which concepts models can and cannot represent has been fu...

Anomaly Detection and Removal Using Non-Stationary Gaussian Processes

This paper proposes a novel Gaussian process approach to fault removal i...