Learning Active Subspaces for Effective and Scalable Uncertainty Quantification in Deep Neural Networks

09/06/2023
by   Sanket Jantre, et al.
0

Bayesian inference for neural networks, or Bayesian deep learning, has the potential to provide well-calibrated predictions with quantified uncertainty and robustness. However, the main hurdle for Bayesian deep learning is its computational complexity due to the high dimensionality of the parameter space. In this work, we propose a novel scheme that addresses this limitation by constructing a low-dimensional subspace of the neural network parameters-referred to as an active subspace-by identifying the parameter directions that have the most significant influence on the output of the neural network. We demonstrate that the significantly reduced active subspace enables effective and scalable Bayesian inference via either Monte Carlo (MC) sampling methods, otherwise computationally intractable, or variational inference. Empirically, our approach provides reliable predictions with robust uncertainty estimates for various regression tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2019

Subspace Inference for Bayesian Deep Learning

Bayesian inference was once a gold standard for learning with neural net...
research
10/29/2019

Active Subspace of Neural Networks: Structural Analysis and Universal Attacks

Active subspace is a model reduction method widely used in the uncertain...
research
03/24/2022

Multilevel Bayesian Deep Neural Networks

In this article we consider Bayesian inference associated to deep neural...
research
07/08/2019

Bayesian deep learning with hierarchical prior: Predictions from limited and noisy data

Datasets in engineering applications are often limited and contaminated,...
research
07/09/2021

Gaussian Process Subspace Regression for Model Reduction

Subspace-valued functions arise in a wide range of problems, including p...
research
06/20/2023

Traversing Between Modes in Function Space for Fast Ensembling

Deep ensemble is a simple yet powerful way to improve the performance of...
research
11/09/2020

Improving Neural Network Training in Low Dimensional Random Bases

Stochastic Gradient Descent (SGD) has proven to be remarkably effective ...

Please sign up or login with your details

Forgot password? Click here to reset