Gaussian Process Subspace Regression for Model Reduction

07/09/2021
by   Ruda Zhang, et al.
6

Subspace-valued functions arise in a wide range of problems, including parametric reduced order modeling (PROM). In PROM, each parameter point can be associated with a subspace, which is used for Petrov-Galerkin projections of large system matrices. Previous efforts to approximate such functions use interpolations on manifolds, which can be inaccurate and slow. To tackle this, we propose a novel Bayesian nonparametric model for subspace prediction: the Gaussian Process Subspace regression (GPS) model. This method is extrinsic and intrinsic at the same time: with multivariate Gaussian distributions on the Euclidean space, it induces a joint probability model on the Grassmann manifold, the set of fixed-dimensional subspaces. The GPS adopts a simple yet general correlation structure, and a principled approach for model selection. Its predictive distribution admits an analytical form, which allows for efficient subspace prediction over the parameter space. For PROM, the GPS provides a probabilistic prediction at a new parameter point that retains the accuracy of local reduced models, at a computational complexity that does not depend on system dimension, and thus is suitable for online computation. We give four numerical examples to compare our method to subspace interpolation, as well as two methods that interpolate local reduced models. Overall, GPS is the most data efficient, more computationally efficient than subspace interpolation, and gives smooth predictions with uncertainty quantification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2020

Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold

This paper introduces a surrogate modeling scheme based on Grassmannian ...
research
09/06/2023

Learning Active Subspaces for Effective and Scalable Uncertainty Quantification in Deep Neural Networks

Bayesian inference for neural networks, or Bayesian deep learning, has t...
research
02/01/2018

Dimension Reduction via Gaussian Ridge Functions

Ridge functions have recently emerged as a powerful set of ideas for sub...
research
11/06/2015

Efficient Multiscale Gaussian Process Regression using Hierarchical Clustering

Standard Gaussian Process (GP) regression, a powerful machine learning t...
research
04/22/2020

Gaussian Process Manifold Interpolation for Probabilistic Atrial Activation Maps and Uncertain Conduction Velocity

In patients with atrial fibrillation, local activation time (LAT) maps a...
research
07/04/2018

Neural Processes

A neural network (NN) is a parameterised function that can be tuned via ...

Please sign up or login with your details

Forgot password? Click here to reset