Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold

03/24/2020
by   Dimitris G. Giovanis, et al.
0

This paper introduces a surrogate modeling scheme based on Grassmannian manifold learning to be used for cost-efficient predictions of high-dimensional stochastic systems. The method exploits subspace-structured features of each solution by projecting it onto a Grassmann manifold. The method utilizes a solution clustering approach in order to identify regions of the parameter space over which solutions are sufficiently similarly such that they can be interpolated on the Grassmannian. In this clustering, the reduced-order solutions are partitioned into disjoint clusters on the Grassmann manifold using the eigen-structure of properly defined Grassmannian kernels and, the Karcher mean of each cluster is estimated. Then, the points in each cluster are projected onto the tangent space with origin at the corresponding Karcher mean using the exponential mapping. For each cluster, a Gaussian process regression model is trained that maps the input parameters of the system to the reduced solution points of the corresponding cluster projected onto the tangent space. Using this Gaussian process model, the full-field solution can be efficiently predicted at any new point in the parameter space. In certain cases, the solution clusters will span disjoint regions of the parameter space. In such cases, for each of the solution clusters we utilize a second, density-based spatial clustering to group their corresponding input parameter points in the Euclidean space. The proposed method is applied to two numerical examples. The first is a nonlinear stochastic ordinary differential equation with uncertain initial conditions. The second involves modeling of plastic deformation in a model amorphous solid using the Shear Transformation Zone theory of plasticity.

READ FULL TEXT

page 9

page 22

page 23

page 24

page 25

page 26

page 27

research
07/09/2021

Gaussian Process Subspace Regression for Model Reduction

Subspace-valued functions arise in a wide range of problems, including p...
research
10/27/2021

Multi-fidelity data fusion through parameter space reduction with applications to automotive engineering

Multi-fidelity models are of great importance due to their capability of...
research
07/01/2022

Enhancing cluster analysis via topological manifold learning

We discuss topological aspects of cluster analysis and show that inferri...
research
09/23/2018

Data-Driven Design: Exploring new Structural Forms using Machine Learning and Graphic Statics

The aim of this research is to introduce a novel structural design proce...
research
06/22/2022

Regression Trees on Grassmann Manifold for Adapting Reduced-Order Models

Low dimensional and computationally less expensive Reduced-Order Models ...
research
11/22/2021

Two step clustering for data reduction combining DBSCAN and k-means clustering

A novel combination of two widely-used clustering algorithms is proposed...
research
10/31/2017

Space-filling design for nonlinear models

Performing a computer experiment can be viewed as observing a mapping be...

Please sign up or login with your details

Forgot password? Click here to reset