Nonlinear dimension reduction for surrogate modeling using gradient information

02/20/2021
by   Daniele Bigoni, et al.
0

We introduce a method for the nonlinear dimension reduction of a high-dimensional function u:ℝ^d→ℝ, d≫1. Our objective is to identify a nonlinear feature map g:ℝ^d→ℝ^m, with a prescribed intermediate dimension m≪ d, so that u can be well approximated by f∘ g for some profile function f:ℝ^m→ℝ. We propose to build the feature map by aligning the Jacobian ∇ g with the gradient ∇ u, and we theoretically analyze the properties of the resulting g. Once g is built, we construct f by solving a gradient-enhanced least squares problem. Our practical algorithm makes use of a sample {x^(i),u(x^(i)),∇ u(x^(i))}_i=1^N and builds both g and f on adaptive downward-closed polynomial spaces, using cross validation to avoid overfitting. We numerically evaluate the performance of our algorithm across different benchmarks, and explore the impact of the intermediate dimension m. We show that building a nonlinear feature map g can permit more accurate approximation of u than a linear g, for the same input data set.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

10/28/2020

Bridging linearity-based and kernel-based sufficient dimension reduction

There has been a lot of interest in sufficient dimension reduction (SDR)...
06/03/2019

Transformed Central Quantile Subspace

We present a dimension reduction technique for the conditional quantiles...
05/26/2015

Using Dimension Reduction to Improve the Classification of High-dimensional Data

In this work we show that the classification performance of high-dimensi...
07/02/2018

Certified dimension reduction in nonlinear Bayesian inverse problems

We propose a dimension reduction technique for Bayesian inverse problems...
04/29/2021

Nonlinear Level Set Learning for Function Approximation on Sparse Data with Applications to Parametric Differential Equations

A dimension reduction method based on the "Nonlinear Level set Learning"...
12/02/2021

Level set learning with pseudo-reversible neural networks for nonlinear dimension reduction in function approximation

Due to the curse of dimensionality and the limitation on training data, ...
09/05/2019

A new reproducing kernel based nonlinear dimension reduction method for survival data

Based on the theories of sliced inverse regression (SIR) and reproducing...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.