DeepAI AI Chat
Log In Sign Up

Nonlinear dimension reduction for surrogate modeling using gradient information

02/20/2021
by   Daniele Bigoni, et al.
0

We introduce a method for the nonlinear dimension reduction of a high-dimensional function u:ℝ^d→ℝ, d≫1. Our objective is to identify a nonlinear feature map g:ℝ^d→ℝ^m, with a prescribed intermediate dimension m≪ d, so that u can be well approximated by f∘ g for some profile function f:ℝ^m→ℝ. We propose to build the feature map by aligning the Jacobian ∇ g with the gradient ∇ u, and we theoretically analyze the properties of the resulting g. Once g is built, we construct f by solving a gradient-enhanced least squares problem. Our practical algorithm makes use of a sample {x^(i),u(x^(i)),∇ u(x^(i))}_i=1^N and builds both g and f on adaptive downward-closed polynomial spaces, using cross validation to avoid overfitting. We numerically evaluate the performance of our algorithm across different benchmarks, and explore the impact of the intermediate dimension m. We show that building a nonlinear feature map g can permit more accurate approximation of u than a linear g, for the same input data set.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/28/2020

Bridging linearity-based and kernel-based sufficient dimension reduction

There has been a lot of interest in sufficient dimension reduction (SDR)...
06/03/2019

Transformed Central Quantile Subspace

We present a dimension reduction technique for the conditional quantiles...
05/26/2015

Using Dimension Reduction to Improve the Classification of High-dimensional Data

In this work we show that the classification performance of high-dimensi...
10/09/2022

Nonlinear Sufficient Dimension Reduction with a Stochastic Neural Network

Sufficient dimension reduction is a powerful tool to extract core inform...
06/10/2020

Deep Dimension Reduction for Supervised Representation Learning

The success of deep supervised learning depends on its automatic data re...
04/26/2013

Learning Densities Conditional on Many Interacting Features

Learning a distribution conditional on a set of discrete-valued features...