Learning functions varying along an active subspace

01/22/2020
by   Hao Liu, et al.
0

Many functions of interest are in a high-dimensional space but exhibit low-dimensional structures. This paper studies regression of a s-Hölder function f in R^D which varies along an active subspace of dimension d while d≪ D. A direct approximation of f in R^D with an ε accuracy requires the number of samples n in the order of ε^-(2s+D)/s. this paper, we modify the Generalized Contour Regression (GCR) algorithm to estimate the active subspace and use piecewise polynomials for function approximation. GCR is among the best estimators for the active subspace, but its sample complexity is an open question. Our modified GCR improves the efficiency over the original GCR and leads to an mean squared estimation error of O(n^-1) for the active subspace, when n is sufficiently large. The mean squared regression error of f is proved to be in the order of (n/log n)^-2s/2s+d where the exponent depends on the dimension of the active subspace d instead of the ambient space D. This result demonstrates that GCR is effective in learning low-dimensional active subspaces. The convergence rate is validated through several numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2018

A probabilistic framework for approximating functions in active subspaces

This paper develops a comprehensive probabilistic setup to compute appro...
research
02/14/2023

Score Approximation, Estimation and Distribution Recovery of Diffusion Models on Low-Dimensional Data

Diffusion models achieve state-of-the-art performance in various generat...
research
10/03/2019

Generalized bounds for active subspaces

The active subspace method, as a dimension reduction technique, can subs...
research
10/01/2020

Reducing Subspace Models for Large-Scale Covariance Regression

We develop an envelope model for joint mean and covariance regression in...
research
01/15/2022

Sample Summary with Generative Encoding

With increasing sample sizes, all algorithms require longer run times th...
research
04/30/2019

Active Manifolds: A non-linear analogue to Active Subspaces

We present an approach to analyze C^1(R^m) functions that addresses limi...
research
12/02/2021

Level set learning with pseudo-reversible neural networks for nonlinear dimension reduction in function approximation

Due to the curse of dimensionality and the limitation on training data, ...

Please sign up or login with your details

Forgot password? Click here to reset