DeepAI AI Chat
Log In Sign Up

ANOVA kernels and RKHS of zero mean functions for model-based sensitivity analysis

06/17/2011
by   Nicolas Durrande, et al.
Ecole Nationale Suprieure des Mines de Saint Etienne
Telecom Saint-Etienne
0

Given a reproducing kernel Hilbert space H of real-valued functions and a suitable measure mu over the source space D (subset of R), we decompose H as the sum of a subspace of centered functions for mu and its orthogonal in H. This decomposition leads to a special case of ANOVA kernels, for which the functional ANOVA representation of the best predictor can be elegantly derived, either in an interpolation or regularization framework. The proposed kernels appear to be particularly convenient for analyzing the e ffect of each (group of) variable(s) and computing sensitivity indices without recursivity.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/27/2018

Distribution regression model with a Reproducing Kernel Hilbert Space approach

In this paper, we introduce a new distribution regression model for prob...
01/30/2020

Reproducing kernels based schemes for nonparametric regression

In this work, we develop and study an empirical projection operator sche...
10/06/2009

Functional learning through kernels

This paper reviews the functional aspects of statistical learning theory...
07/12/2020

On the generalization of Tanimoto-type kernels to real valued functions

The Tanimoto kernel (Jaccard index) is a well known tool to describe the...
05/16/2023

Kernel-based sensivity analysis for (excursion) sets

This work is motivated by goal-oriented sensitivity analysis of inputs/o...
02/14/2018

D2KE: From Distance to Kernel and Embedding

For many machine learning problem settings, particularly with structured...
09/07/2009

Kernels for Measures Defined on the Gram Matrix of their Support

We present in this work a new family of kernels to compare positive meas...