DeepAI AI Chat
Log In Sign Up

ANOVA kernels and RKHS of zero mean functions for model-based sensitivity analysis

by   Nicolas Durrande, et al.
Ecole Nationale Suprieure des Mines de Saint Etienne
Telecom Saint-Etienne

Given a reproducing kernel Hilbert space H of real-valued functions and a suitable measure mu over the source space D (subset of R), we decompose H as the sum of a subspace of centered functions for mu and its orthogonal in H. This decomposition leads to a special case of ANOVA kernels, for which the functional ANOVA representation of the best predictor can be elegantly derived, either in an interpolation or regularization framework. The proposed kernels appear to be particularly convenient for analyzing the e ffect of each (group of) variable(s) and computing sensitivity indices without recursivity.


page 1

page 2

page 3

page 4


Distribution regression model with a Reproducing Kernel Hilbert Space approach

In this paper, we introduce a new distribution regression model for prob...

Reproducing kernels based schemes for nonparametric regression

In this work, we develop and study an empirical projection operator sche...

Functional learning through kernels

This paper reviews the functional aspects of statistical learning theory...

On the generalization of Tanimoto-type kernels to real valued functions

The Tanimoto kernel (Jaccard index) is a well known tool to describe the...

Kernel-based sensivity analysis for (excursion) sets

This work is motivated by goal-oriented sensitivity analysis of inputs/o...

D2KE: From Distance to Kernel and Embedding

For many machine learning problem settings, particularly with structured...

Kernels for Measures Defined on the Gram Matrix of their Support

We present in this work a new family of kernels to compare positive meas...