Conditional mean embeddings and optimal feature selection via positive definite kernels

05/14/2023
by   Palle E. T. Jorgensen, et al.
0

Motivated by applications, we consider here new operator theoretic approaches to Conditional mean embeddings (CME). Our present results combine a spectral analysis-based optimization scheme with the use of kernels, stochastic processes, and constructive learning algorithms. For initially given non-linear data, we consider optimization-based feature selections. This entails the use of convex sets of positive definite (p.d.) kernels in a construction of optimal feature selection via regression algorithms from learning models. Thus, with initial inputs of training data (for a suitable learning algorithm,) each choice of p.d. kernel K in turn yields a variety of Hilbert spaces and realizations of features. A novel idea here is that we shall allow an optimization over selected sets of kernels K from a convex set C of positive definite kernels K. Hence our “optimal” choices of feature representations will depend on a secondary optimization over p.d. kernels K within a specified convex set C.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2021

Positive Definite Multi-Kernels for Scattered Data Interpolations

In this article, we use the knowledge of positive definite tensors to de...
research
01/03/2023

Operator theory, kernels, and Feedforward Neural Networks

In this paper we show how specific families of positive definite kernels...
research
10/09/2019

Kernels over Sets of Finite Sets using RKHS Embeddings, with Application to Bayesian (Combinatorial) Optimization

We focus on kernel methods for set-valued inputs and their application t...
research
07/06/2017

Indefinite Kernel Logistic Regression

Traditionally, kernel learning methods requires positive definitiveness ...
research
04/15/2023

Efficient Convex Algorithms for Universal Kernel Learning

The accuracy and complexity of machine learning algorithms based on kern...
research
04/26/2013

Learning Densities Conditional on Many Interacting Features

Learning a distribution conditional on a set of discrete-valued features...
research
09/09/2008

Exploring Large Feature Spaces with Hierarchical Multiple Kernel Learning

For supervised and unsupervised learning, positive definite kernels allo...

Please sign up or login with your details

Forgot password? Click here to reset