Regularised Least-Squares Regression with Infinite-Dimensional Output Space

10/21/2020
by   Junhyunng Park, et al.
0

We present some learning theory results on reproducing kernel Hilbert space (RKHS) regression, where the output space is an infinite-dimensional Hilbert space.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

09/09/2020

Consistency and Regression with Laplacian regularization in Reproducing Kernel Hilbert Space

This note explains a way to look at reproducing kernel Hilbert space for...
11/08/2011

The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy

The popular cubic smoothing spline estimate of a regression function ari...
02/08/2020

Reproducing Kernel Hilbert Spaces Cannot Contain all Continuous Functions on a Compact Metric Space

Given an uncountable, compact metric space, we show that there exists no...
08/27/2020

The linear conditional expectation in Hilbert space

The linear conditional expectation (LCE) provides a best linear (or rath...
05/28/2018

Autoencoding any Data through Kernel Autoencoders

This paper investigates a novel algorithmic approach to data representat...
05/31/2021

Control Occupation Kernel Regression for Nonlinear Control-Affine Systems

This manuscript presents an algorithm for obtaining an approximation of ...
09/11/2020

A kernel function for Signal Temporal Logic formulae

We discuss how to define a kernel for Signal Temporal Logic (STL) formul...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.