Learning from dependent observations

07/02/2007
by   Ingo Steinwart, et al.
0

In most papers establishing consistency for learning algorithms it is assumed that the observations used for training are realizations of an i.i.d. process. In this paper we go far beyond this classical framework by showing that support vector machines (SVMs) essentially only require that the data-generating process satisfies a certain law of large numbers. We then consider the learnability of SVMs for -mixing (not necessarily stationary) processes for both classification and regression, where for the latter we explicitly allow unbounded noise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2016

Learning theory estimates with observations from general stationary stochastic processes

This paper investigates the supervised learning problem with observation...
research
03/13/2020

Adaptive Learning Rates for Support Vector Machines Working on Data with Low Intrinsic Dimension

We derive improved regression and classification rates for support vecto...
research
02/23/2012

Support Vector Regression for Right Censored Data

We develop a unified approach for classification and regression support ...
research
03/13/2020

SVM Learning Rates for Data with Low Intrinsic Dimension

We derive improved regression and classification rates for support vecto...
research
02/28/2022

The complexity of quantum support vector machines

Quantum support vector machines employ quantum circuits to define the ke...
research
05/20/2005

Upgrading Pulse Detection with Time Shift Properties Using Wavelets and Support Vector Machines

Current approaches in pulse detection use domain transformations so as t...
research
07/24/2023

Extending Path-Dependent NJ-ODEs to Noisy Observations and a Dependent Observation Framework

The Path-Dependent Neural Jump ODE (PD-NJ-ODE) is a model for predicting...

Please sign up or login with your details

Forgot password? Click here to reset