A Nonconformity Approach to Model Selection for SVMs

09/12/2009
by   David R. Hardoon, et al.
0

We investigate the issue of model selection and the use of the nonconformity (strangeness) measure in batch learning. Using the nonconformity measure we propose a new training algorithm that helps avoid the need for Cross-Validation or Leave-One-Out model selection strategies. We provide a new generalisation error bound using the notion of nonconformity to upper bound the loss of each test example and show that our proposed approach is comparable to standard model selection methods, but with theoretical guarantees of success and faster convergence. We demonstrate our novel model selection technique using the Support Vector Machine.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2018

Limitations of "Limitations of Bayesian leave-one-out cross-validation for model selection"

This article is an invited discussion of the article by Gronau and Wagen...
research
09/16/2020

Better Model Selection with a new Definition of Feature Importance

Feature importance aims at measuring how crucial each input feature is f...
research
02/16/2018

Train on Validation: Squeezing the Data Lemon

Model selection on validation data is an essential step in machine learn...
research
05/14/2018

Spatio-temporal Bayesian On-line Changepoint Detection with Model Selection

Bayesian On-line Changepoint Detection is extended to on-line model sele...
research
10/23/2014

Model Selection for Topic Models via Spectral Decomposition

Topic models have achieved significant successes in analyzing large-scal...
research
02/07/2019

Model Selection for Simulator-based Statistical Models: A Kernel Approach

We propose a novel approach to model selection for simulator-based stati...
research
10/28/2018

Consistency of ELBO maximization for model selection

The Evidence Lower Bound (ELBO) is a quantity that plays a key role in v...

Please sign up or login with your details

Forgot password? Click here to reset