Optimal γ and C for ε-Support Vector Regression with RBF Kernels

06/12/2015
by   Longfei Lu, et al.
0

The objective of this study is to investigate the efficient determination of C and γ for Support Vector Regression with RBF or mahalanobis kernel based on numerical and statistician considerations, which indicates the connection between C and kernels and demonstrates that the deviation of geometric distance of neighbour observation in mapped space effects the predict accuracy of ϵ-SVR. We determinate the arrange of γ & C and propose our method to choose their best values.

READ FULL TEXT
research
03/18/2016

Generalized support vector regression: duality and tensor-kernel representation

In this paper we study the variational problem associated to support vec...
research
09/26/2022

Convex Support Vector Regression

Nonparametric regression subject to convexity or concavity constraints i...
research
12/08/2017

Learning 2D Gabor Filters by Infinite Kernel Learning Regression

Gabor functions have wide-spread applications in image processing and co...
research
07/29/2021

Densely connected neural networks for nonlinear regression

Densely connected convolutional networks (DenseNet) behave well in image...
research
07/10/2018

An Empirical Approach For Probing the Definiteness of Kernels

Models like support vector machines or Gaussian process regression often...
research
01/30/2022

A least squares support vector regression for anisotropic diffusion filtering

Anisotropic diffusion filtering for signal smoothing as a low-pass filte...
research
04/23/2015

svcR: An R Package for Support Vector Clustering improved with Geometric Hashing applied to Lexical Pattern Discovery

We present a new R package which takes a numerical matrix format as data...

Please sign up or login with your details

Forgot password? Click here to reset