LS-SVR as a Bayesian RBF network

05/01/2019
by   Diego P. P. Mesquita, et al.
0

We show the theoretical equivalence between the Least Squares Support Vector Regression (LS-SVR) model and maximum a posteriori (MAP) inference on Bayesian Radial Basis Functions (RBF) networks with a specific Gaussian prior on the regression weights. Although previous works have pointed out similar expressions between those learning approaches, we explicit and formally state such correspondence. We empirically demonstrate our result by performing computational experiments with standard regression benchmarks. Our findings open a range of possibilities to improve LS-SVR borrowing strength from well-established developments in Bayesian methodology.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2019

General Regression Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, and Feedforward Neural Networks

The aim of this project is to develop a code to discover the optimal sig...
research
01/31/2023

Adaptive sparseness for correntropy-based robust regression via automatic relevance determination

Sparseness and robustness are two important properties for many machine ...
research
02/24/2020

Being Bayesian, Even Just a Bit, Fixes Overconfidence in ReLU Networks

The point estimates of ReLU classification networks—arguably the most wi...
research
01/10/2013

Heteroscedastic Relevance Vector Machine

In this work we propose a heteroscedastic generalization to RVM, a fast ...
research
04/01/2020

Bayesian ODE Solvers: The Maximum A Posteriori Estimate

It has recently been established that the numerical solution of ordinary...
research
01/16/2022

On Maximum-a-Posteriori estimation with Plug Play priors and stochastic gradient descent

Bayesian methods to solve imaging inverse problems usually combine an ex...

Please sign up or login with your details

Forgot password? Click here to reset