Mean field variational Bayesian inference for support vector machine classification

05/13/2013
by   Jan Luts, et al.
0

A mean field variational Bayes approach to support vector machines (SVMs) using the latent variable representation on Polson & Scott (2012) is presented. This representation allows circumvention of many of the shortcomings associated with classical SVMs including automatic penalty parameter selection, the ability to handle dependent samples, missing data and variable selection. We demonstrate on simulated and real datasets that our approach is easily extendable to non-standard situations and outperforms the classical SVM approach whilst remaining computationally efficient.

READ FULL TEXT
research
10/02/2007

Structured variable selection in support vector machines

When applying the support vector machine (SVM) to high-dimensional class...
research
03/24/2023

Particle Mean Field Variational Bayes

The Mean Field Variational Bayes (MFVB) method is one of the most comput...
research
07/18/2017

Bayesian Nonlinear Support Vector Machines for Big Data

We propose a fast inference method for Bayesian nonlinear support vector...
research
07/11/2022

Sparse Dynamic Factor Models with Loading Selection by Variational Inference

In this paper we develop a novel approach for estimating large and spars...
research
10/14/2021

Algorithms for Sparse Support Vector Machines

Many problems in classification involve huge numbers of irrelevant featu...
research
08/17/2018

A bagging and importance sampling approach to Support Vector Machines

An importance sampling and bagging approach to solving the support vecto...
research
09/24/2014

Variational Pseudolikelihood for Regularized Ising Inference

I propose a variational approach to maximum pseudolikelihood inference o...

Please sign up or login with your details

Forgot password? Click here to reset