Sparse Multinomial Logistic Regression via Approximate Message Passing

09/15/2015
by   Evan Byrne, et al.
0

For the problem of multi-class linear classification and feature selection, we propose approximate message passing approaches to sparse multinomial logistic regression (MLR). First, we propose two algorithms based on the Hybrid Generalized Approximate Message Passing (HyGAMP) framework: one finds the maximum a posteriori (MAP) linear classifier and the other finds an approximation of the test-error-rate minimizing linear classifier. Then we design computationally simplified variants of these two algorithms. Next, we detail methods to tune the hyperparameters of their assumed statistical models using Stein's unbiased risk estimate (SURE) and expectation-maximization (EM), respectively. Finally, using both synthetic and real-world datasets, we demonstrate improved error-rate and runtime performance relative to existing state-of-the-art approaches to sparse MLR.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/05/2014

Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing

For the problem of binary linear classification and feature selection, w...
research
06/26/2018

An Expectation-Maximization Approach to Tuning Generalized Vector Approximate Message Passing

Generalized Vector Approximate Message Passing (GVAMP) is an efficient i...
research
05/23/2019

Replicated Vector Approximate Message Passing For Resampling Problem

Resampling techniques are widely used in statistical inference and ensem...
research
09/20/2020

Expectation propagation for the diluted Bayesian classifier

Efficient feature selection from high-dimensional datasets is a very imp...
research
02/20/2018

Estimator of Prediction Error Based on Approximate Message Passing for Penalized Linear Regression

We propose an estimator of prediction error using an approximate message...
research
02/14/2012

Message-Passing Algorithms for Quadratic Programming Formulations of MAP Estimation

Computing maximum a posteriori (MAP) estimation in graphical models is a...
research
06/11/2020

Asymptotic Errors for Teacher-Student Convex Generalized Linear Models (or : How to Prove Kabashima's Replica Formula)

There has been a recent surge of interest in the study of asymptotic rec...

Please sign up or login with your details

Forgot password? Click here to reset