DeepAI AI Chat
Log In Sign Up

Expectation-maximization for logistic regression

by   James G. Scott, et al.
The University of Texas at Austin

We present a family of expectation-maximization (EM) algorithms for binary and negative-binomial logistic regression, drawing a sharp connection with the variational-Bayes algorithm of Jaakkola and Jordan (2000). Indeed, our results allow a version of this variational-Bayes approach to be re-interpreted as a true EM algorithm. We study several interesting features of the algorithm, and of this previously unrecognized connection with variational Bayes. We also generalize the approach to sparsity-promoting priors, and to an online method whose convergence properties are easily established. This latter method compares favorably with stochastic-gradient descent in situations with marked collinearity.


page 1

page 2

page 3

page 4


Parameter-Expanded ECME Algorithms for Logistic and Penalized Logistic Regression

Parameter estimation in logistic regression is a well-studied problem wi...

A view of Estimation of Distribution Algorithms through the lens of Expectation-Maximization

We show that under mild conditions, Estimation of Distribution Algorithm...

Accelerated and Deep Expectation Maximization for One-Bit MIMO-OFDM Detection

In this paper we study the expectation maximization (EM) technique for o...

A regression model with a hidden logistic process for signal parametrization

A new approach for signal parametrization, which consists of a specific ...

Particle Mean Field Variational Bayes

The Mean Field Variational Bayes (MFVB) method is one of the most comput...

Anytime Planning for Decentralized POMDPs using Expectation Maximization

Decentralized POMDPs provide an expressive framework for multi-agent seq...

Stochastic EM methods with Variance Reduction for Penalised PET Reconstructions

Expectation-maximization (EM) is a popular and well-established method f...

Code Repositories


Extreme Multi-label Learning using Mixture of Factor Analyzers

view repo