DeepAI AI Chat
Log In Sign Up

A note on quadratic approximations of logistic log-likelihoods

by   Daniele Durante, et al.
Università Bocconi

Quadratic approximations of logistic log-likelihoods are fundamental to facilitate estimation and inference for binary variables. Although classical expansions underlying Newton-Raphson and Fisher scoring methods have attracted much of the interest, there has been also a recent focus on quadratic bounds that uniformly minorize the logistic log-likelihood, and are tangent to it in a specific point. Compared to the classical Taylor expansion of the score function, these approximations provide iterative estimation procedures which guarantee monotonicity in the log-likelihood sequence, and motivate variational methods for Bayesian inference. A relevant contribution, within this class of approximations, relies on a convex duality argument to derive a tractable family of tangent quadratic expansions indexed by a location parameter. Although this approximation is widely used in practice, less attempts have been made to understand its probabilistic justification and the associated properties. To address this gap, we formally relate this quadratic lower bound to a recent Pólya-gamma data augmentation, showing that the approximation error associated with the bound coincides with the Kullback-Leibler divergence between a generic Pólya-gamma variable and the one obtained by conditioning on the observed response data. This result facilitates the study of the optimality properties associated with the minorize-majorize and variational Bayes routines leveraging this quadratic bound, and motivates a novel mean-field variational Bayes for logistic regression.


page 1

page 2

page 3

page 4


Score Engineered Logistic Regression

In several FICO studies logistic regression has been shown to be a very ...

On some variance reduction properties of the reparameterization trick

The so-called reparameterization trick is widely used in variational inf...

Variational Bayes Inference of Survival Data using Log-logistic Accelerated Failure Time Model

The log-logistic regression model is one of the most commonly used accel...

Variational approximations using Fisher divergence

Modern applications of Bayesian inference involve models that are suffic...

Nonparametric variational inference

Variational methods are widely used for approximate posterior inference....

Variance reduction properties of the reparameterization trick

The reparameterization trick is widely used in variational inference as ...

Variational Bayes In Private Settings (VIPS)

We provide a general framework for privacy-preserving variational Bayes ...