Low Complexity Approximate Bayesian Logistic Regression for Sparse Online Learning

01/28/2021
by   Gil I. Shamir, et al.
6

Theoretical results show that Bayesian methods can achieve lower bounds on regret for online logistic regression. In practice, however, such techniques may not be feasible especially for very large feature sets. Various approximations that, for huge sparse feature sets, diminish the theoretical advantages, must be used. Often, they apply stochastic gradient methods with hyper-parameters that must be tuned on some surrogate loss, defeating theoretical advantages of Bayesian methods. The surrogate loss, defined to approximate the mixture, requires techniques as Monte Carlo sampling, increasing computations per example. We propose low complexity analytical approximations for sparse online logistic and probit regressions. Unlike variational inference and other methods, our methods use analytical closed forms, substantially lowering computations. Unlike dense solutions, as Gaussian Mixtures, our methods allow for sparse problems with huge feature sets without increasing complexity. With the analytical closed forms, there is also no need for applying stochastic gradient methods on surrogate losses, and for tuning and balancing learning and regularization hyper-parameters. Empirical results top the performance of the more computationally involved methods. Like such methods, our methods still reveal per feature and per example uncertainty measures.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

12/02/2019

Stochastic Variational Inference via Upper Bound

Stochastic variational inference (SVI) plays a key role in Bayesian deep...
09/19/2012

Variational Inference in Nonconjugate Models

Mean-field variational methods are widely used for approximate posterior...
09/26/2017

PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference

Generalized linear models (GLMs) -- such as logistic regression, Poisson...
12/25/2017

A Random Block-Coordinate Douglas-Rachford Splitting Method with Low Computational Complexity for Binary Logistic Regression

In this paper, we propose a new optimization algorithm for sparse logist...
12/06/2021

Bounding Wasserstein distance with couplings

Markov chain Monte Carlo (MCMC) provides asymptotically consistent estim...
09/09/2015

Fast Second-Order Stochastic Backpropagation for Variational Inference

We propose a second-order (Hessian or Hessian-free) based optimization m...
04/16/2021

Affine-invariant ensemble transform methods for logistic regression

We investigate the application of ensemble transform approaches to Bayes...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.