Affine-invariant ensemble transform methods for logistic regression

by   Jakiw Pidstrigach, et al.

We investigate the application of ensemble transform approaches to Bayesian inference of logistic regression problems. Our approach relies on appropriate extensions of the popular ensemble Kalman filter and the feedback particle filter to the cross entropy loss function and is based on a well-established homotopy approach to Bayesian inference. The arising finite particle evolution equations as well as their mean-field limits are affine-invariant. Furthermore, the proposed methods can be implemented in a gradient-free manner in case of nonlinear logistic regression and the data can be randomly subsampled similar to mini-batching of stochastic gradient descent. We also propose a closely related SDE-based sampling method which again is affine-invariant and can easily be made gradient-free. Numerical examples demonstrate the appropriateness of the proposed methodologies.



page 1

page 2

page 3

page 4


Fokker-Planck particle systems for Bayesian inference: Computational approaches

Bayesian inference can be embedded into an appropriately defined dynamic...

Affine invariant interacting Langevin dynamics for Bayesian inference

We propose a computational method (with acronym ALDI) for sampling from ...

Convolutional networks and learning invariant to homogeneous multiplicative scalings

The conventional classification schemes -- notably multinomial logistic ...

A Random Block-Coordinate Douglas-Rachford Splitting Method with Low Computational Complexity for Binary Logistic Regression

In this paper, we propose a new optimization algorithm for sparse logist...

DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression

Scaling multinomial logistic regression to datasets with very large numb...

A gradient-free subspace-adjusting ensemble sampler for infinite-dimensional Bayesian inverse problems

Sampling of sharp posteriors in high dimensions is a challenging problem...

Low Complexity Approximate Bayesian Logistic Regression for Sparse Online Learning

Theoretical results show that Bayesian methods can achieve lower bounds ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.