Affine-invariant ensemble transform methods for logistic regression

04/16/2021
by   Jakiw Pidstrigach, et al.
0

We investigate the application of ensemble transform approaches to Bayesian inference of logistic regression problems. Our approach relies on appropriate extensions of the popular ensemble Kalman filter and the feedback particle filter to the cross entropy loss function and is based on a well-established homotopy approach to Bayesian inference. The arising finite particle evolution equations as well as their mean-field limits are affine-invariant. Furthermore, the proposed methods can be implemented in a gradient-free manner in case of nonlinear logistic regression and the data can be randomly subsampled similar to mini-batching of stochastic gradient descent. We also propose a closely related SDE-based sampling method which again is affine-invariant and can easily be made gradient-free. Numerical examples demonstrate the appropriateness of the proposed methodologies.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

11/25/2019

Fokker-Planck particle systems for Bayesian inference: Computational approaches

Bayesian inference can be embedded into an appropriately defined dynamic...
12/05/2019

Affine invariant interacting Langevin dynamics for Bayesian inference

We propose a computational method (with acronym ALDI) for sampling from ...
06/26/2015

Convolutional networks and learning invariant to homogeneous multiplicative scalings

The conventional classification schemes -- notably multinomial logistic ...
12/25/2017

A Random Block-Coordinate Douglas-Rachford Splitting Method with Low Computational Complexity for Binary Logistic Regression

In this paper, we propose a new optimization algorithm for sparse logist...
04/16/2016

DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression

Scaling multinomial logistic regression to datasets with very large numb...
02/22/2022

A gradient-free subspace-adjusting ensemble sampler for infinite-dimensional Bayesian inverse problems

Sampling of sharp posteriors in high dimensions is a challenging problem...
01/28/2021

Low Complexity Approximate Bayesian Logistic Regression for Sparse Online Learning

Theoretical results show that Bayesian methods can achieve lower bounds ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.