Quasi-Bayes properties of a recursive procedure for mixtures

02/27/2019
by   Sandra Fortini, et al.
0

Bayesian methods are attractive and often optimal, yet nowadays pressure for fast computations, especially with streaming data and online learning, brings renewed interest in faster, although possibly sub-optimal, solutions. To what extent these algorithms may approximate a Bayesian solution is a problem of interest, not always solved. On this background, in this paper we revisit a sequential procedure proposed by Smith and Makov (1978) for unsupervised learning and classification in finite mixtures, and developed by M. Newton and Zhang (1999), for nonparametric mixtures. Newton's algorithm is simple and fast, and theoretically intriguing. Although originally proposed as an approximation of the Bayesian solution, its quasi-Bayes properties remain unclear. We propose a novel methodological approach. We regard the algorithm as a probabilistic learning rule, that implicitly defines an underlying probabilistic model; and we find this model. We can then prove that it is, asymptotically, a Bayesian, exchangeable mixture model. Moreover, while the algorithm only offers a point estimate, our approach allows us to obtain an asymptotic posterior distribution and asymptotic credible intervals for the mixing distribution. Our results also provide practical hints for tuning the algorithm and obtaining desirable properties, as we illustrate in a simulation study. Beyond mixture models, our study suggests a theoretical framework that may be of interest for recursive quasi-Bayes methods in other settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2018

Identifiability of Nonparametric Mixture Models and Bayes Optimal Clustering

Motivated by problems in data clustering, we establish general condition...
research
04/30/2023

Bayesian Finite Mixtures of Ising Models

We introduce finite mixtures of Ising models as a novel approach to stud...
research
11/02/2021

Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees

We formulate natural gradient variational inference (VI), expectation pr...
research
04/22/2019

Is infinity that far? A Bayesian nonparametric perspective of finite mixture models

Mixture models are one of the most widely used statistical tools when de...
research
05/09/2017

Frequentist Consistency of Variational Bayes

A key challenge for modern Bayesian statistics is how to perform scalabl...
research
03/18/2022

Fast Bayesian Coresets via Subsampling and Quasi-Newton Refinement

Bayesian coresets approximate a posterior distribution by building a sma...
research
03/16/2021

Optimal stratification of survival data via Bayesian nonparametric mixtures

The stratified proportional hazards model represents a simple solution t...

Please sign up or login with your details

Forgot password? Click here to reset