Optimal Bayesian estimation of Gaussian mixtures with growing number of components

07/17/2020
by   Ilsang Ohn, et al.
0

We study posterior concentration properties of Bayesian procedures for estimating finite Gaussian mixtures in which the number of components is unknown and allowed to grow with the sample size. Under this general setup, we derive a series of new theoretical results. More specifically, we first show that under mild conditions on the prior, the posterior distribution concentrates around the true mixing distribution at a near optimal rate with respect to the Wasserstein distance. Under a separation condition on the true mixing distribution, we further show that a better and adaptive convergence rate can be achieved, and the number of components can be consistently estimated. Furthermore, we derive optimal convergence rates for the higher-order mixture models where the number of components diverges arbitrarily fast. In addition, we consider the fractional posterior and investigate its posterior contraction rates, which are also shown to be minimax optimal in estimating the mixing distribution under mild conditions. We also investigate Bayesian estimation of general mixtures with strong identifiability conditions, and derive the optimal convergence rates when the number of components is fixed. Lastly, we study theoretical properties of the posterior of the popular Dirichlet process (DP) mixture prior, and show that such a model can provide a reasonable estimate for the number of components while only guaranteeing a slow convergence rate of the mixing distribution estimation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset