DeepAI AI Chat
Log In Sign Up

Consistent Estimation of Identifiable Nonparametric Mixture Models from Grouped Observations

06/12/2020
by   Alexander Ritchie, et al.
0

Recent research has established sufficient conditions for finite mixture models to be identifiable from grouped observations. These conditions allow the mixture components to be nonparametric and have substantial (or even total) overlap. This work proposes an algorithm that consistently estimates any identifiable mixture model from grouped observations. Our analysis leverages an oracle inequality for weighted kernel density estimators of the distribution on groups, together with a general result showing that consistent estimation of the distribution on groups implies consistent estimation of mixture components. A practical implementation is provided for paired observations, and the approach is shown to outperform existing methods, especially when mixture components overlap significantly.

READ FULL TEXT
06/30/2016

An Operator Theoretic Approach to Nonparametric Mixture Models

When estimating finite mixture models, it is common to make assumptions ...
07/22/2022

Generalized Identifiability Bounds for Mixture Models with Grouped Samples

Recent work has shown that finite mixture models with m components are i...
06/23/2014

Exact fit of simple finite mixture models

How to forecast next year's portfolio-wide credit default rate based on ...
02/19/2020

A censored mixture model for modeling risk taking

Risk behavior can have substantial consequences for health, well-being, ...
02/09/2020

On the Bias of the Score Function of Finite Mixture Models

We characterize the unbiasedness of the score function, viewed as an inf...
07/30/2020

Adaptive nonparametric estimation of a component density in a two-class mixture model

A two-class mixture model, where the density of one of the components is...
05/03/2021

Consistent Density Estimation Under Discrete Mixture Models

This work considers a problem of estimating a mixing probability density...