DeepAI AI Chat
Log In Sign Up

Consistent Estimation of Identifiable Nonparametric Mixture Models from Grouped Observations

by   Alexander Ritchie, et al.

Recent research has established sufficient conditions for finite mixture models to be identifiable from grouped observations. These conditions allow the mixture components to be nonparametric and have substantial (or even total) overlap. This work proposes an algorithm that consistently estimates any identifiable mixture model from grouped observations. Our analysis leverages an oracle inequality for weighted kernel density estimators of the distribution on groups, together with a general result showing that consistent estimation of the distribution on groups implies consistent estimation of mixture components. A practical implementation is provided for paired observations, and the approach is shown to outperform existing methods, especially when mixture components overlap significantly.


An Operator Theoretic Approach to Nonparametric Mixture Models

When estimating finite mixture models, it is common to make assumptions ...

Generalized Identifiability Bounds for Mixture Models with Grouped Samples

Recent work has shown that finite mixture models with m components are i...

Exact fit of simple finite mixture models

How to forecast next year's portfolio-wide credit default rate based on ...

A censored mixture model for modeling risk taking

Risk behavior can have substantial consequences for health, well-being, ...

On the Bias of the Score Function of Finite Mixture Models

We characterize the unbiasedness of the score function, viewed as an inf...

Adaptive nonparametric estimation of a component density in a two-class mixture model

A two-class mixture model, where the density of one of the components is...

Consistent Density Estimation Under Discrete Mixture Models

This work considers a problem of estimating a mixing probability density...