DeepAI AI Chat
Log In Sign Up

Families of Parsimonious Finite Mixtures of Regression Models

by   Utkarsh J. Dang, et al.
University of Guelph

Finite mixtures of regression models offer a flexible framework for investigating heterogeneity in data with functional dependencies. These models can be conveniently used for unsupervised learning on data with clear regression relationships. We extend such models by imposing an eigen-decomposition on the multivariate error covariance matrix. By constraining parts of this decomposition, we obtain families of parsimonious mixtures of regressions and mixtures of regressions with concomitant variables. These families of models account for correlations between multiple responses. An expectation-maximization algorithm is presented for parameter estimation and performance is illustrated on simulated and real data.


page 1

page 2

page 3

page 4


Model-based clustering and classification using mixtures of multivariate skewed power exponential distributions

Families of mixtures of multivariate power exponential (MPE) distributio...

Multivariate response and parsimony for Gaussian cluster-weighted models

A family of parsimonious Gaussian cluster-weighted models is presented. ...

Learning Mixtures of DAG Models

We describe computationally efficient methods for learning mixtures in w...

Distributed Learning of Finite Gaussian Mixtures

Advances in information technology have led to extremely large datasets ...

Parsimonious Mixtures of Matrix Variate Bilinear Factor Analyzers

Over the years, data have become increasingly higher dimensional, which ...

Functional Mixtures-of-Experts

We consider the statistical analysis of heterogeneous data for clusterin...