DeepAI AI Chat
Log In Sign Up

Families of Parsimonious Finite Mixtures of Regression Models

12/02/2013
by   Utkarsh J. Dang, et al.
University of Guelph
0

Finite mixtures of regression models offer a flexible framework for investigating heterogeneity in data with functional dependencies. These models can be conveniently used for unsupervised learning on data with clear regression relationships. We extend such models by imposing an eigen-decomposition on the multivariate error covariance matrix. By constraining parts of this decomposition, we obtain families of parsimonious mixtures of regressions and mixtures of regressions with concomitant variables. These families of models account for correlations between multiple responses. An expectation-maximization algorithm is presented for parameter estimation and performance is illustrated on simulated and real data.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/03/2019

Model-based clustering and classification using mixtures of multivariate skewed power exponential distributions

Families of mixtures of multivariate power exponential (MPE) distributio...
11/03/2014

Multivariate response and parsimony for Gaussian cluster-weighted models

A family of parsimonious Gaussian cluster-weighted models is presented. ...
01/30/2013

Learning Mixtures of DAG Models

We describe computationally efficient methods for learning mixtures in w...
10/20/2020

Distributed Learning of Finite Gaussian Mixtures

Advances in information technology have led to extremely large datasets ...
11/20/2019

Parsimonious Mixtures of Matrix Variate Bilinear Factor Analyzers

Over the years, data have become increasingly higher dimensional, which ...
02/04/2022

Functional Mixtures-of-Experts

We consider the statistical analysis of heterogeneous data for clusterin...