Dimension Correction for Hierarchical Latent Class Models

12/12/2012
by   Tomas Kocka, et al.
0

Model complexity is an important factor to consider when selecting among graphical models. When all variables are observed, the complexity of a model can be measured by its standard dimension, i.e. the number of independent parameters. When hidden variables are present, however, standard dimension might no longer be appropriate. One should instead use effective dimension (Geiger et al. 1996). This paper is concerned with the computation of effective dimension. First we present an upper bound on the effective dimension of a latent class (LC) model. This bound is tight and its computation is easy. We then consider a generalization of LC models called hierarchical latent class (HLC) models (Zhang 2002). We show that the effective dimension of an HLC model can be obtained from the effective dimensions of some related LC models. We also demonstrate empirically that using effective dimension in place of standard dimension improves the quality of models learned from data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2011

Effective Dimensions of Hierarchical Latent Class Models

Hierarchical latent class (HLC) models are tree-structured Bayesian netw...
research
02/04/2019

What is the dimension of your binary data?

Many 0/1 datasets have a very large number of variables; on the other ha...
research
07/25/2022

Pinned Distance Sets Using Effective Dimension

In this paper, we use algorithmic tools, effective dimension and Kolmogo...
research
02/15/2022

The aperiodic Domino problem in higher dimension

The classical Domino problem asks whether there exists a tiling in which...
research
06/24/2022

Multi-Modal and Multi-Factor Branching Time Active Inference

Active inference is a state-of-the-art framework for modelling the brain...
research
06/11/2017

Low Complexity Gaussian Latent Factor Models and a Blessing of Dimensionality

Learning the structure of graphical models from data is a fundamental pr...
research
10/06/2021

VC dimension of partially quantized neural networks in the overparametrized regime

Vapnik-Chervonenkis (VC) theory has so far been unable to explain the sm...

Please sign up or login with your details

Forgot password? Click here to reset