Marginalization in Composed Probabilistic Models

01/16/2013 ∙ by Radim Jirousek, et al. ∙ 0

Composition of low-dimensional distributions, whose foundations were laid in the papaer published in the Proceeding of UAI'97 (Jirousek 1997), appeared to be an alternative apparatus to describe multidimensional probabilistic models. In contrast to Graphical Markov Models, which define multidomensinoal distributions in a declarative way, this approach is rather procedural. Ordering of low-dimensional distributions into a proper sequence fully defines the resepctive computational procedure; therefore, a stury of different type of generating sequences is one fo the central problems in this field. Thus, it appears that an important role is played by special sequences that are called perfect. Their main characterization theorems are presetned in this paper. However, the main result of this paper is a solution to the problem of margnialization for general sequences. The main theorem describes a way to obtain a generating sequence that defines the model corresponding to the marginal of the distribution defined by an arbitrary genearting sequence. From this theorem the reader can see to what extent these comutations are local; i.e., the sequence consists of marginal distributions whose computation must be made by summing up over the values of the variable eliminated (the paper deals with finite model).



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.