Structural Learning of Probabilistic Sentential Decision Diagrams under Partial Closed-World Assumption

07/26/2021
by   Alessandro Antonucci, et al.
0

Probabilistic sentential decision diagrams are a class of structured-decomposable probabilistic circuits especially designed to embed logical constraints. To adapt the classical LearnSPN scheme to learn the structure of these models, we propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit. Sum nodes are thus learned by recursively clustering batches in the initial data base, while the partitioning of the variables obeys a given input vtree. Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying logical base, that is a relaxation of the training data base.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset