Learning Tractable Probabilistic Models in Open Worlds

01/17/2019
by   Amelie Levray, et al.
0

Large-scale probabilistic representations, including statistical knowledge bases and graphical models, are increasingly in demand. They are built by mining massive sources of structured and unstructured data, the latter often derived from natural language processing techniques. The very nature of the enterprise makes the extracted representations probabilistic. In particular, inducing relations and facts from noisy and incomplete sources via statistical machine learning models means that the labels are either already probabilistic, or that probabilities approximate confidence. While the progress is impressive, extracted representations essentially enforce the closed-world assumption, which means that all facts in the database are accorded the corresponding probability, but all other facts have probability zero. The CWA is deeply problematic in most machine learning contexts. A principled solution is needed for representing incomplete and indeterminate knowledge in such models, imprecise probability models such as credal networks being an example. In this work, we are interested in the foundational problem of learning such open-world probabilistic models. However, since exact inference in probabilistic graphical models is intractable, the paradigm of tractable learning has emerged to learn data structures (such as arithmetic circuits) that support efficient probabilistic querying. We show here how the computational machinery underlying tractable learning has to be generalized for imprecise probabilities. Our empirical evaluations demonstrate that our regime is also effective.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2011

Lifted Graphical Models: A Survey

This article presents a survey of work on lifted graphical models. We re...
research
04/01/2015

The Libra Toolkit for Probabilistic Models

The Libra Toolkit is a collection of algorithms for learning and inferen...
research
02/19/2021

Probabilistic Generating Circuits

Generating functions, which are widely used in combinatorics and probabi...
research
07/29/2020

Connecting actuarial judgment to probabilistic learning techniques with graph theory

Graphical models have been widely used in applications ranging from medi...
research
09/04/2017

Exact Inference for Relational Graphical Models with Interpreted Functions: Lifted Probabilistic Inference Modulo Theories

Probabilistic Inference Modulo Theories (PIMT) is a recent framework tha...
research
06/04/2021

Tractable Regularization of Probabilistic Circuits

Probabilistic Circuits (PCs) are a promising avenue for probabilistic mo...
research
12/09/2021

Testing Probabilistic Circuits

Probabilistic circuits (PCs) are a powerful modeling framework for repre...

Please sign up or login with your details

Forgot password? Click here to reset