A Standard Approach for Optimizing Belief Network Inference using Query DAGs

02/06/2013
by   Adnan Darwiche, et al.
0

This paper proposes a novel, algorithm-independent approach to optimizing belief network inference. rather than designing optimizations on an algorithm by algorithm basis, we argue that one should use an unoptimized algorithm to generate a Q-DAG, a compiled graphical representation of the belief network, and then optimize the Q-DAG and its evaluator instead. We present a set of Q-DAG optimizations that supplant optimizations designed for traditional inference algorithms, including zero compression, network pruning and caching. We show that our Q-DAG optimizations require time linear in the Q-DAG size, and significantly simplify the process of designing algorithms for optimizing belief network inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/13/2013

Efficient Search-Based Inference for Noisy-OR Belief Networks: TopEpsilon

Inference algorithms for arbitrary belief networks are impractical for l...
research
08/07/2014

Query DAGs: A Practical Paradigm for Implementing Belief Network Inference

We describe a new paradigm for implementing inference in belief networks...
research
03/06/2013

Causal Independence for Knowledge Acquisition and Inference

I introduce a temporal belief-network representation of causal independe...
research
09/07/2018

Optimizing CNN Model Inference on CPUs

The popularity of Convolutional Neural Network (CNN) models and the ubiq...
research
03/06/2013

Minimal Assumption Distribution Propagation in Belief Networks

As belief networks are used to model increasingly complex situations, th...
research
03/02/2022

Redefining The Query Optimization Process

Traditionally, query optimizers have been designed for computer systems ...
research
01/29/2021

Optimizing αμ

αμ is a search algorithm which repairs two defaults of Perfect Informati...

Please sign up or login with your details

Forgot password? Click here to reset