Approximation Algorithms for the Loop Cutset Problem

02/27/2013
by   Ann Becker, et al.
0

We show how to find a small loop curser in a Bayesian network. Finding such a loop cutset is the first step in the method of conditioning for inference. Our algorithm for finding a loop cutset, called MGA, finds a loop cutset which is guaranteed in the worst case to contain less than twice the number of variables contained in a minimum loop cutset. We test MGA on randomly generated graphs and find that the average ratio between the number of instances associated with the algorithms' output and the number of instances associated with a minimum solution is 1.22.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/07/2014

Random Algorithms for the Loop Cutset Problem

We show how to find a minimum loop cutset in a Bayesian network with hig...
09/03/2019

De(con)struction of the lazy-F loop: improving performance of Smith Waterman alignment

Striped variation of the Smith-Waterman algorithm is known as extremely ...
06/01/2011

Randomized Algorithms for the Loop Cutset Problem

We show how to find a minimum weight loop cutset in a Bayesian network w...
03/27/2013

On Heuristics for Finding Loop Cutsets in Multiply-Connected Belief Networks

We introduce a new heuristic algorithm for the problem of finding minimu...
01/25/2022

A Special Case of Schematic Syntactic Unification

We present a unification problem based on first-order syntactic unificat...
06/12/2019

Loop Programming Practices that Simplify Quicksort Implementations

Quicksort algorithm with Hoare's partition scheme is traditionally imple...
05/14/2019

Generating Weighted MAX-2-SAT Instances of Tunable Difficulty with Frustrated Loops

Many optimization problems can be cast into the maximum satisfiability (...