Improving the Efficiency of the PC Algorithm by Using Model-Based Conditional Independence Tests

11/12/2022
by   Erica Cai, et al.
0

Learning causal structure is useful in many areas of artificial intelligence, including planning, robotics, and explanation. Constraint-based structure learning algorithms such as PC use conditional independence (CI) tests to infer causal structure. Traditionally, constraint-based algorithms perform CI tests with a preference for smaller-sized conditioning sets, partially because the statistical power of conventional CI tests declines rapidly as the size of the conditioning set increases. However, many modern conditional independence tests are model-based, and these tests use well-regularized models that maintain statistical power even with very large conditioning sets. This suggests an intriguing new strategy for constraint-based algorithms which may result in a reduction of the total number of CI tests performed: Test variable pairs with large conditioning sets first, as a pre-processing step that finds some conditional independencies quickly, before moving on to the more conventional strategy that favors small conditioning sets. We propose such a pre-processing step for the PC algorithm which relies on performing CI tests on a few randomly selected large conditioning sets. We perform an empirical analysis on directed acyclic graphs (DAGs) that correspond to real-world systems and both empirical and theoretical analyses for Erdős-Renyi DAGs. Our results show that Pre-Processing Plus PC (P3PC) performs far fewer CI tests than the original PC algorithm, between 0.5 the PC algorithm alone performs. The efficiency gains are particularly significant for the DAGs corresponding to real-world systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2023

Characterization and Learning of Causal Graphs with Small Conditioning Sets

Constraint-based causal discovery algorithms learn part of the causal gr...
research
06/09/2022

A Simple Unified Approach to Testing High-Dimensional Conditional Independences for Categorical and Ordinal Data

Conditional independence (CI) tests underlie many approaches to model te...
research
07/12/2019

Signal Conditioning for Learning in the Wild

The mammalian olfactory system learns rapidly from very few examples, pr...
research
12/20/2018

cuPC: CUDA-based Parallel PC Algorithm for Causal Structure Learning on GPU

The main goal in many fields in empirical sciences is to discover causal...
research
01/16/2013

The IBMAP approach for Markov networks structure learning

In this work we consider the problem of learning the structure of Markov...
research
05/30/2018

Who Learns Better Bayesian Network Structures: Constraint-Based, Score-based or Hybrid Algorithms?

The literature groups algorithms to learn the structure of Bayesian netw...
research
12/16/2021

The Dual PC Algorithm for Structure Learning

While learning the graphical structure of Bayesian networks from observa...

Please sign up or login with your details

Forgot password? Click here to reset