Automated Hyperparameter Selection for the PC Algorithm

11/03/2020
by   Eric V. Strobl, et al.
0

The PC algorithm infers causal relations using conditional independence tests that require a pre-specified Type I α level. PC is however unsupervised, so we cannot tune α using traditional cross-validation. We therefore propose AutoPC, a fast procedure that optimizes α directly for a user chosen metric. We in particular force PC to double check its output by executing a second run on the recovered graph. We choose the final output as the one which maximizes stability between the two runs. AutoPC consistently outperforms the state of the art across multiple metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/25/2018

Nested cross-validation when selecting classifiers is overzealous for most practical applications

When selecting a classification algorithm to be applied to a particular ...
research
06/11/2023

Blocked Cross-Validation: A Precise and Efficient Method for Hyperparameter Tuning

Hyperparameter tuning plays a crucial role in optimizing the performance...
research
11/07/2021

Iterative Causal Discovery in the Possible Presence of Latent Confounders and Selection Bias

We present a sound and complete algorithm, called iterative causal disco...
research
12/14/2020

A Single Iterative Step for Anytime Causal Discovery

We present a sound and complete algorithm for recovering causal graphs f...
research
11/04/2015

Lasso based feature selection for malaria risk exposure prediction

In life sciences, the experts generally use empirical knowledge to recod...
research
01/28/2019

Inference after black box selection

We consider the problem of inference for parameters selected to report o...

Please sign up or login with your details

Forgot password? Click here to reset