A hybrid algorithm for Bayesian network structure learning with application to multi-label learning

06/18/2015
by   Maxime Gasse, et al.
0

We present a novel hybrid algorithm for Bayesian network structure learning, called H2PC. It first reconstructs the skeleton of a Bayesian network and then performs a Bayesian-scoring greedy hill-climbing search to orient the edges. The algorithm is based on divide-and-conquer constraint-based subroutines to learn the local structure around a target variable. We conduct two series of experimental comparisons of H2PC against Max-Min Hill-Climbing (MMHC), which is currently the most powerful state-of-the-art algorithm for Bayesian network structure learning. First, we use eight well-known Bayesian network benchmarks with various data sizes to assess the quality of the learned structure returned by the algorithms. Our extensive experiments show that H2PC outperforms MMHC in terms of goodness of fit to new data and quality of the network structure with respect to the true dependence structure of the data. Second, we investigate H2PC's ability to solve the multi-label learning problem. We provide theoretical results to characterize and identify graphically the so-called minimal label powersets that appear as irreducible factors in the joint distribution under the faithfulness condition. The multi-label learning problem is then decomposed into a series of multi-class classification problems, where each multi-class variable encodes a label powerset. H2PC is shown to compare favorably to MMHC in terms of global classification accuracy over ten multi-label data sets covering different application domains. Overall, our experiments support the conclusions that local structural learning with H2PC in the form of local neighborhood induction is a theoretically well-motivated and empirically effective learning framework that is well suited to multi-label learning. The source code (in R) of H2PC as well as all data sets used for the empirical tests are publicly available.

READ FULL TEXT
research
05/19/2015

An Experimental Comparison of Hybrid Algorithms for Bayesian Network Structure Learning

We present a novel hybrid algorithm for Bayesian network structure learn...
research
04/13/2020

MLPSVM:A new parallel support vector machine to multi-label learning

Multi-label learning has attracted the attention of the machine learning...
research
03/22/2021

Partitioned hybrid learning of Bayesian network structures

We develop a novel hybrid method for Bayesian network structure learning...
research
02/27/2023

FLAG: Fast Label-Adaptive Aggregation for Multi-label Classification in Federated Learning

Federated learning aims to share private data to maximize the data utili...
research
09/16/2017

Subset Labeled LDA for Large-Scale Multi-Label Classification

Labeled Latent Dirichlet Allocation (LLDA) is an extension of the standa...
research
11/09/2020

Multi-label Causal Variable Discovery: Learning Common Causal Variables and Label-specific Causal Variables

Causal variables in Markov boundary (MB) have been widely applied in ext...
research
05/16/2022

Decision Making for Hierarchical Multi-label Classification with Multidimensional Local Precision Rate

Hierarchical multi-label classification (HMC) has drawn increasing atten...

Please sign up or login with your details

Forgot password? Click here to reset