pRSL: Interpretable Multi-label Stacking by Learning Probabilistic Rules

05/28/2021
by   Michael Kirchhof, et al.
0

A key task in multi-label classification is modeling the structure between the involved classes. Modeling this structure by probabilistic and interpretable means enables application in a broad variety of tasks such as zero-shot learning or learning from incomplete data. In this paper, we present the probabilistic rule stacking learner (pRSL) which uses probabilistic propositional logic rules and belief propagation to combine the predictions of several underlying classifiers. We derive algorithms for exact and approximate inference and learning, and show that pRSL reaches state-of-the-art performance on various benchmark datasets. In the process, we introduce a novel multicategorical generalization of the noisy-or gate. Additionally, we report simulation results on the quality of loopy belief propagation algorithms for approximate inference in bipartite noisy-or networks.

READ FULL TEXT
research
03/26/2015

Transductive Multi-label Zero-shot Learning

Zero-shot learning has received increasing interest as a means to allevi...
research
11/30/2018

Learning Interpretable Rules for Multi-label Classification

Multi-label classification (MLC) is a supervised learning problem in whi...
research
08/08/2019

On the Trade-off Between Consistency and Coverage in Multi-label Rule Learning Heuristics

Recently, several authors have advocated the use of rule learning algori...
research
10/16/2012

Lifted Relax, Compensate and then Recover: From Approximate to Exact Lifted Probabilistic Inference

We propose an approach to lifted approximate inference for first-order p...
research
08/19/2022

A Dual Modality Approach For (Zero-Shot) Multi-Label Classification

In computer vision, multi-label classification, including zero-shot mult...
research
06/30/2016

Lifted Region-Based Belief Propagation

Due to the intractable nature of exact lifted inference, research has re...
research
08/09/2023

DOST – Domain Obedient Self-supervised Training for Multi Label Classification with Noisy Labels

The enormous demand for annotated data brought forth by deep learning te...

Please sign up or login with your details

Forgot password? Click here to reset