DeepAI AI Chat
Log In Sign Up

Separating Rule Discovery and Global Solution Composition in a Learning Classifier System

by   Michael Heider, et al.

The utilization of digital agents to support crucial decision making is increasing in many industrial scenarios. However, trust in suggestions made by these agents is hard to achieve, though essential for profiting from their application, resulting in a need for explanations for both the decision making process as well as the model itself. For many systems, such as common deep learning black-box models, achieving at least some explainability requires complex post-processing, while other systems profit from being, to a reasonable extent, inherently interpretable. In this paper we propose an easily interpretable rule-based learning system specifically designed and thus especially suited for these scenarios and compare it on a set of regression problems against XCSF, a prominent rule-based learning system with a long research history. One key advantage of our system is that the rules' conditions and which rules compose a solution to the problem are evolved separately. We utilise independent rule fitnesses which allows users to specifically tailor their model structure to fit the given requirements for explainability. We find that the results of SupRB2's evaluation are comparable to XCSF's while allowing easier control of model structure and showing a substantially smaller sensitivity to random seeds and data splits. This increased control aids in subsequently providing explanations for both the training and the final structure of the model.


page 1

page 2

page 3

page 4


Investigating the Impact of Independent Rule Fitnesses in a Learning Classifier System

Achieving at least some level of explainability requires complex analyse...

Learning Classifier Systems for Self-Explaining Socio-Technical-Systems

In socio-technical settings, operators are increasingly assisted by deci...

An Interpretable Algorithm for Uveal Melanoma Subtyping from Whole Slide Cytology Images

Algorithmic decision support is rapidly becoming a staple of personalize...

SupRB: A Supervised Rule-based Learning System for Continuous Problems

We propose the SupRB learning system, a new Pittsburgh-style learning cl...

User Driven Model Adjustment via Boolean Rule Explanations

AI solutions are heavily dependant on the quality and accuracy of the in...

Controlling Neural Networks with Rule Representations

We propose a novel training method to integrate rules into deep learning...

ESC-Rules: Explainable, Semantically Constrained Rule Sets

We describe a novel approach to explainable prediction of a continuous v...