Enhancing Decision Tree based Interpretation of Deep Neural Networks through L1-Orthogonal Regularization

04/10/2019
by   Nina Schaaf, et al.
0

One obstacle that so far prevents the introduction of machine learning models primarily in critical areas is the lack of explainability. In this work, a practicable approach of gaining explainability of deep artificial neural networks (NN) using an interpretable surrogate model based on decision trees is presented. Simply fitting a decision tree to a trained NN usually leads to unsatisfactory results in terms of accuracy and fidelity. Using -orthogonal regularization during training, however, preserves the accuracy of the NN, while it can be closely approximated by small decision trees. Tests with different data sets confirm that -orthogonal regularization yields models of lower complexity and at the same time higher fidelity compared to other regularizers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2018

Deep Neural Decision Trees

Deep neural networks have been proven powerful at processing perceptual ...
research
06/04/2019

A Novel Hyperparameter-free Approach to Decision Tree Construction that Avoids Overfitting by Design

Decision trees are an extremely popular machine learning technique. Unfo...
research
06/19/2019

An Ontology-based Approach to Explaining Artificial Neural Networks

Explainability in Artificial Intelligence has been revived as a topic of...
research
10/16/2022

Machine Learning based Discrimination for Excited State Promoted Readout

A limiting factor for readout fidelity for superconducting qubits is the...
research
08/14/2019

Optimizing for Interpretability in Deep Neural Networks with Tree Regularization

Deep models have advanced prediction in many domains, but their lack of ...
research
11/16/2017

Beyond Sparsity: Tree Regularization of Deep Models for Interpretability

The lack of interpretability remains a key barrier to the adoption of de...
research
04/26/2023

Enhancing Robustness of Gradient-Boosted Decision Trees through One-Hot Encoding and Regularization

Gradient-boosted decision trees (GBDT) are widely used and highly effect...

Please sign up or login with your details

Forgot password? Click here to reset