Real-world-robustness of tree-based classifiers

08/22/2022
by   Christoph Schweimer, et al.
0

The concept of trustworthy AI has gained widespread attention lately. One of the aspects relevant to trustworthy AI is robustness of ML models. In this study, we show how to compute the recently introduced measure of real-world-robustness - a measure for robustness against naturally occurring distortions of input data - for tree-based classifiers. The original method for computing real-world-robustness works for all black box classifiers, but is only an approximation. Here we show how real-world-robustness, under the assumption that the natural distortions are given by multivariate normal distributions, can be exactly computed for tree-based classifiers.

READ FULL TEXT

page 2

page 3

page 4

research
04/21/2022

Robustness of Machine Learning Models Beyond Adversarial Attacks

Correctly quantifying the robustness of machine learning models is a cen...
research
08/29/2022

A Quantitative and Qualitative Analysis of the Robustness of (Real-World) Election Winners

Contributing to the toolbox for interpreting election results, we evalua...
research
01/20/2013

Cellular Tree Classifiers

The cellular tree classifier model addresses a fundamental problem in th...
research
02/16/2020

REST: Performance Improvement of a Black Box Model via RL-based Spatial Transformation

In recent years, deep neural networks (DNN) have become a highly active ...
research
02/10/2021

RoBIC: A benchmark suite for assessing classifiers robustness

Many defenses have emerged with the development of adversarial attacks. ...
research
09/01/2023

Curating Naturally Adversarial Datasets for Trustworthy AI in Healthcare

Deep learning models have shown promising predictive accuracy for time-s...
research
12/16/2020

Try Before You Buy: A practical data purchasing algorithm for real-world data marketplaces

Data trading is becoming increasingly popular, as evident by the appeara...

Please sign up or login with your details

Forgot password? Click here to reset