On the Perils of Cascading Robust Classifiers

06/01/2022
by   Ravi Mangal, et al.
0

Ensembling certifiably robust neural networks has been shown to be a promising approach for improving the certified robust accuracy of neural models. Black-box ensembles that assume only query-access to the constituent models (and their robustness certifiers) during prediction are particularly attractive due to their modular structure. Cascading ensembles are a popular instance of black-box ensembles that appear to improve certified robust accuracies in practice. However, we find that the robustness certifier used by a cascading ensemble is unsound. That is, when a cascading ensemble is certified as locally robust at an input x, there can, in fact, be inputs x' in the ϵ-ball centered at x, such that the cascade's prediction at x' is different from x. We present an alternate black-box ensembling mechanism based on weighted voting which we prove to be sound for robustness certification. Via a thought experiment, we demonstrate that if the constituent classifiers are suitably diverse, voting ensembles can improve certified performance. Our code is available at <https://github.com/TristaChi/ensembleKW>.

READ FULL TEXT
research
10/21/2020

Black-Box Ripper: Copying black-box models using generative evolutionary algorithms

We study the task of replicating the functionality of black-box neural m...
research
03/21/2023

Boosting Verified Training for Robust Image Classifications via Abstraction

This paper proposes a novel, abstraction-based, certified training metho...
research
04/25/2023

Certifying Ensembles: A General Certification Theory with S-Lipschitzness

Improving and guaranteeing the robustness of deep learning models has be...
research
01/31/2020

Additive Tree Ensembles: Reasoning About Potential Instances

Imagine being able to ask questions to a black box model such as "Which ...
research
07/14/2022

Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness

Neural network ensembles, such as Bayesian neural networks (BNNs), have ...
research
04/26/2018

Decentralized learning with budgeted network load using Gaussian copulas and classifier ensembles

We examine a network of learners which address the same classification t...
research
06/29/2022

Adversarial Ensemble Training by Jointly Learning Label Dependencies and Member Models

Training an ensemble of different sub-models has empirically proven to b...

Please sign up or login with your details

Forgot password? Click here to reset