Certifying Ensembles: A General Certification Theory with S-Lipschitzness

04/25/2023
by   Aleksandar Petrov, et al.
0

Improving and guaranteeing the robustness of deep learning models has been a topic of intense research. Ensembling, which combines several classifiers to provide a better model, has shown to be beneficial for generalisation, uncertainty estimation, calibration, and mitigating the effects of concept drift. However, the impact of ensembling on certified robustness is less well understood. In this work, we generalise Lipschitz continuity by introducing S-Lipschitz classifiers, which we use to analyse the theoretical robustness of ensembles. Our results are precise conditions when ensembles of robust classifiers are more robust than any constituent classifier, as well as conditions when they are less robust.

READ FULL TEXT
research
02/11/2020

Generalised Lipschitz Regularisation Equals Distributional Robustness

The problem of adversarial examples has highlighted the need for a theor...
research
06/01/2022

On the Perils of Cascading Robust Classifiers

Ensembling certifiably robust neural networks has been shown to be a pro...
research
01/26/2022

Improving robustness and calibration in ensembles with diversity regularization

Calibration and uncertainty estimation are crucial topics in high-risk e...
research
06/21/2022

Ensembling over Classifiers: a Bias-Variance Perspective

Ensembles are a straightforward, remarkably effective method for improvi...
research
10/07/2021

Sparse MoEs meet Efficient Ensembles

Machine learning models based on the aggregated outputs of submodels, ei...
research
05/12/2020

Robustness Verification for Classifier Ensembles

We give a formal verification procedure that decides whether a classifie...
research
06/15/2022

Estimating Confidence of Predictions of Individual Classifiers and Their Ensembles for the Genre Classification Task

Genre identification is a subclass of non-topical text classification. T...

Please sign up or login with your details

Forgot password? Click here to reset