DeepAI AI Chat
Log In Sign Up

Center Smoothing for Certifiably Robust Vector-Valued Functions

by   Aounon Kumar, et al.

Randomized smoothing has been successfully applied in high-dimensional image classification tasks to obtain models that are provably robust against input perturbations of bounded size. We extend this technique to produce certifiable robustness for vector-valued functions, i.e., bound the change in output caused by a small change in input. These functions are used in many areas of machine learning, such as image reconstruction, dimensionality reduction, super-resolution, etc., but due to the enormous dimensionality of the output space in these problems, generating meaningful robustness guarantees is difficult. We design a smoothing procedure that can leverage the local, potentially low-dimensional, behaviour of the function around an input to obtain probabilistic robustness certificates. We demonstrate the effectiveness of our method on multiple learning tasks involving vector-valued functions with a wide range of input and output dimensionalities.


Intriguing Properties of Input-dependent Randomized Smoothing

Randomized smoothing is currently considered the state-of-the-art method...

Localized Randomized Smoothing for Collective Robustness Certification

Models for image segmentation, node classification and many other tasks ...

Application of probabilistic modeling and automated machine learning framework for high-dimensional stress field

Modern computational methods, involving highly sophisticated mathematica...

Post-Estimation Smoothing: A Simple Baseline for Learning with Side Information

Observational data are often accompanied by natural structural indices, ...

Learning Vector-valued Functions with Local Rademacher Complexity

We consider a general family of problems of which the output space admit...