DeepBern-Nets: Taming the Complexity of Certifying Neural Networks using Bernstein Polynomial Activations and Precise Bound Propagation

05/22/2023
by   Haitham Khedr, et al.
0

Formal certification of Neural Networks (NNs) is crucial for ensuring their safety, fairness, and robustness. Unfortunately, on the one hand, sound and complete certification algorithms of ReLU-based NNs do not scale to large-scale NNs. On the other hand, incomplete certification algorithms are easier to compute, but they result in loose bounds that deteriorate with the depth of NN, which diminishes their effectiveness. In this paper, we ask the following question; can we replace the ReLU activation function with one that opens the door to incomplete certification algorithms that are easy to compute but can produce tight bounds on the NN's outputs? We introduce DeepBern-Nets, a class of NNs with activation functions based on Bernstein polynomials instead of the commonly used ReLU activation. Bernstein polynomials are smooth and differentiable functions with desirable properties such as the so-called range enclosure and subdivision properties. We design a novel algorithm, called Bern-IBP, to efficiently compute tight bounds on DeepBern-Nets outputs. Our approach leverages the properties of Bernstein polynomials to improve the tractability of neural network certification tasks while maintaining the accuracy of the trained networks. We conduct comprehensive experiments in adversarial robustness and reachability analysis settings to assess the effectiveness of the proposed Bernstein polynomial activation in enhancing the certification process. Our proposed framework achieves high certified accuracy for adversarially-trained NNs, which is often a challenging task for certifiers of ReLU-based NNs. Moreover, using Bern-IBP bounds for certified training results in NNs with state-of-the-art certified accuracy compared to ReLU networks. This work establishes Bernstein polynomial activation as a promising alternative for improving NN certification tasks across various applications.

READ FULL TEXT
research
04/04/2020

Rational neural networks

We consider neural networks with rational activation functions. The choi...
research
12/14/2020

High-Order Approximation Rates for Neural Networks with ReLU^k Activation Functions

We study the approximation properties of shallow neural networks (NN) wi...
research
07/30/2021

Validation of RELU nets with tropical polyhedra

This paper studies the problem of range analysis for feedforward neural ...
research
11/02/2018

Efficient Neural Network Robustness Certification with General Activation Functions

Finding minimum distortion of adversarial examples and thus certifying r...
research
07/26/2021

Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning

Privacy concerns in client-server machine learning have given rise to pr...
research
02/11/2018

Optimizing Neural Networks in the Equivalent Class Space

It has been widely observed that many activation functions and pooling m...
research
10/01/2020

Robustness Analysis of Neural Networks via Efficient Partitioning: Theory and Applications in Control Systems

Neural networks (NNs) are now routinely implemented on systems that must...

Please sign up or login with your details

Forgot password? Click here to reset