Robust and Provably Monotonic Networks

11/30/2021
by   Ouail Kitouni, et al.
0

The Lipschitz constant of the map between the input and output space represented by a neural network is a natural metric for assessing the robustness of the model. We present a new method to constrain the Lipschitz constant of dense deep learning models that can also be generalized to other architectures. The method relies on a simple weight normalization scheme during training that ensures the Lipschitz constant of every layer is below an upper limit specified by the analyst. A simple residual connection can then be used to make the model monotonic in any subset of its inputs, which is useful in scenarios where domain knowledge dictates such dependence. Examples can be found in algorithmic fairness requirements or, as presented here, in the classification of the decays of subatomic particles produced at the CERN Large Hadron Collider. Our normalization is minimally constraining and allows the underlying architecture to maintain higher expressiveness compared to other techniques which aim to either control the Lipschitz constant of the model or ensure its monotonicity. We show how the algorithm was used to train a powerful, robust, and interpretable discriminator for heavy-flavor decays in the LHCb realtime data-processing system.

READ FULL TEXT
research
07/14/2023

Expressive Monotonic Neural Networks

The monotonic dependence of the outputs of a neural network on some of i...
research
05/06/2020

Training robust neural networks using Lipschitz bounds

Due to their susceptibility to adversarial perturbations, neural network...
research
10/31/2022

Lipschitz regularized gradient flows and latent generative particles

Lipschitz regularized f-divergences are constructed by imposing a bound ...
research
07/25/2018

Limitations of the Lipschitz constant as a defense against adversarial examples

Several recent papers have discussed utilizing Lipschitz constants to li...
research
12/30/2021

JacNet: Learning Functions with Structured Jacobians

Neural networks are trained to learn an approximate mapping from an inpu...
research
06/11/2019

Stable Rank Normalization for Improved Generalization in Neural Networks and GANs

Exciting new work on the generalization bounds for neural networks (NN) ...
research
02/04/2021

Invertible DenseNets with Concatenated LipSwish

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...

Please sign up or login with your details

Forgot password? Click here to reset