Lipschitz Continuity Retained Binary Neural Network

07/13/2022
by   Yuzhang Shang, et al.
11

Relying on the premise that the performance of a binary neural network can be largely restored with eliminated quantization error between full-precision weight vectors and their corresponding binary vectors, existing works of network binarization frequently adopt the idea of model robustness to reach the aforementioned objective. However, robustness remains to be an ill-defined concept without solid theoretical support. In this work, we introduce the Lipschitz continuity, a well-defined functional property, as the rigorous criteria to define the model robustness for BNN. We then propose to retain the Lipschitz continuity as a regularization term to improve the model robustness. Particularly, while the popular Lipschitz-involved regularization methods often collapse in BNN due to its extreme sparsity, we design the Retention Matrices to approximate spectral norms of the targeted weight matrices, which can be deployed as the approximation for the Lipschitz constant of BNNs without the exact Lipschitz constant computation (NP-hard). Our experiments prove that our BNN-specific regularization method can effectively strengthen the robustness of BNN (testified on ImageNet-C), achieving state-of-the-art performance on CIFAR and ImageNet.

READ FULL TEXT

page 14

page 21

research
06/15/2020

Fast Accurate Method for Bounding the Singular Values of Convolutional Layers with Application to Lipschitz Regularization

This paper tackles the problem of Lipschitz regularization of Convolutio...
research
02/21/2023

Some Fundamental Aspects about Lipschitz Continuity of Neural Network Functions

Lipschitz continuity is a simple yet pivotal functional property of any ...
research
05/03/2014

On Lipschitz Continuity and Smoothness of Loss Functions in Learning to Rank

In binary classification and regression problems, it is well understood ...
research
08/29/2021

Lipschitz Continuity Guided Knowledge Distillation

Knowledge distillation has become one of the most important model compre...
research
03/20/2023

Lipschitz-bounded 1D convolutional neural networks using the Cayley transform and the controllability Gramian

We establish a layer-wise parameterization for 1D convolutional neural n...
research
09/27/2021

Compressive Visual Representations

Learning effective visual representations that generalize well without h...
research
09/12/2023

Certified Robust Models with Slack Control and Large Lipschitz Constants

Despite recent success, state-of-the-art learning-based models remain hi...

Please sign up or login with your details

Forgot password? Click here to reset