Sound and Complete Neural Network Repair with Minimality and Locality Guarantees

10/14/2021
by   Feisi Fu, et al.
0

We present a novel methodology for repairing neural networks that use ReLU activation functions. Unlike existing methods that rely on modifying the weights of a neural network which can induce a global change in the function space, our approach applies only a localized change in the function space while still guaranteeing the removal of the buggy behavior. By leveraging the piecewise linear nature of ReLU networks, our approach can efficiently construct a patch network tailored to the linear region where the buggy input resides, which when combined with the original network, provably corrects the behavior on the buggy input. Our method is both sound and complete – the repaired network is guaranteed to fix the buggy input, and a patch is guaranteed to be found for any buggy input. Moreover, our approach preserves the continuous piecewise linear nature of ReLU networks, automatically generalizes the repair to all the points including other undetected buggy inputs inside the repair region, is minimal in terms of changes in the function space, and guarantees that outputs on inputs away from the repair region are unaltered. On several benchmarks, we show that our approach significantly outperforms existing methods in terms of locality and limiting negative side effects.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2020

Locally Linear Attributes of ReLU Neural Networks

A ReLU neural network determines/is a continuous piecewise linear map fr...
research
04/09/2021

Provable Repair of Deep Neural Networks

Deep Neural Networks (DNNs) have grown in popularity over the past decad...
research
01/30/2023

Formalizing Piecewise Affine Activation Functions of Neural Networks in Coq

Verification of neural networks relies on activation functions being pie...
research
05/05/2023

Repairing Deep Neural Networks Based on Behavior Imitation

The increasing use of deep neural networks (DNNs) in safety-critical sys...
research
11/17/2021

Traversing the Local Polytopes of ReLU Neural Networks: A Unified Approach for Network Verification

Although neural networks (NNs) with ReLU activation functions have found...
research
02/23/2023

Testing Stationarity Concepts for ReLU Networks: Hardness, Regularity, and Robust Algorithms

We study the computational problem of the stationarity test for the empi...
research
01/26/2023

A Robust Optimisation Perspective on Counterexample-Guided Repair of Neural Networks

Counterexample-guided repair aims at creating neural networks with mathe...

Please sign up or login with your details

Forgot password? Click here to reset