An approach to reachability analysis for feed-forward ReLU neural networks

06/22/2017
by   Alessio Lomuscio, et al.
0

We study the reachability problem for systems implemented as feed-forward neural networks whose activation function is implemented via ReLU functions. We draw a correspondence between establishing whether some arbitrary output can ever be outputed by a neural system and linear problems characterising a neural system of interest. We present a methodology to solve cases of practical interest by means of a state-of-the-art linear programs solver. We evaluate the technique presented by discussing the experimental results obtained by analysing reachability properties for a number of benchmarks in the literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2020

Reachability Analysis for Feed-Forward Neural Networks using Face Lattices

Deep neural networks have been widely applied as an effective approach t...
research
04/05/2023

Hybrid Zonotopes Exactly Represent ReLU Neural Networks

We show that hybrid zonotopes offer an equivalent representation of feed...
research
11/28/2018

Formal Verification of CNN-based Perception Systems

We address the problem of verifying neural-based perception systems impl...
research
07/18/2020

Abstraction based Output Range Analysis for Neural Networks

In this paper, we consider the problem of output range analysis for feed...
research
05/22/2018

Expectation propagation: a probabilistic view of Deep Feed Forward Networks

We present a statistical mechanics model of deep feed forward neural net...
research
10/26/2021

Gradient representations in ReLU networks as similarity functions

Feed-forward networks can be interpreted as mappings with linear decisio...
research
06/05/2018

A Framework for the construction of upper bounds on the number of affine linear regions of ReLU feed-forward neural networks

In this work we present a new framework to derive upper bounds on the nu...

Please sign up or login with your details

Forgot password? Click here to reset