Hybrid Zonotopes Exactly Represent ReLU Neural Networks

04/05/2023
by   Joshua Ortiz, et al.
0

We show that hybrid zonotopes offer an equivalent representation of feed-forward fully connected neural networks with ReLU activation functions. Our approach demonstrates that the complexity of binary variables is equal to the total number of neurons in the network and hence grows linearly in the size of the network. We demonstrate the utility of the hybrid zonotope formulation through three case studies including nonlinear function approximation, MPC closed-loop reachability and verification, and robustness of classification on the MNIST dataset.

READ FULL TEXT

page 2

page 4

page 5

page 6

research
06/22/2017

An approach to reachability analysis for feed-forward ReLU neural networks

We study the reachability problem for systems implemented as feed-forwar...
research
03/07/2023

A Neurosymbolic Approach to the Verification of Temporal Logic Properties of Learning enabled Control Systems

Signal Temporal Logic (STL) has become a popular tool for expressing for...
research
06/24/2020

DeepAbstract: Neural Network Abstraction for Accelerating Verification

While abstraction is a classic tool of verification to scale it up, it i...
research
05/04/2022

Convolutional and Residual Networks Provably Contain Lottery Tickets

The Lottery Ticket Hypothesis continues to have a profound practical imp...
research
02/26/2019

Nonlinear Approximation via Compositions

We study the approximation efficiency of function compositions in nonlin...
research
02/19/2020

Span Recovery for Deep Neural Networks with Applications to Input Obfuscation

The tremendous success of deep neural networks has motivated the need to...
research
04/07/2021

Spectral Analysis of the Neural Tangent Kernel for Deep Residual Networks

Deep residual network architectures have been shown to achieve superior ...

Please sign up or login with your details

Forgot password? Click here to reset