Bisimulations for Neural Network Reduction

by   Pavithra Prabhakar, et al.
Kansas State University

We present a notion of bisimulation that induces a reduced network which is semantically equivalent to the given neural network. We provide a minimization algorithm to construct the smallest bisimulation equivalent network. Reductions that construct bisimulation equivalent neural networks are limited in the scale of reduction. We present an approximate notion of bisimulation that provides semantic closeness, rather than, semantic equivalence, and quantify semantic deviation between the neural networks that are approximately bisimilar. The latter provides a trade-off between the amount of reduction and deviations in the semantics.


page 1

page 2

page 3

page 4


On Equivalence and Cores for Incomplete Databases in Open and Closed Worlds

Data exchange heavily relies on the notion of incomplete database instan...

Approximate Bisimulation Relations for Neural Networks and Application to Assured Neural Network Compression

In this paper, we propose a concept of approximate bisimulation relation...

On Neural Network Equivalence Checking using SMT Solvers

Two pretrained neural networks are deemed equivalent if they yield simil...

A Strong Bisimulation for Control Operators by Means of Multiplicative and Exponential Reduction

The purpose of this paper is to identify programs with control operators...

Representation Theorem for Matrix Product States

In this work, we investigate the universal representation capacity of th...

Syzygies among reduction operators

We introduce the notion of syzygy for a set of reduction operators and r...

Towards Rigorous Understanding of Neural Networks via Semantics-preserving Transformations

In this paper we present an algebraic approach to the precise and global...

Please sign up or login with your details

Forgot password? Click here to reset