Seeing Convolution Through the Eyes of Finite Transformation Semigroup Theory: An Abstract Algebraic Interpretation of Convolutional Neural Networks

05/26/2019
by   Andrew Hryniowski, et al.
0

Researchers are actively trying to gain better insights into the representational properties of convolutional neural networks for guiding better network designs and for interpreting a network's computational nature. Gaining such insights can be an arduous task due to the number of parameters in a network and the complexity of a network's architecture. Current approaches of neural network interpretation include Bayesian probabilistic interpretations and information theoretic interpretations. In this study, we take a different approach to studying convolutional neural networks by proposing an abstract algebraic interpretation using finite transformation semigroup theory. Specifically, convolutional layers are broken up and mapped to a finite space. The state space of the proposed finite transformation semigroup is then defined as a single element within the convolutional layer, with the acting elements defined by surrounding state elements combined with convolution kernel elements. Generators of the finite transformation semigroup are defined to complete the interpretation. We leverage this approach to analyze the basic properties of the resulting finite transformation semigroup to gain insights on the representational properties of convolutional neural networks, including insights into quantized network representation. Such a finite transformation semigroup interpretation can also enable better understanding outside of the confines of fixed lattice data structures, thus useful for handling data that lie on irregular lattices. Furthermore, the proposed abstract algebraic interpretation is shown to be viable for interpreting convolutional operations within a variety of convolutional neural network architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2021

Understanding Convolutional Neural Networks from Theoretical Perspective via Volterra Convolution

This study proposes a general and unified perspective of convolutional n...
research
07/23/2020

The Representation Theory of Neural Networks

In this work, we show that neural networks can be represented via the ma...
research
02/05/2021

Regularization for convolutional kernel tensors to avoid unstable gradient problem in convolutional neural networks

Convolutional neural networks are very popular nowadays. Training neural...
research
06/07/2020

DiffGCN: Graph Convolutional Networks via Differential Operators and Algebraic Multigrid Pooling

Graph Convolutional Networks (GCNs) have shown to be effective in handli...
research
10/22/2020

Stability of Algebraic Neural Networks to Small Perturbations

Algebraic neural networks (AlgNNs) are composed of a cascade of layers e...
research
09/03/2020

Algebraic Neural Networks: Stability to Deformations

In this work we study the stability of algebraic neural networks (AlgNNs...
research
01/08/2021

An Information-theoretic Progressive Framework for Interpretation

Both brain science and the deep learning communities have the problem of...

Please sign up or login with your details

Forgot password? Click here to reset