Transformationally Identical and Invariant Convolutional Neural Networks through Symmetric Element Operators

06/10/2018
by   ShihChung B. Lo, et al.
0

Mathematically speaking, a transformationally invariant operator, such as a transformationally identical (TI) matrix kernel (i.e., K= TK), commutes with the transformation (T.) itself when they operate on the first operand matrix. We found that by consistently applying the same type of TI kernels in a convolutional neural networks (CNN) system, the commutative property holds throughout all layers of convolution processes with and without involving an activation function and/or a 1D convolution across channels within a layer. We further found that any CNN possessing the same TI kernel property for all convolution layers followed by a flatten layer with weight sharing among their transformation corresponding elements would output the same result for all transformation versions of the original input vector. In short, CNN[ Vi ] = CNN[ TVi ] providing every K = TK in CNN, where Vi denotes input vector and CNN[.] represents the whole CNN process as a function of input vector that produces an output vector. With such a transformationally identical CNN (TI-CNN) system, each transformation, that is not associated with a predefined TI used in data augmentation, would inherently include all of its corresponding transformation versions of the input vector for the training. Hence the use of same TI property for every kernel in the CNN would serve as an orientation or a translation independent training guide in conjunction with the error-backpropagation during the training. This TI kernel property is desirable for applications requiring a highly consistent output result from corresponding transformation versions of an input. Several C programming routines are provided to facilitate interested parties of using the TI-CNN technique which is expected to produce a better generalization performance than its ordinary CNN counterpart.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 10

07/30/2018

Transformationally Identical and Invariant Convolutional Neural Networks by Combining Symmetric Operations or Input Vectors

Transformationally invariant processors constructed by transformed input...
08/03/2018

Geared Rotationally Identical and Invariant Convolutional Neural Network Systems

Theorems and techniques to form different types of transformationally in...
02/05/2021

Regularization for convolutional kernel tensors to avoid unstable gradient problem in convolutional neural networks

Convolutional neural networks are very popular nowadays. Training neural...
06/12/2019

On regularization for a convolutional kernel in neural networks

Convolutional neural network is a very important model of deep learning....
05/25/2021

FILTRA: Rethinking Steerable CNN by Filter Transform

Steerable CNN imposes the prior knowledge of transformation invariance o...
09/14/2016

Understanding Convolutional Neural Networks with A Mathematical Model

This work attempts to address two fundamental questions about the struct...
06/17/2019

Modeling Music Modality with a Key-Class Invariant Pitch Chroma CNN

This paper presents a convolutional neural network (CNN) that uses input...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.