Transformationally Identical and Invariant Convolutional Neural Networks through Symmetric Element Operators

by   ShihChung B. Lo, et al.

Mathematically speaking, a transformationally invariant operator, such as a transformationally identical (TI) matrix kernel (i.e., K= TK), commutes with the transformation (T.) itself when they operate on the first operand matrix. We found that by consistently applying the same type of TI kernels in a convolutional neural networks (CNN) system, the commutative property holds throughout all layers of convolution processes with and without involving an activation function and/or a 1D convolution across channels within a layer. We further found that any CNN possessing the same TI kernel property for all convolution layers followed by a flatten layer with weight sharing among their transformation corresponding elements would output the same result for all transformation versions of the original input vector. In short, CNN[ Vi ] = CNN[ TVi ] providing every K = TK in CNN, where Vi denotes input vector and CNN[.] represents the whole CNN process as a function of input vector that produces an output vector. With such a transformationally identical CNN (TI-CNN) system, each transformation, that is not associated with a predefined TI used in data augmentation, would inherently include all of its corresponding transformation versions of the input vector for the training. Hence the use of same TI property for every kernel in the CNN would serve as an orientation or a translation independent training guide in conjunction with the error-backpropagation during the training. This TI kernel property is desirable for applications requiring a highly consistent output result from corresponding transformation versions of an input. Several C programming routines are provided to facilitate interested parties of using the TI-CNN technique which is expected to produce a better generalization performance than its ordinary CNN counterpart.



There are no comments yet.


page 10


Transformationally Identical and Invariant Convolutional Neural Networks by Combining Symmetric Operations or Input Vectors

Transformationally invariant processors constructed by transformed input...

Geared Rotationally Identical and Invariant Convolutional Neural Network Systems

Theorems and techniques to form different types of transformationally in...

Regularization for convolutional kernel tensors to avoid unstable gradient problem in convolutional neural networks

Convolutional neural networks are very popular nowadays. Training neural...

On regularization for a convolutional kernel in neural networks

Convolutional neural network is a very important model of deep learning....

FILTRA: Rethinking Steerable CNN by Filter Transform

Steerable CNN imposes the prior knowledge of transformation invariance o...

Understanding Convolutional Neural Networks with A Mathematical Model

This work attempts to address two fundamental questions about the struct...

Modeling Music Modality with a Key-Class Invariant Pitch Chroma CNN

This paper presents a convolutional neural network (CNN) that uses input...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.