Transformationally Identical and Invariant Convolutional Neural Networks through Symmetric Element Operators

06/10/2018
by   ShihChung B. Lo, et al.
0

Mathematically speaking, a transformationally invariant operator, such as a transformationally identical (TI) matrix kernel (i.e., K= TK), commutes with the transformation (T.) itself when they operate on the first operand matrix. We found that by consistently applying the same type of TI kernels in a convolutional neural networks (CNN) system, the commutative property holds throughout all layers of convolution processes with and without involving an activation function and/or a 1D convolution across channels within a layer. We further found that any CNN possessing the same TI kernel property for all convolution layers followed by a flatten layer with weight sharing among their transformation corresponding elements would output the same result for all transformation versions of the original input vector. In short, CNN[ Vi ] = CNN[ TVi ] providing every K = TK in CNN, where Vi denotes input vector and CNN[.] represents the whole CNN process as a function of input vector that produces an output vector. With such a transformationally identical CNN (TI-CNN) system, each transformation, that is not associated with a predefined TI used in data augmentation, would inherently include all of its corresponding transformation versions of the input vector for the training. Hence the use of same TI property for every kernel in the CNN would serve as an orientation or a translation independent training guide in conjunction with the error-backpropagation during the training. This TI kernel property is desirable for applications requiring a highly consistent output result from corresponding transformation versions of an input. Several C programming routines are provided to facilitate interested parties of using the TI-CNN technique which is expected to produce a better generalization performance than its ordinary CNN counterpart.

READ FULL TEXT
research
07/30/2018

Transformationally Identical and Invariant Convolutional Neural Networks by Combining Symmetric Operations or Input Vectors

Transformationally invariant processors constructed by transformed input...
research
08/03/2018

Geared Rotationally Identical and Invariant Convolutional Neural Network Systems

Theorems and techniques to form different types of transformationally in...
research
02/05/2021

Regularization for convolutional kernel tensors to avoid unstable gradient problem in convolutional neural networks

Convolutional neural networks are very popular nowadays. Training neural...
research
06/12/2019

On regularization for a convolutional kernel in neural networks

Convolutional neural network is a very important model of deep learning....
research
07/21/2020

CyCNN: A Rotation Invariant CNN using Polar Mapping and Cylindrical Convolution Layers

Deep Convolutional Neural Networks (CNNs) are empirically known to be in...
research
06/10/2019

Scale Steerable Filters for Locally Scale-Invariant Convolutional Neural Networks

Augmenting transformation knowledge onto a convolutional neural network'...
research
05/25/2021

FILTRA: Rethinking Steerable CNN by Filter Transform

Steerable CNN imposes the prior knowledge of transformation invariance o...

Please sign up or login with your details

Forgot password? Click here to reset