Transformationally Identical and Invariant Convolutional Neural Networks by Combining Symmetric Operations or Input Vectors
Transformationally invariant processors constructed by transformed input vectors or operators have been suggested and applied to many applications. In this study, transformationally identical processing based on combining results of all sub-processes with corresponding transformations either at the final processing step or at the beginning step were found to be equivalent through a special algebraical operation property. This technique can be applied to most convolutional neural network (CNN) systems. Specifically, a transformationally identical CNN system can be constructed by running internally symmetric operations in parallel with the same transformation family followed by a flatten layer with weights sharing among their corresponding transformation elements. Such a CNN can output the same result with any transformation version of the original input vector. Interestingly, we found that this type of transformationally identical CNN system by combining symmetric operations at the flatten layer is mathematically equivalent to an ordinary CNN but combining all transformation versions of the input vector at the input layer. Since the former is computationally demanding, its equivalent with greatly simplified implementation is suggested
READ FULL TEXT