Cyclic orthogonal convolutions for long-range integration of features

12/11/2020
by   Federica Freddi, et al.
0

In Convolutional Neural Networks (CNNs) information flows across a small neighbourhood of each pixel of an image, preventing long-range integration of features before reaching deep layers in the network. We propose a novel architecture that allows flexible information flow between features z and locations (x,y) across the entire image with a small number of layers. This architecture uses a cycle of three orthogonal convolutions, not only in (x,y) coordinates, but also in (x,z) and (y,z) coordinates. We stack a sequence of such cycles to obtain our deep network, named CycleNet. As this only requires a permutation of the axes of a standard convolution, its performance can be directly compared to a CNN. Our model obtains competitive results at image classification on CIFAR-10 and ImageNet datasets, when compared to CNNs of similar size. We hypothesise that long-range integration favours recognition of objects by shape rather than texture, and we show that CycleNet transfers better than CNNs to stylised images. On the Pathfinder challenge, where integration of distant features is crucial, CycleNet outperforms CNNs by a large margin. We also show that even when employing a small convolutional kernel, the size of receptive fields of CycleNet reaches its maximum after one cycle, while conventional CNNs require a large number of layers.

READ FULL TEXT

page 3

page 7

research
02/04/2021

A Deeper Look into Convolutions via Pruning

Convolutional neural networks (CNNs) are able to attain better visual re...
research
03/13/2022

Scaling Up Your Kernels to 31x31: Revisiting Large Kernel Design in CNNs

We revisit large kernel design in modern convolutional neural networks (...
research
04/12/2018

3D G-CNNs for Pulmonary Nodule Detection

Convolutional Neural Networks (CNNs) require a large amount of annotated...
research
06/17/2022

Large-Margin Representation Learning for Texture Classification

This paper presents a novel approach combining convolutional layers (CLs...
research
11/27/2019

Orthogonal Convolutional Neural Networks

The instability and feature redundancy in CNNs hinders further performan...
research
04/02/2021

TFill: Image Completion via a Transformer-Based Architecture

Bridging distant context interactions is important for high quality imag...
research
05/10/2021

AFINet: Attentive Feature Integration Networks for Image Classification

Convolutional Neural Networks (CNNs) have achieved tremendous success in...

Please sign up or login with your details

Forgot password? Click here to reset