DeepAI AI Chat
Log In Sign Up

RecNets: Channel-wise Recurrent Convolutional Neural Networks

by   George Retsinas, et al.
National Technical University of Athens

In this paper, we introduce Channel-wise recurrent convolutional neural networks (RecNets), a family of novel, compact neural network architectures for computer vision tasks inspired by recurrent neural networks (RNNs). RecNets build upon Channel-wise recurrent convolutional (CRC) layers, a novel type of convolutional layer that splits the input channels into disjoint segments and processes them in a recurrent fashion. In this way, we simulate wide, yet compact models, since the number of parameters is vastly reduced via the parameter sharing of the RNN formulation. Experimental results on the CIFAR-10 and CIFAR-100 image classification tasks demonstrate the superior size-accuracy trade-off of RecNets compared to other compact state-of-the-art architectures.


page 1

page 2

page 3

page 4


Rethinking Layer-wise Feature Amounts in Convolutional Neural Network Architectures

We characterize convolutional neural networks with respect to the relati...

Learning Implicitly Recurrent CNNs Through Parameter Sharing

We introduce a parameter sharing scheme, in which different layers of a ...

Incomplete Dot Products for Dynamic Computation Scaling in Neural Network Inference

We propose the use of incomplete dot products (IDP) to dynamically adjus...

Analysis and Optimization of Convolutional Neural Network Architectures

Convolutional Neural Networks (CNNs) dominate various computer vision ta...

Arguments for the Unsuitability of Convolutional Neural Networks for Non–Local Tasks

Convolutional neural networks have established themselves over the past ...

Recurrent Convolutional Neural Networks Learn Succinct Learning Algorithms

Neural Networks (NNs) struggle to efficiently learn certain problems, su...

The Power of Sparsity in Convolutional Neural Networks

Deep convolutional networks are well-known for their high computational ...