Convolutional Neural Networks with Alternately Updated Clique

02/28/2018
by   Yibo Yang, et al.
0

Improving information flow in deep networks helps to ease the training difficulties and utilize parameters more efficiently. Here we propose a new convolutional neural network architecture with alternately updated clique (CliqueNet). In contrast to prior networks, there are both forward and backward connections between any two layers in the same block. The layers are constructed as a loop and are updated alternately. The CliqueNet has some unique properties. For each layer, it is both the input and output of any other layer in the same block, so that the information flow among layers is maximized. During propagation, the newly updated layers are concatenated to re-update previously updated layer, and parameters are reused for multiple times. This recurrent feedback structure is able to bring higher level visual information back to refine low-level filters and achieve spatial attention. We analyze the features generated at different stages and observe that using refined features leads to a better result. We adopt a multi-scale feature strategy that effectively avoids the progressive growth of parameters. Experiments on image recognition datasets including CIFAR-10, CIFAR-100, SVHN and ImageNet show that our proposed models achieve the state-of-the-art performance with fewer parameters.

READ FULL TEXT
research
11/25/2017

Gradually Updated Neural Networks for Large-Scale Image Recognition

We present a simple yet effective neural network architecture for image ...
research
04/30/2019

GaborNet: Gabor filters with learnable parameters in deep convolutional neural networks

The article describes a system for image recognition using deep convolut...
research
11/17/2016

DelugeNets: Deep Networks with Efficient and Flexible Cross-layer Information Inflows

Deluge Networks (DelugeNets) are deep neural networks which efficiently ...
research
06/14/2023

WavPool: A New Block for Deep Neural Networks

Modern deep neural networks comprise many operational layers, such as de...
research
11/27/2019

Decision Propagation Networks for Image Classification

High-level (e.g., semantic) features encoded in the latter layers of con...
research
07/11/2014

Deep Networks with Internal Selective Attention through Feedback Connections

Traditional convolutional neural networks (CNN) are stationary and feedf...
research
07/02/2018

Evenly Cascaded Convolutional Networks

In this paper we demonstrate that state-of-the-art convolutional neural ...

Please sign up or login with your details

Forgot password? Click here to reset