Recurrent Iterative Gating Networks for Semantic Segmentation

11/20/2018
by   Rezaul Karim, et al.
0

In this paper, we present an approach for Recurrent Iterative Gating called RIGNet. The core elements of RIGNet involve recurrent connections that control the flow of information in neural networks in a top-down manner, and different variants on the core structure are considered. The iterative nature of this mechanism allows for gating to spread in both spatial extent and feature space. This is revealed to be a powerful mechanism with broad compatibility with common existing networks. Analysis shows how gating interacts with different network characteristics, and we also show that more shallow networks with gating may be made to perform better than much deeper networks that do not include RIGNet modules.

READ FULL TEXT

page 7

page 8

research
09/28/2019

Distributed Iterative Gating Networks for Semantic Segmentation

In this paper, we present a canonical structure for controlling informat...
research
06/18/2017

Learning Hierarchical Information Flow with Recurrent Neural Modules

We propose ThalNet, a deep learning model inspired by neocortical commun...
research
12/14/2015

On non-iterative training of a neural classifier

Recently an algorithm, was discovered, which separates points in n-dimen...
research
04/30/2019

Interpretation of Feature Space using Multi-Channel Attentional Sub-Networks

Convolutional Neural Networks have achieved impressive results in variou...
research
07/20/2017

Deep Layer Aggregation

Convolutional networks have had great success in image classification an...
research
11/23/2016

Multigrid Neural Architectures

We propose a multigrid extension of convolutional neural networks (CNNs)...
research
08/13/2020

Feature Binding with Category-Dependant MixUp for Semantic Segmentation and Adversarial Robustness

In this paper, we present a strategy for training convolutional neural n...

Please sign up or login with your details

Forgot password? Click here to reset