DIANet: Dense-and-Implicit Attention Network

05/25/2019
by   Zhongzhan Huang, et al.
0

Attention-based deep neural networks (DNNs) that emphasize the informative information in a local receptive field of an input image have successfully boosted the performance of deep learning in various challenging problems. In this paper, we propose a Dense-and-Implicit-Attention (DIA) unit that can be applied universally to different network architectures and enhance their generalization capacity by repeatedly fusing the information throughout different network layers. The communication of information between different layers is carried out via a modified Long Short Term Memory (LSTM) module within the DIA unit that is in parallel with the DNN. The sharing DIA unit links multi-scale features from different depth levels of the network implicitly and densely. Experiments on benchmark datasets show that the DIA unit is capable of emphasizing channel-wise feature interrelation and leads to significant improvement of image classification accuracy. We further empirically show that the DIA unit is a nonlocal normalization tool that enhances the Batch Normalization. The code is released at https://github.com/gbup-group/DIANet.

READ FULL TEXT
research
10/27/2022

Layer-wise Shared Attention Network on Dynamical System Perspective

Attention networks have successfully boosted accuracy in various vision ...
research
09/16/2015

Guiding Long-Short Term Memory for Image Caption Generation

In this work we focus on the problem of image caption generation. We pro...
research
08/12/2019

Instance Enhancement Batch Normalization: an Adaptive Regulator of Batch Noise

Batch Normalization (BN) (Ioffe and Szegedy 2015) normalizes the feature...
research
09/23/2021

DeepRare: Generic Unsupervised Visual Attention Models

Human visual system is modeled in engineering field providing feature-en...
research
09/21/2022

SDA-xNet: Selective Depth Attention Networks for Adaptive Multi-scale Feature Representation

Existing multi-scale solutions lead to a risk of just increasing the rec...
research
08/04/2019

Attentive Normalization

Batch Normalization (BN) is a vital pillar in the development of deep le...
research
08/07/2017

MemNet: A Persistent Memory Network for Image Restoration

Recently, very deep convolutional neural networks (CNNs) have been attra...

Please sign up or login with your details

Forgot password? Click here to reset