Attention in Attention Network for Image Super-Resolution

04/19/2021
by   Haoyu Chen, et al.
10

Convolutional neural networks have allowed remarkable advances in single image super-resolution (SISR) over the last decade. Among recent advances in SISR, attention mechanisms are crucial for high performance SR models. However, few works really discuss why attention works and how it works. In this work, we attempt to quantify and visualize the static attention mechanisms and show that not all attention modules are equally beneficial. We then propose attention in attention network (A^2N) for highly accurate image SR. Specifically, our A^2N consists of a non-attention branch and a coupling attention branch. Attention dropout module is proposed to generate dynamic attention weights for these two branches based on input features that can suppress unwanted attention adjustments. This allows attention modules to specialize to beneficial examples without otherwise penalties and thus greatly improve the capacity of the attention network with little parameter overhead. Experiments have demonstrated that our model could achieve superior trade-off performances comparing with state-of-the-art lightweight networks. Experiments on local attribution maps also prove attention in attention (A^2) structure can extract features from a wider range.

READ FULL TEXT

page 3

page 7

11/29/2018

RAM: Residual Attention Module for Single Image Super-Resolution

Attention mechanisms are a design trend of deep neural networks that sta...
11/22/2020

Interpreting Super-Resolution Networks with Local Attribution Maps

Image super-resolution (SR) techniques have been developing rapidly, ben...
08/20/2020

Single Image Super-Resolution via a Holistic Attention Network

Informative features play a crucial role in the single image super-resol...
10/23/2021

Dense Dual-Attention Network for Light Field Image Super-Resolution

Light field (LF) images can be used to improve the performance of image ...
02/26/2022

Multi-image Super-resolution via Quality Map Associated Temporal Attention Network

With the rising interest in deep learning-based methods in remote sensin...
06/13/2021

Pyramidal Dense Attention Networks for Lightweight Image Super-Resolution

Recently, deep convolutional neural network methods have achieved an exc...
08/02/2021

Finding Discriminative Filters for Specific Degradations in Blind Super-Resolution

Recent blind super-resolution (SR) methods typically consist of two bran...

Code Repositories

A2N

PyTorch code for our paper "Attention in Attention Network for Image Super-Resolution"


view repo