Deep Reinforced Attention Learning for Quality-Aware Visual Recognition

07/13/2020
by   Duo Li, et al.
0

In this paper, we build upon the weakly-supervised generation mechanism of intermediate attention maps in any convolutional neural networks and disclose the effectiveness of attention modules more straightforwardly to fully exploit their potential. Given an existing neural network equipped with arbitrary attention modules, we introduce a meta critic network to evaluate the quality of attention maps in the main network. Due to the discreteness of our designed reward, the proposed learning method is arranged in a reinforcement learning setting, where the attention actors and recurrent critics are alternately optimized to provide instant critique and revision for the temporary attention representation, hence coined as Deep REinforced Attention Learning (DREAL). It could be applied universally to network architectures with different types of attention modules and promotes their expressive ability by maximizing the relative gain of the final recognition performance arising from each individual attention module, as demonstrated by extensive experiments on both category and instance recognition benchmarks.

READ FULL TEXT

page 18

page 19

research
03/09/2020

Dual-attention Guided Dropblock Module for Weakly Supervised Object Localization

In this paper, we propose a dual-attention guided dropblock module, and ...
research
11/28/2020

Efficient Attention Network: Accelerate Attention by Searching Where to Plug

Recently, many plug-and-play self-attention modules are proposed to enha...
research
04/05/2019

Paying More Attention to Motion: Attention Distillation for Learning Video Representations

We address the challenging problem of learning motion representations us...
research
03/28/2021

BA^2M: A Batch Aware Attention Module for Image Classification

The attention mechanisms have been employed in Convolutional Neural Netw...
research
01/12/2017

Modularized Morphing of Neural Networks

In this work we study the problem of network morphism, an effective lear...
research
08/01/2016

Top-down Neural Attention by Excitation Backprop

We aim to model the top-down attention of a Convolutional Neural Network...
research
01/07/2023

ExcelFormer: A Neural Network Surpassing GBDTs on Tabular Data

Though neural networks have achieved enormous breakthroughs on various f...

Please sign up or login with your details

Forgot password? Click here to reset