Towards Better Guided Attention and Human Knowledge Insertion in Deep Convolutional Neural Networks

10/20/2022
by   Ankit Gupta, et al.
0

Attention Branch Networks (ABNs) have been shown to simultaneously provide visual explanation and improve the performance of deep convolutional neural networks (CNNs). In this work, we introduce Multi-Scale Attention Branch Networks (MSABN), which enhance the resolution of the generated attention maps, and improve the performance. We evaluate MSABN on benchmark image recognition and fine-grained recognition datasets where we observe MSABN outperforms ABN and baseline models. We also introduce a new data augmentation strategy utilizing the attention maps to incorporate human knowledge in the form of bounding box annotations of the objects of interest. We show that even with a limited number of edited samples, a significant performance gain can be achieved with this strategy.

READ FULL TEXT

page 9

page 10

page 12

research
12/25/2018

Attention Branch Network: Learning of Attention Mechanism for Visual Explanation

Visual explanation enables human to understand the decision making of De...
research
10/23/2021

Attend and Guide (AG-Net): A Keypoints-driven Attention-based Deep Network for Image Recognition

This paper presents a novel keypoints-based attention mechanism for visu...
research
08/01/2023

Fine-Grained Sports, Yoga, and Dance Postures Recognition: A Benchmark Analysis

Human body-pose estimation is a complex problem in computer vision. Rece...
research
09/05/2022

SR-GNN: Spatial Relation-aware Graph Neural Network for Fine-Grained Image Categorization

Over the past few years, a significant progress has been made in deep co...
research
01/17/2021

Context-aware Attentional Pooling (CAP) for Fine-grained Visual Classification

Deep convolutional neural networks (CNNs) have shown a strong ability in...
research
02/18/2021

Deep Miner: A Deep and Multi-branch Network which Mines Rich and Diverse Features for Person Re-identification

Most recent person re-identification approaches are based on the use of ...
research
01/27/2022

LAP: An Attention-Based Module for Faithful Interpretation and Knowledge Injection in Convolutional Neural Networks

Despite the state-of-the-art performance of deep convolutional neural ne...

Please sign up or login with your details

Forgot password? Click here to reset