The Costs and Benefits of Goal-Directed Attention in Deep Convolutional Neural Networks

02/06/2020
by   Xiaoliang Luo, et al.
0

Attention in machine learning is largely bottom-up, whereas people also deploy top-down, goal-directed attention. Motivated by neuroscience research, we evaluated a plug-and-play, top-down attention layer that is easily added to existing deep convolutional neural networks (DCNNs). In object recognition tasks, increasing top-down attention has benefits (increasing hit rates) and costs (increasing false alarm rates). At a moderate level, attention improves sensitivity (i.e., increases d^') at only a moderate increase in bias for tasks involving standard images, blended images, and natural adversarial images. These theoretical results suggest that top-down attention can effectively reconfigure general-purpose DCNNs to better suit the current task goal. We hope our results continue the fruitful dialog between neuroscience and machine learning.

READ FULL TEXT

page 2

page 6

research
08/07/2023

Optimal Approximation and Learning Rates for Deep Convolutional Neural Networks

This paper focuses on approximation and learning performance analysis fo...
research
04/29/2022

Equine radiograph classification using deep convolutional neural networks

Purpose: To assess the capability of deep convolutional neural networks ...
research
11/22/2017

Context Augmentation for Convolutional Neural Networks

Recent enhancements of deep convolutional neural networks (ConvNets) emp...
research
06/22/2017

A Useful Motif for Flexible Task Learning in an Embodied Two-Dimensional Visual Environment

Animals (especially humans) have an amazing ability to learn new tasks q...
research
07/31/2017

Capacity limitations of visual search in deep convolutional neural network

Deep convolutional neural networks follow roughly the architecture of bi...
research
10/23/2015

Confusing Deep Convolution Networks by Relabelling

Deep convolutional neural networks have become the gold standard for ima...
research
04/07/2023

Attention: Marginal Probability is All You Need?

Attention mechanisms are a central property of cognitive systems allowin...

Please sign up or login with your details

Forgot password? Click here to reset