Augmenting Convolutional networks with attention-based aggregation

12/27/2021
by   Hugo Touvron, et al.
24

We show how to augment any convolutional network with an attention-based global map to achieve non-local reasoning. We replace the final average pooling by an attention-based aggregation layer akin to a single transformer block, that weights how the patches are involved in the classification decision. We plug this learned aggregation layer with a simplistic patch-based convolutional network parametrized by 2 parameters (width and depth). In contrast with a pyramidal design, this architecture family maintains the input patch resolution across all the layers. It yields surprisingly competitive trade-offs between accuracy and complexity, in particular in terms of memory consumption, as shown by our experiments on various computer vision tasks: object classification, image segmentation and detection.

READ FULL TEXT

page 1

page 2

research
09/16/2022

Self-Attentive Pooling for Efficient Deep Learning

Efficient custom pooling techniques that can aggressively trim the dimen...
research
02/07/2022

Patch-Based Stochastic Attention for Image Editing

Attention mechanisms have become of crucial importance in deep learning ...
research
02/08/2019

FocusNet: An attention-based Fully Convolutional Network for Medical Image Segmentation

We propose a novel technique to incorporate attention within convolution...
research
02/28/2020

Temporal Convolutional Attention-based Network For Sequence Modeling

With the development of feed-forward models, the default model for seque...
research
12/17/2020

Attention-based Image Upsampling

Convolutional layers are an integral part of many deep neural network so...
research
02/07/2013

A Fast Learning Algorithm for Image Segmentation with Max-Pooling Convolutional Networks

We present a fast algorithm for training MaxPooling Convolutional Networ...
research
02/07/2018

Learning One Convolutional Layer with Overlapping Patches

We give the first provably efficient algorithm for learning a one hidden...

Please sign up or login with your details

Forgot password? Click here to reset