Deepening Neural Networks Implicitly and Locally via Recurrent Attention Strategy

10/27/2022
by   Shanshan Zhong, et al.
0

More and more empirical and theoretical evidence shows that deepening neural networks can effectively improve their performance under suitable training settings. However, deepening the backbone of neural networks will inevitably and significantly increase computation and parameter size. To mitigate these problems, we propose a simple-yet-effective Recurrent Attention Strategy (RAS), which implicitly increases the depth of neural networks with lightweight attention modules by local parameter sharing. The extensive experiments on three widely-used benchmark datasets demonstrate that RAS can improve the performance of neural networks at a slight addition of parameter size and computation, performing favorably against other existing well-known attention modules.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2023

LSAS: Lightweight Sub-attention Strategy for Alleviating Attention Bias Problem

In computer vision, the performance of deep neural networks (DNNs) is hi...
research
04/13/2023

ASR: Attention-alike Structural Re-parameterization

The structural re-parameterization (SRP) technique is a novel deep learn...
research
11/28/2020

Efficient Attention Network: Accelerate Attention by Searching Where to Plug

Recently, many plug-and-play self-attention modules are proposed to enha...
research
10/31/2020

Asymptotic Theory of Expectile Neural Networks

Neural networks are becoming an increasingly important tool in applicati...
research
05/25/2020

Attention-based Neural Bag-of-Features Learning for Sequence Data

In this paper, we propose 2D-Attention (2DA), a generic attention formul...
research
09/26/2016

Dropout with Expectation-linear Regularization

Dropout, a simple and effective way to train deep neural networks, has l...
research
02/26/2019

Learning Implicitly Recurrent CNNs Through Parameter Sharing

We introduce a parameter sharing scheme, in which different layers of a ...

Please sign up or login with your details

Forgot password? Click here to reset