DeepAI AI Chat
Log In Sign Up

Efficient Visual Tracking via Hierarchical Cross-Attention Transformer

03/25/2022
by   Xin Chen, et al.
Dalian University of Technology
National University of Defense Technology
0

In recent years, target tracking has made great progress in accuracy. This development is mainly attributed to powerful networks (such as transformers) and additional modules (such as online update and refinement modules). However, less attention has been paid to tracking speed. Most state-of-the-art trackers are satisfied with the real-time speed on powerful GPUs. However, practical applications necessitate higher requirements for tracking speed, especially when edge platforms with limited resources are used. In this work, we present an efficient tracking method via a hierarchical cross-attention transformer named HCAT. Our model runs about 195 fps on GPU, 45 fps on CPU, and 55 fps on the edge AI platform of NVidia Jetson AGX Xavier. Experiments show that our HCAT achieves promising results on LaSOT, GOT-10k, TrackingNet, NFS, OTB100, UAV123, and VOT2020. Code and models are available at https://github.com/chenxin-dlut/HCAT.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/17/2021

Efficient Visual Tracking with Exemplar Transformers

The design of more complex and powerful neural network models has signif...
03/29/2021

Transformer Tracking

Correlation acts as a critical role in the tracking field, especially in...
10/17/2021

Siamese Transformer Pyramid Networks for Real-Time UAV Tracking

Recent object tracking methods depend upon deep networks or convoluted a...
08/09/2022

CoViT: Real-time phylogenetics for the SARS-CoV-2 pandemic using Vision Transformers

Real-time viral genome detection, taxonomic classification and phylogene...
12/02/2021

SwinTrack: A Simple and Strong Baseline for Transformer Tracking

Transformer has recently demonstrated clear potential in improving visua...
07/31/2021

HiFT: Hierarchical Feature Transformer for Aerial Tracking

Most existing Siamese-based tracking methods execute the classification ...
07/16/2020

Hopfield Networks is All You Need

We show that the transformer attention mechanism is the update rule of a...