DeepAI AI Chat
Log In Sign Up

Siamese Transformer Pyramid Networks for Real-Time UAV Tracking

10/17/2021
by   Daitao Xing, et al.
NYU college
0

Recent object tracking methods depend upon deep networks or convoluted architectures. Most of those trackers can hardly meet real-time processing requirements on mobile platforms with limited computing resources. In this work, we introduce the Siamese Transformer Pyramid Network (SiamTPN), which inherits the advantages from both CNN and Transformer architectures. Specifically, we exploit the inherent feature pyramid of a lightweight network (ShuffleNetV2) and reinforce it with a Transformer to construct a robust target-specific appearance model. A centralized architecture with lateral cross attention is developed for building augmented high-level feature maps. To avoid the computation and memory intensity while fusing pyramid representations with the Transformer, we further introduce the pooling attention module, which significantly reduces memory and time complexity while improving the robustness. Comprehensive experiments on both aerial and prevalent tracking benchmarks achieve competitive results while operating at high speed, demonstrating the effectiveness of SiamTPN. Moreover, our fastest variant tracker operates over 30 Hz on a single CPU-core and obtaining an AUC score of 58.1 https://github.com/RISCNYUAD/SiamTPNTracker

READ FULL TEXT

page 2

page 4

page 7

page 8

12/17/2021

Efficient Visual Tracking with Exemplar Transformers

The design of more complex and powerful neural network models has signif...
10/15/2021

Pyramid Correlation based Deep Hough Voting for Visual Object Tracking

Most of the existing Siamese-based trackers treat tracking problem as a ...
11/28/2016

ECO: Efficient Convolution Operators for Tracking

In recent years, Discriminative Correlation Filter (DCF) based methods h...
03/24/2022

Keypoints Tracking via Transformer Networks

In this thesis, we propose a pioneering work on sparse keypoints trackin...
03/25/2022

Efficient Visual Tracking via Hierarchical Cross-Attention Transformer

In recent years, target tracking has made great progress in accuracy. Th...
07/03/2022

Divert More Attention to Vision-Language Tracking

Relying on Transformer for complex visual feature learning, object track...
05/25/2023

MixFormerV2: Efficient Fully Transformer Tracking

Transformer-based trackers have achieved strong accuracy on the standard...