Learning Second-Order Attentive Context for Efficient Correspondence Pruning

03/28/2023
by   Xinyi Ye, et al.
0

Correspondence pruning aims to search consistent correspondences (inliers) from a set of putative correspondences. It is challenging because of the disorganized spatial distribution of numerous outliers, especially when putative correspondences are largely dominated by outliers. It's more challenging to ensure effectiveness while maintaining efficiency. In this paper, we propose an effective and efficient method for correspondence pruning. Inspired by the success of attentive context in correspondence problems, we first extend the attentive context to the first-order attentive context and then introduce the idea of attention in attention (ANA) to model second-order attentive context for correspondence pruning. Compared with first-order attention that focuses on feature-consistent context, second-order attention dedicates to attention weights itself and provides an additional source to encode consistent context from the attention map. For efficiency, we derive two approximate formulations for the naive implementation of second-order attention to optimize the cubic complexity to linear complexity, such that second-order attention can be used with negligible computational overheads. We further implement our formulations in a second-order context layer and then incorporate the layer in an ANA block. Extensive experiments demonstrate that our method is effective and efficient in pruning outliers, especially in high-outlier-ratio cases. Compared with the state-of-the-art correspondence pruning approach LMCNet, our method runs 14 times faster while maintaining a competitive accuracy.

READ FULL TEXT

page 1

page 3

page 6

research
08/06/2018

Attentive Semantic Alignment with Offset-Aware Correlation Kernels

Semantic correspondence is the problem of establishing correspondences a...
research
07/04/2019

Attentive Context Normalization for Robust Permutation-Equivariant Learning

Many problems in computer vision require dealing with sparse, unstructur...
research
11/01/2022

GMF: General Multimodal Fusion Framework for Correspondence Outlier Rejection

Rejecting correspondence outliers enables to boost the correspondence qu...
research
11/01/2021

Learning Pruned Structure and Weights Simultaneously from Scratch: an Attention based Approach

As a deep learning model typically contains millions of trainable weight...
research
01/03/2021

Consensus-Guided Correspondence Denoising

Correspondence selection between two groups of feature points aims to co...
research
11/30/2020

Learnable Motion Coherence for Correspondence Pruning

Motion coherence is an important clue for distinguishing true correspond...
research
08/06/2023

Local Consensus Enhanced Siamese Network with Reciprocal Loss for Two-view Correspondence Learning

Recent studies of two-view correspondence learning usually establish an ...

Please sign up or login with your details

Forgot password? Click here to reset