Tensor Low-Rank Reconstruction for Semantic Segmentation

08/02/2020
by   Wanli Chen, et al.
0

Context information plays an indispensable role in the success of semantic segmentation. Recently, non-local self-attention based methods are proved to be effective for context information collection. Since the desired context consists of spatial-wise and channel-wise attentions, 3D representation is an appropriate formulation. However, these non-local methods describe 3D context information based on a 2D similarity matrix, where space compression may lead to channel-wise attention missing. An alternative is to model the contextual information directly without compression. However, this effort confronts a fundamental difficulty, namely the high-rank property of context information. In this paper, we propose a new approach to model the 3D context representations, which not only avoids the space compression but also tackles the high-rank difficulty. Here, inspired by tensor canonical-polyadic decomposition theory (i.e, a high-rank tensor can be expressed as a combination of rank-1 tensors.), we design a low-rank-to-high-rank context reconstruction framework (i.e, RecoNet). Specifically, we first introduce the tensor generation module (TGM), which generates a number of rank-1 tensors to capture fragments of context feature. Then we use these rank-1 tensors to recover the high-rank context features through our proposed tensor reconstruction module (TRM). Extensive experiments show that our method achieves state-of-the-art on various public datasets. Additionally, our proposed method has more than 100 times less computational cost compared with conventional non-local-based methods.

READ FULL TEXT

page 14

page 16

research
03/19/2018

Nonlocal Low-Rank Tensor Factor Analysis for Image Restoration

Low-rank signal modeling has been widely leveraged to capture non-local ...
research
04/02/2016

Embedding Lexical Features via Low-Rank Tensors

Modern NLP models rely heavily on engineered features, which often combi...
research
04/19/2020

Tensor completion using enhanced multiple modes low-rank prior and total variation

In this paper, we propose a novel model to recover a low-rank tensor by ...
research
07/01/2019

Permutohedral Attention Module for Efficient Non-Local Neural Networks

Medical image processing tasks such as segmentation often require captur...
research
03/27/2017

Reweighted Infrared Patch-Tensor Model With Both Non-Local and Local Priors for Single-Frame Small Target Detection

Many state-of-the-art methods have been proposed for infrared small targ...
research
09/09/2021

Is Attention Better Than Matrix Decomposition?

As an essential ingredient of modern deep learning, attention mechanism,...
research
02/07/2020

iqiyi Submission to ActivityNet Challenge 2019 Kinetics-700 challenge: Hierarchical Group-wise Attention

In this report, the method for the iqiyi submission to the task of Activ...

Please sign up or login with your details

Forgot password? Click here to reset