Domain Invariant Siamese Attention Mask for Small Object Change Detection via Everyday Indoor Robot Navigation

03/29/2022
by   Koji Takeda, et al.
0

The problem of image change detection via everyday indoor robot navigation is explored from a novel perspective of the self-attention technique. Detecting semantically non-distinctive and visually small changes remains a key challenge in the robotics community. Intuitively, these small non-distinctive changes may be better handled by the recent paradigm of the attention mechanism, which is the basic idea of this work. However, existing self-attention models require significant retraining cost per domain, so it is not directly applicable to robotics applications. We propose a new self-attention technique with an ability of unsupervised on-the-fly domain adaptation, which introduces an attention mask into the intermediate layer of an image change detection model, without modifying the input and output layers of the model. Experiments, in which an indoor robot aims to detect visually small changes in everyday navigation, demonstrate that our attention technique significantly boosts the state-of-the-art image change detection model.

READ FULL TEXT

page 1

page 3

page 5

research
06/28/2023

Lifelong Change Detection: Continuous Domain Adaptation for Small Object Change Detection in Every Robot Navigation

The recently emerging research area in robotics, ground view change dete...
research
02/14/2020

Electricity Theft Detection with self-attention

In this work we propose a novel self-attention mechanism model to addres...
research
03/17/2020

Axial-DeepLab: Stand-Alone Axial-Attention for Panoptic Segmentation

Convolution exploits locality for efficiency at a cost of missing long r...
research
12/30/2018

Variational Self-attention Model for Sentence Representation

This paper proposes a variational self-attention model (VSAM) that emplo...
research
03/16/2023

MROS: A framework for robot self-adaptation

Self-adaptation can be used in robotics to increase system robustness an...
research
06/07/2021

On the Expressive Power of Self-Attention Matrices

Transformer networks are able to capture patterns in data coming from ma...

Please sign up or login with your details

Forgot password? Click here to reset