Improving Robot Localisation by Ignoring Visual Distraction

07/25/2021
by   Oscar Mendez, et al.
5

Attention is an important component of modern deep learning. However, less emphasis has been put on its inverse: ignoring distraction. Our daily lives require us to explicitly avoid giving attention to salient visual features that confound the task we are trying to accomplish. This visual prioritisation allows us to concentrate on important tasks while ignoring visual distractors. In this work, we introduce Neural Blindness, which gives an agent the ability to completely ignore objects or classes that are deemed distractors. More explicitly, we aim to render a neural network completely incapable of representing specific chosen classes in its latent space. In a very real sense, this makes the network "blind" to certain classes, allowing and agent to focus on what is important for a given task, and demonstrates how this can be used to improve localisation.

READ FULL TEXT

page 4

page 5

page 6

research
05/19/2021

VSGM – Enhance robot task understanding ability through visual semantic graph

In recent years, developing AI for robotics has raised much attention. T...
research
11/23/2018

Learning to attend in a brain-inspired deep neural network

Recent machine learning models have shown that including attention as a ...
research
03/24/2019

Using RGB Image as Visual Input for Mapless Robot Navigation

Robot navigation in mapless environment is one of the essential problems...
research
07/25/2023

Audio-aware Query-enhanced Transformer for Audio-Visual Segmentation

The goal of the audio-visual segmentation (AVS) task is to segment the s...
research
03/20/2023

Sketch2Saliency: Learning to Detect Salient Objects from Human Drawings

Human sketch has already proved its worth in various visual understandin...
research
05/22/2019

AttentionRNN: A Structured Spatial Attention Mechanism

Visual attention mechanisms have proven to be integrally important const...
research
03/23/2023

Top-Down Visual Attention from Analysis by Synthesis

Current attention algorithms (e.g., self-attention) are stimulus-driven ...

Please sign up or login with your details

Forgot password? Click here to reset