Using Visual Anomaly Detection for Task Execution Monitoring

07/29/2021
by   Santosh Thoduka, et al.
0

Execution monitoring is essential for robots to detect and respond to failures. Since it is impossible to enumerate all failures for a given task, we learn from successful executions of the task to detect visual anomalies during runtime. Our method learns to predict the motions that occur during the nominal execution of a task, including camera and robot body motion. A probabilistic U-Net architecture is used to learn to predict optical flow, and the robot's kinematics and 3D model are used to model camera and body motion. The errors between the observed and predicted motion are used to calculate an anomaly score. We evaluate our method on a dataset of a robot placing a book on a shelf, which includes anomalies such as falling books, camera occlusions, and robot disturbances. We find that modeling camera and body motion, in addition to the learning-based optical flow prediction, results in an improvement of the area under the receiver operating characteristic curve from 0.752 to 0.804, and the area under the precision-recall curve from 0.467 to 0.549.

READ FULL TEXT

page 2

page 5

page 7

research
05/10/2021

Video Anomaly Detection By The Duality Of Normality-Granted Optical Flow

Video anomaly detection is a challenging task because of diverse abnorma...
research
10/14/2022

Multi-Task Learning based Video Anomaly Detection with Attention

Multi-task learning based video anomaly detection methods combine multip...
research
05/21/2019

Improved Optical Flow for Gesture-based Human-robot Interaction

Gesture interaction is a natural way of communicating with a robot as an...
research
05/29/2017

Towards Visual Ego-motion Learning in Robots

Many model-based Visual Odometry (VO) algorithms have been proposed in t...
research
03/16/2022

CLUE-AI: A Convolutional Three-stream Anomaly Identification Framework for Robot Manipulation

Robot safety has been a prominent research topic in recent years since r...
research
07/01/2020

FlowControl: Optical Flow Based Visual Servoing

One-shot imitation is the vision of robot programming from a single demo...
research
06/29/2022

Deep Active Visual Attention for Real-time Robot Motion Generation: Emergence of Tool-body Assimilation and Adaptive Tool-use

Sufficiently perceiving the environment is a critical factor in robot mo...

Please sign up or login with your details

Forgot password? Click here to reset