MURPHY: Relations Matter in Surgical Workflow Analysis

12/24/2022
by   Shang Zhao, et al.
0

Autonomous robotic surgery has advanced significantly based on analysis of visual and temporal cues in surgical workflow, but relational cues from domain knowledge remain under investigation. Complex relations in surgical annotations can be divided into intra- and inter-relations, both valuable to autonomous systems to comprehend surgical workflows. Intra- and inter-relations describe the relevance of various categories within a particular annotation type and the relevance of different annotation types, respectively. This paper aims to systematically investigate the importance of relational cues in surgery. First, we contribute the RLLS12M dataset, a large-scale collection of robotic left lateral sectionectomy (RLLS), by curating 50 videos of 50 patients operated by 5 surgeons and annotating a hierarchical workflow, which consists of 3 inter- and 6 intra-relations, 6 steps, 15 tasks, and 38 activities represented as the triplet of 11 instruments, 8 actions, and 16 objects, totaling 2,113,510 video frames and 12,681,060 annotation entities. Correspondingly, we propose a multi-relation purification hybrid network (MURPHY), which aptly incorporates novel relation modules to augment the feature representation by purifying relational features using the intra- and inter-relations embodied in annotations. The intra-relation module leverages a R-GCN to implant visual features in different graph relations, which are aggregated using a targeted relation purification with affinity information measuring label consistency and feature similarity. The inter-relation module is motivated by attention mechanisms to regularize the influence of relational features based on the hierarchy of annotation types from the domain knowledge. Extensive experimental results on the curated RLLS dataset confirm the effectiveness of our approach, demonstrating that relations matter in surgical workflow analysis.

READ FULL TEXT

page 1

page 5

page 6

page 8

page 9

page 10

page 13

page 14

research
08/07/2022

Towards Graph Representation Learning Based Surgical Workflow Anticipation

Surgical workflow anticipation can give predictions on what steps to con...
research
03/29/2022

Exploring Intra- and Inter-Video Relation for Surgical Semantic Scene Segmentation

Automatic surgical scene segmentation is fundamental for facilitating co...
research
03/30/2021

Temporal Memory Relation Network for Workflow Recognition from Surgical Video

Automatic surgical workflow recognition is a key component for developin...
research
04/21/2020

LRTD: Long-Range Temporal Dependency based Active Learning for Surgical Workflow Recognition

Automatic surgical workflow recognition in video is an essentially funda...
research
02/26/2021

A Universal Model for Cross Modality Mapping by Relational Reasoning

With the aim of matching a pair of instances from two different modaliti...
research
03/27/2023

Context-Aware Transformer for 3D Point Cloud Automatic Annotation

3D automatic annotation has received increased attention since manually ...

Please sign up or login with your details

Forgot password? Click here to reset