COMPASS: A Formal Framework and Aggregate Dataset for Generalized Surgical Procedure Modeling

09/14/2022
by   Kay Hutchinson, et al.
0

Objective: We propose a formal framework for modeling surgical tasks using a unified set of motion primitives (MPs) as the basic surgical actions to enable more objective labeling and aggregation of different datasets and training generalized models for surgical action recognition. Methods: We use our framework to create the COntext and Motion Primitive Aggregate Surgical Set (COMPASS), including six dry-lab surgical tasks from three publicly-available datasets (JIGSAWS, DESK, and ROSMA) with kinematic and video data and context and MP labels. Methods for labeling surgical context and automatic translation to MPs are presented. We propose the Leave-One-Task-Out (LOTO) cross validation method to evaluate a model's ability to generalize to an unseen task. Results: Our context labeling method achieves near-perfect agreement between consensus labels from crowd-sourcing and expert surgeons. Segmentation of tasks to MPs enables the generation of separate left and right transcripts and significantly improves LOTO performance. We find that MP segmentation models perform best if trained on tasks with the same context and/or tasks from the same dataset. Conclusion: The proposed framework enables high-quality labeling of surgical data based on context and fine-grained MPs. Modeling surgical tasks with MPs enables the aggregation of different datasets for training action recognition models that can generalize better to unseen tasks than models trained at the gesture level. Significance: Our formal framework and aggregate dataset can support the development of models and algorithms for surgical process analysis, skill assessment, error detection, and autonomy.

READ FULL TEXT

page 1

page 4

page 5

research
06/28/2023

Evaluating the Task Generalization of Temporal Convolutional Networks for Surgical Gesture and Motion Recognition using Kinematic Data

Fine-grained activity recognition enables explainable analysis of proced...
research
02/28/2023

Towards Surgical Context Inference and Translation to Gestures

Manual labeling of gestures in robot-assisted surgery is labor intensive...
research
12/03/2022

Recognition and Prediction of Surgical Gestures and Trajectories Using Transformer Models in Robot-Assisted Surgery

Surgical activity recognition and prediction can help provide important ...
research
03/14/2023

Kinematic Data-Based Action Segmentation for Surgical Applications

Action segmentation is a challenging task in high-level process analysis...
research
07/25/2019

Weakly Supervised Recognition of Surgical Gestures

Kinematic trajectories recorded from surgical robots contain information...
research
03/22/2023

Self-distillation for surgical action recognition

Surgical scene understanding is a key prerequisite for contextaware deci...
research
09/24/2019

Offline identification of surgical deviations in laparoscopic rectopexy

Objective: A median of 14.4 during surgery and a third of them are preve...

Please sign up or login with your details

Forgot password? Click here to reset