Multimodal-GuideNet: Gaze-Probe Bidirectional Guidance in Obstetric Ultrasound Scanning

07/26/2022
by   Qianhui Men, et al.
0

Eye trackers can provide visual guidance to sonographers during ultrasound (US) scanning. Such guidance is potentially valuable for less experienced operators to improve their scanning skills on how to manipulate the probe to achieve the desired plane. In this paper, a multimodal guidance approach (Multimodal-GuideNet) is proposed to capture the stepwise dependency between a real-world US video signal, synchronized gaze, and probe motion within a unified framework. To understand the causal relationship between gaze movement and probe motion, our model exploits multitask learning to jointly learn two related tasks: predicting gaze movements and probe signals that an experienced sonographer would perform in routine obstetric scanning. The two tasks are associated by a modality-aware spatial graph to detect the co-occurrence among the multi-modality inputs and share useful cross-modal information. Instead of a deterministic scanning path, Multimodal-GuideNet allows for scanning diversity by estimating the probability distribution of real scans. Experiments performed with three typical obstetric scanning examinations show that the new approach outperforms single-task learning for both probe motion guidance and gaze movement prediction. Multimodal-GuideNet also provides a visual guidance signal with an error rate of less than 10 pixels for a 224x288 US image.

READ FULL TEXT
research
07/08/2020

Automatic Probe Movement Guidance for Freehand Obstetric Ultrasound

We present the first system that provides real-time probe movement guida...
research
11/02/2021

Learning Robotic Ultrasound Scanning Skills via Human Demonstrations and Guided Explorations

Medical ultrasound has become a routine examination approach nowadays an...
research
05/06/2023

Towards a Simple Framework of Skill Transfer Learning for Robotic Ultrasound-guidance Procedures

In this paper, we present a simple framework of skill transfer learning ...
research
07/25/2023

Learning Autonomous Ultrasound via Latent Task Representation and Robotic Skills Adaptation

As medical ultrasound is becoming a prevailing examination approach nowa...
research
08/02/2022

Can Gaze Beat Touch? A Fitts' Law Evaluation of Gaze, Touch, and Mouse Inputs

Gaze input has been a promising substitute for mouse input for point and...
research
11/14/2016

An Evaluation of Information Sharing Parking Guidance Policies Using a Bayesian Approach

Real-time parking occupancy information is critical for a parking manage...
research
03/21/2019

Scanning Probe State Recognition With Multi-Class Neural Network Ensembles

One of the largest obstacles facing scanning probe microscopy is the con...

Please sign up or login with your details

Forgot password? Click here to reset