Assessing task understanding in remote ultrasound diagnosis via gesture analysis

Edgar Rojas-Muñoz, Juan P. Wachs

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This work presents a gesture-based approach to estimate task understanding and performance during remote ultrasound tasks. Our approach is comprised of two main components. The first component uses the Multi-Agent Gestural Instruction Comparer (MAGIC) framework to represent and compare the gestures performed by collaborators. Through MAGIC, gestures can be compared based in their morphology, semantics, and pragmatics. The second component computes the Physical Instructions Assimilation (PIA) metric, a score representing how well are gestures being used to communicate and execute physical instructions. To evaluate our hypothesis, 20 participants performed a remote ultrasound task consisting of three subtasks: vessel detection, blood extraction, and foreign body detection. MAGIC’s gesture comparison approaches were compared against two other approaches based on how well they replicated human-annotated gestures matchings. Our approach outperformed the others, agreeing with the human baseline over 76% of the times. Subsequently, a correlation analysis was performed to compare PIA’s task understanding insights with those of three other metrics: error rate, idle time rate, and task completion percentage. Significant correlations (p≤0.04) were found between PIA and all the other metrics, positioning PIA as an effective metric for task understanding estimation. Finally, post-experiment questionnaires were used to subjectively evaluate the participants’ perceived understanding. The PIA score was found to be significantly correlated with the participants’ overall task understanding (p≤ 0.05), hinting to the relation between the assimilation of physical instructions and self-perceived understanding. These results demonstrate that gestures an be used to estimate task understanding in remote ultrasound tasks, which can improve how these tasks are performed and assessed.

Original languageEnglish
Pages (from-to)1489-1500
Number of pages12
JournalPattern Analysis and Applications
Volume24
Issue number4
DOIs
StatePublished - 1 Nov 2021
Externally publishedYes

Keywords

  • Gestures
  • Human collaboration
  • Task understanding
  • Ultrasound training

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Assessing task understanding in remote ultrasound diagnosis via gesture analysis'. Together they form a unique fingerprint.

Cite this