Context-based hand gesture recognition for the operating room

Mithun George Jacob, Juan Pablo Wachs

Research output: Contribution to journalArticlepeer-review

66 Scopus citations

Abstract

A sterile, intuitive context-integrated system for navigating MRIs through freehand gestures during a neurobiopsy procedure is presented. Contextual cues are used to determine the intent of the user to improve continuous gesture recognition, and the discovery and exploration of MRIs. One of the challenges in gesture interaction in the operating room is to discriminate between intentional and non-intentional gestures. This problem is also referred as spotting. In this paper, a novel method for training gesture spotting networks is presented. The continuous gesture recognition system was shown to successfully detect gestures 92.26% of the time with a reliability of 89.97%. Experimental results show that significant improvements in task completion time were obtained through the effect of context integration.

Original languageEnglish
Pages (from-to)196-203
Number of pages8
JournalPattern Recognition Letters
Volume36
Issue number1
DOIs
StatePublished - 15 Jan 2014
Externally publishedYes

Keywords

  • Continuous gesture recognition
  • Human computer interaction
  • Operating room

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Context-based hand gesture recognition for the operating room'. Together they form a unique fingerprint.

Cite this