Abstract
A sterile, intuitive context-integrated system for navigating MRIs through freehand gestures during a neurobiopsy procedure is presented. Contextual cues are used to determine the intent of the user to improve continuous gesture recognition, and the discovery and exploration of MRIs. One of the challenges in gesture interaction in the operating room is to discriminate between intentional and non-intentional gestures. This problem is also referred as spotting. In this paper, a novel method for training gesture spotting networks is presented. The continuous gesture recognition system was shown to successfully detect gestures 92.26% of the time with a reliability of 89.97%. Experimental results show that significant improvements in task completion time were obtained through the effect of context integration.
| Original language | English |
|---|---|
| Pages (from-to) | 196-203 |
| Number of pages | 8 |
| Journal | Pattern Recognition Letters |
| Volume | 36 |
| Issue number | 1 |
| DOIs | |
| State | Published - 15 Jan 2014 |
| Externally published | Yes |
Keywords
- Continuous gesture recognition
- Human computer interaction
- Operating room
ASJC Scopus subject areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Artificial Intelligence