Abstract
In this paper, we consider a vision-based system that can interpret a user's gestures in real time to manipulate objects within a medical data visualization environment. Dynamic navigation gestures are translated to commands based on their relative positions on the screen. Static gesture poses are identified to execute non-directional commands. This is accomplished by using Haar-like features to represent the shape of the hand. These features are then input to a Fuzzy C-Means Clustering algorithm for pose classification. A probabilistic neighborhood search algorithm is employed to automatically select a small number of Haar features. and to tune the fuzzy c-means classification algorithm. The gesture recognition system was implemented in a sterile medical data-browser environment. Test results on four interface tasks showed that the use of a few Haar features with the supervised FCM yielded successful performance rates of 95 to 100%. In addition a small exploratory test of the Adaboost Haar system was made to detect a single hand gesture, and assess its suitability for hand gesture recognition.
| Original language | English |
|---|---|
| State | Published - 1 Dec 2005 |
Keywords
- Computerized medical equipment
- Fuzzy c-means
- Haar-like features
- Hand gesture recognition
- Neighborhood search
ASJC Scopus subject areas
- Mechanical Engineering
- Mechanics of Materials
- General Materials Science
- Industrial and Manufacturing Engineering