TY - GEN
T1 - Toward realistic hands gesture interface
T2 - 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017
AU - Krupka, Eyal
AU - Freedman, Daniel
AU - Leichter, Ido
AU - Karmon, Kfir
AU - Gurvich, Ilya
AU - Smolin, Yoni
AU - Bloom, Noam
AU - Hurvitz, Aviv
AU - Tzairi, Yuval
AU - Vinnikov, Alon
AU - Bar Hillel, Aharon
N1 - Publisher Copyright:
© 2017 ACM.
PY - 2017/5/2
Y1 - 2017/5/2
N2 - Development of a rich hand-gesture-based interface is currently a tedious process, requiring expertise in computer vision and/or machine learning. We address this problem by introducing a simple language for pose and gesture description, a set of development tools for using it, and an algorithmic pipeline that recognizes it with high accuracy. The language is based on a small set of basic propositions, obtained by applying four predicate types to the fingers and to palm center: direction, relative location, finger touching and finger folding state. This enables easy development of a gesture-based interface, using coding constructs, gesture definition files or an editing GUI. The language is recognized from 3D camera input with an algorithmic pipeline composed of multiple classification/regression stages, trained on a large annotated dataset. Our experimental results indicate that the pipeline enables successful gesture recognition with a very low computational load, thus enabling a gesture-based interface on low-end processors.
AB - Development of a rich hand-gesture-based interface is currently a tedious process, requiring expertise in computer vision and/or machine learning. We address this problem by introducing a simple language for pose and gesture description, a set of development tools for using it, and an algorithmic pipeline that recognizes it with high accuracy. The language is based on a small set of basic propositions, obtained by applying four predicate types to the fingers and to palm center: direction, relative location, finger touching and finger folding state. This enables easy development of a gesture-based interface, using coding constructs, gesture definition files or an editing GUI. The language is recognized from 3D camera input with an algorithmic pipeline composed of multiple classification/regression stages, trained on a large annotated dataset. Our experimental results indicate that the pipeline enables successful gesture recognition with a very low computational load, thus enabling a gesture-based interface on low-end processors.
KW - Hand gesture NUI development
KW - Hand gesture recognition
UR - http://www.scopus.com/inward/record.url?scp=85044848224&partnerID=8YFLogxK
U2 - 10.1145/3025453.3025508
DO - 10.1145/3025453.3025508
M3 - Conference contribution
AN - SCOPUS:85044848224
T3 - Conference on Human Factors in Computing Systems - Proceedings
SP - 1887
EP - 1898
BT - CHI 2017 - Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
Y2 - 6 May 2017 through 11 May 2017
ER -