Free-hand interaction for handheld augmented reality using an RGB-depth camera

Huidong Bai, Lei Gao, Jihad El-Sana, Mark Billinghurst

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

In this paper, we present a novel gesture-based interaction method for handheld Augmented Reality (AR) implemented on a tablet with an RGB-Depth camera attached. Compared with conventional device-centric interaction methods like keypad, stylus, or touchscreen input, natural gesture-based interfaces offer a more intuitive experience for AR applications. Combining with depth information, gesture interfaces can extend handheld AR interaction into full 3D space. In our system we retrieve the 3D hand skeleton from color and depth frames, mapping the results to corresponding manipulations of virtual objects in the AR scene. Our method allows users to control virtual objects in 3D space using their bare hands and perform operations such as translation, rotation, and zooming.

Original languageEnglish
Title of host publicationSIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, SA 2013
PublisherAssociation for Computing Machinery
ISBN (Print)9781450326339
DOIs
StatePublished - 1 Jan 2013
EventSIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, SA 2013 - Hong Kong, Hong Kong
Duration: 19 Nov 201322 Nov 2013

Publication series

NameSIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, SA 2013

Conference

ConferenceSIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, SA 2013
Country/TerritoryHong Kong
CityHong Kong
Period19/11/1322/11/13

Keywords

  • 3D user interfaces
  • gesture interaction
  • hand skeletonization
  • handheld augmented reality

Fingerprint

Dive into the research topics of 'Free-hand interaction for handheld augmented reality using an RGB-depth camera'. Together they form a unique fingerprint.

Cite this