Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking

Taehee Lee, Tobias Höllerer

Overview

The Handy AR presents a vision-based user interface that tracks a user's outstretched hand to use it as the reference pattern for augmented reality (AR) inspection, providing a 6-DOF camera pose estimation from the tracked fingertip configuration. A hand pose model is constructed in a one-time calibration step by measuring the fingertip positions relative to each other in presence of ground-truth scale information. Through frame-by-frame reconstruction of the camera pose relative to the hand, we can stabilize 3D graphics annotations on top of the hand, allowing the user to inspect such virtual objects conveniently from different viewing angles in AR.

Fingertip Detection

Fingertips are detected using a curvature-based algorithm on the contour of a user's hand. The contour point with a high curvature value is sought as a candidate fingertip point. Then an ellipse is fitted to accurately locate the fingertip. Five fingertips are detected and ordered based on the position of a thumb so that the fingertips are used as point correspondences for a camea pose estimation algorithm.

Camera Pose Estimation

As long as the five fingertips of the fixed hand posture are tracked successfully, the pose estimation method has enough point correspondences for estimating the extrinsic camera parameters for 6DOF camera pose relative to the hand. In order to inspect an AR object on top of the hand from different viewing angles, the user may rotate or move the hand arbitrarily.

Interaction

The Handy AR can be used for interacting with AR objects such as world-stabilized objects using other marker-based AR library such as ARTag. Selecting an AR object and then inspecting it can be performed using a user's hand effectively. While the world coordinate system is defined by the markers' transformation matrix, the local coordinate system of the camera pose estimated via fingertips is used for inspecting the virtual object on top of the hand.

Videos

Source Code

A preliminary HandyAR Ver0.2 is available. Please e-mail the author for any comment.

Publications

T. Lee and T. Höllerer. 2009. Multithreaded Hybrid Feature Tracking for Markerless Augmented Reality. IEEE Transactions on Visualization and Computer Graphics 15, 3 (May. 2009), 355-368.

T. Lee and T. Höllerer. Hybrid Feature Tracking and User Interaction for Markerless Augmented Reality. Proc. IEEE VR 2008 (10th Int'l Conference on Virtual Reality), Reno, NV, March 8-12, pp.145-152. Best Paper Nominee

T. Lee and T. Höllerer, Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking. In Proc. IEEE International Symposium on Wearable Computers (ISWC), Boston, MA, Oct. 2007 [pdf]

T. Lee and T. Höllerer, Initializing Markerless Tracking Using a Simple Hand Gesture. In Proc. IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR), Nara, Japan, Nov. 2007