Home

Interaction and Annotation at a Distance in Outdoor Augmented Reality

Interaction and Annotation at a Distance in Outdoor Augmented Reality

Jason Wither, Tobias Höllerer

Tree Labeling Example Application


Overview

We have developed and evaluated components of a system for adding new annotations to an existing outdoor model from a stationary viewing point. This system consists of three main components, all developed to aid the user in aligning a 3D cursor with physical objects at the correct distance of those objects.

For the first component of our system, we developed four interaction techniques for moving a cursor throughout 3-space. The techniques were developed for a mobile AR setting, designed to keep the user's hands as free as possible. We also conducted a user study to determine which of our interaction techniques was the most successful. The next component of our system is a hybrid vision/inertial tracking system for improved alignment in orientation tracking. Our tracking module removes the lag and drift of an inertial orientation tracker by including vision feature tracking of annotated landmarks. The visual position of these landmarks is then combined with the inertial tracking for robust tracking results. This tracking system was developed for use in evaluation studies for the third part of our work. We developed AR cues to aid users in determining distance, both to their cursor and the surrounding real and virtual objects. We developed three different depth cues, and conducted a user study to test them in two scenarios, one where the user was given no other information, and a second where there were other objects that already had virtual annotations in the user's field of view.

Current / Future Work

We are currently working on building a more complete wearable system with position tracking as well as orientation tracking. We hope to then use the 6 DOF tracking to enhance the placement of annotations. We feel that this could be done in a number of different ways. First it gives the user the ability to move around, so that they have other depth cues available to them when trying to judge the distance to objects. More importantly though if our tracking is robust enough it will allow us to model parts of the scene automatically using the combination of the vision tracking, and gps/inertial tracking. If the user is looking a feature from one location and then moves to a different location at looks at the same feature we will be able to find the location of that feature based on the users two locations, and their viewing angle. Ultimately this could allow automatic scene reconstruction, as the user walks through a scene features are selected automatically, and their 3D location is also automatically generated.

Publications

Jason Wither and Tobias Höllerer
Pictorial Depth Cues for Outdoor Augmented Reality Best Paper Nominee
In Proc. International Symposium on Wearable Computers, October 2005
(PDF)

Jason Wither
Interaction and Annotation at a Distance in Outdoor Augmented Reality
Masters Thesis, September 2005
(PDF)

Jason Wither and Tobias Höllerer
Evaluating Techniques for Interaction at a Distance
In Proc. International Symposium on Wearable Computers, November 2004
(PDF)

Videos

Evaluating Techniques for Interaction at a Distance (divx format)

Pictorial Depth Cues for Outdoor Augmented Reality (divx format)