Home

Hidden Infrastructure Visualization and Multimodal Interaction Techniques for Outdoor AR

Hidden Infrastructure Visualization and Multimodal Interaction Techniques for Outdoor AR

Ryan Bane, Tobias Höllerer, Mathias Kölsch

 

Overview

This project explores visualization techniques and interaction "tools" to display multiple occluded layers of complex objects and data sources. The tools are operated through a set of non-traditional user interfaces, tailored to the mobile user, that enable simultaneous input of multi-dimensional control parameters.

Details

Occlusion is one of the basic truths of vision: if one opaque object is in front of another, the second object is occluded from view and can't be seen. There are applications, however, in which a user might want to work around this problem. A construction worker might want to view some internal structure of a building from outside its walls, or a paramedic might need to see the location of a specific room inside a building.

For Superman, this task was easy. The script says he can see through just what he needs to, but we don't have this luxury. Instead, this research is aimed at designing a system to use augmented reality technology to display the infrastructure of complicated objects over the real world view of those objects. By using a 3d model of a building and a tracking system for the user, the model can be viewed at the same angle and in the same apparent position as the building.

So far, the work on this project has produced a set of interactive tools designed to give users Virtual X-Ray vision. These tools address a common problem in depicting occluded infrastructure: either too much information is displayed, confusing users, or too little information is displayed, depriving users of important depth cues. Four tools are presented: The Tunnel Tool and Room Selector Tool directly augment the user's view of the environment, allowing them to explore the scene in direct, first person view. The Room in Miniature Tool allows the user to select and interact with a room from a third person perspective, allowing users to view the contents of the room from points of view that would normally be difficult or impossible to achieve. The Room Slicer Tool aids users in exploring volumetric data displayed within the Room in Miniature tool. Used together, the tools presented in this paper can be used to achieve the virtual x-ray vision effect.

We test our prototype system in a far-field mobile augmented reality setup, visualizing the interiors of a small set of buildings on the UCSB campus. To operate the tools in mobile and outdoor settings, a multimodal input interface has been implemented that avoids most pitfalls of traditional input devices, such as bulky keyboards. Speech recognition modifies discrete parameters, such as switching visualizations on and off. Hand gesture input controls two-dimensional parameters, such as the size and location of visualization tools with respect to the camera/view axis. A handheld trackball provides one-dimensional input of infinite range, used for example to modify the distance of the Tunnel Tool from the user.

Techniques for viewing the internals of the building build off of previous work done in the paper "Resolving Multiple Occluded Layers in Augmented Reality" by Mark A. Livingston, J. Edward Swan II, Joseph L. Gabbard, Tobias Höllerer, Deborah Hix, Simon J. Julier, Yohan Baillot and Dennis Brown.

Publications

Ryan Bane and Tobias Höllerer.
Interactive Tools for Virtual X-Ray Vision in Mobile Augmented Reality.
In Proc. ISMAR '04 (Intl Symp. on Mixed and Augmented Reality), Nov. 2--5, 2004, Arlington, VA.

Mathias Kölsch, Ryan Bane, Tobias Höllerer, and Matthew Turk.
Touching the Visualized Invisible: Wearable AR with a Multimodal Interface
In submission.

Videos

Interactive Tools for Virtual X-Ray Vision

Related Projects

Hand Gesture Input
Resolving Multiple Occluded Layers in Augmented Reality