Tobias Höllerer (PI), Stephen DiVerdi, Sehwan Kim, Taehee Lee, Jonathan Ventura,
Jason Wither, Chris Coffin, Steffen Gauglitz, Chris Sweeney
We are conducting many different projects with the goal of Anywhere Augmentation. We have introduced this term for the concept of building an AR system that will work in arbitrary environments with no prior preparation. The main goal of this work is to lower the barrier of broad acceptance for augmented reality by expanding beyond research prototypes that only work in prepared, controlled environments. To this end we are building a framework to unite the resources that are commonly available to any Anywhere Augmentation user – rough global position tracking, local sensors, globally available GIS data, and user input – to address the two main problem areas hindering widespread applicability of this technology: generality and robustness. We are working in several areas towards are goal of Anywhere Augmentation, including tracking in unprepared environments, interface and interaction design, and application development.
Motivation and Details
Our approach to Anywhere Augmentation is based on combining wearable computing and augmented reality (AR). Instead of embedding computing and display equipment in the environment as in the case of ubiquitous computing, graphical annotations are overlaid on top of the environment by means of the user's own equipment. AR can be shown via optical see-through glasses or video overlay, which works with near-eye displays or hand-held devices such as smartphones.
Mobile and wearable computing technologies have found their way into mainstream industrial and governmental service applications over the past decade. They are now commonplace in the shipping and hospitality industries, as well as in mobile law enforcement, to highlight a few successful examples. However, current mobile computing solutions outside of research laboratories do not sense and adapt to the user's environment and they do not link the services they provide with the physical world.
The big difference between the successful demonstrations of AR in research prototypes and general use is that the prototypes operate in at least partially controlled environments. In order to obtain reliable and accurate registration between the physical world and the augmentations one either needs a model of the environment, or the environment needs to be instrumented, at least passively with registration markers. Both of these preconditions severely constrain the applicability of AR. Instrumentation of environments on a global scale is exceedingly unlikely to take place, and detailed 3D city and landscape models are very cumbersome to create. In addition, even if detailed 3D models of target environments existed on a broad scale, keeping them up-to-date would be a major challenge and they would still not take into account dynamic changes.
The problem with these high initial costs is that they form a barrier to entry for AR applications. Potential users are frequently turned away when it becomes apparent that use of an AR system will require a few days of building, modeling, instrumenting, calibrating, and measuring, assuming all the required pieces of hardware are on hand. To create an active community of potential developers, it is important to foster experimentation with AR technologies by making use as simple a matter as possible. Augmented reality must become a casual technology that people who are not experts in the field can try out themselves before widespread adoption can be expected.
The research we are undertaking does not rely on any equipment in the environment (other than the satellites in the GPS constellation). Its focus is on the incremental generation of the necessary world models on the fly, in real time, while the system is in operation. This means that we need a simple initialization step, for which we take advantage of efficient computer-assisted interaction techniques to help establish the link and registration with the physical world.
Our concept of Anywhere Augmentation precludes us from relying on data that is not likely to become readily available world or at least nationwide in the near future. We do propose to utilize several sources of GIS data for which there are already data repositories with nationwide coverage (e.g. aerial photography, elevation, land use, street maps, and the NGA names database). Our concept of Anywhere Augmentation does not depend on the existence of any of these data sources, but we will consider these sources when available, and their existence will improve the user experience by providing more information and stronger constraints for user interaction.
Ideally, an AR system developed with the goal of Anywhere Augmentation in mind would be able to be used "out of the box" in a new environment with no setup necessary. However, it is reasonable to expect some small amount of initial effort, so long as it does not interfere overall with the experience. Thus, we focus on systems that take preparation time on the order of seconds, or at least no more than a few minutes - enough time for quick calibration or semi-automatic acquisition of environment data, but by far not enough for the careful measurement and setup work required by high-accuracy AR systems today.
We propose the development of efficient, intelligently constrained semi-automatic interaction tools that allow a mobile AR user to easily establish and correct registration between the physical world and virtual augmentations. Furthermore, we propose techniques to automatically model the user's environment on the fly, geometrically and radiometrically, and we will leverage these models for improved AR interactions: matching the appearance of virtual geometry with the natural lighting conditions and controlling the layout and placement of 2D and 3D annotations.
||Live Tracking and Mapping from Both General and Rotation-Only Camera Motion
Steffen Gauglitz, Chris Sweeney, Jonathan Ventura, Matthew Turk, Tobias Höllerer
||Outdoor Modeling and Localization for Mobile Augmented Reality
Jonathan Ventura and Tobias Höllerer
||Wide-area Mobile Localization from Panoramic Imagery
Jonathan Ventura and Tobias Höllerer
|| Virtual Keyframes for Environment Map Capturing
Sehwan Kim, Christopher Coffin and Tobias Höllerer
||Online Environment Model Estimation for Augmented Reality
Jonathan Ventura and Tobias Hollerer
Fast Annotation and Modeling with a Single-Point Laser Range Finder
Jason Wither, Chris Coffin, Jonathan Ventura, and Tobias Hollerer
Annotation with Aerial Photographs
Jason Wither, Stephen DiVerdi, Tobias Höllerer
We present a mobile augmented reality system for outdoor annotation of the real world. To reduce user burden, we use aerial photographs in addition to the wearable system's usual data sources (position, orientation, camera and user input). This allows the user to accurately annotate 3D features with only a few simple interactions from a single position by aligning features in both their firstperson viewpoint and in the aerial view. We examine three types of aerial photograph features - corners, edges, and regions - that are suitable for a wide variety of useful mobile augmented reality applications, and are easily visible on aerial photographs. By using aerial photographs in combination with wearable augmented reality, we are able to achieve much higher accuracy 3D annotation positions than was previously possible from a single user location.
Evaluating Display Types for AR Selection and Annotation
Jason Wither, Stephen DiVerdi, Tobias Höllerer
This project presents a user study to compare two different tasks, selection and visual search, among three representative Augmented Reality display types: a headworn display, and hand-held displays in two different user poses.
GroundCam: A Tracking Modality for Mobile Mixed Reality
Stephen DiVerdi, Tobias Höllerer
This project contributes to the goal of Anywhere Augmentation by introducing the GroundCam, a cheap tracking modality with no significant setup necessary. By itself, the GroundCam provides high frequency, high resolution relative position information similar to an inertial navigation system, but with significantly less drift. When coupled with a wide area tracking modality via a complementary Kalman filter, the hybrid tracker becomes a powerful base for indoor and outdoor mobile mixed reality work.
Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking
Taehee Lee, Tobias Höllerer
We present markerless camera tracking and user interface methodology for readily inspecting augmented reality (AR) objects using a user's outstretched hand.
- Wide-Area Scene Mapping for Mobile Visual Tracking.
- J. Ventura, T. Höllerer. International Symposium on Mixed and Augmented Reality (ISMAR) 2012.
- Live Tracking and Mapping from Both General and Rotation-Only Camera Motion.
- S. Gauglitz, C. Sweeney, J. Ventura, M. Turk, T. Höllerer. International Symposium on Mixed and Augmented Reality (ISMAR) 2012. Best Paper Award.
- Outdoor Mobile Localization from Panoramic Imagery.
- J. Ventura, T. Höllerer. IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2011).
- Evaluating the Impact of Recovery Density on Augmented Reality Tracking.
- C. Coffin, C. Lee, T. Höllerer. 2011 IEEE International Symposium on Mixed and Augmented Reality.
- Robust Relocalization and Its Evaluation For Online Environment Map Construction.
- S. Kim, C. Coffin, T. Höllerer. IEEE Transactions on Visualization and Computer Graphics, vol. 17(7), pp. 875–887.
- Relocalization Using Virtual Keyframes For Online Environment Map Construction.
- S. Kim, C. Coffin, T. Höllerer. Proc. ACM Virtual Reality Software and Technology (VRST) 2009.
- Annotation in Outdoor Augmented Reality.
- J. Wither, S. DiVerdi, T. Höllerer. Computers & Graphics, Volume 33, Issue 6.
- All Around the Map: Online Spherical Panorama Construction.
- S. DiVerdi, J. Wither, T. Höllerer. Computers and Graphics.
- Multi-Threaded Hybrid Feature Tracking for Markerless Augmented Reality.
- T. Lee, T. Höllerer. IEEE Transactions of Visualization and Computer Graphics, 15(3).
- Fast Annotation and Modeling with a Single-Point Laser Range Finder.
- J. Wither, C. Coffin, J. Ventura, T. Höllerer. International Symposium on Mixed and Augmented Reality (ISMAR) 2008.
- Heads Up and Camera Down: A Vision-based Tracking Modality for Mobile Mixed Reality.
- S. DiVerdi, T. Höllerer. IEEE Transactions of Visualization and Computer Graphics, Vol. 14, No. 3, May/June, 2008.
- Envisor: Online Environment Map Construction for Mixed Reality.
- S. DiVerdi, J. Wither, T. Höllerer. Proc. IEEE VR 2008 (10th Int’l Conference on Virtual Reality). Best Paper Honorable Mention.
- Hybrid Feature Tracking and User Interaction for Markerless Augmented Reality.
- T. Lee, T. Höllerer. Proc. IEEE VR 2008 (10th Int’l Conference on Virtual Reality). Best Paper Nominee.
- Towards Anywhere Augmentation.
- S. DiVerdi. Doctoral Thesis, University of California, Santa Barbara, 2007.
- Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking.
- T. Lee, T. Höllerer. Int'l Symposium on Wearable Computers (IEEE ISWC) 2007.
- Initializing Markerless Tracking Using a Simple Hand Gesture.
- T. Lee, T. Höllerer. Proc. ACM/IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
- Evaluating Display Types for AR Selection and Annotation.
- J. Wither, S. DiVerdi, T. Höllerer. In: Proc. ACM/IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
- Using Aerial Photographs for Improved Mobile AR Annotation.
- J. Wither, S. DiVerdi, T. Höllerer. International Symposium on Mixed and Augmented Reality, Santa Barbara, CA, Oct. 22-25, 2006.
- Pictorial Depth Cues for Outdoor Augmented Reality.
- J. Wither, T. Höllerer. Proc. IEEE International Symposium on Wearable Computers (ISWC), Osaka, Japan, Oct. 2005.
- Interaction and Annotation at a Distance in Outdoor Augmented Reality.
- J. Wither. Master's Thesis, September 2005.
- Evaluating Techniques for Interaction at a Distance.
- J. Wither, T. Höllerer. In Proc. ISWC '04 (Eighth IEEE Intl. Symp. on Wearable Computers), Arlington, VA, Oct. 31 - Nov. 3 2004.