The Interactive FogScreen

Ismo Rakkolainen, Stephen DiVerdi, Alex Olwal, Nicola Candussi, Tobias Höllerer


FogScreen is a novel immaterial projection display. The image floats in thin air. Touching it feels like nothing. Images and videos of the basic FogScreen: The FogScreen can also be made interactive, which starts the real fun! At UCSB, we have started work on new interaction technologies for the FogScreen, and use it for novel applications anything from engineering to arts in a multidisciplinary manner.

See a video from SIGGRPAH 2005 Emerging Technologies! (More videos further down)

And learn about the FogScreen's appearance on a futuristic Discovery Channel TV show!

The interactive FogScreen is an intriguing experience.


We've tested out three different input devices for use with the FogScreen.

Ultrasound: The eBeam Interactive is an ultrasound tracking solution designed to be used with whiteboard markers, to allow the sharing of information written on whiteboards. It provides 2D mouse-style input as well as buttons. The user must hold a small device about the size of a whiteboard marker to interact.

Unfortunately, eBeam does not work very well for our purposes - the ultrasound has shown to be too noisy and too limited range. A more sophisticated ultrasound tracker may show improved results, but the interaction of the ultrasound waves with the fog presents a fundamental limitation.

The eBeam tracker system

Range Finding: The SICK LMS200 is a time-of-flight laser range finder, providing a 1D array of distances from the source of the laser beam, which is swept along an arc. We use this range information to simulate a 2D mouse-style input by finding the closest object in range and converting the distance to an x,y coordinate. The main advantage of this technology is that no device is needed by the user to interact with the system - a hand can act as the input device.

We've had some initial difficulties using the LMS200 due to interaction between the fog and the laser beam, but on an interaction plane about 5 inches or more in front of the fog screen, unencumbered hand (or object) tracking works nicely and reliably. From an interaction point of view, for one user the system is very intuitive, but additional users will confuse the system easily, as they will also intercept the beam and generate unexpected results (the disadvantage of a deviceless tracker).

The SICK LMS200 laser range finder

Vision-based Tracking: The WorldViz Precision Position Tracker (PPT) system is a vision based tracker that uses inexpensive infrared cameras and LEDs to triangulate the 3D positions of tracked objects. Users hold small devices with single LEDs to allow them to interact. Multiple LEDs can be tracked simultaneously, currently up to 8. One advantage of infrared tracking is that the fog is invisible in the infrared spectrum, so it does not interfere - visible spectrum tracking would be hampered by the FogScreen.

The primary limitations of PPT and vision-based tracking in general are the issues of occlusion and lighting. Currently, we use 4 cameras placed around the FogScreen a few meters above the ground, which provides good coverage of the workspace. However, when large numbers of people crowd around the screen, each LED is no longer visible from the minimum number of cameras (3) to properly determine position. The infrared nature of the PPT system requires that the environment have as little ambient infrared light as possible, which generally means no incadescent or natural illumination. Bright flourescent light is fine, but as the FogScreen looks best in dark environments, we keep our lab dark in general.

To integrate button-style input, we used ELECTRO // VIRTUAL wireless joystick handles with attached infrared LEDs. The button events are transmitted with bluetooth and appear as generic mouse buttons. Unfortunately, PPT does not provide any ID information with the tracking result, so we have to use careful filtering to maintain the association between position and button inputs.

A small infrared LED for use with PPT An infrared tracked joystick


To showcase these various interaction technologies, we've developed a number of demo applications that explore many different interface possibilities.

Painting with fireworks painting with your hands

Painting: Using the input device as a 2D mouse for the FogScreen easily allowed for an intuitive way to use traditional programs. With a black canvas, a user can paint in thin air with the usual tools from 2D paint programs, creating virtual graffiti. A slightly more advanced extension allows users to paint with dynamic effects like fire and sparks.


The Consigalo interactive game The Consigalo interactive game

Consigalo: Our second demo is an engaging multiplayer game in which users grab animals and sort them into their respective goals for points. The interaction is in the spirit of tablet PC input - the user can move in front of the screen to move their cursor, and then touch the screen to initiate actions such as grabbing an animal. To accomplish this, we use the 3D position from PPT.


The Rigid Body Simulator The Rigid Body Simulator

Rigid Bodies: Most peoples' first reaction when they see the FogScreen is to reach out and touch the image. To provide a physically intuitive experience, we also produced a demo that simulates rigid body dynamics for 3D objects that, when projected onto the FogScreen, appear to float in front of the audience. Users control green paddle objects using a 2D input technique, which can then influence the motion of the other rigid bodies in the scene by colliding with them. Collisions are accentuated by a shower of sparks and an appropriate sound. The final result is an experience that very much feels like playing with actual objects floating in the air.


The Virtual Forest Navigator The Virtual Forest Navigator

Virtual Forest: Our Virtual Forest project was modified to be used with the FogScreen to show how a first person style interface would feel, and to show off some advanced real-time rendering techniques on the novel display. A user can navigate the forest using a tracked wireless joystick to control their velocity and direction. Different buttons also allow the user to look around and change the direction of the sunlight.


The Face Deformer The Face Deformer

Elastic Face Deformation: This demo allows the user to interactively stretch and sculpt the shape of a 3D head model. The interface uses a tracked wireless joystick to control a 3D cursor around the head, while buttons on the joystick trigger stretching or sculpting actions.


The live images to the left illustrate the two-sidedness of our Fog Screen (different but coordinated images on the front and back).


I. Rakkolainen, K. Palovuori
Interactive FogScreen poster, ACM UIST 2004, Santa Fe, NM, USA, October 24-27, 2004.

Related Pages

FogScreen Inc.