- Friday, June 6, 2014, 5 - 7:45pm
- 2024 Elings Hall, UC Santa Barbara
- Information and Directions at http://ilab.cs.ucsb.edu


At the "Four Eyes" Lab, directed by Matthew Turk and Tobias Höllerer, we pursue research in the four I's of Imaging, Interaction, and Innovative Interfaces. During the open house, we will be describing and demonstrating several ongoing research projects. Feel free to drop by any time between 5 - 7:45pm and have a look at any projects that might interest you, talk to the lab's faculty, students, and visitors, and partake of some refreshments. See also our handout.


List of Presented Projects and Presenters:

Visualizing Targeted Audiences

Saiph Savage, Angus Forbes, Carlos Toxtli, Grant Mckenzie, Shloka Desai, Luis Rodriguez, Tobias Höllerer

People want to collaborate with large audiences for meaningful collective efforts. However, finding the right people to collaborate is hard. We introduce a novel online tool that uses interactive data visualizations to help people dynamically identify audiences for their collective efforts.

Mixed Reality Simulation Using Large Displays and Mobile Devices

Mathieu Rodrigue, Drew Waranis, Tim Wood, Tobias Höllerer

With a mixed reality simulator, one can perform usability studies and evaluate mixed reality systems while minimizing confounding variables. Our implementation consists of a large display and mobile device, which are considered the "real world" backdrop and augmenting device, respectively.

TweetProbe: A Real-Time Microblog Stream Visualization

Byungkyu (Jay) Kang, George Legrady and Tobias Höllerer

We present a novel data visualization approach for real-time social data stream analytics using Twitter streaming data. The visual and architectural design of the system has been implemented as a real-time visualization framework, showing the most trendy tweets, hashtags and sentiment of individual messages.

iVisDesigner: Expressive Interactive Design of Information Visualizations

Donghao Ren, Tobias Höllerer, Xiaoru Yuan

iVisDesigner is a web-based system that enables users to design information visualizations for complex datasets interactively, without the need for textual programming. Our system achieves high interactive expressiveness through conceptual modularity, covering a broad information visualization design space. iVisDesigner supports the interactive design of interactive visualizations, such as provisioning for responsive graph layouts and different types of brushing and linking interactions.

Interactive Remote Collaboration via Augmented Reality

Steffen Gauglitz, Benjamin Nuernberger, Matthew Turk, Tobias Höllerer

Using state-of-the-art computer vision and augmented reality technology, we push the boundaries of remote collaboration. Rather than passively watching video feeds as with conventional video conferencing, we allow users to virtually navigate the live remote scene and communicate spatial information via world-anchored annotations.

Image Synthesis via Bayesian Filtering

Victor Fragoso, Pradeep Sen, and Matthew Turk

Several approaches for image retargeting, image in-painting, HDR, and other applications, use a common patch-based procedure to synthesize images. Unfortunately, this patch-based procedure can sometimes distort the structure of objects in the scene or introduce artifacts. To alleviate this issue, we pose patch-based image synthesis as a sequential estimation problem, where color samples are taken over time. From this perspective, we use a recursive Bayesian filter to estimate the color of every pixel in the output image. Our proposed recursive Bayesian filter reduces significant color variation, which can break the coherent structure of the scene or can add artifacts when estimating the pixel colors.

Non-Visual Navigation with Music and Vibrotactile Cues

Emily Fujimoto and Matthew Turk

One common use for smartphones is pedestrian navigation. However, most navigation applications rely on visual information, distracting the user from obstacles and hazards that may be in their path. This project aims to communicate navigation information using only non-visual information, specifically vibrotactile and spatial music feedback.

MoodPlay: A Mood-based Recommender System

Ivana Andjelkovic, John O'Donovan

This demo introduces a novel user interface for music discovery and personalized recommendation, based on a visual model of music relevant moods. While state of the art recommendation systems suggest music in a form of ordered lists, without offering effective means for divergence, the goal of this research is to position a user within the music space based on the personal taste and provide visual structure to encourage further explorations.

Diner's Dilemma

James Schaffer, John O'Donovan, Tobias Höllerer

To help us understand how information content affects trust, we have developed an interactive version of the classic "3 person Diner's Dilemma game". This game has been well studied by researchers in cognitive science and psychology fields. We are particularly interested in ways that the user interface can influence trust decisions in the abstract game, and further, in ways that this can be propagated to real world decision-making scenarios. The game itself is simple: given that you will be repeatedly dining with 2 others over the period of a year, and always splitting the bill at the end of each meal, you must choose between a low-quality, low-cost item (hotdog), or a high-quality, high-cost item (lobster). Analyzing dining behavior over time allows us to better understand how participants react to the user interface and the opponents.

A Content Scoring Approach to Microblog Data Exploration

James Schaffer, Tobias Höllerer, John O'Donovan

Fluo is a tool developed at the Four Eyes Lab that uses a visual algorithm to recommend microblog content by synthesizing user-defined domain knowledge and assessments of data from an artificial intelligence. Users can interactively weigh the inputs from multiple AI and specify the relative importance of terms in the dataset by moving sliders and specifying traditional filters. Iterated interaction with the interface potentially helps the user collect evidence, accumulate insight more quickly over time, and provides feedback on how automated algorithms parameters might be better tuned for the current task. Multiple scenarios will be available for visitors, including movie recommendations and a fictional emergency relief scenario.

User Perspective Magic Lens

Domagoj Baričević

Current augmented reality magic lens implementations render the augmented scene from the point of view of the camera on the hand-held device. The perspective of that camera is very different from the perspective of the user so what the user sees does not align with the real world. A true magic lens would show the scene from the point of view of the user, not the device. We demonstrate a prototype of a hand-held AR magic lens with user-perspective rendering.

Data OS

Christopher Hall

An architecture for metadata-friendly personal computing environments - acting simultaneously as a text editor, command line interface, programming IDE, interactive interpreter, and runtime GUI for software with or without any UI code. Data OS hopes to bring serialized data out of the 60s, using a universal binary syntax based meta-format with all the recursion, binary encoding, and human readability benefits undelivered by XML. By fostering an explicitly structure aware ecosystem, 'files' go from being passive opaque monoliths of bytes to active transparent data structures with automatic management of the inlining and factoring of dependencies as pieces are migrated across systems or contexts.