Abstracts for James A. Landay

The EECS Research Summary for 2003


Liquid: Context-Aware Distributed Queries

Alan Newberger, Christopher Beckmann, Jeffrey Heer, and Jason I. Hong
(Professors Anind Dey, James A. Landay, and Jennifer Mankoff)
(NSF) IIS-0205644

Context-aware computing is an effort to use sensed attributes of an environment to provide enriched support for activities. For example, an application might provide relevant services based upon your location or the identity of your companions. As low-level architectural support for context-aware computing matures [1,2], we are ready to explore more general and powerful means of access to context data. Information required by a context-aware application may be spread across a number of different repositories, partitioned by any number of physical, organizational, or privacy boundaries. What is needed is a mechanism for context-aware applications to issue context-based queries without having to explicitly manage the complex storage layout and access policies of the underlying data.

To address this need, we are developing Liquid, a distributed query processing system intended to both simplify and enhance the next generation of context-aware applications. Liquid will allow applications to issue long-standing queries in a simple declarative language and to monitor continuously changing query results. Our system is targeted at supporting two primary features: (1) continuous (persistent) queries sensitive to the dynamic nature of context (e.g., issuer changes location), and (2) queries with approximate results, where result substitutions can be made by exploiting relationships between repositories (e.g., a floor's temperature data is substituted for missing room temperature data). It is our hope that the Liquid system will provide a solid base for building advanced context-aware applications.

[1]
A. K. Dey, D. Salber, and G. D. Abowd, "A Conceptual Framework and a Toolkit for Supporting the Rapid Prototyping of Context-Aware Applications," Human-Computer Interaction Journal, Vol. 16, No. 2-4, 2001.
[2]
J. I. Hong and J. A. Landay, "An Infrastructure Approach to Context-Aware Computing," Human-Computer Interaction, Vol. 16, No.2-4, 2001.

More information (http://guir.berkeley.edu/projects/cfabric/) or

Send mail to the author : (jheer@cs.berkeley.edu)

Insight: Tool Support for User-Centered Ubicomp Prototyping and Evaluation

Alan Liu1 and Peter Khooshabeh2
(Professors Anind Dey and James A. Landay)

The field of ubiquitous computing (Ubicomp) is still at its inception. Within the field, there is no standard set of methodologies to evaluate Ubicomp systems. System designers benefit from performing multiple iterations, and having functionality in a system before the final implementation to get feedback on design issues. This gives rise to different prototyping techniques such as Wizard of Oz. Insight is a set of tools to support the Wizard of Oz prototyping of Ubicomp systems and the evaluation of the those systems, using data collected in user studies. It is composed of the context event logger, a tool for allowing a wizard in a Wizard of Oz scenario to simulate a sensor network that captures events in its environment, and the context event analyzer, a tool for showing higher level aggregates of lower level context data.

Our plan is to use the logger and analyzer in evaluating two iterations of an application for industrial-sized kitchens that tracks and helps users locate food items. In the first iteration, we applied paper-prototyping techniques along with Wizard of Oz simulation of sensor input using the logger. Our second iteration will involve an interactive prototype that will still depend on simulating actual sensors, but which will have networked computer screens and a simple database to actually respond to users without the need for a person (wizard) to simulate application behavior. After user studies of the prototype from each iteration, we will employ the analyzer and examine the events logged, looking for events that identify design flaws. Our goal is to collect evidence demonstrating the strengths and flaws of both methods in Ubicomp design and evaluation.

[1]
S. Consolvo, L. Arnstein, and B. Franza, "User Study Techniques in the Design and Evaluation of a Ubicomp Environment," Proc. Int. Conf. Ubiquitous Computing, September 2002.
1Undergraduate (EECS)
2Undergraduate (non-EECS)

Send mail to the author : (lliu@eecs.berkeley.edu)

From Data to Display: The Design and Evaluation of a Peripheral Sound Display for the Deaf

Wai-ling Ho-Ching
(Professors James A. Landay and Jennifer Mankoff)

In 1997 there were 227,000 deaf people in the US who could not use regular auditory sensing to gain awareness of sound. Instead, they use alternate awareness techniques such as sensing vibrations and the use of flashing lights to substitute for the aural sensing of sound in the workplace. However, there remains a gap between the experience of a hearing individual and the experience of a deaf person. Our work describes the design and evaluation of a peripheral display to provide the deaf with awareness of sound in an office environment to help close that gap. Conceptual drawings of sound by hearing participants, exploration with paper prototypes, interviews, and surveys formed the basis for our current design.

We implemented two prototypes shown in Figures 1 and 2. One is based on a spectrograph, a tool commonly used by speech therapists that represents pitch and intensity of sound over time. Another depicts position and amplitude over time. We evaluated them in a dual task experiment with eight deaf participants and found they were able to peripherally identify notification sounds such as a door knock or telephone ring with both systems while performing a visual primary task. Participants had significantly higher identification rates with the visualization that represented position. Neither visualization resulted in a significant amount of distraction in terms of performance of a primary task. This work [1] has been received with much enthusiasm by members of the deaf community and may ultimately result in a system for better support for sound awareness for the deaf in situations of fixed visual focus.


Figure 1: A cellular phone ring as represented by our spectrograph visualization. In this visualization, height is mapped to frequency, color to intensity (blue = quiet; red = loud). The temporal aspect is depicted by having the visualization animate from right to left. A cellular phone ring is recognizable by a regular frequency amplitude pattern. This is typical of mechanical sounds.

Figure 2: A cellular phone ring as represented by our ripples visualization. A top view map of the room appears in white. The rings denote the position of a sound source in a room. The size and color of rings indicate the amplitude of the sound. Frequency does not appear in this visualization. A user can infer a sound source from its location. In this case, the participant was told the phone was on the desk. Thus, a sound coming from the desk would probably be the phone.

[1]
F. W. Ho-Ching, J. Mankoff, and J. A. Landay, "From Data to Display: The Design and Evaluation of a Peripheral Sound Display for the Deaf," CHI (submitted). Also, UC Berkeley Computer Science Division Report No. UCB/CSD 02/1204, October 2002. Available online: http://www.cs.berkeley.edu/~wai-ling/pubs/chi2003long-submitted.pdf.

More information (http://guir.berkeley.edu/projects/ic2hear) or

Send mail to the author : (wai-ling@cs.berkeley.edu)

Multimodal, Multi-Device Prototyping Using Programming by Illustration

Anoop Sinha
(Professor James A. Landay)
NSF Graduate Fellowship and (NSF) 9985111

User interface designers are increasingly faced with the challenge of targeting multi-device, multimodal applications, but do not have tools to support them. This work proposes an informal prototyping tool, named CrossWeaver, which implements the programming by illustration (PBI) technique, enabling non-programmer designers to build multimodal, multi-device user interface prototypes, test those prototypes with end users, and collect valuable feedback informing iterative design.

PBI is a technique for user interface prototyping that involves building executable prototypes from example sketches. PBI has its origin in the informal interface approach [1], supporting natural human input, such as sketching, while minimizing recognition and transformation of the input. PBI also uses programming by demonstration techniques [2], enabling a working application to be built by an end-user based on concrete examples, in this case design sketches. CrossWeaver extends informal user interface and programming by demonstration research to multimodal, multi-device applications, enabling a designer to create and test a multi-device, multimodal prototype from a set of example-sketched storyboards.


Figure 1: Screenshot of the initial CrossWeaver prototype

[1]
J. A. Landay and B. A. Myers, "Sketching Interfaces: Toward More Human Interface Design," IEEE Computer, Vol. 34, No. 3, 2001.
[2]
A. Cypher, ed., "Watch What I Do: Programming by Demonstration," D. C. Halbert et al., ed., MIT Press, Cambridge, MA, 1993.

More information (http://guir.berkeley.edu/projects/crossweaver/) or

Send mail to the author : (aks@eecs.berkeley.edu)

Damask: Supporting Early-Stage Multi-Device UI Design Using Patterns

James Lin
(Professor James A. Landay)
(NSF) 9985111

People often use a variety of computing devices, such as PCs, PDAs, and cell phones, to access the same information. The user interface to this information needs to be different for each device, due to different input and output constraints. Currently, designers designing such multi-device user interfaces either have to design a UI separately for each device, which is time consuming, or use a program to automatically generate interfaces, which often results in interfaces that are awkward.

We are creating a system called Damask [1] to better support multi-device UI design. With Damask, the designer will design a UI for one device by sketching the design and by specifying which design patterns the interface uses. The patterns will help Damask generate user interfaces optimized for the other target devices. The generated interfaces will be of sufficient quality so that it will be more convenient to use Damask than to design each of the other interfaces separately, and the ease with which designers will be able to create designs will encourage them to engage in iterative design.


Figure 1: Damask's proposed user interface

[1]
J. Lin and J. A. Landay, "Damask: A Tool for Early-Stage Design and Prototyping of Multi-Device User Interfaces," Int. Conf. Distributed Multimedia Systems Workshop on Visual Computing, San Francisco, CA, September 2002.

More information (http://guir.berkeley.edu/projects/damask/) or

Send mail to the author : (jimlin@eecs.berkeley.edu)

DENIM: Finding a Tighter Fit between Tools and Practice for Web Site Design

James Lin, Mark Newman, Yang Li1, and Marc Ringuette2
(Professor James A. Landay)

We conducted an ethnographic study [1] in which we observed and interviewed several professional web designers. This study showed that the process of designing a web site involves an iterative progression from less detailed to more detailed representations of the site. For example, designers often create site maps early in the process, which are high-level representations of a site in which each page or set of pages is depicted as a label. They then proceed to create storyboards of interaction sequences, which employ minimal page-level detail and focus instead on the navigational elements required to get from one page to another. Later still, designers create schematics and mock-ups, which are different representations of individual pages.

These were the primary observations that led to the design and implementation of DENIM [2], a system to assist web designers in the early stages of information, navigation, and interaction design. DENIM is an informal pen-based system that allows designers to quickly sketch web pages, create links among them, and interact with them in a run mode. The different ways of viewing a web site, from site map to storyboard to individual pages, are integrated through the use of zooming.

More information is available through the Group for User Interface Research web site at http://guir.berkeley.edu.


Figure 1: The DENIM system

[1]
M. W. Newman and J. A. Landay, "Sitemaps, Storyboards, and Specifications: A Sketch of Web Site Design Practice," Designing Interactive Systems, New York, NY, August 2000.
[2]
J. Lin, M. W. Newman, J. I. Hong, and J. A. Landay, "DENIM: Finding a Tighter Fit between Tools and Practice for Web Site Design," CHI Letters: Human Factors in Computing Systems, The Hague, The Netherlands, April 2000.
1Postdoctoral Researcher
2Staff

More information (http://guir.berkeley.edu/projects/denim/) or

Send mail to the author : (jimlin@eecs.berkeley.edu)

Privacy-Sensitive Infrastructure Support for Context-Awareness

Jason I. Hong, Chris Beckmann, Jeff Heer, Xiaodong Jiang, and Alan Newberger
(Professor James A. Landay)
(NSF) IIS-0205644

Context-aware applications are computer systems that make use of implicitly gathered information, such as a person's identity, location, and activity. This is in contrast to traditional computer systems that require explicit user interaction for all input.

This work is addressing two different but related problems. The first is organizing and managing the sensors, data, and services in a meaningful way. The second is doing all of this in a privacy-sensitive manner that provides end-users with greater control and feedback over what information is being collected about them and how that information is being used.

The main abstraction we are developing is InfoSpace. InfoSpaces are repositories of context information designed to be analogous to web sites. That is, in the same way that many people create and manage personal web sites, they would create and manage personal InfoSpaces. While a person would only have one logical InfoSpace, they may have several InfoSpaces that physically reside on multiple devices, thus providing people with high availability even when mobile.

However, systems that collect highly personal information like this are always strongly criticized because of potential privacy threats. To address these legitimate concerns, we are integrating several privacy mechanisms, including basic access control to limit queries, the option to return intentionally ambiguous results, privacy tags for specifying privacy preferences on data that flows from one InfoSpace to another, and user interfaces for helping end-users understand who has been accessing their data.


More information (http://guir.berkeley.edu/projects/cfabric/) or

Send mail to the author : (jasonh@eecs.berkeley.edu)

The Designers’ Outpost: A Tangible Interface for Collaborative Web Site Design

Katherine Everitt, Scott Klemmer, and Robert Lee1
(Professor James A. Landay)
(NSF) IIS-0084367

In our previous studies into web design [1], we found that pens, paper, walls, and tables were often used for explaining, developing, and communicating ideas during the early phases of design. These wall-scale paper-based design practices inspired The Designers’ Outpost [2], a tangible user interface that combines the affordances of paper and large physical workspaces with the advantages of electronic media to support information design. With Outpost, users collaboratively author web site information architectures on an electronic whiteboard using physical media (post-it notes and images), structuring and annotating that information with electronic pens. This interaction is enabled by a touch-sensitive SMART board augmented with a rear-mounted video camera for capturing movement and a front-mounted high-resolution camera for capturing ink.

The electronic representation gives us three main advantages: the ability to support fluid transitions to other tools, such as DENIM [3], support for history [4], and remote collaboration [5].

We have recently developed a remote collaboration system [5] based on The Designers’ Outpost. The system provides a distributed shared workspace that employs physical post-it notes as interaction primitives. We implement and evaluate two mechanisms for awareness: transient ink input for gestures and a blue shadow of the remote collaborator for presence.


Figure 1: Users collaborate remotely using physical artifacts. Notes that are digital on this board correspond to electronic notes in Figure 2.

Figure 2: Notes on this board are electronic versions of the physical notes in Figure 1.

[1]
M. W. Newman and J. A. Landay, "Sitemaps, Storyboards, and Specifications: A Sketch of Web Site Design Practice," Proc. Designing Interactive Systems, New York, NY, August 2000.
[2]
S. R. Klemmer, M. W. Newman, R. Farrell, M. Bilezikjian, and J. A. Landay, "The Designers’ Outpost: A Tangible Interface for Collaborative Web Site Design," ACM Symp. User Interface Software and Technology, CHI Letters, Vol. 3, No. 2, 2001.
[3]
J. Lin, M. W. Newman, J. I. Hong, and J. A. Landay, "DENIM: Finding a Tighter Fit between Tools and Practice for Web Site Design," CHI Human Factors in Computing Systems, CHI Letters, Vol. 2, No. 1, 2000.
[4]
S. R. Klemmer, M. Thomsen, E. Phelps-Goodman, and J. A. Landay, Where Do Web Sites Come From? Capturing and Interacting with Design History, UC Berkeley Computer Science Division, Report No. UCB/CSD 01/1157, October 2001.
[5]
K. M. Everitt, S. R. Klemmer, R. Lee, and J. A. Landay, "Two Worlds Apart: Bridging the Gap Between Physical and Virtual Media for Distributed Design Collaboration," CHI, 2003 (submitted). Also, UC Berkeley Computer Science Division Report No. UCB/CSD 02/1201, 2002.
1Undergraduate (EECS)

More information (http://guir.berkeley.edu/outpost/) or

Send mail to the author : (everitt@eecs.berkeley.edu)

Books with Voices: Paper Transcripts as a Tangible Interface to Oral Histories

Scott R. Klemmer, Jamey Graham1, and Gregory J. Wolff2
(Professor James A. Landay)

Our contextual inquiry into the practices of oral historians unearthed a curious incongruity: while oral historians consider interview recordings to be a central historical artifact, these recordings sit unused after a written transcript is produced. We hypothesized that this is largely because books are more usable than recordings, so we created Books with Voices [1]: bar-code augmented paper transcripts enabling fast, random access to digital video interviews on a PDA. We present quantitative results of an evaluation of this tangible interface with 13 participants. They found this lightweight, structured access to original recordings to be useful, offering substantial benefits with minimal overhead. Oral historians found a level of emotion in the video not available in the printed transcript. The video also helped readers clarify the text and observe nonverbal cues.


Figure 1: Accessing digital video by scanning transcripts

Figure 2: PDA video display of oral histories

Figure 3: Augmented paper transcripts produced by Books with Voices; from an oral history with Professor Carlo Séquin

[1]
S. R. Klemmer, J. Graham, G. J. Wolff, and J. A. Landay, Books with Voices: Paper Transcripts as a Tangible Interface to Oral Histories, UC Berkeley Computer Science Division, Report No. UCB/CSD 02/1199, September 2002.
[2]
J. M. Graham and J. J. Hull, "Video Paper: A Paper-based Interface for Skimming and Watching Video," Int. Conf. Consumer Electronics, Los Angeles, CA, July 2002.
1Ricoh Innovations, Inc.
2Ricoh Innovations, Inc.

More information (http://guir.berkeley.edu/oral-history) or

Send mail to the author : (srk@cs.berkeley.edu)