GT LCC 3710 - Emerging frameworks for tangible user interfaces

Unformatted text preview:

by B. UllmerH. IshiiWe present steps toward a conceptualframework for tangible user interfaces. Weintroduce the MCRpd interaction model fortangible interfaces, which relates the role ofphysical and digital representations, physicalcontrol, and underlying digital models. Thismodel serves as a foundation for identifying anddiscussing several key characteristics of tangibleuser interfaces. We identify a number of systemsexhibiting these characteristics, and situatethese within 12 application domains. Finally, wediscuss tangible interfaces in the context ofrelated research themes, both within and outsideof the human-computer interaction domain.The last decade has seen a large and growing bodyof research in computational systems embrac-ing physical-world modalities of interaction. Thiswork has led to the identification of several majorresearch themes, including ubiquitous computing,augmented reality, mixed reality, and wearable com-puting.At the same time, a number of research systems re-lating to the use of physical artifacts as representa-tions and controls for digital information have notbeen well-characterized in terms of these earlierframeworks. Fitzmaurice, Ishii, and Buxton took amajor step in this direction with their description of“graspable user interfaces.”1,2Building upon this foundation, we extended theseideas and introduced the term “tangible user inter-faces” in our discussion of “Tangible Bits.”3Amongother historical inspirations, we suggested the aba-cus as a compelling prototypical example. In partic-ular, a key point to note is that the abacus is not aninput device. The abacus makes no distinction be-tween “input” and “output.” Instead, the beads, rods,and frame of the abacus serve as manipulable phys-ical representations of abstract numerical values andoperations. Simultaneously, these component arti-facts also serve as physical controls for directly manip-ulating their underlying associations.This seamless integration of representation and con-trol differs markedly from the mainstream graphicaluser interface (GUI) approaches of modern human-computer interaction (HCI). Graphical interfacesmake a fundamental distinction between “input de-vices,” such as the keyboard and mouse, as controls,and graphical “output devices,” such as monitors andhead-mounted displays, as portals for representationsfacilitating human interaction with computationalsystems. Tangible interfaces, in the tradition of theabacus, explore the conceptual space opened by theelimination of this distinction.In this paper, we take steps toward a conceptualframework for tangible user interfaces. In the pro-cess, we hope to characterize not only systems ex-plicitly conceived as “tangible interfaces,” but morebroadly, numerous past and contemporary systemsthat may be productively considered in terms of tan-gible interface characteristics.娀Copyright 2000 by International Business Machines Corpora-tion. Copying in printed form for private use is permitted with-out payment of royalty provided that (1) each reproduction is donewithout alteration and (2) the Journal reference and IBM copy-right notice are included on the first page. The title and abstract,but no other portions, of this paper may be copied or distributedroyalty free without further permission by computer-based andother information-service systems. Permission to republish anyother portion of this paper must be obtained from the Editor.IBM SYSTEMS JOURNAL, VOL 39, NOS 3&4, 2000 0018-8670/00/$5.00 © 2000 IBM ULLMER AND ISHII915Emerging frameworksfor tangible user interfacesA first exampleTo better ground our discussions, we begin by in-troducing an example interface called “Urp,” de-picted in Figure 1. Urp is a tangible interface for ur-ban planning, based on a workbench for simulatingthe interactions among buildings in an urban envi-ronment.4,5The interface combines a series of phys-ical building models and interactive tools with an in-tegrated projector/camera/computer node called the“I/O bulb.”Under the mediating illumination of the I/O bulb, thebuilding models of Urp cast graphical shadows ontothe workbench surface, corresponding to solar shad-ows at a particular time of day. The position of thesun can be controlled by turning the physical handsof a clock tool. As the corresponding shadows aretransformed, the building models can be moved androtated to minimize intershadowing problems (shad-ows cast on adjacent buildings).A physical “material wand” can be used to bind al-ternate material properties to individual buildings.For instance, when bound with a “glass” materialproperty, buildings cast not only solar shadows, butalso solar reflections. These reflections exhibit morecomplex (and less intuitive) behavior than shadows.Moreover, these reflections pose special problemsfor urban drivers (roadways are also physically in-stantiated and simulated by Urp.)Finally, a computational fluid flow simulation isbound to a physical “wind” tool. By adding this ob-ject to the workbench, a wind-flow simulation is ac-tivated, with field lines graphically flowing aroundthe buildings (which remain interactively manipu-lable). Changing the physical orientation of the windtool correspondingly alters the orientation of thecomputationally simulated wind.Tangible user interfacesAs illustrated by the previous example, tangible in-terfaces give physical form to digital information, em-ploying physical artifacts both as representations andcontrols for computational media. Tangible user in-terfaces (TUIs) couple physical representations (e.g.,spatially manipulable physical objects) with digitalFigure 1 “Urp” urban planning simulation, with buildings, wind tool, and wind probe (photo courtesy of John Underkoffler)ULLMER AND ISHII IBM SYSTEMS JOURNAL, VOL 39, NOS 3&4, 2000916representations (e.g., graphics and audio), yieldinguser interfaces that are computationally mediatedbut generally not identifiable as “computers” per se.Clearly, traditional user interface elements such askeyboards, mice, and screens are also “physical” inform. Here, the role of physical representation pro-vides an important distinction. For example, in theUrp tangible interface, physical models of buildingsare used as physical representations of actual build-ings.The physical forms (representing specific buildings)of the Urp models, as well as their position and ori-entation on the workbench of the system, serve cen-tral roles in


View Full Document

GT LCC 3710 - Emerging frameworks for tangible user interfaces

Download Emerging frameworks for tangible user interfaces
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Emerging frameworks for tangible user interfaces and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Emerging frameworks for tangible user interfaces 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?