BYU CS 656 - TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction

Unformatted text preview:

TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA [email protected] ABSTRACT A novel touch screen technology is presented. TouchLight uses simple image processing techniques to combine the output of two video cameras placed behind a semi-transparent plane in front of the user. The resulting image shows objects that are on the plane. This technique is well suited for application with a commercially available projection screen material (DNP HoloScreen) which permits projection onto a transparent sheet of acrylic plastic in normal indoor lighting conditions. The resulting touch screen display system transforms an otherwise normal sheet of acrylic plastic into a high bandwidth input/output surface suitable for gesture-based interaction. Image processing techniques are detailed, and several novel capabilities of the system are outlined. Categories and Subject Descriptors H.5.2 [Information Interfaces and Presentation]: User Interfaces—Input devices and strategies; I.4.9 [Image Processing and Computer Vision]: Applications General Terms Algorithms, Design, Human Factors Keywords Computer vision, gesture recognition, computer human interaction, displays, videoconferencing 1. INTRODUCTION Common touch screen technologies are limited in capability. For example, most are not able to track more than a small number of objects on the screen at a time, and typically they report only the 2D position of the object and no shape information. Partly this is due to superficial limitations of the particular hardware implementation, which in turn are driven by the emphasis on emulating pointer input for common GUI interactions. Typically, today’s applications are only able to handle one 2D pointer input. A number of systems have recently introduced the concept of imaging touch screens, where instead of a small list of discrete points, a full touch image is computed, where each ‘pixel’ of the output image indicates the presence of an object on the touch screen’s surface. The utility of the touch image thus computed has been demonstrated in gesture-based interactions for application on wall and table form factors. For example, the DiamondTouch [3] system uses horizontal and vertical rows of electrodes to sense the capacitively coupled touch of the users’ hands at electrode intersections. MetaDesk [13], HoloWall [9] and Designer’s Outpost [8] each use video cameras and computer vision techniques to compute a touch image. These systems permit simultaneous video projection and surface sensing by using a diffusing screen material which, from the camera view, only resolves those objects that are on or very near the surface. The touch image produced by these camera-based systems reveals the appearance of the object as it is viewed from behind the surface. Application events may be triggered as the result of image processing techniques applied to the touch image. For example, the appearance or shape of an object may uniquely identify the object to the system and trigger certain application events. In this paper we introduce the TouchLight system, which uses simple computer vision techniques to compute a touch image on a plane situated between a pair of cameras and the user (see Figures 1 and 2). We demonstrate these techniques in combination with a projection display material which permits the projection of an image onto a transparent sheet of acrylic plastic, and the simultaneous operation of the computer vision processes. TouchLight goes beyond the previous camera-based systems; by not using a diffusing projection surface, it permits a high resolution touch image. For example, a high resolution image of a paper document may be captured using a high-resolution still camera, or one of the newer high resolution CMOS video cameras. The absence of a diffuser also permits the cameras to see beyond the display surface, just as they would if placed behind a sheet of glass. This allows a variety of interesting capabilities such as using face recognition techniques to identify the current user, eye-to-eye video conferencing, and other processes which are typically the domain of vision-based perceptual user interfaces. We describe the overall configuration of TouchLight, and detail the image processing techniques used to compute TouchLight’s Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ICMI’04, October 13–15, 2004, State College, Pennsylvania, USA. Copyright 2004 ACM 1-58113-954-3/04/0010…$5.00. 69touch image. Finally, we discuss how TouchLight enables novel gesture-based interaction. 2. TOUCHLIGHT CONFIGURATION The physical configuration of TouchLight is illustrated in Figure 1 and Figure 2. A pair of commonly available Firewire web cameras are mounted behind the display surface such that each camera can see all four corners of the display. The importance of the distance between the cameras is discussed later. The DNP HoloScreen material is applied to the rear surface of the acrylic display surface. The HoloScreen is a special refractive holographic film which scatters light from a rear projector when the incident light is at a particular angle. The material is transparent to all other light, and so is suitable for applications where traditional projection display surfaces would be overwhelmed by ambient light. Typical applications include retail storefronts, where ambient light streaming through windows precludes traditional rear-projection screens. Additionally the screen is transparent in the near-infrared range. Per manufacturer’s instructions the projector is mounted such that the projected light strikes the display at an angle of about 35 degrees. In a typical vertical, eye-level installation, this configuration does not result in the user looking directly into the “hot spot” of the projector. We note that many projectors are not able to correct for the keystone distortion when the projector is mounted at this extreme angle. In our implementation, we use the NVKeystone digital keystone


View Full Document

BYU CS 656 - TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction

Download TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?