DOC PREVIEW
Examining User Interactions with Video Retrieval Systems

This preview shows page 1-2-3-4-5 out of 15 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Examining User Interactions with Video Retrieval Systems* Michael G. Christel, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA 15213; [email protected], phone 1 724 935-4076 ABSTRACT The Informedia group at Carnegie Mellon University has since 1994 been developing and evaluating surrogates, summary interfaces, and visualizations for accessing digital video collections containing thousands of documents, millions of shots, and terabytes of data. This paper reports on TRECVID 2005 and 2006 interactive search tasks conducted with the Informedia system by users having no knowledge of Informedia or other video retrieval interfaces, but being experts in analyst activities. Think-aloud protocols, questionnaires, and interviews were also conducted with this user group to assess the contributions of various video summarization and browsing techniques with respect to broadcast news test corpora. Lessons learned from these user interactions are reported, with recommendations on both interface improvements for video retrieval systems and enhancing the ecological validity of video retrieval interface evaluations. Keywords: user studies, TRECVID, information visualization, digital video library, video surrogate, user interface evaluation, video retrieval, Informedia, human-computer interaction 1. INTRODUCTION A number of Informedia user studies have taken place through the years, most often with Carnegie Mellon students and staff as the participants. These studies were surveyed in a 2006 paper reporting on how they can provide a user pull complementing the technology push as automated video processing advances1. The merits of discount usability techniques for iterative improvement and evaluation were presented in that same survey paper, as well as the structure of formal empirical investigations with end users that have ecological validity while addressing the human computer interaction metrics of efficiency, effectiveness, and satisfaction. Conclusions were reported with respect to video summarization and browsing, ranging from the simplest portrayal of a single thumbnail to represent video stories, to collections of thumbnails in storyboards, to playable video skims, to video collages with multiple synchronized information perspectives. This paper complements that 2006 survey report by presenting a series of user studies conducted with representatives of a user community outside of the college/university population: professional situation analysts whose jobs focus on the management, analysis, processing, and dissemination of strategic and tactical intelligence from varied, typically voluminous data sources. The merits of discount usability techniques for iterative improvement and evaluation are discussed, as well as the structure of formal empirical investigations that address the human computer interaction metrics of efficiency (can I finish the task in reasonable time), effectiveness (can I produce a quality solution), and satisfaction (would I be willing or eager to repeat the experience again). The three metrics may be correlated, e.g., an interface that is very satisfying may motivate its user to greater performance and hence higher effectiveness, while conversely an unsatisfying interface may produce extremely slow activity leading to poor efficiency. These three usability aspects are discussed elsewhere in greater detail as they relate to HCI research in general, with the conclusion that all three are necessary to get an accurate assessment of an interface’s usability2. Before surveying the Informedia user studies conducted with the analysts, a discussion of ecological validity is warranted, because it affects the impact of the user study results. Foraker Design defines ecological validity as follows3: Ecological validity – the extent to which the context of a user study matches the context of actual use of a system, such that it is reasonable to suppose that the results of the study are representative of actual usage and that the differences in context are unlikely to impact the conclusions drawn. All factors of how the study is constructed must be considered: how representative are the tasks, the users, the context, and the computer systems? * Copyright 2007 Society of Photo-Optical Instrumentation Engineers. This paper is published in Proceedings of SPIE Volume 6506, Multimedia Content Access: Algorithms and Systems 2007, A. Hanjalic, R. Schettini and N. Sebe, eds., and is made available as an electronic reprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.Ecological validity is often difficult for multimedia information retrieval researchers for a number of reasons. The data in hand may not be representative, e.g., the use of the Corel professional image database will not be represent amateur collections like the average individual’s digital photograph collection. The tasks employed may be artificial, e.g., finding a factual date from a news video corpus may be a task that in practice is always achieved through a newspaper text archive rather than a broadcast news archive. The users may not represent actual users, with university research often substituting college students as the user study subjects because of their availability. Finally, the context is likely different between the user study and an actual work environment, with an actual work environment having time and accuracy pressures that are difficult to simulate in a short term study. A discussion of ecological validity will be threaded throughout this paper. In fact, the concern that the “users may not represent actual users” led to the interest in employing actual situation analysts in the interface assessments reported here, rather than conduct trials with college students. The conclusions regarding TRECVID interactive search tasks are interesting in that the TRECVID video retrieval research community, who often pose as users for these tasks against their own developed systems, respond much differently than do the “real users” as represented here by the situation analysts. 2. NIST TRECVID VIDEO RETRIEVAL EVALUATION The NIST Text REtrieval Conference (TREC) was


Examining User Interactions with Video Retrieval Systems

Download Examining User Interactions with Video Retrieval Systems
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Examining User Interactions with Video Retrieval Systems and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Examining User Interactions with Video Retrieval Systems 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?