Unformatted text preview:

PageEvaluation of ATUS Diary Section 7Evaluation Methods and FactorsScreen colorFinally, recommendations to improve efficiency include the following:ReferencesUsability Evaluation: ATUS Data Collection Instrument 1 Note: Permission has been granted by the student to post this work on the website. No part of this work may be copied or used. This applies to this and all sample assignments posted on this site. –Dr. Dringus Usability Evaluation: American Time Use Survey Data Collection Instrument Diane E. Herz MMIS 680: Human Computer Interaction February 10, 2002Usability Evaluation: ATUS Data Collection Instrument 2 Abstract In this paper, the author evaluates a prototype of the diary portion of the American Time Use Survey’s proposed data collection instrument. Three factors are evaluated: consistency, compatibility of data entry fields to data entry required, and efficiency. The text provides summary evaluations, and details are included in the appendices. Recommendations for usability improvements are included.Usability Evaluation: ATUS Data Collection Instrument 3 Table of Contents Page Introduction 4 Evaluation Methods and Factors 5 Evaluation of ATUS Diary Section 7 Recommendations 11 References 14 Tables and Figures Table 1. Selected Evaluation Features 6 Table 2. Property Checklist Summary Ratings for Consistency 8 Table 3. Scenario Tests and Summary Ratings for Compatibility 9 Figure 1. Diary Screen Shot—WHO field 8 Figure 2: Error Message 10 11Usability Evaluation: ATUS Data Collection Instrument 4 Introduction In December 2000, Congress approved funding for the Bureau of Labor Statistics (BLS) to conduct a new survey to measure how people spend their time. The American Time Use Survey, or ATUS, will begin in January 2003, will run continuously, and will be conducted by telephone. Census Bureau interviewers will simultaneously interview respondents and enter data about the respondent’s activities during the prior day into a windows-based graphical user interface--the ATUS data collection instrument.1 This usability evaluation focuses on the portion of the ATUS data collection instrument that is designed to collect information about users’ activities during the prior 24 hours (the “diary” portion.) This evaluation can be considered a prototype evaluation—as the software is still being modified prior to introduction in 2003. (Jordan, 1998). The data collection (and the related activity coding application) is built in Blaise, a windows-based graphical user interface (GUI) developed by Statistics Netherlands and distributed in the U.S. by Westat, Inc. The Census Bureau has recently chosen Blaise as the standard for software development in the agency. The organization is beginning to work with survey sponsors to build new instruments and to convert current DOS-based instruments--used in both Computer-Assisted Telephone Interviews (CATI) and Computer Assisted Personal Interviews (CAPI)--to Blaise. The ATUS is the second survey that will use Blaise, and is the first CATI survey to do so. Hence, the Census Bureau is still early in the process of developing and refining Blaise user interface guidelines. Because both Blaise and ATUS are new to the CATI interviewers, all system users will be novice users when the survey goes to full production in 2003.Usability Evaluation: ATUS Data Collection Instrument 5 Evaluation Methods and Factors The evaluation presented here is non-empirical, involving only the author as participant (Shneiderman, 1998; Jordan, 1998). The evaluation focuses on the core part of the ATUS data collection instrument—the “diary” portion. This evaluation was conducted using cognitive walkthrough style. The author knows many of the Census Bureau interviewers, the large majority of whom are high school graduates without any college. Most are familiar with the World Wide Web, but are only familiar with DOS-based survey collection instruments. With their education and experience in mind, the author reviewed software properties against property checklists (based on Census guidelines), and conducted scenario tests and task analyses (Jordan, 1998). Three features were chosen for evaluation: consistency, compatibility with data collection, and efficiency. (See table 1.) Flexibility was also considered for evaluation, but was left out due to length of study requirements. Learnability and ease of use were deemed out of scope for a non-empirical research project, as they are better examined an empirical method, such as focus groups or user observations (Jordan, 1998). All ATUS data collection instrument users will begin as novices. They will not have used the application, and none will have used instruments with GUIs to collect survey data; that is—all other surveys these interviewers conduct have DOS-based interfaces. In addition, none have ever conducted the ATUS interview. Eventually, many users will become experts, as the application will be continuously in use during the year-round ATUS. Still, interviewer turnover is high as a rule; thus designing for novice users is important. According to Shneiderman, 1 After the interview is completed, data will be exported from that instrument into the ATUS activity coding application. Using the coding application, coders will assign an activity code toUsability Evaluation: ATUS Data Collection Instrument 6 applications for novice users should be designed so that the interface is simple and logically organized.” Simplicity requires consistency. Consistency is also the first of Shneiderman’s “Golden Rules” of interface design, and is mentioned by many other authors (Jordan, 1998; Jeffries et. al, 1991; Holzschlag, 2000; Grudin, 1989). For these reasons, consistence was chosen for this evaluation. Other features as well as consistency were chosen because of their import in data collection instruments (like ATUS) and their ability to be evaluated using non-empirical methods. Shneiderman identifies the most important features for evaluation for data collection instruments as consistency of data entry transactions, minimal input actions by users, minimal memory load on users, compatibility of data entry with data display and flexibility for user control of data entry. Compatibility with data collection and efficiency were chosen for evaluation. Flexibility was considered, and is defined in Table 1, but


View Full Document

nova MMIS 0680 - ATUS Data Collection Instrument 2

Download ATUS Data Collection Instrument 2
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view ATUS Data Collection Instrument 2 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ATUS Data Collection Instrument 2 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?