UE CS 350 - CS 350: Computer/Human Interaction Lecture 21 Overview

Unformatted text preview:

November 10, 2009 CS 350 Lecture 21 1CS 350: Computer/Human InteractionLecture 21 Overview●Usability evaluation●Types of evaluation●Methods of evaluation●Usability specifications●Reminder: Mid-project Progress Report due on Thursday with peer review forms. No class on Thursday.●Assignment out: Homework 5, due Tuesday at beginning of classNovember 10, 2009 CS 350 Lecture 21 2CS 350: Computer/Human InteractionUsability Evaluation●Any analysis or empirical study of the usability of a prototype or a system●Goal is to provide feedback in software development. Answer questions like:–Is the system sufficiently useful?–Is it too difficult to use or learn?–Is it satisfying to use?–Does it meet the stated goals?●Understand problems and causes; plan changes to correct themNovember 10, 2009 CS 350 Lecture 21 3CS 350: Computer/Human InteractionTypes of Evaluation●Formative evaluation – done during system development. Drives redesign. Often done by asking user to verbalize thoughts during user test of prototypes●Summative evaluation – done at the end of project or other checkpoint. Answers “how well did we do?” Generally done by measuring performance times and error rates.November 10, 2009 CS 350 Lecture 21 4CS 350: Computer/Human InteractionEvaluation Methods●Analytical methods – study the design/system characteristics and compare with theory, modeling, or guidelines from experts●Empirical methods – study how users use system via observation, surveys, controlled experimentsNovember 10, 2009 CS 350 Lecture 21 5CS 350: Computer/Human InteractionEvaluation Methods●Need both analytical and empirical evaluation:–Is it a bad implementation of a good design or a good implementation of a bad design?●How to choose what methods to use?●Which is more expensive?●Which carries more weight with developers?November 10, 2009 CS 350 Lecture 21 6CS 350: Computer/Human InteractionAnalytical Methods●SBD claims analysis showing positive and negative impacts●Usability inspection–Expert walk-through based on guidelines or checklist (more than one if possible)–Walk-through at different levels or categories–List problems in each level/category, order by severityNovember 10, 2009 CS 350 Lecture 21 7CS 350: Computer/Human InteractionAnalytical Methods●Cognitive walk-through for walk-up and use systems (e.g. ATMs)–Look for affordances, metaphors–Careful task selection, answer questions at each step: what comes next, what is assumed, are there any competing goals●All methods use checklist forms, very popular with industry – generates lots of data for low costNovember 10, 2009 CS 350 Lecture 21 8CS 350: Computer/Human InteractionHeuristic Evaluation●Nielsen (1994) gives 10 general guidelines for usability inspection–Use simple and natural dialog–Speak the user's language–Minimize memory load–Be consistent–Provide feedbackNovember 10, 2009 CS 350 Lecture 21 9CS 350: Computer/Human InteractionHeuristic Evaluation–Provide clearly marked exits–Provide shortcuts–Provide good error messages–Prevent errors–Include good help and documentation●For any usability inspection–Want multiple experts–Want point of view (POV) of different classes of stakeholdersNovember 10, 2009 CS 350 Lecture 21 10CS 350: Computer/Human InteractionModel-Based Analysis●User is modeled as a breakdown of goal identification, steps to achieve goal, implementation of steps, and selection rules●Predictive model developed using scientific knowledge of human memory and behavior–Use model elements for mental activities–Like HTA, also estimate task times for alternativesNovember 10, 2009 CS 350 Lecture 21 11CS 350: Computer/Human InteractionGOMS Example●Goals, Operators, Methods, Selection rules●Example: how to close Firefox tabGOAL: CLOSE-ACTIVE-TAB |-[select GOAL: USE-MENU-METHOD | |-MOVE-MOUSE-TO-MENU-BAR | DRAG-DOWN-FILEMENU | RELEASE-ON-CLOSE-TAB-OPTION |----GOAL: USE-HANDLE-METHOD | |-MOVE-MOUSE-TO-INTERNAL-CORNER | CLICK-ON-CLOSE-BOX |----GOAL: USE-CONTROL-KEY |-PRESS-CTRL-WNovember 10, 2009 CS 350 Lecture 21 12CS 350: Computer/Human InteractionTradeoffs●Usability inspections are fast and cheap BUT–Miss details only seen in actual use–Doesn’t identify causes of problems–Emphasizes problems that are infrequent or atypical in actual use–Contributes little to overall HCI theoryNovember 10, 2009 CS 350 Lecture 21 13CS 350: Computer/Human InteractionTradeoffs●Model-based analysis has scientific foundation and is powerful and credible BUT–Limited to the scope of the theory–Time-consuming to develop–Ignores higher level structures of behaviorNovember 10, 2009 CS 350 Lecture 21 14CS 350: Computer/Human InteractionEmpirical Evaluation●Real data from real use. Main concern is validity–Are the users representative–Is the test population large/diverse enough–Is the test system realistic enough (vs. early prototypes)–Does the data reveal real life impact●Generally, does the investigation genuinely reflect real-world happeningsNovember 10, 2009 CS 350 Lecture 21 15CS 350: Computer/Human InteractionEmpirical Evaluation●Field studies - observations of real life+ by definition, the tasks are valid and the data is relevant- difficult to categorize and summarize data- time consuming to set up and conduct●Interviews - ask about critical incidents+ collaborative effort between designers and stakeholders- memory is biased, tend to reconstruct rather than recallNovember 10, 2009 CS 350 Lecture 21 16CS 350: Computer/Human InteractionControlled Experiments●Carefully select representative tasks from task analysis and SBD claims●Control for uninteresting aspects; e.g., location, task order, instructions●Collect multiple measures of performance (time/errors), output quality, and satisfaction ratingsNovember 10, 2009 CS 350 Lecture 21 17CS 350: Computer/Human InteractionControlled Experiments●Define hypothesis in advance. I.e., what is the expected outcome.–Independent variable - that which is manipulated; each manipulation method is called a test condition or a level. E.g., three different input devices.–Dependent variable - the measured experi-ment outcome. E.g., time to complete task●Several of each may be included in an experiment. November 10, 2009 CS 350 Lecture 21 18CS 350: Computer/Human InteractionControlled Experiments●Two kinds of experiment design–Within


View Full Document

UE CS 350 - CS 350: Computer/Human Interaction Lecture 21 Overview

Download CS 350: Computer/Human Interaction Lecture 21 Overview
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view CS 350: Computer/Human Interaction Lecture 21 Overview and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view CS 350: Computer/Human Interaction Lecture 21 Overview 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?