Usability EngineeringPurposesSlide 3Usability AttributesLearnabilityEfficiencyMemorabilityErrorsMinor ErrorsMajor ErrorsSubjective SatisfactionAssumptionsCognitive Walkthrough MethodCognitive Walkthrough MethodSlide 15Sample Questions for WalkthroughCognitive WalkthroughDebriefing SessionWalkthrough ReportHeuristic EvaluationStages of Heuristic EvaluationSlide 22Slide 23Neilsen’s HeuristicsUsability HeuristicsUsability EngineeringDr. Dania BilalIS 588Spring 2007Drs. Bilal & NormorePurposesMeasures multiple components of the user interfaceAddresses relationships between system and its usersBridges the gap between human and machinesPurposesMeasures the quality of system design in relation to its intended usersInvolves several methods, each applied at appropriate time of the design and development processUsability AttributesAs described by NeilsenLearnabilityEfficiencyMemorabilityErrors & their severitySubjective satisfactionLearnabilitySystem must be easy to learn, especially for novice usersHard to learn•systems are usually designed for expert usersLearning curve for novice and expert usersEfficiencySystem should be efficient to use so that once the user has learned how to use it, the user can achieve a high level of productivityEfficiency increases with learningMemorabilitySystem should be easy to remember, especially by casual usersNo need to learn how to use system all over again after a period of not using itErrors System should have a low error rateSystem should provide user with a recovery mechanismMinor errorsMajor errorsMinor ErrorsErrors that did not greatly slow down user’s interaction with the system User is able to recover from themthrough system feedbackthrough awareness of error madeMajor ErrorsDifficult to recover from themLead to faulty work if high in frequencyMay not be discovered by the userErrors can be catastrophicSubjective SatisfactionSystem should be likeable by users (affective)Satisfaction varies with purpose of systemuser goalsAssumptionsThe designer’s best guess is not good enoughThe user is always rightThe user is not always rightUsers are not designersDesigners are not usersMore features are not always betterMinor interface details matterOnline help does not really helpSource: Nielsen, J. (1993). Usability Engineering. San Diego: Morgan Kaufman.Cognitive Walkthrough MethodInvolves experts acting on behalf of actual usersCharacteristics of typical users are identified & documentedTasks focusing on aspects of design to be evaluated are developedCognitive Walkthrough MethodAn observer “experimenter” is presentPrepares tasksTakes notes, Provides help, etc. Coordinates and overseas final reportCognitive Walkthrough MethodExpert walkthrough interface on each taskExpert records problems that user may experienceAssumptions about what would cause problems and why are notedBenchmarks may be used for each taskSample Questions for WalkthroughWill the user know what to do to complete part of or whole task successfully?Can user see button or icon to use for next action?Can user find specific subject category from the hierarchy?Cognitive WalkthroughEach expert documents experience about walkthrough for each taskCritical problems documentedProblems and what cause them are explainedDraft report/notes are compiled and shared with other experts and ExperimenterDebriefing SessionExperts and experimenter meet & discuss findingsExperimenter shares his/her observational notes with expertsFindings include success stories & failure stories, as applicableConsolidated report is generatedWalkthrough ReportInclude questions experts for each of the tasks and the consolidated answer See Text, p. 420 for examples Use benchmarks and map out the finding for each taskSee Assignment 5 for further info.Heuristic EvaluationEvaluators interact with an interface several times and map interface to specific heuristics or guidelinesExample: Nielsen’s ten heuristicsEach evaluator generates a reportReports are aggregated and final report is generatedAn observer may be presentStages of Heuristic Evaluation Stage 1: Debriefing sessionExperts told what to doWritten instructions provided to each expert Heuristics provided to each expert as part of written instructionsVerbal instructions may be includedStages of Heuristic EvaluationStage 2: Evaluation sessionsEach expert tests system based on heuristicsExpert may also use specific tasksTwo passes are taken through interface•First pass: overview and familiarity•Second pass: Focus on specific features & identify usability problemsStages of Heuristic EvaluationStage 3: Debriefing sessionExperts meet to discuss outcome and compare findingsExperts consolidate findingsExperts prioritize usability problems found & suggest solutionsNeilsen’s HeuristicsTen heuristics found athttp://www.useit.com/papers/heuristic/heuristic_list.html & Text, pp. 408-409.Additional rules:See Text, p. 409; pp. 412-417.Some heuristics can be combined under categories and given general description.Usability Heuristicshttp://www.usabilityfirst.com/methods http://www.useit.com/papers/heuristic/heuristic_evaluation.html (how to conduct a heuristic evaluation)http://www.uie.com/articles (collection of articles)http://www.uie.com/articles/usability_tests_learn/ Learning about usability test (Jared Spool)http://www.useit.com/papers/heuristic/severityrating.html (Severity
View Full Document