DOC PREVIEW
UMD CMSC 434 - Qualitative Evaluation Techniques

This preview shows page 1-2-3-4 out of 11 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Evaluation-Qualitative 1Qualitative Evaluation TechniquesHow to quickly evaluate prototypes by observing people’s use of themHow specific methods can help you discover what a person is thinking about as they are using your system Evan Golub / Ben Bederson / Saul GreenbergQualitative methods for usability evaluation Qualitative: • produces a description, usually in non-numeric terms• may be subjectiveMethods• Introspection• Direct observation- simple observation- think-aloud- constructive interaction• Query via interviews and questionnairesEvaluation-Qualitative 2Evan Golub / Ben Bederson / Saul GreenbergThe Introspection MethodThe designer tries the system (or prototype) out (a walkthrough of the systems screens and features)• does the system “feel right”?• most common evaluation methodProblems- not reliable as completely subjective - not valid as introspector is a non-typical userAlso - Intuitions and introspection are often wrong!Evan Golub / Ben Bederson / Saul GreenbergConceptual Model ExtractionShow the users low-fidelity prototypes or screenshots of medium-fidelity prototypes (user-centered walkthrough). Ask use user to explain what each screen element does or represents as well as how they would attempt to perform individual tasks.This allows us to gain insight as to a user’s initial perception of our interface and the mental model they might be constructing as they begin to use our system.NOTE: Since we are walking them through specific parts as their guide, we will not really see how a user might explore the system on their own or their learning processes.Evaluation-Qualitative 3Evan Golub / Ben Bederson / Saul GreenbergDirect observationEvaluator observes and records users interacting with design/system• in lab:- user asked to complete a set of pre-determined tasks- a specially built and fully instrumented usability lab may be available• in field:- user goes through normal dutiesExcellent at identifying gross design/interface problemsValidity/reliability depends on how controlled/contrived the situation is...Three general approaches:• simple observation• think-aloud• constructive interactionEvan Golub / Ben Bederson / Saul GreenbergSimple Observation MethodUser is given the task, and evaluator just watches the userProblem• does not give insight into the user’s decision process or attitudeEvaluation-Qualitative 4Evan Golub / Ben Bederson / Saul GreenbergThe Think Aloud MethodSubjects are asked to say what they are thinking/doing- what they believe is happening- what they are trying to do- why they took an action• Gives insight into what the user is thinkingProblems- awkward/uncomfortable for subject (thinking aloud is not normal!)- “thinking” about it may alter the way people perform their task- hard to talk when they are concentrating on problemMost widely used evaluation method in industryHmm, what does this do? I’ll try it… Ooops, now what happened?Evan Golub / Ben Bederson / Saul GreenbergThe Constructive Interaction MethodTwo people work together on a task• normal conversation between the two users is monitored- removes awkwardness of think-aloud• Variant: Co-discovery learning- use semi-knowledgeable “coach” and naive subject together- make naive subject use the interface• results in - naive subject asking questions- semi-knowledgeable coach responding- provides insights into thinking process of bothbeginner and intermediateusersNow, why did it do that?Oh, I think you clicked on the wrong iconEvaluation-Qualitative 5Evan Golub / Ben Bederson / Saul GreenbergRecording observations (make sure you ask permission)How do we record user actions during observation for later analysis?- if no record is kept, evaluator may forget, miss, or mis-interpret events• paper and pencil- primitive but cheap- evaluators record events, interpretations, and extraneous observations- hard to get detail (writing is slow)- coding schemes help… • audio recording- good for recording talk produced by thinking aloud/constructive interaction- hard to tie into user actions (ie what they are doing on the screen)- hard to search through later• video recording- can see and hear what a user is doing- one camera for screen, another for subject (picture in picture)- can be intrusive during initial period of use- generates too much dataEvan Golub / Ben Bederson / Saul GreenbergCoding scheme example...tracking a person’s activity in the officeTime working on computerperson enters roomanswers telephoneinitiates telephoneworking on deskaway from desk but in roomaway from room9:00 9:02 9:05 9:10 9:13InterruptionsAbsencesDesktop activitiessssseees = start of activity e = end of activityEvaluation-Qualitative 6Evan Golub / Ben Bederson / Saul GreenbergQuerying Users via InterviewsExcellent for pursuing specific issues• vary questions to suit the context• probe more deeply on interesting issues as they arise• good for exploratory studies via open-ended questioning • often leads to specific constructive suggestionsProblems:• accounts are subjective• time consuming• evaluator can easily bias the interview• prone to rationalization of events/thoughts by user- user’s reconstruction may be wrongEvan Golub / Ben Bederson / Saul GreenbergStructured InterviewsPlan a set of central questions• could be based on results of user observations• gets things started• focuses the interview• ensures a base of consistencyTry not to ask leading questions!“Now that was easy, wasn’t it?”“How hard would you say this task was?”Start with individual discussions to discover different perspectives, and continue with group discussions• the larger the group, the more the universality of comments can be ascertained• also encourages discussion between usersEvaluation-Qualitative 7Evan Golub / Ben Bederson / Saul GreenbergRetrospective TestingPost-observation interview to clarify events that occurred during system use• perform an observational test• create a video record of it• have users view the video and comment on what they did- excellent for grounding a post-test interview- avoids erroneous reconstruction- users often offer concrete suggestionsDo you know why you never tried that option?I didn’t see it. Why don’t you make it look like a button?Evan Golub / Ben Bederson / Saul GreenbergEvaluation through query continued: MethodsQuestionnaires / Surveys• preparation “expensive,” but administration cheap- can reach


View Full Document
Download Qualitative Evaluation Techniques
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Qualitative Evaluation Techniques and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Qualitative Evaluation Techniques 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?