This interactive talk focuses on the UX tool of heuristic evaluation (or expert review) and best practices for designing and reporting the results of this review. Audience members will be prompted to share their experiences in conducting reviews and reporting them. A straw poll will indicate how many follow a standard set of heuristics and how many do something else. Discussion of the whys and why nots will set the stage for focusing on how to report the results. A brief walk through the evolution of reporting from the checklist to the narrative will be reviewed with examples from reports to prompt audience stories of their process and its effectiveness. New UX practitioners and students, as well as seasoned veterans, will have the chance to defend their approach or perhaps be persuaded to change.
1. Storytelling the
results of heuristic
evaluation
Carol Barnum
Director of Graduate Studies in Information Design and The Usability Center
@ Southern Polytechnic
2. Heuristic Eval is popular pick
UPA survey results for HE/expert review
% of respondents Survey year
77% 2007
74% 2009
75% 2011
Boston UPA 2012 Slide 2
3. Why so popular? Fact or myth?
Fast Cheap
Easy Effective
Convenient
Boston UPA 2012 Slide 3
5. HE output
• A list of usability problems
• Tied to a heuristic or rule of practice
• A ranking of findings by severity
• Recommendations for fixing problems
• Oh, and the positive findings, too
Boston UPA 2012 Slide 5
6. Nielsen’s 10 heuristics
1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover from errors
10. Help and documentation
J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994
Boston UPA 2012 Slide 6
7. What do you do?
• Do you do it (or teach it)?
• How do you do it?
• Why do you do it?
• Do you do it alone or with others?
• How do you present findings?
• Is it cheaper than utesting?
Boston UPA 2012 Slide 7
8. What do I do? A brief history
• Phase 1: Nielsen is my bible
Boston UPA 2012 8
9. CUE 4 Hotel Pennsylvania
• Comparative evaluation of reservation process
• 17 teams
– 8 did expert review/HE
– Only 1 team used heuristic evaluation
• Rolf’s conclusions
– Findings “overly sensitive“—too many to manage
– Need to improve classification schemes
– Need more precise and usable recommendations
CHI 2003
Results available at Rolf Molich’s DialogDesign website, http://www.dialogdesign.dk/CUE-4.htm
Boston UPA 2012 Slide 9
11. What do I do? A brief history
• Phase 1: Nielsen is my bible
• Phase 2: loosely based findings from Nielsen
– tables
– severity ratings
– screen captures for
• observations
• recommendations
Boston UPA 2012 11
12. Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.
H = Hyperspace; C = Cardiac Arrest; S = Shock
Severity
Finding Description Recommendation H C S
Rating
Objectives/goals for Reason content is being Develop a consistent structure that 3
the modules presented defines what’s noted in the
Conciseness of presentation bulleted points, above.
Definitions required to work Avoid generic statements that
with the module/content don’t focus users on what they will
bethat defines what’s noted in the
Objectives/goals for Reason content is being Develop a consistent structure 3
Evaluation criteria and
the modules presented accomplishing.
methods Conciseness of Advise that there is an assessment
bulleted points, above.
presentation
Direct tie between content Avoid generic statements that
used for evaluation and indicate if
Definitions required to don’t focus users on what they
and assessmentwith the
work measure it’swill be accomplishing.
at the end or interspersed in
Sequence of presentation
module/content the modulethere is an
Advise that
Evaluation criteria and
follows logically from assessment used for evaluation
Connect ideas in the goals and
methods and indicate if it’s at the end or
introduction Direct tie between objectives with outcomes in the
interspersed in the module
Quizzes challengeand assessment
content users assessment in the goals and
Connect ideas
measure Follow thewith outcomes in the
objectives order of presentation
Sequence of presentation assessment
follows logically from defined the the beginning
Follow at order of presentation
introduction Develop at the beginning
defined interesting and
Develop interesting and
Quizzes challenge users challenging questions
challenging questions
Re-frame goals/objectives at the
Re-frame goals/objectives at the
end of the module
end of the module
Boston UPA 2012 12
15. What do I do? A brief history
• Phase 1: Nielsen is my bible
• Phase 2: loosely based findings from Nielsen;
tables, screen captures, recommendations
• Phase 3: screen captures, UX terminology
Boston UPA 2012 15
17. A unique password between 6 and 16 characters was
required. “Unique” is not defined. This is a problem
with terminology.
Usually, passwords must be a combination of
letters and numbers for higher security. An all-
letter password—Heuristics—was accepted. A
dictionary term is not a secure password and
contradicts accepted conventions. The ability to
input a dictionary word may be a component of
trust for users.
The username and security question answer were
rejected on submit.
This result is confusing as the name was
confirmed on the previous screen. This relates
to establishing conventions for the form of
names/passwords on the input screen. Input
formats need to be defined on the relevant
page.
Differences in spelling “username” vs. “user
name” are subtle but are consistency issues.
The red banner is confusing as the user chose the
gold (Free Edition). This is a consistency issue.
17
18. What do I do? A brief history
• Phase 1: Nielsen is my bible
• Phase 2: loosely based findings from Nielsen;
tables, screen captures, recommendations
• Phase 3: screen captures, UX terminology
– Phase 3.1: user experience emerges
Boston UPA 2012 18
19. State Tax
Reviewer comment: I wanna click on the map, not the pulldown. WAH!
Also, I’ve got no idea what the text on this page means.
Slide 19
20. What do I do? A brief history
• Phase 1: Nielsen is my bible
• Phase 2: loosely based findings from Nielsen;
tables, screen captures, recommendations
• Phase 3: screen captures, UX terminology
– Phase 3.1: user experience emerges
• Phase 4: tell the story of the user experience
Boston UPA 2012 20
21. Persona-based scenario review
• Ginny Redish and Dana Chisnell
• AARP report—58 pages, 50 websites
– Two personas—Edith and Matthew
– Evaluators “channel“ the user via persona and
tasks/goals
– Their story emerges
Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf
Boston UPA 2012 Slide 21
22. While the clickable
area is very large
in the navigation
blocks, Edith
expected to click
on the labels, so
she was surprised
when the menu
appeared
When trying to
click an item in
the menu above,
Edith had trouble
selecting because
her mouse
hovered close
enough to the
choices below to
open that menu,
obscuring the
item she wanted
to click
Chisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP)
23. Steve Krug’s approach
• All sites have usability problems
• All organizations have limited resources
• You’ll always find more problems than you have
resources to fix
• It’s easy to get distracted by less serious problems
that are easier to solve . . .
• Which means that the worst ones often persist
• Therefore, you have to be intensely focused on
fixing the most serious problems first
Rocket Surgery Made Easy, New Riders, 2010
Boston UPA 2012 Slide 23
24. Krug’s maxims
• Focus ruthlessly on a small number of the
most important problems.
• When fixing problems, always do the least you
can do.
Boston UPA 2012 Slide 24
25. Conversation, Storytelling
• Ginny Redish
– Letting Go of the Words, Morgan Kaufmann, 2007
– Engage in conversation with your reader
• Whitney Quesenbery and Kevin Brooks
– Storytelling for User Experience Design, Rosenfeld,
2010
– Stories can be a part of all stages of work from
user research to evaluation
Boston UPA 2012 Slide 25
26. Report deliverable – What do you do?
No deliverable
Quick findings
Presentation
Detailed report
Jim Ross, “Communicating User Research Findings,“ UX Matters, Feb. 6, 2012
Boston UPA 2012 Slide 26