Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Storytelling results of heuristic evaluation

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Prochain SlideShare
Heuristic evaluation
Heuristic evaluation
Chargement dans…3
×

Consultez-les par la suite

1 sur 43 Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Similaire à Storytelling results of heuristic evaluation (20)

Publicité
Publicité

Storytelling results of heuristic evaluation

  1. 1. Carol Barnum Director of The Usability Center @ Southern Polytechnic Storytelling the results of heuristic evaluation
  2. 2. Heuristic Evaluation is popular pick STC Summit 2013 UPA survey results are consistent % of respondents Survey year 77% 2007 74% 2009 75% 2011 Slide 2
  3. 3. Why so popular? STC Summit 2013 Fast Cheap Easy Effective Convenient Slide 3
  4. 4. STC Summit 2013 Slide 4
  5. 5. “It can often be more expensive and difficult to find 3-5 usability professionals as it is to test 3-5 users.” Jeff Sauro,“What’s the difference between a Heuristic Evaluation and a Cognitive Walkthrough?”Measuring Usability, Aug. 2, 2011 Care to comment? STC Summit 2013 Slide 5
  6. 6. 1. Visibility of system status 2. Match between system and real world 3. User control and freedom 4. Consistency and standards 5. Error prevention 6. Recognition rather than recall 7. Flexibility and efficiency of use 8. Aesthetic and minimalist design 9. Help users recognize, diagnose, and recover from errors 10. Help and documentation J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994 Nielsen’s 10 heuristics STC Summit 2013 Slide 6
  7. 7. • Small set of evaluators – 3 to 5 optimal cost-benefit – Single evaluator finds 35% of problems • Each evaluator inspects alone – 1 to 2 hours – Several passes through interface – Inspection based on heuristics – If evaluators are not SME’s, hints can be given – Evaluator writes notes or report The Nielsen Method STC Summit 2013 Slide 7
  8. 8. After individual evaluations are done, evaluators: – Talk to each other, often with a facilitator – Share reports/notes – Collate findings – Rank issues by severity – Write compiled report The Nielsen Method STC Summit 2013 Slide 8
  9. 9. • Supply a typical usage scenario, listing the steps a user would take to perform tasks • Hold a design debrief with designers • Use brainstorming to focus on possible solutions • Include positive findings Nielsen variations on method STC Summit 2013 Slide 9
  10. 10. And the method is called… “Discount Usability Engineering“ STC Summit 2013 Slide 10
  11. 11. So, what do you get? STC Summit 2013 • A list of potential problems • Also (sometimes) the positive findings • Tied to a heuristic or rule of practice • A ranking of findings by severity • (Sometimes) recommendations for fixing problems • A report of findings Slide 11
  12. 12. What do you do? Well, I always . . . . STC Summit 2013 Slide 12
  13. 13. What do I do? Well, I used to follow Nielsen STC Summit 2013 Slide 13
  14. 14. Phase 1: Nielsen is my bible
  15. 15. • Comparative evaluation of reservation process • 17 teams – 8 did expert review/heuristic evaluation – Only 1 team used Nielsen’s heuristics • Rolf’s conclusions – Findings “overly sensitive“—too many to manage – Need to improve classification schemes – Need more precise and usable recommendations CHI 2003 Results available at Rolf Molich’s DialogDesign http://www.dialogdesign.dk/CUE-4.htm CUE 4 Hotel Pennsylvania 2003 STC Summit 2013 Slide 15
  16. 16. STC Webinar 2012 Slide 16
  17. 17. After that, what did I do? I got a little older and wiser Slide 17
  18. 18. Phase 2: Loose interpretation of Nielsen dropped his heuristics kept severity ratings added screen captures added callouts added recommendations STC Summit 2013 Slide 18
  19. 19. Objectives/goals for the modules Reason content is being presented Conciseness of presentation Definitions required to work with the module/content Evaluation criteria and methods Direct tie between content and assessment measure Sequence of presentation follows logically from introduction Quizzes challenge users Develop a consistent structure that defines what’s noted in the bulleted points, above. Avoid generic statements that don’t focus users on what they will be accomplishing. Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the module Connect ideas in the goals and objectives with outcomes in the assessment Follow the order of presentation defined at the beginning Develop interesting and challenging questions Re-frame goals/objectives at the end of the module    3 Finding Description of problem Recommendation H C S Severity Rating Objectives/goals for the modules not clear Unclear reason content is being presented Lack of conciseness of presentation Definitions are required to work with the module/content Evaluation criteria and methods unclear Direct tie between content and assessment measure unclear Sequence of presentation does not follow logically from introduction. Quizzes do not challenge users. Develop a consistent structure that defines what’s noted in the bulleted points. Avoid generic statements that don’t focus users on what they will be accomplishing. Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the module. Connect ideas in the goals and objectives with outcomes in the assessment. Follow the order of presentation defined at the beginning. Develop interesting and challenging quiz questions. Re-frame goals/objectives at the end of the module.    3 Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives. H = Hyperspace; C = Cardiac Arrest; S = Shock STC Summit 2013 Slide 19
  20. 20. Slide 20
  21. 21. Slide 21
  22. 22. Then, what did I do? I broke free! Slide 22STC Webinar 2012
  23. 23. findings stated in our terminology screen captures Phase 3: I did it my way Slide 23
  24. 24. Slide 24
  25. 25. A unique password between 6 and 16 characters was required. “Unique” is not defined. This is a problem with terminology. Usually, passwords must be a combination of letters and numbers for higher security. An all- letter password—Heuristics—was accepted. A dictionary term is not a secure password and contradicts accepted conventions. The ability to input a dictionary word may be a component of trust for users. The username and security question answer were rejected on submit. This result is confusing as the name was confirmed on the previous screen. This relates to establishing conventions for the form of names/passwords on the input screen. Input formats need to be defined on the relevant page. Differences in spelling “username” vs. “user name” are subtle but are consistency issues. The red banner is confusing as the user chose the gold (Free Edition). This is a consistency issue. Slide 25
  26. 26. User experience emerges in expert reviewer comments... Slide 26
  27. 27. Expert Reviewer comment: I wanna click on the map, not the pulldown. WAH! Also, I’ve got no idea what the text on this page means. Slide 27
  28. 28. Why not tell the user’s story?! Slide 28
  29. 29. • Ginny Redish and Dana Chisnell • AARP report—58 pages, 50 websites – Two personas—Edith and Matthew – Evaluators “channel“ the user via persona and tasks/goals – The users’ stories emerge Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf Strategy—Persona-based scenario review STC Summit 2013 Slide 29
  30. 30. While the clickable area is very large in the navigation blocks, Edith expected to click on the labels, so she was surprised when the menu appeared When trying to click an item in the menu above, Edith had trouble selecting because her mouse hovered close enough to the choices below to open that menu, obscuring the item she wanted to click Chisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP) Slide 30
  31. 31. Engage in conversation with your reader “Every use of every website is a conversation started by the site visitor.” Ginny Redish Letting Go of the Words. 2nd ed. Morgan Kaufmann, Sept. 2012 STC Summit 2013 Slide 31
  32. 32. Tell the story of your user’s experience “Stories organize facts in memorable ways.” Whitney Quesenbery and Kevin Brooks Storytelling for User Experience Rosenfeld Media 2010 STC Summit 2013 Slide 32
  33. 33. No deliverable Quick findings Presentation Detailed report Options for report deliverables STC Summit 2013 Slide 33
  34. 34. “big honkin’ report“ STC Summit 2013 Slide 34
  35. 35. • All sites have usability problems • All organizations have limited resources • You’ll always find more problems than you have resources to fix • It’s easy to get distracted by less serious problems that are easier to solve . . . • Which means that the worst ones often persist • Therefore, you have to be intensely focused on fixing the most serious problems first Rocket Surgery Made Easy, New Riders, 2010 Steve Krug’s approach STC Summit 2013 Slide 35
  36. 36. “Focus ruthlessly on a small number of the most important problems.” --Steve Krug STC Summit 2013 Slide 36
  37. 37. • Effective • Efficient • Engaging • Error-tolerant • Easy to learn Whitney Quesenbery, wqusability.com Lighten the load Start with Quesenbery’s 5 E’s STC Summit 2013 Slide 38
  38. 38. Customize your heuristics Slide 39STC Summit 2013
  39. 39. Walk in your user’s shoes Slide 40
  40. 40. STC Webinar 2012 Slide 41 It’s up to you
  41. 41. The End STC Summit 2013 More in the book . . . or Email cbarnum@spsu.edu Slide 42
  42. 42. Image credits • Slide 4, smartevos.com • Slide 10, thecampuscompanion.com • Slide 12, 13 lkgh.today.msnbc.com • Slide 15, estdevonmethodists.org.uk • Slide 19, groundnotes.wordpress.com • Slide 24, musicalstewdailywordpress.com • Slide 27, spinsucks.com • Slide 29, uncycloped.wikea.com • Slide 35, momentsofserenitywordpress.com • Slide 40, thebolditalic.com • Slide 41, en.wikipedia.com • Slide 42, apta.com

×