2. Introductions
• Who we are:
– Debbie Cook
– Michelle Erickson
• Who you are:
– Name
– Company
– Role
– Goals for this course
2
3. Objectives
In this workshop, you will learn to:
• Plan, design, and prepare for a diagnostic
usability test
– Decide which user(s) to recruit
– Create effective tasks and scenarios
• Run and effectively moderate a usability test
• Employ a low-cost, low-effort approach for
capturing and analyzing results
3
6. Agenda
• Definition of usability testing
• Planning, designing, and preparing for a test
• BREAK
• Moderating and running a test
• Analyzing and summarizing results
7. What is Usability Testing?
“A systematic way of observing actual and
potential users of a product as they work with
it under controlled conditions.”
Observer
Participant
- Joe Dumas & Beth Loring
Moderating Usability Tests
Moderator
8. Two Types of Usability Tests
• Formative test
– Goal: Uncover and diagnose issues; understand
user’s workflow and/or mental model
– Focus is on qualitative assessment, identifying and
fixing issues with the design
• Summative test
– Goal: Measure usability of a product or provide a
baseline for comparison between products or
versions
– Focus is on quantitative assessment
This tutorial will focus on Formative Testing. 8
9. Formative Testing Can Help You…
• Learn about users’ requirements
• Diagnose problems early and reduce waste
• Ensure users’ success and improve their
satisfaction with the product
• Advocate for a change to the design
• Educate stakeholders about how users interact
with the product
10. Planning Moderating Analyzing
PLANNING, DESIGNING AND
PREPARING FOR YOUR TEST
10
11. Planning Moderating Analyzing
Planning Roadmap
Create a plan: Include the right
“the what” people: “the who”
• Objectives • Participants
• Scenario • Observers
• Tasks
• Protocol
• Prototype
• Pilot test
11
12. Planning Moderating Analyzing
Test Plan Template
12
13. Planning Moderating Analyzing
Case Study: DisneyParks.com
14. Planning Moderating Analyzing
Deciding on Test Objectives
• What are your goals for the test?
• What do you hope to learn?
– Are there aspects of the design you’re concerned
about?
– Are there particular user groups that you’re
concerned about?
14
15. Planning Moderating Analyzing
Thinking About Your Users
Identify target user profiles of interest
• Think about the target users of the product
– Primary users
– Secondary users
• Determine which user characteristics are most
likely to impact results
• Review with team to ensure buy-in
15
16. Planning Moderating Analyzing
DisneyParks.com Activity
Brainstorm Objectives and Users
As a group, let’s brainstorm together
• What goals or objectives might you have for a
usability test of this Web site?
• Who are target users of this Web site?
16
17. Planning Moderating Analyzing
DisneyParks.com Example
Test Objectives
17
18. Planning Moderating Analyzing
DisneyParks.com Example
Users
18
19. Planning Moderating Analyzing
Creating a Scenario
• What is a test scenario?
– A short description of the participant’s “current
state” used to set the context for the test or task.
• A well-written scenario…
– Supports the goal(s) of your test
– Is realistic
– Speaks the user’s language, not the system’s
19
20. Planning Moderating Analyzing
Developing Your Tasks
• What is a test task?
– A brief description of what you want the participant
to do during the test (not how to do it)
• A well-written task…
– Supports the goals of your test
– Is realistic; it reflects a real task user will perform
– Provides relevant details needed
– Speaks the user’s language, not the system’s
– Probes potential usability issues
– Is typically independent of other test tasks
20
21. Planning Moderating Analyzing
Tips for Developing Tasks
Think about tasks that are:
• New to the user or require the user to change the
way s/he performs the task
• Performed frequently
• Critical to success
• Likely to reveal a potential usability issue or
concern
• Unique to a particular environment or interaction
21
22. Planning Moderating Analyzing
DisneyParks.com Activity
Brainstorm Scenario and Tasks
As a group,
• Brainstorm (15 minutes):
– Test scenario - how would you “set the stage” for
the user at the beginning of the test?
– Tasks - what task(s) might you have the users
perform to support your test objectives?
• Report out (5 minutes):
22
23. Planning Moderating Analyzing
DisneyParks.com Example
Scenario and Tasks
23
24. Planning Moderating Analyzing
Refining Your Tasks
Refine and scope tasks based on:
• Priority
• Relevance
• Prototype capabilities
• Time
– Estimated time to complete
– Stakeholder availability
• Possibility of completion
NEVER give users an impossible task! 24
25. Planning Moderating Analyzing
Establishing a Test Protocol
• Always start the user off with an easy task
• Identify possible curve balls
– Create a plan for how you will deal with these if
the user encounters them
– Ensure the team and observers are aware of, and
agree to, the protocol
• Decide how you will provide help if asked
• Identify key information for the moderator
(you) and the observers to be aware of
25
26. Planning Moderating Analyzing
Identifying Key Information
For each task in your plan, capture:
• The goal(s) for the task
• The “correct” answer
• Estimated time on task
• Any special protocol for the task
• Areas of interest you want your observers to
take note of
This information is only for you and your observers.
26
27. Planning Moderating Analyzing
Example Test Protocol
27
28. Planning Moderating Analyzing
DisneyParks.com Example
Key Task Information
Show me how you would get started planning your trip.
Max time on task: 10 minutes
Task goals:
• Learn how the user navigates to the site
• Learn about how the user begins his/her search for information
Protocol Notes:
• Start participant out with a clear desktop, i.e., no windows open, not even a browser.
• This is an exploratory task to see how they get started, once the user starts down a particular
path the facilitator will guide him/her to an appropriate follow up task.
Observers’ take note of:
• How does the user get to the Disney site? (e.g., Google search, typing in direct URL, etc.)
• How does the user navigate to the DisneyParks.com site?
• What does the user click on once s/he arrives at the DisneyParks.com site?
• What information does s/he pay attention to first? What information does s/he start looking
for?
• Does the user browse for information or search?
• Does the user notice the “Vacation Planning” option on the DisneyParks.com home page 28
29. Planning Moderating Analyzing
Choosing Participants
• Identify user profiles of greatest interest
• Determine which user characteristics are most
likely to impact results
• Review criteria with team to ensure buy-in
Observer
Participant
Moderator
29
30. Planning Moderating Analyzing
Deciding on Number of Participants
• It depends on…
– Time
– Money
– Availability of participants that fit the profile
– Attitude and availability of key stakeholders
• Generally accepted practice = 6 – 12
participants
– Three (3) to five (5) participants per profile
– 80% of problems identified with 4 – 5 participants
30
31. Planning Moderating Analyzing
Inviting Observers
Invite people to observe and take notes
• Team members
– Developers
– Designers Observer
– Writers Participant
– Quality Engineers
– Product managers Moderator
• Key stakeholders
– Clients
– Champions
Invite anyone who might benefit from attending. 32
32. Planning Moderating Analyzing
DisneyParks.com Activity
Brainstorm Participants and Observers
In a group,
• Brainstorm (10 minutes):
– Participants
• Which user group(s) will the test focus on?
• Who will you target as possible participants?
– Observers – which stakeholders might you invite
to observe the tests?
• Report out (5 minutes):
33
33. Planning Moderating Analyzing
DisneyParks.com Example
Participants and Observers
The project team
34
34. Planning Moderating Analyzing
Planning Roadmap
Create a plan: Include the right
“the what” people: “the who”
• Objectives • Participants
• Scenario • Observers
• Tasks
• Protocol
• Prototype
• Pilot test
35
36. Planning Moderating Analyzing
MODERATING AND RUNNING A TEST
40
37. Planning Moderating Analyzing
Moderator Responsibilities
• Setting up the environment
• Preparing observers
• Caring for and briefing participants
• Conducting the test
Observer
Participant
Moderator
41
38. Planning Moderating Analyzing
Setting Up the Environment
• Prototype
• Paper prototype
• Computer
• Cameras
• Audio and video recorders
Myth: You need a state of the art usability lab to
run tests.
Reality: You can test a paper prototype anywhere!
42
39. Planning Moderating Analyzing
Moderator Responsibilities
• Setting up the environment
• Preparing observers
• Caring for and briefing participants
• Conducting the test
Observer
Participant
Moderator
43
40. Planning Moderating Analyzing
Briefing Observers
• Set ground rules for etiquette
• Teach observers how to take notes
• Establish a protocol for observers’ questions
during or after study
• Have observers write notes on post-its and pass to
moderator, or Instant Message during test
• Allow observers to ask questions directly
44
42. Planning Moderating Analyzing
Collecting Observational Data
Training observers on taking objective notes
• Indicate direct quotes vs. observation
• Indicate if it is an observer thought or idea
• Record task and context
• Indicate time on tasks, when relevant
• One observation per post-it
46
43. Planning Moderating Analyzing
What’s a Good Observation?
Anything that you think is interesting,
remarkable, noteworthy.
• What you see the participant doing
• What you hear the participant saying
• What you notice the system doing in response
to the participant’s actions
47
44. Planning Moderating Analyzing
Example Observation Notes
48
45. Planning Moderating Analyzing
Moderator Responsibilities
• Setting up the environment
• Preparing observers
• Briefing participants
• Conducting the test
Observer
Participant
Moderator
49
47. Planning Moderating Analyzing
Overview
• Caring for the participant
• Informed consent and non-disclosure
agreements
• Using a test script
• Introducing the test
51
48. Planning Moderating Analyzing
Caring for Participants
• Reduce anxiety and create comfortable
environment
• Offer beverages, restrooms, and ensure their
physical comfort
• Answer questions when possible
52
49. Planning Moderating Analyzing
Informed Consent and Non-disclosure
Inform participants about:
• Informed consent
– Participant rights
– Concept of minimal risk
• Nondisclosure and confidentiality agreements
Sample available on STC Website: Usability Toolkit:
http://www.stcsig.org/usability/resources/toolkit/toolkit.html
53
51. Planning Moderating Analyzing
Benefits of Using a Test Script
• Each participant hears all critical briefing
information
• All participants get the same information
55
52. Planning Moderating Analyzing
Introducing the Test
• Importance of their participation
• Expected length of test
• Test environment
• Procedure (tasks, questions for participant,
wrap up)
• Breaks, stopping
• Questions for the team
57
53. Planning Moderating Analyzing
Think Aloud Protocol
• Participants verbalize thoughts and
expectations
• Moderator reminds participants and prompts
for their thoughts
58
54. Planning Moderating Analyzing
We’re Testing the Product!
• Assure the participant that you’re testing the
product, not their performance.
• Most participants blame themselves when
they encounter problems.
59
56. Planning Moderating Analyzing
Moderator Responsibilities
• Setting up the environment
• Preparing observers
• Briefing participants
• Conducting the test
Observer
Participant
Moderator
61
57. Planning Moderating Analyzing
Conducting the Test
62
58. Planning Moderating Analyzing
Overview
• Task handling
• Interacting with participants
• Providing help
• Concluding the test
• Troubleshooting
63
59. Planning Moderating Analyzing
Task Handling
Pace
• Give tasks one at a time
• Let participants set pace
• Ask clarifying questions after each task (if
needed)
65
60. Planning Moderating Analyzing
Task Handling
Assists and Abandons
Based on test goals, consider:
• How will you provide assistance when the
participant is stuck?
• When will you advise or allow the participant to
abandon a task or test?
In general, let participant struggle.
• Don’t assist if it interferes with an important
objective.
66
61. Planning Moderating Analyzing
Task Handling
Task Completion
• Participant should indicate when they are
done with each task.
• Ask them whether they are finished, if necessary.
• Build confirmation into tasks.
• If they do not know, moderator needs to
determine whether the task is complete.
• If not, ask them to reread and/or redo task.
67
62. Planning Moderating Analyzing
Interacting with Participants
Handling Questions
• Asking unbiased questions:
– Neutral words
– Avoid adjectives and adverbs
– Use both extremes (i.e. easy and difficult)
• Answering participant questions:
– Don’t answer directly
– Turn questions around and ask the participant:
What do you think will happen? What do you expect
should happen?
69
63. Planning Moderating Analyzing
Interacting with Participants
Interrupting
In general, avoid interrupting the participant.
But if you must:
• Remind them to think aloud
• Prompt them to clarify or reread tasks
• Probe for information about their actions or
expectations (“Say more about that…”)
70
64. Planning Moderating Analyzing
Interacting with Participants
Using Active Listening Skills
• Use nonverbal communication and minimal
encouragers
• Allow for silence, struggle, and uncertainty
• Check expectations
• Validate your understanding
71
65. Planning Moderating Analyzing
Providing Product Help
Provide minimal help
• Only when and if participant requests help
Unless you are testing Help…
72
66. Planning Moderating Analyzing
Concluding the Test
• Clarifying questions from you or observers
• Post test interviews
• Wrap up and closure
73
68. Planning Moderating Analyzing
During a Real Test…
• Observers take notes on the content of the
test, not the moderator’s performance.
• Moderator’s primary responsibility is to
facilitate the test and ensure the comfort of
the participant – not take notes.
76
69. Planning Moderating Analyzing
DisneyParks.com Activity
Moderation Demo and Debrief
• Task 1: Observe moderator’s techniques
– Use the checklist to takes notes.
• Task 2: Capture observational data on post-its
– What you see the participant doing
– What you hear the participant saying
– What you notice the system doing in response to
the participant’s actions
77
71. Planning Moderating Analyzing
Moderation Summary
• You don’t need a fancy usability lab to test!
• Brief observers on etiquette and note taking.
• Make participants feel comfortable, inform
them of their rights, and assure them you are
testing the product, not them!
• Let participants set the pace.
• Interrupt only minimally and be careful not to
bias the test!
79
72. Planning Moderating Analyzing
ANALYZING AND SUMMARIZING
RESULTS
80
73. Planning Moderating Analyzing
Goals of Analysis and Summary
• Identify the problems that were uncovered in
the study
• Develop a common understanding
• Impact and improve design
• Convince team to make changes
81
74. Planning Moderating Analyzing
Analyzing Test Data
Analysis techniques vary depending on many
factors:
• Culture/organization
• Test objectives and requirements
• Your test goals
82
75. Planning Moderating Analyzing
Our Recommendation:
Affinity Diagramming
• Share and consolidate observation notes.
• Categorize notes into meaningful groups.
• Prioritize by voting on top issues.
• Record and communicate results to team and
stakeholders.
83
76. Planning Moderating Analyzing
DisneyParks.com Activity
Affinitize the Results
• Post all notes together, uncategorized. Toss
duplicates.
• Categorize notes into meaningful groups.
– When you see 2 that are related, take both notes
and post together.
– As soon as you put two notes together, title the
category on a 3” x 3” post-it note.
• Prioritize the feedback (by category or
individual note).
84
77. Planning Moderating Analyzing
Reporting Results
Reporting techniques vary depending on time,
culture, and resources.
Some examples include:
• Video clips
• Formal detailed report
• Excel spreadsheet
• Executive summary
• PowerPoint presentation
85
78. Planning Moderating Analyzing
Analysis and Reporting Summary
• Goals of analysis:
– Develop a common understanding
– Identify the problems
– Convince team to make changes
– Improve a design
• Scale reports to fit.
• Try Affinity diagramming!
86
80. Just Test!
• Show your product to users and get feedback.
• Use these materials to refine your testing
methods.
• You don’t need to be an expert.
• You don’t need a state-of-the-art lab.
• You don’t need elaborate protocols and test
scripts.
• The more you test, the better you’ll get.
88
81. Primary Resources
A Practical Guide to Moderating Usability Tests
Usability Testing
Joseph S. Dumas
Joseph S. Dumas & Beth Loring
& Janice C. Redish
89
82. Thank you!
Contact us with any questions…
• Debbie.cook@mathworks.com
• Michelle.erickson@mathworks.com
95
Notes de l'éditeur
For MathWorks Training:Explain purposeUPA workshop on MondayTried to condense a semester long class, 2 books and years of experience into 3 hour tutorialYou’re helping us practice, while at the same time, learning the basics on running your own usability testGoal is that this also becomes MathWorks official training for Usability TestingINTRODUCE OURSELVESMy brief bioMichelle – brief bioLogistics:Feel free to ask questions as we go through the materialsPlease turn phones and other devices onto silentPrint-outs – Decide what to do…The content in the presentation is similar but the order of some of the printed material may be slightly different.We won’t cover everything in the printed materials.We feel that the best way to learn is to DO so we want to focus on practicing the most essential concepts.
?Ask them what they hope to get out of classSet expectationsIn this workshop, you will learn:How to plan for a diagnostic usability test, including…Which users to focus test onAnd the fundamentals of writing tasksHow to moderate a usability test and interacting with the participantHow to capture and synthesize the data using a low-cost, low-effort approach we leverage at MathWorks.Our goal is that you’ll know enough when you leave here today to get started testing
Why is usability testing important?One reason is to avoid disasters like this one…1989 Plane bound for Belfast crashed along the side of the M147 people were killed and 74 people were injuredWhat happened?One of the engines failedPilot mistakenly turned off the wrong engineInvestigation of the incident revealed that one contributing factor to this horrible accidenthad to do with designBBC: http://news.bbc.co.uk/2/hi/uk_news/northern_ireland/8363595.stmhttp://www.aaib.gov.uk/cms_resources.cfm?file=/4-1990%20G-OBME%20Append.pdfo with design and usability of gauges
Gauge design had an impact on pilot’s ability to determine engine performance at a glanceDo you think this is something that could have been discovered during usability testing?It’s our goal, that at the end of this class, you’d be able to plan a test that might uncover this type of design flaw.
We’re going to start off the tutorial by explaining what we mean by the phrase “usability testing”, the two common types of usability tests.Then we’re going to go into detail about how you would:Plan and prepare for a testModerate and conduct a testCapture and analyze the resultsThe bulk of our time will be spent in the middle two sectionsAgain, because we believe you’ll learn a lot more by doing some of these things than you will by listening to us talk about these things…we’re going to try to give you the opportunity to try your hand at many of these things.We’ve tried to structure the class so that it’s as hands-on and interactive as we can be given the timeframe and the amount of info we’re trying to cover.Let’s jump in.
Why only covering formative?HarderMore commonYou learn more from the tests
Encourage audience participation.
Better graphicsNow that we’re all on the same page about what a usability test is and what the benefits are, let’s talk about how you’d plan for one.When I say “plan”, I mean plan, design and prepare for…5 P’s – Prior Planning Prevents Poor Performance
Get rid of this roadmap and just use an overall roadmap?To run an effective test you’ll need all these things and you’ll need to ensure that the right people are involved. Today, for the sake of time, we’re going to focus on:ObjectivesScenarioTasksProtocolParticipantsObservers
Michelle is going to hand out this template that we’ll refer to throughout this tutorial.In fact, we’re going to use it to plan a test of our own.
Pass out the Workshop Aid templateBecause we think working through an example is a better way to learn than listening to us talk at you for 3 hours, we’ve selected a Web site to use as a case study throughout the workshop.Our goal is to plan a test using a case study: Disneyparks.comNormally, you’d probably be testing a product you’re familiar with. At the very least, you’d likely have had a chance to look at briefly and acquaint yourself with it.For the purposes of today’s workshop, we’re going to rely primarily on instinct and assumptions because we also want some of you to act as participants?CASE STUDY SETUP: We’re going to pretend that we work for Walt Disney Company. The company has noticed an increase in the number of telephone calls they’vereceived from people trying to plan vacations. They want to reduce expenses associated with this service by enabling people to successfully plan a vacation on the Web site, without having to call the support line for additional assistance.You’re working on the Web site design team.We’ll work together to start filling out a Usability Test Plan of the DisneyParks.com site.
One of the first things you want to think about are test goals…Why are you running the test? What are you trying to learn from having users work with the product?Do you have questions about how users will react to certain aspects of the designHave previous tests or heuristic reviews revealed potential flaws in the design that you want to exploreDo you simply want to understand how users are really using the product?Purpose and focus of test will likely be influenced by:Where you are in development cycle.New product or existing productThink about user groups initially – what types of people use this site?Are there groups of users you think may experience difficult or that you want to better understand their workflow.
Who: How do you decide which user(s) to test?Won’t be able to test all user profiles.Think about the different users of your product and the characteristics that differentiate or impact how they interact with the product.Determine which user characteristics are mostly likely to impact test performance, focus your plan and your recruiting efforts on people with those characteristics. Example: smart phones – Do you think teenagers use smart phones differently than their parents or grandparents do?What types of things might differ? Do you think people who travel a lot might use smart phones differently than people who don’t travel as much?People with young children?Why is it important to review with team?Don’t want them to write off data or downplay feedback because the participant was “atypical”, “not representative”, “wrong profile” What characteristics might you consider for products you’re interested in testing?Demographics?AgeIncome levelComputer experience?Use the computer for internet or e-mail vs. use the computer every day for workMobile usage?
The company has noticed an increase in the number of telephone support calls they’vereceived from people trying to plan vacations. They want to reduce expenses associated with this service by enabling people to successfully plan a vacation on the Web site, without having to call the support line for additional assistance.You’re working on the Web site design team.Target time: 10 minutesLarge group activity: Write ideas on whiteboard or large post-it easel paperShow site while people brainstormSometimes the goals of test are impacted by whether or not you think users are successful with their goals while using the product
SIMPLIFY? LARGER FONT?Here are some example test objectives we came up withAre there similarities?Differences?
Consider adding pictures…
I actually gave you a scenario to set the stage for our tasks today.“We work for Walt Disney Company. The company has noticed an increase in the number of telephone calls they’vereceived from people trying to plan vacations. They want to reduce expenses associated with this service by enabling people to successfully plan a vacation on the Web site, without having to call the support line for additional assistance.”It gave you all a frame of reference for the activities that we’re doing / going to do.You’ll want to do the same for the people you’ll testEXAMPLE OF USER’S LANGUAGE VS SYSTEM LANGUAGE?GET class involved here – do a pop quiz!
Scenarios and tasks are the two parts of your test that users will actually see.Task describes what the user needs to do, not how they should do it (not step by step instructions)Simple example: for cell phone some sample tasks would be:Make a call toAnswer a callAdd your best friend’s contact information to your list of frequently called…Include the relevant details participants will need to complete the tasks.For example, for the “make a call task”, you’ll want to provide details about the person or phone number to call.Support the goals of your testTasks should be realisticProbe potential usability issues – major and minor (e.g., touch screens for portable devices – environmental considerations like glare or gloves)Speaks user’s language, not the system’s example:Independent tasks – help preserve participant’s self esteem – don’t want the participant to feel like they’ve failedExample of dependent tasks:Configuring the phoneMaking a callThere may be exceptions to the independent tasks rule
ANY OR ALL OF THESE!Brainstorm tasks, then refine, scope and finalizeTips for brainstorming: think aboutNew tasks (things the user has to do now that they didn’t have to do before)Common users tasks (frequently performed)Critical user tasksTasks where users have or might run into issues with the designEnvironmental considerations and types of devices and interaction behaviorsIn the Disney case study…who might be a great source of information about where people have problems…people providing phone support
Have small groups break to write 3 tasks.Target 15 minutes to brainstormReport out 5 – 10 minutesWe give feedback while they are working on tasks Catch language – user’s not “disney-speak”TELL CLASS TO USE “OUR” USERS –New slide or handout with those users
When showing them the example tasks, explain that the first task is an example of a more exploratory task.What types of things were we careful of…
CONSIDER “IMPOSSIBLE” – controversial?After you’ve brainstormed the tasks, then refine, scope and finalize themMay combine tasks, eliminate tasksTypically you do this based on:Priority – how important is the task relative to the others the test goalsRelevance – how well does the task help achieve test goals. Help you learn what you’re trying to learnPrototype capabilities – can users be successful at this task given current prototypeWe haven’t really talked about prototype…prototype can be anything…NEVER GIVE USERS A TASK THAT THEY COULDN’T POSSIBLY SUCCEED AT - VERY IMPORTANT THAT YOU DON’T INTENTIONALLY SET USER UP FOR FAILURE Time
All I mean by protocol is instructions for how you’re going to handle certain situations.Observers are the members of your development team that you’re going to invite to the test (more on that later)Always start them off with an easy taskFirst task is about setting them at easeLetting them get comfortable with the concept of usability test.Sometimes you can start with an exploratory task which is easy because there are not wrong answers.Example of a curve ball or land mine and strategy for dealing with it?Part of the prototype isn’t thereThey ask for documentation / help – what will you do?Go down the wrong path, how long will you let them explore?Example of GUIDE test where we wanted to get feedback on a particular new featureProtocol stated that if users hadn’t discovered the feature by the 3rd task – we’d lead them to it since we’d already learned it wasn’t discoverable but wanted to learn if they thought the feature added value.
Your goal – what you hope to learn from the taskCorrect answer – so you know the path to success and when the user has completed the taskAre there any things you observers should be on the lookout for?For example: Where did users look for the feature? Did they go to the right menu or toolbar button?
MAY NOT INCLUDEHave them brainstorm first - ?CONSIDER ANIMATING THISCUT OFF SOME OF THE TEXTHARD TO SEEDon’t always have to do this, but the more prepared you are, the better the test will go.
This is really about who you will recruit.For our exercises, we focused on vacation planners.Who: How do you decide which user(s) to test?Won’t be able to test all user profiles.Determine which user characteristics are mostly likely to impact test performance, focus your plan and your recruiting efforts on people with those characteristics. What characteristics might you consider for products you’re interested in testing?Demographics?AgeIncome levelComputer experience?Use the computer for internet or e-mail vs. use the computer every day for workMobile usage?Why is it important to review with team?Don’t want them to write off data or downplay feedback because the participant was “atypical”, “not representative”, “wrong profile”
KEY TAKEAWAY – ACKNOWLEDGE SAMPLE OF CONVENIENCEObviously testing takes time, the more people you test, the longer it can take.Often times, teams have a specific window for testingMoneyParticipantsParticipants are usually paid for their time and effort. (Need to think about this as part of your testing budget). How will they get paid and how much?CashGift cardsProductNot a tax expert…there are tax implications for participants so you want to think about that…More participants = more moneySpecial expertise = more money (doctors, finance experts)Development teamsIf stakeholders are participating in the tests, time they spend testing is time away from other aspects of their jobAvailability of participants - Sometimes difficult to find people to participate due toDesired user characteristics, especially if specific expertise neededPaper prototype (limits to local area)Acknowledge convenience sample - we often take what we can getAttitude and availability of key stakeholdersHow open is the team to feedback? How likely are they to make changes? (If no, why waste your time testing?)If observing, did they “plan” for this in their estimations?“Some data is better than no data”
DON’T COVERParticipant background / recruiting screener based on the profile / characteristics of interestLeverage leadsPlace where people do similar tasks nowPeople you’ve talked to in the pastBenefits – you need to be prepared to explain the value they get from testingOpportunity to provide early feedback that could influence design / directionAdvance exposure to a new feature or productChance to interact with developers of a product they might be using (be careful here…)Recruiting participantsInternalCustomer databasesLeads from customer facing employeesUsability dbExternalService providers like Alpha Buzz, Usability Works and ones in AppendixAdvertisingOn your own websiteSocial networking sites (e.g., craigslist, professional organization sites or newsgroups)Craigslist
Tie in importance of taking notes for analysis method we’ll discuss laterMention that # of observers may be influenced by test location – (user site – fewer people, lab – more)If you’re running a Wizard of Oz test where someone has to act as the computer, they’re obviously a required attendee!If you’re new, may want to get your feet wet before inviting tons of peopleYou may be constrained by space limitations, if you’re running a test in the field – only want to bring essential personnel.If you have your own lab, highly recommend you invite anyone who might benefit from watchingInvite anyone who might benefit or who we might want to influence.
Essentially, thinking about the questions you’d want to ask on your screener to decide which users are appropriate for your testHave small groups break to write 3 tasks.Target 15 minutes to brainstormReport out 5 – 10 minutesWe give feedback while they are working on tasks Catch language – user’s not “disney-speak”
Label the picture “The Team”
Hand-out completed version of the DisneyParks.com Test planWrap-up and summarize Touch on idea that you don’t need a fully functioning prototype and a tripped out usability lab to test. You don’t just test.Pilot test – practice run to work out kinksWe didn’t cover the additional two sections, filling them out can be helpful. We just don’t have time.
May be cutPrototype, can the prototype support tasks, do you need to modify tasks…Deciding what to test, - fidelity of prototype, which features or aspects of application or product to testWhat: What level of fidelity prototype do I need for an effective usability test?Paper prototypes, software prototypes, hybrid paper and software, balsamiq mocks and click throughs, APIs - Instant messaging test, final software PaperInteractive – Balsamiq, Axure, PowerPoint, ExcelThrowaway prototype – semi-functional product built just to facilitate testing (depends on time and money)Fully functioning product – can test with fully functional, released software. Great way to learn about enhancement opportunitiesBe creative.Examples: We work on command line interface – We have used IM client to “simulate” the experience of typing a command at the prompt (Wizard of Oz).We’ve paired interactive prototypes with paper
DON’T COVER
DON’T COVERMethods and how they differ, when to do each, pros and cons
Michelle 20-
Hello, thanks Debbie!!Tell whale joke?VIDEO HERE – show bad video What did you see, what could have been better?
Setting up lab equipment for viewing and recording – you may delegate this to a lab tech or AV personBriefing and setting ground-rules for observers, instructing them on how to take notes and ask questionsGreeting and taking care of the participant-Making participant feel comfortable, welcome and appreciatedParticipant briefing - Setting expectations and explaining ground rules and legal issuesAll aspects of running test
Whether you are testing in a lab or a person’s site.You will be responsible for making sure everything is working and set up to test.Based on the practices at your company, you may be recording the test, you will likely be responsible for setting up all recording and viewing equipment.Depending on the type of prototype and location, it will impact the equipment needed.
Setting up lab equipment for viewing and recording – you may delegate this to a lab tech or AV personBriefing and setting ground-rules for observers, instructing them on how to take notes and ask questionsGreeting and taking care of the participant-Making participant feel comfortable, welcome and appreciatedParticipant briefing - Setting expectations and explaining ground rules and legal issuesAll aspects of running test
Reinforce rules for observers such as no sidebars or participant interruptions, also keep volume down even in observation rooms. Suggest that observers are in separate room or at a different table if possible.Some people allow observers to pass questions to you to clarify any confusion about test.Agree on a method for flagging questionsSafest to Have observers write notes on post-its and pass to moderatorBut if you trust your observers to not bias and to interact with participant appropriately, then Allow observers to ask questions directly
Add pic of our observers ground rules form UX lab!
Indicate if it’s a solution or design idea for a problem they observed – how to include this in slide??We at MathWorks find it extremely valuable to have observers take notes.This ties into the low-cost method for analyzing test data I’ll talk about later.Showing observer idea rather than an observation – lightbulb, i?Note taking – how much should the moderator take notes? There are a variety of methods, Morae, … here’s where you can learn more. Many different techniques But our focus is using post itsEspecially important for the low cost analysis method we will discuss laterObservations should include – things you see the participant doing
What to write down – not EVERYTHIGN YOU SEE THEM DOING…anything surprising, relevant to design, problems, expectations, decisionsShow pics of post its with notes on them here
Setting up lab equipment for viewing and recording – you may delegate this to a lab tech or AV personBriefing and setting ground-rules for observers, instructing them on how to take notes and ask questionsGreeting and taking care of the participant- Making participant feel comfortable, welcome and appreciatedParticipant briefing - Setting expectations and explaining ground rules and legal issuesAll aspects of running test
Usability Studies can be stressful - Reduce anxiety and set at easeAnticipate that participants may show up early – have a place for them to wait, get comfortable Offer restrooms, beverages, engage in neutral conversation practice think aloudAnswer any questions they may have
Be sure to consider these issues and discuss with your company.RESEARCH: Nondisclosurevs confidentialityMust protect legal rights of organization and the participantMust inform your participants of the following items and have them sign proper legal docs in your presence.Use boilerplate forms or create a form and have your legal dept. review – statement of rightsEnsure participants have read, understand, and have signed forms in your presenceFollow proper procedures to care for participantsParticipants’ rights – withdraw at any time, be informed what test is about, what data will be used forMinimal risk – probability and magnitude of harm or discomfort anticipated in the test are not greater than those ordinarily encounteredindialy life or during routine physical or psychological exams or test.Informed consent – key to protecting participant rights:Information – procedures, purpose, risks, opp to ask questions, break, or quit. Importance of test and their feedbackComprehension – they completely and clearly understand test procedures and purposeVoluntariness – not coerced or unduly influencedForms to have signed:Nondisclosure and confidentiality – participant cannot share with anyone the details of the test or product evaluated Waivers – permission to use recordings, feedback, and comments gained during test but you will not record or use their name
MAYBE NOT SHOW
Use script and checklist to ensure equal prep for all participants – this could impact results!!Refer to your participant briefing and training planSame info to all participants to minimize affecting different outcomesEnsure that you cover all important aspects (legal issues, procedures, etc before each testIf someone else needs to facilitate for you – unlikely event – you can be sure they received consistent infoA note on being organized:Easiest to make it a checklistLabel all test materials – keep separate folders for each participant, labeled with participant number and test nameHave separate observer notes, Staying organized will help maintain test quality
Make handout – maybe show a screen shotShow general script here.Point to the Briefing Script part and explain its often on a checklist with other moderator set up tasks.
Introduce environment:Where you will each sitIntroduce anyone else that will be in room with you and explain their roleTest environment - (test room, computer, recording equipment, observation room)Explain why we record – how it helps us analyze product and usability issues, who will see itQuestions – they can ask as many questions as they like, but we may not answer them at all or maybe some at end of test.But encourage them to ask because it helps us understand their thought process and areas of possible confusion
ROADMAP OUTLINEVISUAL _ PERSON WITH THOUGHT BUBBLE AND THEN TALK BUBBLE THE SAME THING!DEMO the protocol (Michelle and Debbie live demo – if time permits) or give example.Ask users to verbalize their thoughts and questions as much as possibleWhat are they thinking? What do they expect will happen? What did they think when system did something? What are they looking for? What are they planning to do? Where are they clicking or reading on screen?Doesn’t come naturally to most peopleWill provide prompts as test goes along Advantages – lots of insight into users workflow, mental model, expectations and goalsDisadvantages – slower performance and speed through tasks, unnaturalDemonstrate using a stapler or some other object (bring the object if you plan to do a demo)Ask if they have any questions about think aloudViewpoints on whether think aloud affects performance during the testAn alternative to Think Aloud is self reporting after the tasks or entire testPros:More natural task work, less interruption of user’s workflow thru tasks, less talkingDisadvantages:This is more time consumingThink Aloud protocol(fidelity and robustness of prototype, interaction guidelines, no wrong answer, ask questions but we won’t answer them…)Questions – they can ask as many questions as they like, but we may not answer them at all or maybe some at end of test.But encourage them to ask because it helps us understand their thought process and areas of possible confusionThink Aloud protocol
ROADMAP/OUTLINEVery important to state up front and remind them of this!Problems you may encounter are reflective of the design and the prototype, not your abilities or performance
10-15 minutesWe can pull up a volunteer to act as participant and have Michelle brief them.Fix animation to blow out the briefing piece.REVISE CHECKLIST!! Add Before the Test header on checklistPass out the Disney Usability Test Checklist.Take a few moments to read through this and start to memorize the key points.Can we have a volunteer from the audience to role play the study participant? I will demonstrate a participant briefing.Your briefing does not have to follow the script verbatim, just cover the important parts. Also try not to read from the script directly – make sure you maintain rapport and eye contact with your participant.After the role play, let’s debrief. What did you notice? Did they forget any parts? Which parts do you feel are most important?
Handling lab equipment for viewing and recording – you may delegate this to a lab tech or AV personBriefing and Setting groundrules for observersGreeting and taking care of the participant-Making participant feel comfortable, welcome and appreciatedParticipant briefing - Setting expectations and explaining ground rules and legal issuesAll aspects of running test
CONSIDER ROLLING THIS INTO TASK PACEGive tasks one at a time – Have each task on a single page (one task per page)Start with the scenariosUser does not get overwhelmed seeing a stack of 12 tasks – and you may not get to all of themSome people don’t number tasks so that participant won’t know if they skip a taskremoving excluding tasks. Reordering tasksdependent tasksLet participant announce when they have finished each task.Ask if they would like to continue and give another taskIf providing observer notes, make sure you have a copy without for participants
DEFINE ASSIST AND ABANDONstruggle (if you think participants will struggle in a particular area, discuss ahead of time with team about assists, how long to let them struggle, important objectives, etc.)Plan in advance. Based on goals you set ahead of time – decide when to assist or abandon.Assist means participant failed task.CONSIDER Using “When a participant gets stuck.”
MAY NOT PRESENTWho decides when/whether a participant has completed a task?Building confirmation into task – What tickets would you buy? Write down your answer here…Moderator ultimately decides when a task is complete, but:Participant should verbalize when they are done with a task.This gives us critical info about their understanding But when they do not know or say so, you need to determine whether to move on or have them go back and redo task. Or you can ask a question to help them determine whether they are done.Ask them to keep tryingBe consistent among all participants and test
Ask audience why its important to avoid bias??Add animation.Show good example and bad example in slide:What did you expect?How awesome was that?NEW SLIDE?How to act around partcipants? Funny, formal, taciturn, serious? Positivie? Kind.Respectvul and attentive and BE YOURSELFExercise caution when answering and asking questions or giving assists or hintsAsking questions: use neutral words, avoid adjectives and adverbs.If you do use them, use both extremes – easy and difficult, fast and slow, Responding to questions:In general, don’t answer questions directly. That would be giving them as assist.Turn question around and ask them:What do you think… it should do?What do you think it will return?
WHAT IF PARTICIPANT GOES OFF TASK??Let them do the tasks at their own pace and in their own wayBut you may need to:Remind them to think aloudPrompt for rereading task or reading task more carefullyProbes – would you have used online help? What would you have searched for?What did you just do? Why did you do that? what were you thinking? What were you expecting?
DON’T PRESENTActive listening skillsNonverbal communication, minimal encouragers, validation, …Allowing for silence, struggle, and uncertaintyChecking expectations
DON’T PRESENTInclude in demo?Participants should initiate requests for help, they indicate they’d search or bring up helpGive minimal amount of help – no extra info especially if it relates to a later taskDefine ahead and agree with your team - does help = task failureIf you are testing help, then that’s another story.
Post test questions - invite observers to ask questions or to pass you questions for participant if you have time.Post test Interview:anything else to say about what they’ve seen? ask for overall impression, Likert scales often used here.Wrap up and closure clear up user’s confusion, answer questions? Making the participant leave feeling ok – not totally confused.Thank them and give them email or business card in case they want to give you additional feedback as they use your product.
Do NOT COVERIn case of task failure:Explain that they did not fail, system/product has failed.Guidelines for determining when a task has failed. did they timeout?Dumas has some guidelines – if estimated time is <10 mins, give 3x alloted time before calling it.If estimated time is >10 minutes, give 2x allote time.Error state or wrong results: if you have to reset something on screen or protototype, have participant look away first…
Show video HEREHALF ROOM OBSEVE MODERATION AND HALF TASKS20 minutesPass Out Moderation checklist, markers and post-its for those to observe tasksAlso Debbie begin to setup a laptop that she plugs in and then readies to switch display.This is where they observe moderating and take notes related to the techniques of moderatingAlso, ask them to take notes on Post-Its. Write down anything you notice about the user doing the tasks.Are they struggling? Where? How? What are they saying? Did they accomplish the tasks?show an example of some post its notes next.
It varies across moderators how many notes they take during a test.The most important thing is that you maintain a good rapport with the participant.Taking notes can be very distracting and disruptive to the participant.So until and unless you learn to do it smoothly, it’s best left to the observers.When I first started out, I took extensive notes because I was afraid we’d miss details.but as I tested more, I started to trust that my observers would capture the important details.If observers and note takers are in the same room as you, you can look around to see if observers are catching a particular note.You can also come to learn how your developers take notes and know what types of info they’ll likely capture and what they never seem to notice.One observation per post-itIndicate direct quotes if it is an observer thought or ideaIndicate questions to ask moderator/participant
Task 1 – 8 minutesWhat did you notice me doing?What did I do well?For second part of video, take notes on both my moderation technique, but also on the test itself.Best practices to point out & discussAssistRedirecting participant back to taskAnswering question with questionAsking about participant’s expectationsPrompt think aloudHelping participant know when they are done with a task.Debbie will ask some questions and Michelle will reflect back as question.Michelle to provide an assist. (if we cover that slide)Michelle to ask about expectations Debbie has.Debbie to stop thinking aloud during a part of the task and Michelle has to promptMichelle to use active listening skills - minimal encouragers and check a perception.Michelle to ask Debbie a question – using both extremes.
What we just covered –key takeawaysSee resources later.Add more detail for each section – ie conducting the test.
Describes how to deal with the data we collect during usability testing, and what kind of analysis is needed to make recommendations, prioritize changes and deliver effective reportsWork with your team or clients to determine how best to decide and communicate recommendations and prioritiesAgree on criteria – no of issues reported on a specific feature, severity of issues found (agree on severity scale), Analyzing results collaboratively is what we recommend to help influence the design.Otherwise you analyze by yourself, prepare a report, then pitch it to the team.
Getting team members to observe and take notes during usability sessions is incomparably valuable.Taking notes on post-it notes is a simple and effective tool for data collection.Affinity diagramming is a flexible method to process data quickly.Quick and simple data analysis may be all you ever need to do.
We have a whole training on this method, but for today we’d like to give you a few minutes to try it out together with the notes you took earlier.Powerful when done as a group. Observers can share their perceptions.Small group or large group affinity session.Depending on time – Break into small groups and have them affinitize some of the notes they took during the second live demo. – 15 – 20 minutesCategorize notes into meaningful groups.Find two related and go over to another wall. You may not criticize anyone else’s placement of a note or category name; you MAY quietly move a note to the “correct” location or rename a categoryBreak up large categories (> 7 items)There is a whole workshop on this mathed alone. But I encourage you to try it out.
Work with your teams to figure out the best techniques for reporting results.This may depend on the type of project size of project, your companyTime, reWhat you do will depend on the needs and resources of your organization and projectSometimes just a small summary report is sufficient.Sometimes videos, extensive analysis of every observation with times on task is donesources,
may not need the section break.
Workshop Conclusion message here.Remember – just do itThe more you do it the better you getThere is a lot of info you can use to hone your skills,But just start testing…
So much we didn’t cover, here are more resources for you to cover