How to ask better questions and how to assess UX using surveys.
This workshop at UXLX 2014 in Lisbon was a deep dive into two important topics in survey design for user research.
We used the four-step model of how people answer questions to work on better questions, then we focused on two special uses of questionnaires in user research: the post-test assessment of satisfaction, and then how to gather information from users for redesign.
Thanks to all the attendees for making this workshop a lot of fun.
Caroline Jarrett @cjforms
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Unveiling the Role of Social Media Suspect Investigators in Preventing Online...
A deep dive into questions by @cjforms at UxLx
1. A deep dive
into questions
Workshop at UxLx 2014 led by Caroline Jarrett
How to ask better questions, and
how to assess user experience using surveys
7. • I’ll hand out an invitation I received recently by email
• Work in pairs
• Decide whether it is a survey or something else
7
8. Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
8
9. There are four steps to answer a
question
Step
1. Read and understand
2. Find an answer
3. Judge the answer
4. Place the answer
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)
“The psychology of survey response”
10. There are four steps to answer a
question
Step A good question …
1. Read and understand is legible and makes sense
2. Find an answer asks for answers that we know
3. Judge the answer asks for answers we’re happy to reveal
4. Place the answer offers appropriate spaces for the answers
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)
“The psychology of survey response”
12. Let’s review a question
• There is a question coming up on the next slide
• I will ask you to think about ONE of these four steps
1. Read and understand
2. Find the answer
3. Judge the answer
4. Place the answer
• Please think about any problems in that particular step
12
14. Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
14
16. Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
16
17. 17
In your last five days at work, what
percentage of your work time do you
estimate that you spend using publicly-
available online services (not including
email, instant messaging and search) to
do your work using a work computer or
other device?
19. Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
19
22. Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
22
23. Test with users to make sure you
offer the right answer options
23
26. Offer the right widget to collect the answer
Knowledge of what
users want to tell you
How many answers? Offer
We know all the
answers that users are
likely to give us
They only have one
answer
Radio buttons
They may have more
than one
Check boxes
We’re not sure Text boxes
26
Allen Miller, S. J. and Jarrett, C. (2001) “Should I use a drop-down?”
http://www.formsthatwork.com/files/Articles/dropdown.pdf
29. Grids are a major cause of
survey drop-out
29
35%
20%
20%
15%
5%
5%
Total incompletes across the 'main' section of the questionnaire
(after the introduction stage)
Subject Matter
Media Downloads
Survey Length
Large Grids
Open Questions
Other
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/Kantar
Quoted in Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
30. But it’s the topic that matters most
30
35%
20%
20%
15%
5%
5%
Total incompletes across the 'main' section of the questionnaire
(after the introduction stage)
Subject Matter
Media Downloads
Survey Length
Large Grids
Open Questions
Other
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/Kantar
Quoted in Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
31. Agenda
Introductions
What is a survey?
How to ask better questions
The steps to answer a question
Improve step 1: read and understand
Improve step 2: find the answer
Improve step 3: judge the answer
Improve step 4: place the answer
Understand why people answer
Break
How to assess user experience using surveys
Wrap up
31
32. Response relies on
effort, reward, and trust
32
Trust
Perceived
effort
Perceived
reward
Diagram from Jarrett, C, and Gaffney, G (2008) “Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000) “Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
33. An interesting subject helps
in all the areas
33
Shared interests
inspire trust
Interesting
topics
take
less effort
Interesting
subject =
intrinsically
rewarding
35. 35
Your answers to this survey
are important for our work
But what’s in it for
me? And I’m really
ready for a coffee.
36. Agenda Introductions
What is a survey?
How to ask better questions
Break
How to assess user experience using surveys
Wrap up
36
37. Let’s start with a standard option: SUS
• The System Usability Scale
(SUS) was created in 1986
• It has been shown to be
valid and reliable
• You get a score between
0 and 100
• You can compare your SUS
score with other systems
37
Brooke, J. (1996). SUS: A "quick and dirty" usability scale. In
Usability Evaluation in Industry.
P. W. Jordan, B. Thomas, B. A. Weerdmeester and A. L.
McClelland. London:, Taylor and Francis.
38. Jeff Sauro (@measuringux) has done
a lot of work with SUS
38 http://www.measuringusability.com/sus.php.
• Jeff provides tools for
scoring SUS
• He has adapted it to
websites
• “SUS scores are not
percentages”
39. There are other commercial products
with wider concepts of UX
39
• SUPR-Q
– includes credibility and loyalty
– licenced product
– http://www.suprq.com
• WAMMI
– online service
– includes access to standard
databases (extra for SUPR-Q)
– http://www.wammi.com
40. • Your task: find an explanation of the difference between
the European Commission and the European Union
• Use this site: http://ec.europa.eu
• Decide whether you had a good or bad experience
40
41. • I will ask you to fill in SUS (original) or SUS (Sauro)
41
42. My journey into user experience
started a long time ago, with usability
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and
satisfaction
in a specified context of use
(ISO 9241:11 1998)
42
43. Working mostly in government, we were
interested in effectiveness and efficiency
43
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and
satisfaction
in a specified context of use
(ISO 9241:11 1998)
44. But what about user experience?
What about satisfaction?
44
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and
satisfaction
in a specified context of use
(ISO 9241:11 1998)
Picture credit: Flickr jek in the box
46. Satisfaction is a complex matter
Compared experience to what? Resulting thoughts
(nothing) Indifference
Expectations Better / worse / different
Needs Met / not met / mixture
Excellence (the ideal product) Good / poor quality (or ‘good enough’)
Fairness Treated equitably / inequitably
Events that might have been Vindication / regret
46
Adapted from Oliver, R. L. (1996) and (2009)
“Satisfaction: A Behavioral Perspective on the Consumer”
47. Example: bronze medal winners tend
to be happier than silver medal winners
47
Nathan Twaddle, Olympic Bronze Medal Winner in Beijing
Photo credit: peter.cipollone, Flickr
Matsumoto D, & Willingham B (2006). The thrill of victory and the
agony of defeat: spontaneous expressions of medal winners of the
2004 Athens Olympic Games.
48. • The first question was about rating satisfaction
• What were they asking us to rate?
– Just a guess from what you recall
48
50. The challenge of UX and surveys:
which bit to measure?
The extent to which a product
can be used by specified users
to achieve specified goals with
effectiveness, efficiency and
satisfaction
in a specified context of use
(ISO 9241:11 1998)
???
51. Some ideas about what we could
measure
In the definition GoDaddy customer
support
GoDaddy as a provider of
domain names
Product This contact with help desk Overall experience of moving
a domain to GoDaddy
Users What proportion of
customers contact support
Demographics (example: type
of job)
Goals Reason for contacting help Reason for looking at
GoDaddy
Effectiveness Whether support fixed the
problem
Whether GoDaddy offers the
right products
Efficiency Whether it took a reasonable
time
Whether the product is priced
correctly
Satisfaction Helpfulness of support
person
Likely to purchase again /
recommend
Context of use Home/office; alone/helped Business / personal
51
52. 52
In the definition Information to help the European Commission
design a better website
Product
Users
Goals
Effectiveness
Efficiency
Satisfaction
Context of use
53. • Write questions for each topic
• Then get another team to try your questionnaire
53