2. David Hamill
UX specialist. Can’t draw.
linkedin.com/in/davidhamill/
twitter.com/dav_hamill
medium.com/@david.hamill
3. 14 years in UX
We called it User-Centred Design
back then.
When I started out, the internet
looked like this.
I’m currently working for
Skyscanner.
4. My thoughts are my own
I try to make them my employer’s
thoughts too.
It never works as much as I’d like.
5. I’m going to talk about
1. The limitations of prototype testing
2. Why test your live product?
3. Observing your problem space
4. Jobs to be done interviews
5. Common interpretation errors
6. I’m going to assume…
you know what usability testing is.
8. It’s not very real
Prototype testing is very useful, but isn’t
the most realistic.
Some people test prototypes as their
only form of user research.
They shouldn’t.
9. What it’s good for?
• Spotting big problems with new
ideas early
• Checking orientation, interactions
and labels all make sense
• Iterating
10. Limitations
• It’s not a good validator of user value
• Observed behaviour is heavily constrained
• Findings have short-lived use
• Testing solutions > understanding problems
• Iteration often doesn’t happen
11. Do more than just test prototypes
Otherwise your understanding of users will be
limited. The project team will lack intuition.
13. Regular tests of live product
This is how you find out what’s wrong with it.
Observe target users while they perform your
product’s key tasks.
14. Benefits of testing live product
• Plenty of rope
• Findings have longevity
• Understanding based on realistic use
• Build a database of findings
https://airtable.com/universe/expShuhNMi0Oc0xpb/
polaris-ux-nuggets
15. Limitations of testing live product
• Unrealistic in some ways
• It won’t always tell you why people use it
• Everyone prefers watching the new shiny
17. What’s this?
Watch how people behave when you give
them the scenario without any guidance on
what to use.
We now do this weekly at Skyscanner.
Everyone in the company is invited to watch.
18. Benefits
• Very realistic experience to observe
• Observe how people decide what to use
• Observe how multiple products combine
• Realistic session length on your product
• Better understanding of user habits
https://articles.uie.com/user_exposure_hours/
19. Limitations
• Sometimes they don’t use your product
• Colleagues can be less interested
• Fewer actionable issues
Best set up as one session a week/fortnight.
Share the session recording.
21. What’s JTBD?
Other than the latest product fad? Cognitive
interviews discussing the story behind people
hiring and firing your product. Try to discover
when and why they use alternatives to it.
https://www.youtube.com/watch?v=kGuSM3yU
xik
22. Benefits
• Understand what job your product was
hired for
• Understand why people leave
• Understand when they won’t leave
• Understand your real competition
• Understand why people use alternatives
• Prioritising work should become easier
23. Limitations
• Framework often ill-applied to atomic levels
• Framework can be confusing/rigid
• Not a great spectator sport
• People use the terminology without doing
the research
My advice is to use it as much as it’s useful
and drop the bits you find too rigid.
24. 5. Common interpretation errors
The goal of research is not to observe users, but
to get better outcomes.
25. Lack of scepticism
Not hungry to find any issues with the direction
we are headed.
Signals warning of failure are overlooked
because of lack of hunger to spot them.
26. Confirmation bias
The opposite problem.
Any comment or behaviour which supports
your assumption is given disproportionate
weight in evaluation.
27. Ignoring observed habit
Observing the need and ignoring the
behaviour.
Dreaming up a new product or feature the user
will never seek out because it doesn’t fit their
workflow.
28. Distracted by the interesting
Boring problems with easy solutions often
ignored. Interesting problems with elaborate
risky solutions are given more attention.
Prioritising your oeuvre over your users.
29. Actions rather than goals
The user responds to your design because of
what they think is on offer. You watch this
without considering their overall goal.
You propose a faster horse when they need a
Hyperloop.
30. Filling in the blanks
The user mentions something they would do
later without specifying how they would go
about it, you invent the specifics and call it
user research.
31. Lost in Design Studio
User data goes into Design Studio/Sprint but
doesn’t emerge at the other end.
For lots of the reasons above. And also
because democracy is a flawed design
approach.