SlideShare a Scribd company logo
1 of 40
Don’t Make Me Think!
Dr. Andres Baravalle
Overview
• Introduction to usability evaluation
  methods
• Usability testing
• Don’t make me think!
• Lab activities




                       2
Usability
• "The usability of an interface is a measure
  of the effectiveness, efficiency and
  satisfaction with which specified users
  can achieve specified goals in a
  particular environment with that
  interface." (ISO 13407).




                      3
                                  3
Effectiveness, efficiency and
satisfaction
• Effectiveness: measures if the expected goals
  have been achieved and measures the
  accuracy and completeness of the specified
  goals
• Efficiency: measures the effort necessary to
  achieve the user’s goal and if it is
  proportionate to the expected result
• Satisfaction: measures the pleasantness of
  using a particular interface and if the interface is
  suitable or not for the desired goal.
                          4
                                         4
Web Usability: did you know
that…
• People succeed 66% of the time when
  working on “single site” activities and 60%
  of the time when having to browse through
  the internet for information (Nielsen and
  Loranger, 2006)




                      5
Web Usability: did you know
that… (2)
• Experienced users spend about 25
  seconds in a homepage and 45 in an
  interior page (35 and 60 for inexperienced
  users)
• Only 23% of users scroll on their first visit
  of a homepage
  – The number decreases
  – The average scroll for first visit is 0.8 of a
    screen
                          6
Web Usability: did you know
that… (3)
• 88% of users go to search engines to find
  information
• Font face and size: different font faces for
  print and screen
  – Different font size depending on target
    audience




                        7
Types of usability evaluation
• Three main categories of evaluation methods
  (Sharp, Rogers and Preece, 2006):
  – Controlled settings involving users, eg. usability
    testing & experiments in laboratories and living labs.
  – Natural settings involving users, eg. field studies
    (usability inquiry) to see how the product is used in
    the real world.
  – Any settings not involving users, eg. consultants
    critique (usability inspections) and analytical
    evaluations


                            8
Types of evaluation (2)
• In practice, your evaluation protocol will
  include a set of methods to be used in a
  complementary way




                      9
Evaluation methods
 Method      Controlled Natural    Without
             settings   settings   users

 Observing     x            x

 Asking        x            x
 users
 Asking                     x        x
 experts
 Testing       x
 Modeling                            x
                       10
Controlled settings
• Controlled settings methods are used to
  evaluate an artefact by evaluating it on
  users within controlled settings (e.g. lab)
  – The focus is on experiments




                       11                  11
Natural settings
• Natural setting methods focus (at different
  degrees) on analysing an artefact as used
  in the natural environment
  – The focus is on observation




                       12                12
Without users
• This category includes all other methods,
  not requiring direct user involvement
• Analytical evaluation methods are based
  on “dissecting” the interaction with an
  artefact
  – They don’t require involving users
  – E.g. usability inspections and predictive
    models


                        13
Usability testing




                    14
Usability testing
• Involves recording performance of typical users
  doing typical tasks
  –   Controlled settings
  –   Users are observed and timed
  –   Data is recorded on video & key presses are logged
  –   The data is used to calculate performance times, and
      to identify & explain errors
• User satisfaction is evaluated using
  questionnaires & interviews


                            15
Testing conditions
• Usability lab or other controlled space
• Emphasis on:
  – Selecting representative users
  – Developing representative tasks
• Small sample (5-10 users) typically selected
• Tasks usually last no longer than 30 minutes
• The test conditions should be the same for every
  participant


                          16
Some type of data
• Time to complete a task
• Time to complete a task after a specified time
  away from the product
• Number and type of errors per task
• Number of errors per unit of time
• Number of navigations to online help or manuals
• Number of users making a particular error
• Number of users completing task successfully


                       17
How many participants is
enough for user testing?
• The number is a practical issue
• Depends on:
  – Schedule for testing
  – Availability of participants
  – Cost of running tests
• Typically 5-10 participants
  – Some experts argue that testing should
    continue with additional users until no new
    insights are gained
                          18
Usability testing methods
• The next slides cover some of the
  methods that are used in usability testing
• Some of those methods can be adapted
  and used for other types of usability
  evaluations
  – E.g. Thinking aloud could be adapted to be
    used in a usability inquiry too



                       19
Thinking aloud
• Thinking aloud consists in an interaction
  (scenario) during which the participants
  are requested to perform several tasks
  and to freely express their thoughts,
  feelings and opinions
  – Co-discovery is a variation of the thinking
    aloud method with two user interacting co-
    operatively
  – Aims to reflect real-life situations in which
    users can ask for help from other people
                         20
                                          20
Question asking
• Another variation on the Thinking aloud
  method, in which the evaluator asks the
  user questions while s/he is performing
  tasks with the artefact under analysis




                     21
                                   21
Remote testing
• Remote testing is used to remotely
  evaluate an artefact, by gathering
  quantitative (and in some cases
  qualitative) data about the user’s
  behaviour while performing task in a
  scenario
  – It is typically used for software interfaces



                         22
                                           22
Don’t make me think!
Don’t Make Me Think
• The first law of Usability Engineering
  (according to Steve Krug) is...
• Don’t Make Me Think!




                                     24
#1: Users don’t read web pages
• Users don’t read web pages – they just
  scan




                                   25
#2: Don’t make optimal choices
• Optimal choices are in most cases a
  waste of resources
• Typically is not needed to commit the
  resources needed to have an optimal
  interface rather than a good interface
  – People don’t look for perfect plans – they look
    for good enough plans
  – Are you really going to look for a second price
    when you find a book in Amazon at £ 3?
                        26
#3: Users have no
understanding of how things
work
• Nor they should need to, in many cases
  – Knowing the TCP/IP stack is not going to help
    you to send an email
• Don’t design interfaces that require
  learning from users – most probably users
  are NOT going to learn how to use your
  interface

                       27
The trunk test
• Imagine you are blindfolded in the trunk of
  a car
• Driven around
• Dumped somewhere
  – Once you are out, you need to assess your
    situation




                      28
The trunk test (2)
• A usable web site will allow you to “survive” a
  trunk test
• On a usable web page you’ll be always able to
  answer these questions:
  –   What site is this
  –   What page I’m on
  –   What are the main sections
  –   What are my options
  –   Where I am
  –   How can I search

                            29
The trunk test (3)
• You can use this approach by printing a
  set of pages and asking users to circle
  some or all of those areas
• You can compare user’s performance on
  different web pages to have an indicator of
  their usability




                     30
Designing home pages
• A typical home page will include:
   –   Site identity and mission
   –   Site hierarchy
   –   Site search
   –   Teases (e.g. Featured content)
   –   Timely content
   –   Deals (including ads)
   –   Shortcuts to content
   –   Registration
• A home page should always pass the “trunk test”!



                                  31
Always, always, always TEST
• Testing one user is better than testing
  none!




                      32
Test soon, test often
• Testing one user early is better than
  testing 50 at the end




                      33
Testing is iterative
• No point in testing if you don’t correct the
  errors that you find...




                       34
Why did you add this button to
the user interface?




               35
References
• Nielsen, J. and Loranger, H. (2006).
  Prioritizing Web Usability.
• Krug, S. (2009) Don’t Make Me Think
• Sharp, H., Rogers, Y. and Preece, Y.
  (2007) Interaction Design: Beyond
  Human-Computer Interaction, 2nd edition,
  John Wiley & Sons.
Lab activities




                 37
Activity 1: throw-away prototype
evaluation
• Your team has to develop a mobile web
  site for UEL students
• Develop a paper prototype and test it
  using the Co-discovery method against
  users from another team
  – Refine your prototype and re-test it against
    another team



                        38
Activity 2: Android music player
• Your team has to evaluate the usability of
  a prototype music player for Android:
  identify tasks and configuration for a
  usability test




                      39
Activity 3: Test the Sony web
store
• Working as a team, plan a protocol for
  evaluating Sony’s on-line store:
  – Identify core tasks that users would typically
    do on the web site and how to evaluate them
  – Recommend the configuration (settings –
    including resolution, browser etc.) for the test
• Run the test!

More Related Content

What's hot

What's hot (10)

03 bad usability kills
03 bad usability kills03 bad usability kills
03 bad usability kills
 
Heuristic evaluation
Heuristic evaluationHeuristic evaluation
Heuristic evaluation
 
UsabilityMatters_Usability_Testing_Introduction_Workshop
UsabilityMatters_Usability_Testing_Introduction_WorkshopUsabilityMatters_Usability_Testing_Introduction_Workshop
UsabilityMatters_Usability_Testing_Introduction_Workshop
 
Usable Government Forms and Surveys: Best Practices for Design (from MoDevGov)
Usable Government Forms and Surveys: Best Practices for Design (from MoDevGov)Usable Government Forms and Surveys: Best Practices for Design (from MoDevGov)
Usable Government Forms and Surveys: Best Practices for Design (from MoDevGov)
 
Putting Users in UX: Research Methods for Design
Putting Users in UX: Research Methods for DesignPutting Users in UX: Research Methods for Design
Putting Users in UX: Research Methods for Design
 
smava - archetypes: patterns of behaviour
smava - archetypes: patterns of behaviour smava - archetypes: patterns of behaviour
smava - archetypes: patterns of behaviour
 
Recsys 2016 tutorial: Lessons learned from building real-life recommender sys...
Recsys 2016 tutorial: Lessons learned from building real-life recommender sys...Recsys 2016 tutorial: Lessons learned from building real-life recommender sys...
Recsys 2016 tutorial: Lessons learned from building real-life recommender sys...
 
Vale2017 b13-presentation
Vale2017 b13-presentationVale2017 b13-presentation
Vale2017 b13-presentation
 
User Interface Prototyping Techniques: Low Fidelity Prototyping
User Interface Prototyping Techniques: Low Fidelity PrototypingUser Interface Prototyping Techniques: Low Fidelity Prototyping
User Interface Prototyping Techniques: Low Fidelity Prototyping
 
Lightweight and ‘guerrilla’ usability testing for digital humanities projects
Lightweight and ‘guerrilla’ usability testing for digital humanities projectsLightweight and ‘guerrilla’ usability testing for digital humanities projects
Lightweight and ‘guerrilla’ usability testing for digital humanities projects
 

Similar to Don’t make me think!

Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
debcook
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
debcook
 

Similar to Don’t make me think! (20)

Usability evaluation methods (part 2) and performance metrics
Usability evaluation methods (part 2) and performance metricsUsability evaluation methods (part 2) and performance metrics
Usability evaluation methods (part 2) and performance metrics
 
Planning and usability evaluation methods
Planning and usability evaluation methodsPlanning and usability evaluation methods
Planning and usability evaluation methods
 
ICS3211 Lecture 9
ICS3211 Lecture 9ICS3211 Lecture 9
ICS3211 Lecture 9
 
ICS3211_lecture 9_2022.pdf
ICS3211_lecture 9_2022.pdfICS3211_lecture 9_2022.pdf
ICS3211_lecture 9_2022.pdf
 
ICS3211 lecture 10
ICS3211 lecture 10ICS3211 lecture 10
ICS3211 lecture 10
 
CIS375 Interaction Designs Chapter14
CIS375 Interaction Designs Chapter14CIS375 Interaction Designs Chapter14
CIS375 Interaction Designs Chapter14
 
CIS375 Interaction Designs Chapter13
CIS375 Interaction Designs Chapter13CIS375 Interaction Designs Chapter13
CIS375 Interaction Designs Chapter13
 
Usability Testing
Usability TestingUsability Testing
Usability Testing
 
Usability requirements
Usability requirements Usability requirements
Usability requirements
 
Usability Evaluation
Usability EvaluationUsability Evaluation
Usability Evaluation
 
How to Conduct Usability Studies: A Librarian Primer
How to Conduct Usability Studies: A Librarian PrimerHow to Conduct Usability Studies: A Librarian Primer
How to Conduct Usability Studies: A Librarian Primer
 
Usability evaluations (part 2)
Usability evaluations (part 2) Usability evaluations (part 2)
Usability evaluations (part 2)
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
 
Get It Right the First Time Through Cheap and Easy DIY Usability Testing - Dr...
Get It Right the First Time Through Cheap and Easy DIY Usability Testing - Dr...Get It Right the First Time Through Cheap and Easy DIY Usability Testing - Dr...
Get It Right the First Time Through Cheap and Easy DIY Usability Testing - Dr...
 
Website Usability & Eye-tracking by Marco Pretorious (Certified Usability Ana...
Website Usability & Eye-tracking by Marco Pretorious (Certified Usability Ana...Website Usability & Eye-tracking by Marco Pretorious (Certified Usability Ana...
Website Usability & Eye-tracking by Marco Pretorious (Certified Usability Ana...
 
Get it right the first time through cheap and easy DIY usability testing
Get it right the first time through cheap and easy DIY usability testingGet it right the first time through cheap and easy DIY usability testing
Get it right the first time through cheap and easy DIY usability testing
 
Get it right the first time through cheap and easy DIY usability testing
Get it right the first time through cheap and easy DIY usability testingGet it right the first time through cheap and easy DIY usability testing
Get it right the first time through cheap and easy DIY usability testing
 
Unit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxUnit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptx
 
Usability testing 2013.12.20.
Usability testing 2013.12.20.Usability testing 2013.12.20.
Usability testing 2013.12.20.
 

More from Andres Baravalle

More from Andres Baravalle (17)

Dark web markets: from the silk road to alphabay, trends and developments
Dark web markets: from the silk road to alphabay, trends and developmentsDark web markets: from the silk road to alphabay, trends and developments
Dark web markets: from the silk road to alphabay, trends and developments
 
Introduction to jQuery
Introduction to jQueryIntroduction to jQuery
Introduction to jQuery
 
Introduction to JavaScript
Introduction to JavaScriptIntroduction to JavaScript
Introduction to JavaScript
 
Social, professional, ethical and legal issues
Social, professional, ethical and legal issuesSocial, professional, ethical and legal issues
Social, professional, ethical and legal issues
 
Design rules and usability requirements
Design rules and usability requirementsDesign rules and usability requirements
Design rules and usability requirements
 
Other metrics
Other metricsOther metrics
Other metrics
 
Issue-based metrics
Issue-based metricsIssue-based metrics
Issue-based metrics
 
Background on Usability Engineering
Background on Usability EngineeringBackground on Usability Engineering
Background on Usability Engineering
 
Measuring the user experience
Measuring the user experienceMeasuring the user experience
Measuring the user experience
 
SPEL (Social, professional, ethical and legal) issues in Usability
SPEL (Social, professional, ethical and legal) issues in UsabilitySPEL (Social, professional, ethical and legal) issues in Usability
SPEL (Social, professional, ethical and legal) issues in Usability
 
Accessibility: introduction
Accessibility: introduction  Accessibility: introduction
Accessibility: introduction
 
Usability evaluations (part 3)
Usability evaluations (part 3) Usability evaluations (part 3)
Usability evaluations (part 3)
 
Interfaces
InterfacesInterfaces
Interfaces
 
Data collection and analysis
Data collection and analysisData collection and analysis
Data collection and analysis
 
Interaction design and cognitive aspects
Interaction design and cognitive aspects Interaction design and cognitive aspects
Interaction design and cognitive aspects
 
Designing and prototyping
Designing and prototypingDesigning and prototyping
Designing and prototyping
 
Usability: introduction (Week 1)
Usability: introduction (Week 1)Usability: introduction (Week 1)
Usability: introduction (Week 1)
 

Recently uploaded

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 

Recently uploaded (20)

Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 

Don’t make me think!

  • 1. Don’t Make Me Think! Dr. Andres Baravalle
  • 2. Overview • Introduction to usability evaluation methods • Usability testing • Don’t make me think! • Lab activities 2
  • 3. Usability • "The usability of an interface is a measure of the effectiveness, efficiency and satisfaction with which specified users can achieve specified goals in a particular environment with that interface." (ISO 13407). 3 3
  • 4. Effectiveness, efficiency and satisfaction • Effectiveness: measures if the expected goals have been achieved and measures the accuracy and completeness of the specified goals • Efficiency: measures the effort necessary to achieve the user’s goal and if it is proportionate to the expected result • Satisfaction: measures the pleasantness of using a particular interface and if the interface is suitable or not for the desired goal. 4 4
  • 5. Web Usability: did you know that… • People succeed 66% of the time when working on “single site” activities and 60% of the time when having to browse through the internet for information (Nielsen and Loranger, 2006) 5
  • 6. Web Usability: did you know that… (2) • Experienced users spend about 25 seconds in a homepage and 45 in an interior page (35 and 60 for inexperienced users) • Only 23% of users scroll on their first visit of a homepage – The number decreases – The average scroll for first visit is 0.8 of a screen 6
  • 7. Web Usability: did you know that… (3) • 88% of users go to search engines to find information • Font face and size: different font faces for print and screen – Different font size depending on target audience 7
  • 8. Types of usability evaluation • Three main categories of evaluation methods (Sharp, Rogers and Preece, 2006): – Controlled settings involving users, eg. usability testing & experiments in laboratories and living labs. – Natural settings involving users, eg. field studies (usability inquiry) to see how the product is used in the real world. – Any settings not involving users, eg. consultants critique (usability inspections) and analytical evaluations 8
  • 9. Types of evaluation (2) • In practice, your evaluation protocol will include a set of methods to be used in a complementary way 9
  • 10. Evaluation methods Method Controlled Natural Without settings settings users Observing x x Asking x x users Asking x x experts Testing x Modeling x 10
  • 11. Controlled settings • Controlled settings methods are used to evaluate an artefact by evaluating it on users within controlled settings (e.g. lab) – The focus is on experiments 11 11
  • 12. Natural settings • Natural setting methods focus (at different degrees) on analysing an artefact as used in the natural environment – The focus is on observation 12 12
  • 13. Without users • This category includes all other methods, not requiring direct user involvement • Analytical evaluation methods are based on “dissecting” the interaction with an artefact – They don’t require involving users – E.g. usability inspections and predictive models 13
  • 15. Usability testing • Involves recording performance of typical users doing typical tasks – Controlled settings – Users are observed and timed – Data is recorded on video & key presses are logged – The data is used to calculate performance times, and to identify & explain errors • User satisfaction is evaluated using questionnaires & interviews 15
  • 16. Testing conditions • Usability lab or other controlled space • Emphasis on: – Selecting representative users – Developing representative tasks • Small sample (5-10 users) typically selected • Tasks usually last no longer than 30 minutes • The test conditions should be the same for every participant 16
  • 17. Some type of data • Time to complete a task • Time to complete a task after a specified time away from the product • Number and type of errors per task • Number of errors per unit of time • Number of navigations to online help or manuals • Number of users making a particular error • Number of users completing task successfully 17
  • 18. How many participants is enough for user testing? • The number is a practical issue • Depends on: – Schedule for testing – Availability of participants – Cost of running tests • Typically 5-10 participants – Some experts argue that testing should continue with additional users until no new insights are gained 18
  • 19. Usability testing methods • The next slides cover some of the methods that are used in usability testing • Some of those methods can be adapted and used for other types of usability evaluations – E.g. Thinking aloud could be adapted to be used in a usability inquiry too 19
  • 20. Thinking aloud • Thinking aloud consists in an interaction (scenario) during which the participants are requested to perform several tasks and to freely express their thoughts, feelings and opinions – Co-discovery is a variation of the thinking aloud method with two user interacting co- operatively – Aims to reflect real-life situations in which users can ask for help from other people 20 20
  • 21. Question asking • Another variation on the Thinking aloud method, in which the evaluator asks the user questions while s/he is performing tasks with the artefact under analysis 21 21
  • 22. Remote testing • Remote testing is used to remotely evaluate an artefact, by gathering quantitative (and in some cases qualitative) data about the user’s behaviour while performing task in a scenario – It is typically used for software interfaces 22 22
  • 23. Don’t make me think!
  • 24. Don’t Make Me Think • The first law of Usability Engineering (according to Steve Krug) is... • Don’t Make Me Think! 24
  • 25. #1: Users don’t read web pages • Users don’t read web pages – they just scan 25
  • 26. #2: Don’t make optimal choices • Optimal choices are in most cases a waste of resources • Typically is not needed to commit the resources needed to have an optimal interface rather than a good interface – People don’t look for perfect plans – they look for good enough plans – Are you really going to look for a second price when you find a book in Amazon at £ 3? 26
  • 27. #3: Users have no understanding of how things work • Nor they should need to, in many cases – Knowing the TCP/IP stack is not going to help you to send an email • Don’t design interfaces that require learning from users – most probably users are NOT going to learn how to use your interface 27
  • 28. The trunk test • Imagine you are blindfolded in the trunk of a car • Driven around • Dumped somewhere – Once you are out, you need to assess your situation 28
  • 29. The trunk test (2) • A usable web site will allow you to “survive” a trunk test • On a usable web page you’ll be always able to answer these questions: – What site is this – What page I’m on – What are the main sections – What are my options – Where I am – How can I search 29
  • 30. The trunk test (3) • You can use this approach by printing a set of pages and asking users to circle some or all of those areas • You can compare user’s performance on different web pages to have an indicator of their usability 30
  • 31. Designing home pages • A typical home page will include: – Site identity and mission – Site hierarchy – Site search – Teases (e.g. Featured content) – Timely content – Deals (including ads) – Shortcuts to content – Registration • A home page should always pass the “trunk test”! 31
  • 32. Always, always, always TEST • Testing one user is better than testing none! 32
  • 33. Test soon, test often • Testing one user early is better than testing 50 at the end 33
  • 34. Testing is iterative • No point in testing if you don’t correct the errors that you find... 34
  • 35. Why did you add this button to the user interface? 35
  • 36. References • Nielsen, J. and Loranger, H. (2006). Prioritizing Web Usability. • Krug, S. (2009) Don’t Make Me Think • Sharp, H., Rogers, Y. and Preece, Y. (2007) Interaction Design: Beyond Human-Computer Interaction, 2nd edition, John Wiley & Sons.
  • 38. Activity 1: throw-away prototype evaluation • Your team has to develop a mobile web site for UEL students • Develop a paper prototype and test it using the Co-discovery method against users from another team – Refine your prototype and re-test it against another team 38
  • 39. Activity 2: Android music player • Your team has to evaluate the usability of a prototype music player for Android: identify tasks and configuration for a usability test 39
  • 40. Activity 3: Test the Sony web store • Working as a team, plan a protocol for evaluating Sony’s on-line store: – Identify core tasks that users would typically do on the web site and how to evaluate them – Recommend the configuration (settings – including resolution, browser etc.) for the test • Run the test!