SlideShare une entreprise Scribd logo
1  sur  108
Understanding online
    audiences
    Planning and implementing
  research into online audiences

  UX day Oxford 18 March 2013


         Martin Bazley
 Online experience consultant
  Martin Bazley & Associates
Martin Bazley
Previously
• Teaching (7 yrs)
• Science Museum, London,
  Internet Projects (7yrs)
• E-Learning Officer, MLA South East (3yrs)
• Founder: Digital Learning Network DLNET
Martin Bazley
• Current
• Developing online resources, websites,
  user testing, evaluation, training,
  consultancy…
  Martin Bazley & Associates
  www.martinbazley.com



 Slides and notes available afterwards
Note to self:
                check stats
                tomorrow to see
                if anyone looked
                up the website




www.martinbazley.com
How can we get a sense of who our online
visitors are and what they do with our online
content?

How do we gather data to help us improve
what we do?

How do we measure success from the
user's point of view, and / or against our own
objectives and constraints?

For example, how justify investment (or lack
of it) in social networks etc?
Reasons for doing audience research:
               Evaluation
• Did your project/product/service do what
  you wanted it to do?
• Provide information for stakeholders
• Gauge audience satisfaction
Reasons for doing audience research:
               Promotion
• Improve your offer for your target
  audiences
• Increase usage
• Widen access
Reasons for doing audience research:
                 Planning
• Inform development of a new
  product/service
• Inform business planning
• Prove interest in a related activity
Data gathering tools

• Qualitative: focus groups, “free text”
  questions in surveys, interviews
• Quantitative: web statistics, “multiple
  choice” questions in surveys, visitor
  tracking
• Observational: user testing, ethnographic
Define audience
                                                         Plan methodology
                          research goal




Use results to guide                                                  Collect data
       changes




                                          Analyse data
Define audience research
                                                                 Plan methodology
                                   goal




Use results to guide                                                          Collect data
       changes




                                                  Analyse data
Define audience research
                                                                 Plan methodology
                                   goal




Use results to guide                                                          Collect data
       changes




                                                  Analyse data
Define audience research
                                                                 Plan methodology
                                   goal




Use results to guide                                                          Collect data
       changes




                                                  Analyse data
Define audience research
                                                             Plan methodology
                               goal




Use results to guide                                                      Collect data
       changes




                                              Analyse data
Define audience
                                                         Plan methodology
                          research goal




Use results to guide                                                  Collect data
       changes




                                          Analyse data
Strengths and weaknesses of different
       data gathering techniques
Data gathering techniques
User testing
 - early in development and again near end
Online questionnaires
 – emailed to people or linked from website
Focus groups
 - best near beginning of project, or at
 redevelopment stage
Visitor surveys
 - link online and real visits
Web stats
 - useful for long term trends /events etc
Need to distinguish between:

Diagnostics
  – making a project or service better

Reporting
– to funders, or for advocacy
Online questionnaires
(+) once set up they gather numerical and
  qualitative data with no further effort –
   given time can build up large datasets
(+) the datasets can be easily exported and
  manipulated, can be sampled at various times,
  and structured queries can yield useful results
(–) respondents are self-selected and this will skew
  results – best to compare with similar data from
  other sources, like visitor surveys
(–) the number and nature of responses may
  depend on how the online questionnaire is
  displayed and promoted on the website
Focus groups
(+) can explore specific issues in more
  depth, yielding rich feedback
(+) possible to control participant
  composition to ensure representative
(–) comparatively time-consuming
  (expensive) to organise and analyse
(–) yield qualitative data only - small
  numbers mean numerical comparisons
  are unreliable
Visitor surveys
(+) possible to control participant
  composition to ensure representative
(–) comparatively time-consuming
  (expensive) to organise and analyse
(–) responses can be affected by various
  factors including interviewer, weather on
  the day, day of the week, etc, reducing
  validity of numerical comparisons between
  museums
Web stats
(+) Easy to gather data – can decide what to
  do with it later
(+) Person-independent data generated - it
  is the interpretation, rather than the data
  themselves, which is subjective. This
  means others can review the same data
  and verify or amend initial conclusions
  reached
Web stats
(–) Different systems generate different data
  for the same web activity – for example no
  of unique visits measured via Google
  Analytics is generally lower than that
  derived via server log files
(–) Metrics are complicated and require
  specialist knowledge to appreciate them
  fully
Web stats
(–) As the amount of off-website web activity
  increases (e.g. Web 2.0 style interactions)
  the validity of website stats decreases,
  especially for reporting purposes, but also
  for diagnostics
 (–) Agreeing a common format for
  presentation of data and analysis requires
  collaborative working to be meaningful
Online surveys
    SurveyMonkey
www.surveymonkey.com
Web stats
Google Analytics
Learn GA: short intro videos etc
     https://www.google.com/analytics/iq.html
The best way to learn GA
       is to use it:
     www.google.com/analytics/
Web stats:   Focus on   trends
  rather than absolute     values
The ‘long tail’




An example of a power law graph showing popularity
ranking. To the right is the long tail; to the left are the
few that dominate. Notice that the areas of both
regions match. [Wikipedia: Long Tail]
The ‘long tail’




The tail becomes bigger and longer in new markets (depicted in
red). In other words, whereas traditional retailers have focused
on the area to the left of the chart, online bookstores derive
more sales from the area to the right.[Wikipedia: Long Tail]
SCA guidance
http://sca.jiscinvolve.org/wp/audience-publications/

                      Good overview
                   Step by step approach



                 Culture 24 Let’s Get Real
            http://weareculture24.org.uk/projects/action-research/
More information / advice /
          ideas
    Happy to help - phone number on site:



     Martin Bazley
     0780 3580 737
  www.martinbazley.com
Extra slides
   not used in session
Some of these may be useful -
  please feel free to call for
  clarification or more info
When to evaluate or test and why

• Before funding approval – project planning

• Post-funding - project development

• Post-project – summative evaluation
Testing is an iterative process

Testing isn’t something you do once

Make something
 => test it
    => refine it
       => test it again
Before funding – project planning
• *Evaluation of other websites
  – Who for? What for? How use it? etc
  – awareness raising: issues, opportunities
  – contributes to market research              Research
  – possible elements, graphic feel etc

• *Concept testing
  – check idea makes sense with audience
  – reshape project based on user feedback
                                               Focus group
Post-funding - project development
• *Concept testing
  – refine project outcomes based on
    feedback from intended users
                                              Focus group

• Refine website structure
  – does it work for users?
                               One-to-one tasks

• *Evaluate initial look and feel
  – graphics,navigation etc

                                              Focus group
Card sorting - get various people to try out the
website structure before you build it
Post-funding - project development 2
• *Full evaluation of a draft working
  version
    – usability AND content: do activities work, how
      engaging is it, what else could be offered, etc




Observation of actual use of website
by intended users,
using it for intended purpose,
in intended context – workplace, classroom, library, home, etc
Post-funding - project development 3

• Acceptance testing of ‘finished’
  website
  – last minute check, minor corrections only
  – often offered by web developers



• Summative evaluation
  – report for funders, etc
  – learn lessons at project level for next time
Website evaluation and testing
Need to think ahead a bit:
  – what are you trying to find out?

  – how do you intend to test it?

  – why? what will do you do as a result?



  The Why? should drive this process
Evaluating online learning
resources in the classroom
Martin Bazley
Online experience consultant
Key point:


 for a site designed for schools,


 the most effective user testing observations


 will be made in a real classroom situation
National Archives Moving Here project


 For teachers of 8 – 14 yr olds

 History Geography and Citizenship



 Features: Interactives, activity sheets, audio and video clips
Moving Here Schools:
For 8 – 14 yr olds studying:
History Geography and Citizenship
Features:
Interactives, activity sheets, audio and video
clips
1. preliminary testing
 sessions –
conventional user-testing
 with teachers (at TNA)
2. in-class testing –
teachers used the Moving
 Here Schools site with pupils
 in their own classrooms
 This meant sitting at the back of the classroom
   observing and taking notes…
Evaluation: 2-phase approach
Site ready in parts – but not too ready:
The environment had a significant
 impact on how the site was used.

The class dynamic within the
 different groups contributed to
 how much the students learned.
The environment and social dynamics




The environment had a significant
 impact on how the site was used.

The class dynamic within the
 different groups contributed to
 how much the students learned.
in-class testing picked up elements
  not there in conventional user
  testing.

teachers in preliminary user testing
  did not spot some problems until
  actually in the classroom. For
  example…
interactive activities:
looked big enough when
 viewed on a screen
 nearby…
… but text/images too
small for some children to
see from the back of the
class…
…so interactives needed
to be viewable full-screen
Only spotted during in-class testing:



…so interactives needed
to be viewable full-screen
content:
when students tried to read
 text out loud, teachers
 realised some text was too
 difficult or complex
activity sheets:
some sheets did not have
 spaces for students to put
 their names - caused
 confusion when printing 30
 at same time…
Manchester Art Gallery art interactive


  For teachers of 8 – 11 yr olds, and for pupils

  History Art and Citizenship



  Features: interactive with built in video, quiz, etc,

  plus activity sheets and background info
Martin Bazley
Martin Bazley
Martin Bazley
Martin Bazley
'This classroom user testing is all very well, but...'



How can you see everything in a class of 30
  children – don't you miss things?
    You see things in a classroom that
      don't arise in one-to-one testing
    They are the real issues
'This classroom user testing is all very well, but...'



How can you see everything in a class of 30
  children – don't you miss things?
    You see things in a classroom that
      don't arise in one-to-one testing
    They are the real issues
'This classroom user testing is all very well, but...' but...'
          'This classroom user testing is all very well,




Doesn't using a specific class with particular
  needs skew the results?
    »   For example, low ability, poor English, equipment not working,
        behaviour issues, etc - are results as reliable as those in a
        'neutral' environment?
    »   ‘neutral environment’ ? – no such thing - any test will be
        subjective, and in any case:
    »   Testing is to make website work well in classroom, - need to
        see effects of factors like those.
'This classroomclassroom user testing is very well,but...'
             'This user testing is all all very well, but...'




 Doesn't using a specific class with particular
   needs skew the results?
        »   For example, low ability, poor English, equipment not working,
            behaviour issues, etc - are results as reliable as those in a
            'neutral' environment?
        »   ‘neutral environment’ ? – no such thing - any test will be
            subjective, and in any case:
        »   Testing is to make website work well in classroom, - need to
            see effects of factors like those.
'This classroomclassroom user testing is very well,but...'
             'This user testing is all all very well, but...'




 Can't my Web developer do the testing for us?
        » best not to use external developer to do user
          testing - conflict of interest
        » also likely to focus more on the technical
          aspects of the site than on effect on the
          teacher and pupils.
        » observe classes yourself but use an
          independent evaluator for key decision
          points
'This classroomclassroom user testing is very well,but...'
             'This user testing is all all very well, but...'




 Can't my Web developer do the testing for us?
        » best not to use external developer to do user
          testing - conflict of interest
        » also likely to focus more on the technical
          aspects of the site than on effect on the
          teacher and pupils.
        » visit a classroom yourself but use an
          independent evaluator for key decision
          points
'This classroom user testing is all very well, but...' but...'
          'This classroom user testing is all very well,




I don't have the time or budget to do this!
    »   need cost no more than conventional user testing. one person
        could attend a one-hour class session in a school, giving the
        teacher the same small token payment
    »   This programme had evaluation built into project: 6.7% of
        total Schools site budget.
    »   Allow 5 -10% of total project budget for user testing




                                            => videos
Video clips
• Moving Here key ideas not lesson plans etc
  http://www.vimeo.com/18888798
• http://www.vimeo.com/18892401 Lesson
  starter
• Time saver http://www.vimeo.com/18867252
  S
User test early

Testing one user early on in the project…

…is better than testing 50 near the end
Two usability testing techniques

“Get it” testing
- do they understand the purpose, how it
  works, etc

Key task testing
- ask the user to do something, watch how
  well they do

Ideally, do a bit of each, in that order
User testing – who should do it?
• The worst person to conduct (or interpret)
  user testing of your own site is…
  – you!
• Beware of hearing what you want to
  hear…
• Useful to have an external viewpoint
• First 5mins in a genuine setting tells you
  80% of what’s wrong with the site
Strengths and weaknesses of different
       data gathering techniques
Data gathering techniques
User testing
 - early in development and again near end
Online questionnaires
 – emailed to people or linked from website
Focus groups
 - best near beginning of project, or at
 redevelopment stage
Visitor surveys
 - link online and real visits
Web stats
 - useful for long term trends /events etc
Need to distinguish between:

Diagnostics
  – making a project or service better

Reporting
– to funders, or for advocacy
Online questionnaires
(+) once set up they gather numerical and
  qualitative data with no further effort –
   given time can build up large datasets
(+) the datasets can be easily exported and
  manipulated, can be sampled at various times,
  and structured queries can yield useful results
(–) respondents are self-selected and this will skew
  results – best to compare with similar data from
  other sources, like visitor surveys
(–) the number and nature of responses may
  depend on how the online questionnaire is
  displayed and promoted on the website
Focus groups
(+) can explore specific issues in more
  depth, yielding rich feedback
(+) possible to control participant
  composition to ensure representative
(–) comparatively time-consuming
  (expensive) to organise and analyse
(–) yield qualitative data only - small
  numbers mean numerical comparisons
  are unreliable
Visitor surveys
(+) possible to control participant
  composition to ensure representative
(–) comparatively time-consuming
  (expensive) to organise and analyse
(–) responses can be affected by various
  factors including interviewer, weather on
  the day, day of the week, etc, reducing
  validity of numerical comparisons between
  museums
Web stats
(+) Easy to gather data – can decide what to
  do with it later
(+) Person-independent data generated - it
  is the interpretation, rather than the data
  themselves, which is subjective. This
  means others can review the same data
  and verify or amend initial conclusions
  reached
Web stats
(–) Different systems generate different data
  for the same web activity – for example no
  of unique visits measured via Google
  Analytics is generally lower than that
  derived via server log files
(–) Metrics are complicated and require
  specialist knowledge to appreciate them
  fully
Web stats
(–) As the amount of off-website web activity
  increases (e.g. Web 2.0 style interactions)
  the validity of website stats decreases,
  especially for reporting purposes, but also
  for diagnostics
 (–) Agreeing a common format for
  presentation of data and analysis requires
  collaborative working to be meaningful
More information / advice /
          ideas


     Martin Bazley
     0780 3580 737
  www.martinbazley.com
SCA guidance
http://sca.jiscinvolve.org/wp/audience-publications/

                      Good overview
                   Step by step approach



                 Culture 24 Let’s Get Real
            http://weareculture24.org.uk/projects/action-research/
Crit room
‘simulated user testing’
Crit room protocol
Simulating user testing – usually one-to-one
  in quiet room
No one (especially site stakeholders) other
  than tester say anything for first part of
  session
In this simulation we will focus on
- Look and feel of site
- Usability
- Content
More information / advice /
          ideas
    Happy to help - phone number on site:



     Martin Bazley
     0780 3580 737
  www.martinbazley.com

Contenu connexe

Similaire à Understanding online audiences ux day oxford 18 mar 13

Understanding online audiences creating capacity 19 june 2012
Understanding online audiences creating capacity 19 june 2012Understanding online audiences creating capacity 19 june 2012
Understanding online audiences creating capacity 19 june 2012Martin Bazley
 
Data Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better BusinessData Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better BusinessMcKonly & Asbury, LLP
 
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09Martin Bazley
 
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan BerlinExperience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan BerlinMad*Pow
 
Experience Research Best Practices
Experience Research Best PracticesExperience Research Best Practices
Experience Research Best PracticesDan Berlin
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdfMohdTaufiqIshak
 
Business Analytics Techniques.docx
Business Analytics Techniques.docxBusiness Analytics Techniques.docx
Business Analytics Techniques.docxAbhinavSharma309481
 
20190527_Dietmar Lampert _ New indicators for Open Sciene
20190527_Dietmar Lampert _ New indicators for Open Sciene20190527_Dietmar Lampert _ New indicators for Open Sciene
20190527_Dietmar Lampert _ New indicators for Open ScieneOpenAIRE
 
Bmgt 205 chapter_10
Bmgt 205 chapter_10Bmgt 205 chapter_10
Bmgt 205 chapter_10Chris Lovett
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdfssuser9878d0
 
Performance Management to Program Evaluation: Creating a Complementary Connec...
Performance Management to Program Evaluation: Creating a Complementary Connec...Performance Management to Program Evaluation: Creating a Complementary Connec...
Performance Management to Program Evaluation: Creating a Complementary Connec...nicholes21
 
Xdelia evaluation framework
Xdelia evaluation frameworkXdelia evaluation framework
Xdelia evaluation frameworkgrainne
 
CAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journeyCAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journeyKandy Woodfield
 
Bazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online ResourcesBazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online ResourcesMartin Bazley
 
090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview Presentation090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview PresentationMartin Bazley
 
The Audience Focus - Shaping the Arts Council England commissioned intellige...
The Audience Focus - Shaping the Arts Council England commissioned intellige...The Audience Focus - Shaping the Arts Council England commissioned intellige...
The Audience Focus - Shaping the Arts Council England commissioned intellige...The Audience Agency
 

Similaire à Understanding online audiences ux day oxford 18 mar 13 (20)

Understanding online audiences creating capacity 19 june 2012
Understanding online audiences creating capacity 19 june 2012Understanding online audiences creating capacity 19 june 2012
Understanding online audiences creating capacity 19 june 2012
 
Data Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better BusinessData Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better Business
 
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
 
Part III. Project evaluation
Part III. Project evaluationPart III. Project evaluation
Part III. Project evaluation
 
Main
MainMain
Main
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
Towards a M&E toolkit for Egypt's agricultural development projects
Towards a M&E toolkit for Egypt's agricultural development projectsTowards a M&E toolkit for Egypt's agricultural development projects
Towards a M&E toolkit for Egypt's agricultural development projects
 
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan BerlinExperience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
 
Experience Research Best Practices
Experience Research Best PracticesExperience Research Best Practices
Experience Research Best Practices
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
Business Analytics Techniques.docx
Business Analytics Techniques.docxBusiness Analytics Techniques.docx
Business Analytics Techniques.docx
 
20190527_Dietmar Lampert _ New indicators for Open Sciene
20190527_Dietmar Lampert _ New indicators for Open Sciene20190527_Dietmar Lampert _ New indicators for Open Sciene
20190527_Dietmar Lampert _ New indicators for Open Sciene
 
Bmgt 205 chapter_10
Bmgt 205 chapter_10Bmgt 205 chapter_10
Bmgt 205 chapter_10
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
Performance Management to Program Evaluation: Creating a Complementary Connec...
Performance Management to Program Evaluation: Creating a Complementary Connec...Performance Management to Program Evaluation: Creating a Complementary Connec...
Performance Management to Program Evaluation: Creating a Complementary Connec...
 
Xdelia evaluation framework
Xdelia evaluation frameworkXdelia evaluation framework
Xdelia evaluation framework
 
CAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journeyCAQDAS 2014 From graph paper to digital research our Framework journey
CAQDAS 2014 From graph paper to digital research our Framework journey
 
Bazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online ResourcesBazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online Resources
 
090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview Presentation090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview Presentation
 
The Audience Focus - Shaping the Arts Council England commissioned intellige...
The Audience Focus - Shaping the Arts Council England commissioned intellige...The Audience Focus - Shaping the Arts Council England commissioned intellige...
The Audience Focus - Shaping the Arts Council England commissioned intellige...
 

Plus de Martin Bazley

Digital learning resources
Digital learning resourcesDigital learning resources
Digital learning resourcesMartin Bazley
 
Digital learning: an overview
Digital learning: an overviewDigital learning: an overview
Digital learning: an overviewMartin Bazley
 
Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...Martin Bazley
 
MA conf cardiff 9 Oct 2014 museum websites online experience martin bazley ...
MA conf cardiff 9 Oct 2014 museum websites   online experience martin bazley ...MA conf cardiff 9 Oct 2014 museum websites   online experience martin bazley ...
MA conf cardiff 9 Oct 2014 museum websites online experience martin bazley ...Martin Bazley
 
Digital technology in museums - case studies
Digital technology in museums - case studiesDigital technology in museums - case studies
Digital technology in museums - case studiesMartin Bazley
 
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploadingDigital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploadingMartin Bazley
 
Developing online learning resources for schools on a budget
Developing online learning resources for schools on a budgetDeveloping online learning resources for schools on a budget
Developing online learning resources for schools on a budgetMartin Bazley
 
Creating online learning resources for schools for uploading
Creating online learning resources for schools   for uploadingCreating online learning resources for schools   for uploading
Creating online learning resources for schools for uploadingMartin Bazley
 
Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...Martin Bazley
 
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)Martin Bazley
 
Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11Martin Bazley
 
Creating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced imagesCreating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced imagesMartin Bazley
 
10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley boden10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley bodenMartin Bazley
 
MyLearning and funding ukmw10
MyLearning and funding ukmw10MyLearning and funding ukmw10
MyLearning and funding ukmw10Martin Bazley
 
Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010Martin Bazley
 
Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010Martin Bazley
 
Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010Martin Bazley
 
Grace Kimble NHM Intro
Grace Kimble NHM IntroGrace Kimble NHM Intro
Grace Kimble NHM IntroMartin Bazley
 

Plus de Martin Bazley (20)

Digital learning resources
Digital learning resourcesDigital learning resources
Digital learning resources
 
Digital learning: an overview
Digital learning: an overviewDigital learning: an overview
Digital learning: an overview
 
Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...
 
MA conf cardiff 9 Oct 2014 museum websites online experience martin bazley ...
MA conf cardiff 9 Oct 2014 museum websites   online experience martin bazley ...MA conf cardiff 9 Oct 2014 museum websites   online experience martin bazley ...
MA conf cardiff 9 Oct 2014 museum websites online experience martin bazley ...
 
Digital technology in museums - case studies
Digital technology in museums - case studiesDigital technology in museums - case studies
Digital technology in museums - case studies
 
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploadingDigital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
 
Developing online learning resources for schools on a budget
Developing online learning resources for schools on a budgetDeveloping online learning resources for schools on a budget
Developing online learning resources for schools on a budget
 
Creating online learning resources for schools for uploading
Creating online learning resources for schools   for uploadingCreating online learning resources for schools   for uploading
Creating online learning resources for schools for uploading
 
Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...
 
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
 
Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11
 
Creating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced imagesCreating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced images
 
10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley boden10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley boden
 
MyLearning and funding ukmw10
MyLearning and funding ukmw10MyLearning and funding ukmw10
MyLearning and funding ukmw10
 
Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010
 
Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010
 
Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010
 
Peter Pavement
Peter PavementPeter Pavement
Peter Pavement
 
Grace Kimble NHM Intro
Grace Kimble NHM IntroGrace Kimble NHM Intro
Grace Kimble NHM Intro
 
Jane Devine Mejia
Jane Devine MejiaJane Devine Mejia
Jane Devine Mejia
 

Dernier

MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 

Dernier (20)

MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 

Understanding online audiences ux day oxford 18 mar 13

  • 1. Understanding online audiences Planning and implementing research into online audiences UX day Oxford 18 March 2013 Martin Bazley Online experience consultant Martin Bazley & Associates
  • 2. Martin Bazley Previously • Teaching (7 yrs) • Science Museum, London, Internet Projects (7yrs) • E-Learning Officer, MLA South East (3yrs) • Founder: Digital Learning Network DLNET
  • 3. Martin Bazley • Current • Developing online resources, websites, user testing, evaluation, training, consultancy… Martin Bazley & Associates www.martinbazley.com Slides and notes available afterwards
  • 4. Note to self: check stats tomorrow to see if anyone looked up the website www.martinbazley.com
  • 5. How can we get a sense of who our online visitors are and what they do with our online content? How do we gather data to help us improve what we do? How do we measure success from the user's point of view, and / or against our own objectives and constraints? For example, how justify investment (or lack of it) in social networks etc?
  • 6. Reasons for doing audience research: Evaluation • Did your project/product/service do what you wanted it to do? • Provide information for stakeholders • Gauge audience satisfaction
  • 7. Reasons for doing audience research: Promotion • Improve your offer for your target audiences • Increase usage • Widen access
  • 8. Reasons for doing audience research: Planning • Inform development of a new product/service • Inform business planning • Prove interest in a related activity
  • 9. Data gathering tools • Qualitative: focus groups, “free text” questions in surveys, interviews • Quantitative: web statistics, “multiple choice” questions in surveys, visitor tracking • Observational: user testing, ethnographic
  • 10. Define audience Plan methodology research goal Use results to guide Collect data changes Analyse data
  • 11. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 12. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 13. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 14. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 15. Define audience Plan methodology research goal Use results to guide Collect data changes Analyse data
  • 16. Strengths and weaknesses of different data gathering techniques
  • 17. Data gathering techniques User testing - early in development and again near end Online questionnaires – emailed to people or linked from website Focus groups - best near beginning of project, or at redevelopment stage Visitor surveys - link online and real visits Web stats - useful for long term trends /events etc
  • 18. Need to distinguish between: Diagnostics – making a project or service better Reporting – to funders, or for advocacy
  • 19. Online questionnaires (+) once set up they gather numerical and qualitative data with no further effort – given time can build up large datasets (+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results (–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys (–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website
  • 20. Focus groups (+) can explore specific issues in more depth, yielding rich feedback (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) yield qualitative data only - small numbers mean numerical comparisons are unreliable
  • 21. Visitor surveys (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums
  • 22. Web stats (+) Easy to gather data – can decide what to do with it later (+) Person-independent data generated - it is the interpretation, rather than the data themselves, which is subjective. This means others can review the same data and verify or amend initial conclusions reached
  • 23. Web stats (–) Different systems generate different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files (–) Metrics are complicated and require specialist knowledge to appreciate them fully
  • 24. Web stats (–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics (–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful
  • 25. Online surveys SurveyMonkey www.surveymonkey.com
  • 27. Learn GA: short intro videos etc https://www.google.com/analytics/iq.html
  • 28. The best way to learn GA is to use it: www.google.com/analytics/
  • 29. Web stats: Focus on trends rather than absolute values
  • 30. The ‘long tail’ An example of a power law graph showing popularity ranking. To the right is the long tail; to the left are the few that dominate. Notice that the areas of both regions match. [Wikipedia: Long Tail]
  • 31. The ‘long tail’ The tail becomes bigger and longer in new markets (depicted in red). In other words, whereas traditional retailers have focused on the area to the left of the chart, online bookstores derive more sales from the area to the right.[Wikipedia: Long Tail]
  • 32. SCA guidance http://sca.jiscinvolve.org/wp/audience-publications/ Good overview Step by step approach Culture 24 Let’s Get Real http://weareculture24.org.uk/projects/action-research/
  • 33. More information / advice / ideas Happy to help - phone number on site: Martin Bazley 0780 3580 737 www.martinbazley.com
  • 34. Extra slides not used in session Some of these may be useful - please feel free to call for clarification or more info
  • 35. When to evaluate or test and why • Before funding approval – project planning • Post-funding - project development • Post-project – summative evaluation
  • 36. Testing is an iterative process Testing isn’t something you do once Make something => test it => refine it => test it again
  • 37. Before funding – project planning • *Evaluation of other websites – Who for? What for? How use it? etc – awareness raising: issues, opportunities – contributes to market research Research – possible elements, graphic feel etc • *Concept testing – check idea makes sense with audience – reshape project based on user feedback Focus group
  • 38.
  • 39. Post-funding - project development • *Concept testing – refine project outcomes based on feedback from intended users Focus group • Refine website structure – does it work for users? One-to-one tasks • *Evaluate initial look and feel – graphics,navigation etc Focus group
  • 40.
  • 41.
  • 42. Card sorting - get various people to try out the website structure before you build it
  • 43.
  • 44. Post-funding - project development 2 • *Full evaluation of a draft working version – usability AND content: do activities work, how engaging is it, what else could be offered, etc Observation of actual use of website by intended users, using it for intended purpose, in intended context – workplace, classroom, library, home, etc
  • 45.
  • 46.
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
  • 52.
  • 53.
  • 54.
  • 55.
  • 56. Post-funding - project development 3 • Acceptance testing of ‘finished’ website – last minute check, minor corrections only – often offered by web developers • Summative evaluation – report for funders, etc – learn lessons at project level for next time
  • 57. Website evaluation and testing Need to think ahead a bit: – what are you trying to find out? – how do you intend to test it? – why? what will do you do as a result? The Why? should drive this process
  • 58. Evaluating online learning resources in the classroom Martin Bazley Online experience consultant
  • 59. Key point: for a site designed for schools, the most effective user testing observations will be made in a real classroom situation
  • 60. National Archives Moving Here project For teachers of 8 – 14 yr olds History Geography and Citizenship Features: Interactives, activity sheets, audio and video clips
  • 61. Moving Here Schools: For 8 – 14 yr olds studying: History Geography and Citizenship Features: Interactives, activity sheets, audio and video clips
  • 62. 1. preliminary testing sessions – conventional user-testing with teachers (at TNA)
  • 63. 2. in-class testing – teachers used the Moving Here Schools site with pupils in their own classrooms This meant sitting at the back of the classroom observing and taking notes…
  • 65. Site ready in parts – but not too ready:
  • 66. The environment had a significant impact on how the site was used. The class dynamic within the different groups contributed to how much the students learned.
  • 67. The environment and social dynamics The environment had a significant impact on how the site was used. The class dynamic within the different groups contributed to how much the students learned.
  • 68. in-class testing picked up elements not there in conventional user testing. teachers in preliminary user testing did not spot some problems until actually in the classroom. For example…
  • 69. interactive activities: looked big enough when viewed on a screen nearby…
  • 70.
  • 71. … but text/images too small for some children to see from the back of the class…
  • 72.
  • 73. …so interactives needed to be viewable full-screen
  • 74. Only spotted during in-class testing: …so interactives needed to be viewable full-screen
  • 75. content: when students tried to read text out loud, teachers realised some text was too difficult or complex
  • 76. activity sheets: some sheets did not have spaces for students to put their names - caused confusion when printing 30 at same time…
  • 77. Manchester Art Gallery art interactive For teachers of 8 – 11 yr olds, and for pupils History Art and Citizenship Features: interactive with built in video, quiz, etc, plus activity sheets and background info
  • 78.
  • 83. 'This classroom user testing is all very well, but...' How can you see everything in a class of 30 children – don't you miss things? You see things in a classroom that don't arise in one-to-one testing They are the real issues
  • 84. 'This classroom user testing is all very well, but...' How can you see everything in a class of 30 children – don't you miss things? You see things in a classroom that don't arise in one-to-one testing They are the real issues
  • 85. 'This classroom user testing is all very well, but...' but...' 'This classroom user testing is all very well, Doesn't using a specific class with particular needs skew the results? » For example, low ability, poor English, equipment not working, behaviour issues, etc - are results as reliable as those in a 'neutral' environment? » ‘neutral environment’ ? – no such thing - any test will be subjective, and in any case: » Testing is to make website work well in classroom, - need to see effects of factors like those.
  • 86. 'This classroomclassroom user testing is very well,but...' 'This user testing is all all very well, but...' Doesn't using a specific class with particular needs skew the results? » For example, low ability, poor English, equipment not working, behaviour issues, etc - are results as reliable as those in a 'neutral' environment? » ‘neutral environment’ ? – no such thing - any test will be subjective, and in any case: » Testing is to make website work well in classroom, - need to see effects of factors like those.
  • 87. 'This classroomclassroom user testing is very well,but...' 'This user testing is all all very well, but...' Can't my Web developer do the testing for us? » best not to use external developer to do user testing - conflict of interest » also likely to focus more on the technical aspects of the site than on effect on the teacher and pupils. » observe classes yourself but use an independent evaluator for key decision points
  • 88. 'This classroomclassroom user testing is very well,but...' 'This user testing is all all very well, but...' Can't my Web developer do the testing for us? » best not to use external developer to do user testing - conflict of interest » also likely to focus more on the technical aspects of the site than on effect on the teacher and pupils. » visit a classroom yourself but use an independent evaluator for key decision points
  • 89. 'This classroom user testing is all very well, but...' but...' 'This classroom user testing is all very well, I don't have the time or budget to do this! » need cost no more than conventional user testing. one person could attend a one-hour class session in a school, giving the teacher the same small token payment » This programme had evaluation built into project: 6.7% of total Schools site budget. » Allow 5 -10% of total project budget for user testing => videos
  • 90. Video clips • Moving Here key ideas not lesson plans etc http://www.vimeo.com/18888798 • http://www.vimeo.com/18892401 Lesson starter • Time saver http://www.vimeo.com/18867252 S
  • 91. User test early Testing one user early on in the project… …is better than testing 50 near the end
  • 92. Two usability testing techniques “Get it” testing - do they understand the purpose, how it works, etc Key task testing - ask the user to do something, watch how well they do Ideally, do a bit of each, in that order
  • 93.
  • 94. User testing – who should do it? • The worst person to conduct (or interpret) user testing of your own site is… – you! • Beware of hearing what you want to hear… • Useful to have an external viewpoint • First 5mins in a genuine setting tells you 80% of what’s wrong with the site
  • 95. Strengths and weaknesses of different data gathering techniques
  • 96. Data gathering techniques User testing - early in development and again near end Online questionnaires – emailed to people or linked from website Focus groups - best near beginning of project, or at redevelopment stage Visitor surveys - link online and real visits Web stats - useful for long term trends /events etc
  • 97. Need to distinguish between: Diagnostics – making a project or service better Reporting – to funders, or for advocacy
  • 98. Online questionnaires (+) once set up they gather numerical and qualitative data with no further effort – given time can build up large datasets (+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results (–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys (–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website
  • 99. Focus groups (+) can explore specific issues in more depth, yielding rich feedback (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) yield qualitative data only - small numbers mean numerical comparisons are unreliable
  • 100. Visitor surveys (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums
  • 101. Web stats (+) Easy to gather data – can decide what to do with it later (+) Person-independent data generated - it is the interpretation, rather than the data themselves, which is subjective. This means others can review the same data and verify or amend initial conclusions reached
  • 102. Web stats (–) Different systems generate different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files (–) Metrics are complicated and require specialist knowledge to appreciate them fully
  • 103. Web stats (–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics (–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful
  • 104. More information / advice / ideas Martin Bazley 0780 3580 737 www.martinbazley.com
  • 105. SCA guidance http://sca.jiscinvolve.org/wp/audience-publications/ Good overview Step by step approach Culture 24 Let’s Get Real http://weareculture24.org.uk/projects/action-research/
  • 107. Crit room protocol Simulating user testing – usually one-to-one in quiet room No one (especially site stakeholders) other than tester say anything for first part of session In this simulation we will focus on - Look and feel of site - Usability - Content
  • 108. More information / advice / ideas Happy to help - phone number on site: Martin Bazley 0780 3580 737 www.martinbazley.com

Notes de l'éditeur

  1. The Moving Here site, launched in 2003, is the product of collaboration between 30 local, regional and national archives, museums and libraries across the UK, headed by the National Archives. The site explores, records and illustrates why people came to England over the last 200 years, and what their experiences were and continue to be. It holds a database of on-line versions of 200,000 original documents and images recording migration history, all free to access for personal and educational use. The documents include photographs, personal papers, government documents, maps, and images of art objects, as well as a collection of sound recordings and video clips, all accessible through a search facility. The site also includes a Migration Histories section focusing on four communities – Caribbean, Irish, Jewish and South Asian – as well as a gallery of selected images from the collection, a section about tracing family roots, and a Stories section allowing users to submit stories and photographs about their own experiences of migration to England. The site was funded by the BLF (Big Lottery Fund). The Moving Here Schools site is a subsection of the greater Moving Here site, and was designed during a second phase (2005-07) of the Moving Here project. One aim of this phase is to ensure that stories of migration history are passed down to younger generations through schools. The Schools section therefore focuses on History, Citizenship and Geography for Key Stages 2 and 3 of the National Curriculum (ages 8 to 14), and includes four modules: The Victorians, Britain Since 1948, The Holocaust, and People and Places. Designed for use with an interactive whiteboard, the resources on Moving Here Schools include images and documents, audio and video clips, downloadable activity sheets, on-line interactive activities, a gallery of images, and links to stories of immigration experiences that have been collected by the Community Partnerships strand of the project. Funding for the Schools section is provided by HLF (Heritage Lottery Fund). The Schools site launches in March 2007 as a new section of the Moving Here site. Most on-line resource testing involves a potential user being observed in an environment chosen and closely controlled by an evaluator who guides the user through pre-set questions and assigned actions. Although this method of assessment can be useful in addressing top level issues and can give insight into some of the design, navigation and content changes that may need to be made, it does not replicate the conditions under which the Web site would normally be used. In this presentation we contend that the primary aim of testing Web sites for use in schools should be to capture feedback, not only on the usability, overall design, content and other aspects of the Web site itself, but also on the ways in which the Web site supports, or hinders, enjoyment and learning on the part of teachers and students in a real classroom environment. A range of issues, including group dynamics in the classroom, teachers' prior experience of using an electronic whiteboard, their background knowledge of the subject, and other issues, can all have an impact on the overall experience. The feedback generated through 'real life testing' – or 'habitat testing' – is rich and highly informative for refining the site to improve the end user experience – and it need not be expensive or overly time consuming. In this presentation we will discuss the expectations, challenges and opportunities that arose during 25 in-class testing sessions undertaken as part of our evaluation programme. We also make suggestions about how this type of testing can enhance Web site evaluation programmes, not only with regard to gaining feedback on specific resources, but also as an awareness-raising experience for museum practitioners.
  2. A comprehensive evaluation for Moving Here Schools was built into the project from the beginning, and was allocated £20,000 of the entire project's budget, which came to approximately 6.7 % of the £300,000 budget that was allotted to the Schools portion of the project (or 1.7 % of the total project budget of £1.2 million). The evaluation programme included two distinct phases: a period of preliminary testing sessions, during which teachers participated in conventional user-testing, and a period of in-class testing, during which teachers used the Moving Here Schools site directly with their pupils in their own classrooms. In planning the evaluation process, the team felt that a combination of methodologies might produce more fruitful results than just one methodology alone. The same concept is suggested in Haley Goldman and Bendoly's study of heuristic evaluation used in tandem with other types of evaluation to test museum Web sites (2003). Six teachers from the London area formed the evaluation team, selected and partially supported by the LGFL (London Grid for Learning). Four of them participated in both the preliminary testing sessions and the in-class sessions. Two preliminary testing sessions were held at the National Archives in February and June 2006 to review the first and second drafts of the Schools site. The Education Resources Manager led the sessions and two other members of the Moving Here team participated, mainly recording information. The user environment (the ICT suite at the National Archives), as well as the methodology, was based on conventional user testing, bringing the testers into a closed environment and observing them as they interacted with the draft site. For each of the two observation sessions, the teachers spent a full day looking at the draft versions of the module they had been assigned and commenting on navigation, design, subject coverage, style, tone and other elements of the site, observed by three Moving Here team members. They also participated in a group discussion at the end of the day. Their feedback was written into reports and used to make improvements to the site. These preliminary testing sessions proved invaluable to improving the quality and usability of the site. Between the February session and the June session, major changes were made as a direct result of teacher feedback, substantially improving the site. The main changes made were to reorganize the materials into shorter, blockier lessons rather than longer, linear lessons; to merge two related lessons into one; to shorten most lessons and lesson pages; and to redevelop some of the interactive activity specifications. In the June session teachers noted that their input had been followed up and commented favourably upon the fact that their feedback had been used to improve the site (one teacher said that even though she had been asked for her opinion about Web sites before, she had never seen her suggestions put into practice before this particular project). After the June session, changes were again made to the site, including design modifications, changes to interactive activities that had already been programmed, and a few more changes (including additions) to content. Although this amount of testing could be considered enough – especially since it yielded such fruitful results – Moving Here also included in-class testing as part of its programme. This approach incorporates the advantages of 'ordinary' user testing, but builds on it by taking account of the social dynamics and practical problems that influence the use of the site, so as to ensure the site is usable in the classroom – as opposed to ensuring that it is usable in a controlled user testing environment. This unique addition to the evaluation programme proved even more useful than the conventional user testing. We were hired as the evaluation team to carry out the in-class testing programme. In classroom testing sessions, teachers were observed, at their schools, using the Moving Here site with their students. The schools included two primary schools in the borough of Newham, London, a secondary school in Bethnal Green, London, and a secondary school in Peckham, London – all neighbourhoods with culturally diverse communities, a high proportion of immigrants and a large number of people whose first language is not English. Four of the original six teachers were involved, and the evaluation team went into their classrooms 25 times between October and December 2006 – 5 times per teacher, with one teacher doing a double load and testing two modules instead of a single one, for a total of 10 sessions. The teachers were paid £300 each for five sessions of in-class testing, working out at £60 per session. This closely follows the standard amount of between $50 and $100 US for one session, as suggested by Steve Krug in his seminal work on user testing, 'Don't Make Me Think!'(2000). The teacher who tested two modules received double payment. The total spent on teachers was £1500. The evaluation team was paid approximately £18,500 for the 5 sets of observations and session reports, plus a final report. The evaluation team met with each of the participating teachers in advance to agree which lessons to use for the in-classroom evaluation, to brief the teachers on what was required during the session and to agree on the procedure for arriving during the school day. In consultation with the Education Resources Manager, the evaluators produced an evaluation plan with a set of questions to ask each teacher to make sure they had covered all areas and issues, plus an in-class observation checklist. The questions covered learning outcomes, tone, length of lesson, order of pages, images, design, navigation, accessibility, activity sheet issues, issues with interactives, children's engagement, and other improvements that teachers thought might be useful. The teachers submitted lesson plans with intended learning outcomes for each session according to the National Curriculum. Immediately after each session a written session report was sent to the Education Resources Manager, who used the findings in each report to implement changes to the site while the series of observations were still going on. In some cases, changes requested by a teacher were in place in time for the next observation.
  3. A comprehensive evaluation for Moving Here Schools was built into the project from the beginning, and was allocated £20,000 of the entire project's budget, which came to approximately 6.7 % of the £300,000 budget that was allotted to the Schools portion of the project (or 1.7 % of the total project budget of £1.2 million). The evaluation programme included two distinct phases: a period of preliminary testing sessions, during which teachers participated in conventional user-testing, and a period of in-class testing, during which teachers used the Moving Here Schools site directly with their pupils in their own classrooms. In planning the evaluation process, the team felt that a combination of methodologies might produce more fruitful results than just one methodology alone. The same concept is suggested in Haley Goldman and Bendoly's study of heuristic evaluation used in tandem with other types of evaluation to test museum Web sites (2003). Six teachers from the London area formed the evaluation team, selected and partially supported by the LGFL (London Grid for Learning). Four of them participated in both the preliminary testing sessions and the in-class sessions. Two preliminary testing sessions were held at the National Archives in February and June 2006 to review the first and second drafts of the Schools site. The Education Resources Manager led the sessions and two other members of the Moving Here team participated, mainly recording information. The user environment (the ICT suite at the National Archives), as well as the methodology, was based on conventional user testing, bringing the testers into a closed environment and observing them as they interacted with the draft site. For each of the two observation sessions, the teachers spent a full day looking at the draft versions of the module they had been assigned and commenting on navigation, design, subject coverage, style, tone and other elements of the site, observed by three Moving Here team members. They also participated in a group discussion at the end of the day. Their feedback was written into reports and used to make improvements to the site. These preliminary testing sessions proved invaluable to improving the quality and usability of the site. Between the February session and the June session, major changes were made as a direct result of teacher feedback, substantially improving the site. The main changes made were to reorganize the materials into shorter, blockier lessons rather than longer, linear lessons; to merge two related lessons into one; to shorten most lessons and lesson pages; and to redevelop some of the interactive activity specifications. In the June session teachers noted that their input had been followed up and commented favourably upon the fact that their feedback had been used to improve the site (one teacher said that even though she had been asked for her opinion about Web sites before, she had never seen her suggestions put into practice before this particular project). After the June session, changes were again made to the site, including design modifications, changes to interactive activities that had already been programmed, and a few more changes (including additions) to content. Although this amount of testing could be considered enough – especially since it yielded such fruitful results – Moving Here also included in-class testing as part of its programme. This approach incorporates the advantages of 'ordinary' user testing, but builds on it by taking account of the social dynamics and practical problems that influence the use of the site, so as to ensure the site is usable in the classroom – as opposed to ensuring that it is usable in a controlled user testing environment. This unique addition to the evaluation programme proved even more useful than the conventional user testing. We were hired as the evaluation team to carry out the in-class testing programme. In classroom testing sessions, teachers were observed, at their schools, using the Moving Here site with their students. The schools included two primary schools in the borough of Newham, London, a secondary school in Bethnal Green, London, and a secondary school in Peckham, London – all neighbourhoods with culturally diverse communities, a high proportion of immigrants and a large number of people whose first language is not English. Four of the original six teachers were involved, and the evaluation team went into their classrooms 25 times between October and December 2006 – 5 times per teacher, with one teacher doing a double load and testing two modules instead of a single one, for a total of 10 sessions. The teachers were paid £300 each for five sessions of in-class testing, working out at £60 per session. This closely follows the standard amount of between $50 and $100 US for one session, as suggested by Steve Krug in his seminal work on user testing, 'Don't Make Me Think!'(2000). The teacher who tested two modules received double payment. The total spent on teachers was £1500. The evaluation team was paid approximately £18,500 for the 5 sets of observations and session reports, plus a final report. The evaluation team met with each of the participating teachers in advance to agree which lessons to use for the in-classroom evaluation, to brief the teachers on what was required during the session and to agree on the procedure for arriving during the school day. In consultation with the Education Resources Manager, the evaluators produced an evaluation plan with a set of questions to ask each teacher to make sure they had covered all areas and issues, plus an in-class observation checklist. The questions covered learning outcomes, tone, length of lesson, order of pages, images, design, navigation, accessibility, activity sheet issues, issues with interactives, children's engagement, and other improvements that teachers thought might be useful. The teachers submitted lesson plans with intended learning outcomes for each session according to the National Curriculum. Immediately after each session a written session report was sent to the Education Resources Manager, who used the findings in each report to implement changes to the site while the series of observations were still going on. In some cases, changes requested by a teacher were in place in time for the next observation.
  4. Besides the elements that originated from the site and its contents, the environment had a significant impact on how the site was used. For example, various disturbances in the classroom (excess noise, students coming in late, interruptions at the door, etc), as well as logistical issues (time taken to turn on laptops and log in, time taken to log in at the ICT suite, time taken to find the correct Web site and the page within the Web site, difficulties with saving documents to students' folders and printing them out) all affected how well students worked with the Web site. As an example, a class of year 7 students became overly excited when visiting the ICT suite for the first time with the teacher, could not concentrate on the activity because there were other classes in the ICT suite, and was unable to access a video because the school's firewall blocked it. This resulted in a relatively uneven learning experience, but also was instrumental in indicating to the evaluators which of the activities that had been attempted during that class were the most engaging, and would therefore be the most likely to hold students' attention during periods of high disturbance Another important element that the evaluators were able to capture only in a classroom setting was how students worked with each other. The class dynamic within the different groups contributed to how much the students learned, and while this issue will affect not only the group's learning from a specific Web site but also how well they will learn in all of their classroom endeavours, it is important to note how it affected the testing. For example, some groups worked extremely well together on an activity sheet, but this may have been due not only to the intrinsic interest they took in the activity but also to external issues (threats of detention if they talked too much, possibility of bad marks if they did not complete the activity, incentive of a free lunch to the group that handed in the best activity sheet). Interestingly, the evaluators' presence did not seem to distract students. Initially, the evaluators thought that their presence, even sitting discreetly at the back of the room, might cause students to react differently than they might normally have done. However, during the in-class testing sessions, evaluators found that their presence was either ignored or considered normal by the children. Reasons for this might be that students are accustomed to having more than one adult in the classroom at a time – teaching assistants might be a constant presence, other teachers might interrupt the class, and OFSTED inspectors (the Office for Standards in Education, the UK's official school inspection system) might visit classes to conduct inspections.
  5. Besides the elements that originated from the site and its contents, the environment had a significant impact on how the site was used. For example, various disturbances in the classroom (excess noise, students coming in late, interruptions at the door, etc), as well as logistical issues (time taken to turn on laptops and log in, time taken to log in at the ICT suite, time taken to find the correct Web site and the page within the Web site, difficulties with saving documents to students' folders and printing them out) all affected how well students worked with the Web site. As an example, a class of year 7 students became overly excited when visiting the ICT suite for the first time with the teacher, could not concentrate on the activity because there were other classes in the ICT suite, and was unable to access a video because the school's firewall blocked it. This resulted in a relatively uneven learning experience, but also was instrumental in indicating to the evaluators which of the activities that had been attempted during that class were the most engaging, and would therefore be the most likely to hold students' attention during periods of high disturbance Another important element that the evaluators were able to capture only in a classroom setting was how students worked with each other. The class dynamic within the different groups contributed to how much the students learned, and while this issue will affect not only the group's learning from a specific Web site but also how well they will learn in all of their classroom endeavours, it is important to note how it affected the testing. For example, some groups worked extremely well together on an activity sheet, but this may have been due not only to the intrinsic interest they took in the activity but also to external issues (threats of detention if they talked too much, possibility of bad marks if they did not complete the activity, incentive of a free lunch to the group that handed in the best activity sheet). Interestingly, the evaluators' presence did not seem to distract students. Initially, the evaluators thought that their presence, even sitting discreetly at the back of the room, might cause students to react differently than they might normally have done. However, during the in-class testing sessions, evaluators found that their presence was either ignored or considered normal by the children. Reasons for this might be that students are accustomed to having more than one adult in the classroom at a time – teaching assistants might be a constant presence, other teachers might interrupt the class, and OFSTED inspectors (the Office for Standards in Education, the UK's official school inspection system) might visit classes to conduct inspections.
  6. The in-class testing was useful in picking up elements that were not, and would not have been, flagged in the conventional user testing scenario. Even though the testers were the same teachers who had participated in the two preliminary user testing sessions, they did not pick up on some of the elements that needed to be fixed or changed until they were actually using the site in the classroom. It was only when practical implications became apparent that they noticed these items needed to be changed. All of the following issues had been present during the conventional user testing sessions but had not been singled out as needing modification until the in-class testing sessions: content : when they read the text out loud or asked students to read the text on the pages, teachers realized that the tone of some of the text was too difficult or complex, even though it had seemed fine when it was read on the screen images : teachers realized that some of the images they had seen on lesson pages were not actually useful or pertinent to their teaching, and so should be removed or moved to different pages activity sheets : activity sheets did not have spaces for students to put their names, which caused confusion when they were printing out their work – something teachers hadn't noticed when looking at the content of the sheets interactive activities : although they took up a fairly large amount of screen space when they were being viewed by a single user on a single screen, interactive activities were too small for some children to see from the back of the class and needed to be expandable to full-screen size navigation : the breadcrumb trail needed to go down one more level in order for teachers and students to immediately recognize where they were within the site navigation : the previous/next buttons and the page numbers only appeared at the bottom of the screen, sometimes after the 'fold line', which made it difficult for users to know how to get from one page to the others In fact, sometimes the issues that came up during classroom testing directly opposed what teachers had told us in initial user testing sessions. For example, during user testing teachers had said that the breadcrumb trails were easy to use and helped with navigation, but when they began using the site in the classroom, they found that this was not always the case. Teachers needed to try out the site with their pupils to see what really worked.
  7. In other words, if you are observing in a specific class with, say, low ability or poor levels of English, equipment not working, behaviour issues, and so on, how can you be sure your results are as reliable as those obtained in a 'neutral' environment? First, there is no such thing as a neutral environment, and any test will be subjective, not least because of the particular interests and abilities of the subjects themselves. Secondly, and more importantly, the overall aim of this testing is to ensure the Web site works well in classrooms, and this means seeing the effect that factors like those mentioned above have on the way the Web site is used. Although ideally one would test in more than one classroom (as in this project), just one session in one classroom, however unique the setting might be, reveals more about the required changes than one session in a neutral environment, because the social dynamics and educational imperatives are simply not there to be observed in neutral surroundings.
  8. In other words, if you are observing in a specific class with, say, low ability or poor levels of English, equipment not working, behaviour issues, and so on, how can you be sure your results are as reliable as those obtained in a 'neutral' environment? First, there is no such thing as a neutral environment, and any test will be subjective, not least because of the particular interests and abilities of the subjects themselves. Secondly, and more importantly, the overall aim of this testing is to ensure the Web site works well in classrooms, and this means seeing the effect that factors like those mentioned above have on the way the Web site is used. Although ideally one would test in more than one classroom (as in this project), just one session in one classroom, however unique the setting might be, reveals more about the required changes than one session in a neutral environment, because the social dynamics and educational imperatives are simply not there to be observed in neutral surroundings.
  9. f you are using an external Web developer, it is probably best to avoid getting them to do the user testing, as there is an inherent conflict of interest leading to a likelihood of minimising the changes required, and also because they are likely to focus more on the technical aspects of the site than on its effect on the teacher and pupils themselves. For the same reason, it would be preferable to get an external or unrelated evaluator for the project rather than visit a classroom yourself if you are the producer or writer of the content. It is always difficult to take criticism and a neutral party will not have any issues surrounding ownership of the material.
  10. f you are using an external Web developer, it is probably best to avoid getting them to do the user testing, as there is an inherent conflict of interest leading to a likelihood of minimising the changes required, and also because they are likely to focus more on the technical aspects of the site than on its effect on the teacher and pupils themselves. For the same reason, it would be preferable to get an external or unrelated evaluator for the project rather than visit a classroom yourself if you are the producer or writer of the content. It is always difficult to take criticism and a neutral party will not have any issues surrounding ownership of the material.
  11. It's true that it can cost money to conduct user testing in a classroom – but then again, it need cost no more than conventional user testing. In conventional user testing, an acceptably budget-conscious way of conducting the testing is to have (preferably neutral) staff administer it, hand-writing the notes or using in-house recording equipment to record the user's experience, and to give the tester a small token of appreciation such as a gift voucher. In an in-class testing scenario, one person could attend a one-hour class session in a school, giving the teacher the same small token payment and taking notes or using recording equipment (taking care not to breach rules about the recording of children) to make notes of the issues uncovered. The Moving Here Schools evaluation programme was built into the project plan, but still only used 6.7% of the total spending on the Schools site. The team would recommend spending between 5 and 10% of your total project budget on user testing – especially a combination of conventional and 'real habitat' testing – and planning it into your project from the start. When taking into account the cost of not conducting effective user testing, then the cost of user testing is usually worth every penny. If you can only afford one test, do one. Krug makes the point best when he says 'Testing one user is 100 percent better than testing none. Testing always works. Even the wrong test with the wrong user will show you things you can do to improve your site' (2000).