Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites
1. Apples and Oranges: Lessons From a
Usability Study of Two Library FAQ
Web Sites
Susan [Gardner] Archambault
Kenneth Simon
2. Loyola Marymount University
• Private Catholic
University in Los
Angeles, California
• 5900+ undergraduates
and 1900+ graduates
• William H. Hannon
Library Information
Desk open 24/5
3. Research Question
• What is the most effective
way to provide access to
our Library FAQs?
• A comparison of two
products: How Do I? and
LibAnswers. Which
features do students
prefer, and which features
lead to better
performance?
8. Methodology
• Conducted usability
testing on 20
undergraduate students
at LMU
• Population equally
represented each class
(freshmen through
seniors) and had a ratio
of 60:40 females to
males
9. Methodology
• Used a combination of
the Performance Test
methodology and the
Think-Aloud
methodology
10. Methodology
• Students given 10 performance tasks to
complete at a computer twice - once using
LibAnswers as starting point, and once using
How Do I?
• After each performance task, students given
questionnaire measuring satisfaction with site
11. Performance Task Questions
How to print in the library from a laptop How to request a research consultation
How long can a graduate student check How to search for a book by the author’s
out a book name
Where are the library copy machines How to tell what books are on reserve for
a class
How to request a book from basement Where to access CRSPSift software in the
storage library
Can a Loyola law school student reserve How much does it cost for an undergrad
a group study room in advance to request a magazine article from
another library
14. Additional Questions
• How likely would you be to use each page
again?
• What was your favorite aspect of each site?
• What was your least favorite aspect?
• Overall, do you prefer LibAnswers or How Do
I?
15. Performance Scoring: Speed
• Start the clock when
the person begins
searching for the
answer to a new
question on the home
page of the site they are
testing
• Stop the clock when
they copy the URL with
the answer
16. Performance Scoring: Accuracy
Check off the one that
Was the Answer…
applies:
Pointed to a related question under the
Completely Accurate: found the answer
correct category, but incorrect page
On the correct path to the
information, but did not go far enough or
Incorrect and off topic
took wrong subsequent path
On the correct page, but did not see the
answer (supersedes everything else they Gave up: never found an answer
tried on other attempts to answer)
17. Performance Scoring: Efficiency
• Count the number of
times the person made
a new attempt, or
started down a new
path, by returning to
the home page *after*
a previous attempt
away from or on the
homepage failed
18. Sample Scoring Video
bit.ly/usabilityvideo
Site Speed Accuracy Efficiency
How Do I? 46 seconds Completely Accurate +1 (clicked 1 wrong path)
LibAnswers 36 seconds Completely Accurate +1 (clicked 1 wrong path)
19. Performance Results
Speed Average (seconds) Efficiency Total Wrong Paths
LibAnswers 40.55 LibAnswers 30
How Do I? 33.90 How Do I? 40
20. Performance Results
Accuracy LibAnswers How Do I?
Completely accurate 182 (91%) 175 (87.5%)
Correct path but did not go 5 (2.5%) 15 (7.5%)
far enough or took a wrong
subsequent path
Correct page, but did not 3 (1.5%) 3 (1.5%)
see the answer
Pointed to a related 6 (3%) 3 (1.5%)
question under the correct
category, but incorrect
page
Incorrect and off-topic 0 3 (1.5%)
Gave up: never found 4 (2%) 1 (.005%)
answer
21. LibAnswers Features Used
Feature Number Who Used Percent
Search Box 16 80%
Auto-Suggest 12 60%
Popular Answers 9 45%
Cloud Tag 8 40%
Related Questions 4 20%
Change Topic Drop-down 2 10%
Recent Answers 2 10%
22. Satisfaction
Likely to use Very Unlikely Undecided Likely Very
again unlikely Likely
LibAnswers 0 15% (3) 5 (25%) 5 (25%) 7 (35%)
How Do I? 0 15% (3) 3 (15%) 5 (25%) 9 (45%)
24. Patterns
• Overall, 9 of 20 performed worse with the site
they said they preferred.
• 4 of 5 freshmen performed worse with the site
they said they preferred. Upperclassmen were
more consistent.
• Females tended to perform better with their
preferred site; males did not.
• 75% of the males preferred How Do I? over
LibAnswers, while females were evenly divided.
25. LibAnswers
Likes Dislikes
• Overwhelming interface /
• Keyword search “like a cluttered
search engine” • Long list of specific
• Autosuggest in search questions but hard to find
the info you want
bar
• Less efficient than the
• Popular topics list “How Do I” page
• Friendly / pleasant to • Once you do a search, you
lose your original question
use • Autosuggestions are
• Don’t have to read ambiguous or too
through categories broad, and sometimes
don’t function properly
26. How Do I?
Likes Dislikes
• Fast / efficient to use • Less efficient than the
LibAnswers page: have to
• Everything is right read a lot
there in front of you: “I • Too restricted: needs a
don’t have to type, just search box
click” • Have to guess a category
• Simple, clearly laid out to decide where to look
categories • Limited number of too-
broad questions
• Organized and clean • Boring / basic
looking appearance
27. Sharing results with Springshare
• Retain question asked in search results screen.
• Add stopwords to search, so typing “How do I”
doesn’t drop down a long list of irrelevant
stuff, and “Where is” and “where are” aren’t
mutually exclusive.
• Remove “related LibGuides” content to reduce
clutter.
• Control the list of “related questions” below an
answer: they seem to be based only on the first
topic assigned to a given question.
33. Conclusions
• Ended up with a • Sitting in silence
balance between two watching the
extremes rather than participants made them
one or the other nervous. Next time
maybe leave the room
• Think-aloud method: and have a self-guided
gave up control; no test
preconceived ideas
could influence • Efficiency is difficult to
outcome measure: moved away
from counting clicks
35. Bibliography
• Ericsson, K.A. and Simon, H.A. • Porter, J. (2003). Testing the
(1980). Verbal Reports as Data. Three-Click Rule. Retrieved from
Psychological Review, 87(3), 215- http://www.uie.com/articles/thre
251. e_click_rule/.
• Smith, Ashleigh, Magner, Brian, a • Willis, G.B. (2005). Cognitive
nd Phelan, Paraic. (2008, Nov. Interviewing: A Tool for Improving
20). Think Aloud Protocol Part 2. Questionnaire Design. Thousand
Retrieved May 3, 2012 from Oaks, CA: Sage Publications.
http://www.youtube.com/watch?
v=dyQ_rtylJ3c&feature=related
• Norlin, Elaina. (2002). Usability
Testing for Library Web Sites: A
Hands-On Guide. Chicago:
American Library Association.
36. Additional Information
Presentation Slides Contact Us
• bit.ly/gardnersimon Ken Simon
Reference & Instruction Technologies Librarian
Loyola Marymount University
Twitter: @ksimon
Email: ksimon@me.com
Susan [Gardner] Archambault
Head of Reference & Instruction
Loyola Marymount University
Twitter: @susanLMU
Email: susan.gardner@lmu.edu