This panel discussion focused on tools and techniques for gathering feedback on database trials from librarians and library users. The panelists from Golden Gate University, University of Nebraska-Kearney, and Clemson University discussed criteria for selecting database trials, scheduling trials, soliciting and recording trial feedback, and closing the loop with participants and vendors after a trial. Key points included using web surveys to gather feedback, timing trials to maximize participation, and maintaining records of past trials and decisions.
Re-Branding the Librarian Profession in the Digital Age
Trials by Juries: Suggested Practices for Database Trials
1. Annis Lee Adams, Golden Gate University
Jon Ritterbush, University of Nebraska - Kearney
Christine E. Ryan, Clemson University
Tuesday, April 3, 2012
2. This panel discussion topic began with an
innocent question on the ERIL-L listserv in
September 2011:
What tools or techniques have worked for you
in gathering feedback on database trials,
whether from librarians or library users?
4. Does the content directly support our
programs?
Does the platform allow for IP and proxy server
authentication?
Our FY budget must be completed by March, so
we try to trial things in the fall and winter, but
will trial all year long.
5. Collection Development Librarians
Faculty, as appropriate. Note: we only involve
faculty if we know we can afford the product.
Email announcement to CD librarians, who can
forward to their faculty. Discussion at CD
meetings (which are held twice monthly).
7. Internal Blog on Wordpress
Information about how to access trial
Cost information
Description of product(s)
Evaluators comment on why or why not to acquire
After trial ends, change blog entry to “draft” to
keep for historical records
Spreadsheet to track all product investigations,
decisions made and why.
12. A. What criteria apply to selecting product trials?
B. How are trials scheduled?
C. How is trial feedback solicited and recorded?
13. A. What criteria apply to selecting product trials?
1. Would this product meet the standards of our E-
Resource Collection Development Policy?
IP access?
Support the curriculum?
2. Can we realistically afford this product in the near
future?
Yes – continue with consideration of trial
No – may postpone official trial, or proceed with full
honesty to vendor
14. A. What criteria apply to selecting product trials?
3. Does this trial have a librarian requestor/sponsor?
Trials requested by students or faculty must be
sponsored by the liaison librarian for that department or
subject area.
If requestor/sponsor is not the ER Librarian, the other
librarian should also evaluate the trial product and solicit
feedback from other faculty in their liaison area.
Requiring a librarian sponsor may help deflect vendor-
initiated trials
15. B. How are trials scheduled?
1. Scheduled between Sept. – Nov. and Feb.- Apr. to
capture maximum participation by faculty.
2. No more than three trials scheduled
simultaneously to avoid “trial fatigue.”
3. Only extended trials (>3 months) are advertised on
the library’s blog and website. Most trials (<1
month) are kept internal or shared with faculty by
email.
16. C. How is trial feedback solicited and recorded?
1. Email was a mediocre solution –
Responses from non-librarians were often less
descriptive as to how the trial database might be used.
Time-consuming to compile into reports.
2. Short web forms have worked better
Use a mix of Likert scale and free-text questions
UNK uses Qualtrics. LibGuides survey boxes,
SurveyMonkey, PollDaddy and Google Forms were other
suggested web survey tools shared on ERIL-L
17. C. How is trial feedback solicited and recorded?
3. Advertising the trial
Post trial information to an internal library blog, with a
sample “press release” that could be copied/pasted into
emails to specific faculty.
Arrange for a vendor’s webinar or live demonstration of
the product for librarians and faculty.
For extended trials, could post info to faculty and/or
student listservs, library’s website and public blog.
18. C. How is trial feedback solicited and recorded?
4. Sharing responses publicly?
PRO: Allows participants to see what others think
before/after their trial experience (and perhaps respond).
CON: May discourage participants from sharing “frank”
comments.
At UNK, we’re erring on the side of privacy and keeping
survey responses internal, with no identifiers required.
20. Trial Survey Questions
How relevant is this database's content to your research and/or
instruction at UNK?* (Highly Relevant / Somewhat Relevant /
Limited Relevance / Not Relevant)
In which courses -- or areas of research or teaching -- would
this database help you?* (Free-text)
How easy was it to navigate this database and access results on
your search topic?* (Very Easy /
Somewhat Easy / Somewhat Difficult / Very Difficult)
Should the UNK Library acquire this database, in your
opinion?* (Yes / No / Not sure)
Additional comments? (Free-text)
21. Trial Survey Questions
OPTIONAL: Would you like to be notified regarding the
library's decision on this database trial? If yes, please enter
your email address below:
Only if the respondent enters an email address, the survey will
also ask…
May the library contact you with follow-up questions regarding
your responses to this database trial?
(Yes / No)
22. C. How is trial feedback solicited and recorded?
5. Record keeping of trial feedback
CSV and PDF files of survey results are saved and shared
with other librarians.
Maintain a basic spreadsheet about past/current trials
and decisions.
24. WHY TRIAL?
• WE need new content
• New content packages or technology
• Platform changes
• Consortium changes
• User requests
REQUESTS ORIGINATE EVERYWHERE, BUT……
25. WHO DECIDES TO PROCEED WITH A TRIAL?
• Committee
• Subject liaison
• Other?
SO YOU WANT TO TRIAL, NOW WHAT? ………
33. Fischer, Christine. 2007. "Group Therapy—
Database Trials." Against the Grain 19, no. 6:
65-6.
Street, Chrissy. 2010.
"Getting the Most from a Database Trial." Legal
Information Management 10, no. 2: 147-8.
34. Annis Lee Adams,
Golden Gate University
ladams@ggu.edu
Jon Ritterbush,
University of Nebraska – Kearney
ritterbushjr@unk.edu Twitter: @loperlibrarian
Christine E. Ryan,
Clemson University
ryanchr@clemson.edu