SlideShare une entreprise Scribd logo
1  sur  34
Télécharger pour lire hors ligne
Annis Lee Adams, Golden Gate University
Jon Ritterbush, University of Nebraska - Kearney
Christine E. Ryan, Clemson University
Tuesday, April 3, 2012
This panel discussion topic began with an
innocent question on the ERIL-L listserv in
September 2011:


 What tools or techniques have worked for you
  in gathering feedback on database trials,
   whether from librarians or library users?
Annis Lee Adams, Golden Gate University
Does the content directly support our
programs?

Does the platform allow for IP and proxy server
authentication?

Our FY budget must be completed by March, so
we try to trial things in the fall and winter, but
will trial all year long.
Collection Development Librarians

Faculty, as appropriate. Note: we only involve
faculty if we know we can afford the product.

Email announcement to CD librarians, who can
forward to their faculty. Discussion at CD
meetings (which are held twice monthly).
Content

Ease of use/functionality

Cost

Did a faculty member request it?
Internal Blog on Wordpress
   Information about how to access trial
   Cost information
   Description of product(s)
   Evaluators comment on why or why not to acquire
After trial ends, change blog entry to “draft” to
keep for historical records
Spreadsheet to track all product investigations,
decisions made and why.
Jon Ritterbush, University of Nebraska-Kearney
A. What criteria apply to selecting product trials?
B. How are trials scheduled?
C. How is trial feedback solicited and recorded?
A. What criteria apply to selecting product trials?
   1. Would this product meet the standards of our E-
      Resource Collection Development Policy?
         IP access?
         Support the curriculum?
   2. Can we realistically afford this product in the near
      future?
         Yes – continue with consideration of trial
         No – may postpone official trial, or proceed with full
         honesty to vendor
A. What criteria apply to selecting product trials?
   3. Does this trial have a librarian requestor/sponsor?
        Trials requested by students or faculty must be
        sponsored by the liaison librarian for that department or
        subject area.
        If requestor/sponsor is not the ER Librarian, the other
        librarian should also evaluate the trial product and solicit
        feedback from other faculty in their liaison area.
        Requiring a librarian sponsor may help deflect vendor-
        initiated trials
B. How are trials scheduled?
   1. Scheduled between Sept. – Nov. and Feb.- Apr. to
      capture maximum participation by faculty.
   2. No more than three trials scheduled
      simultaneously to avoid “trial fatigue.”
   3. Only extended trials (>3 months) are advertised on
      the library’s blog and website. Most trials (<1
      month) are kept internal or shared with faculty by
      email.
C. How is trial feedback solicited and recorded?
   1. Email was a mediocre solution –
       Responses from non-librarians were often less
       descriptive as to how the trial database might be used.
       Time-consuming to compile into reports.
   2. Short web forms have worked better
       Use a mix of Likert scale and free-text questions
       UNK uses Qualtrics. LibGuides survey boxes,
       SurveyMonkey, PollDaddy and Google Forms were other
       suggested web survey tools shared on ERIL-L
C. How is trial feedback solicited and recorded?
   3. Advertising the trial
        Post trial information to an internal library blog, with a
        sample “press release” that could be copied/pasted into
        emails to specific faculty.
        Arrange for a vendor’s webinar or live demonstration of
        the product for librarians and faculty.
        For extended trials, could post info to faculty and/or
        student listservs, library’s website and public blog.
C. How is trial feedback solicited and recorded?
   4. Sharing responses publicly?
       PRO: Allows participants to see what others think
       before/after their trial experience (and perhaps respond).
       CON: May discourage participants from sharing “frank”
       comments.
       At UNK, we’re erring on the side of privacy and keeping
       survey responses internal, with no identifiers required.
Sample survey at bit.ly/y33F7D
Trial Survey Questions
  How relevant is this database's content to your research and/or
  instruction at UNK?* (Highly Relevant / Somewhat Relevant /
  Limited Relevance / Not Relevant)
  In which courses -- or areas of research or teaching -- would
  this database help you?* (Free-text)
  How easy was it to navigate this database and access results on
  your search topic?* (Very Easy /
  Somewhat Easy / Somewhat Difficult / Very Difficult)
  Should the UNK Library acquire this database, in your
  opinion?* (Yes / No / Not sure)
  Additional comments? (Free-text)
Trial Survey Questions
   OPTIONAL: Would you like to be notified regarding the
   library's decision on this database trial? If yes, please enter
   your email address below:

Only if the respondent enters an email address, the survey will
also ask…
   May the library contact you with follow-up questions regarding
   your responses to this database trial?
   (Yes / No)
C. How is trial feedback solicited and recorded?
   5. Record keeping of trial feedback
        CSV and PDF files of survey results are saved and shared
        with other librarians.
        Maintain a basic spreadsheet about past/current trials
        and decisions.
Christine E. Ryan, Clemson University
WHY TRIAL?
  •   WE need new content
  •   New content packages or technology
  •   Platform changes
  •   Consortium changes
  •   User requests



      REQUESTS ORIGINATE EVERYWHERE, BUT……
WHO DECIDES TO PROCEED WITH A TRIAL?
  • Committee
  • Subject liaison
  • Other?




    SO YOU WANT TO TRIAL, NOW WHAT? ………
WHO COORDINATES THE TRIAL?
WHO COORDINATES THE TRIAL?
• Beginning: what, why
• End: yes/no
• Everything in between
  • When to trial
  • Who to trial with
  • How to…………
HOW TO ACCESS
  • Standing trial web page
    • Highly visible
    • Moderately visible
    • Hidden
  • Subject or LibGuide
ADVERTISING/PROMOTING THE TRIAL
  • General campus email (faculty/staff)
  • General student email
    • Student organizations
  • Subject liaison contacts
    •   Emails to departments
    •   Attend department meetings
    •   Embedded librarians
    •   Course-specific announcements (Blackboard, library
        instruction)
  • Web page, printed announcements
GATHERING/ANALYZING FEEDBACK
  • Gathering Feedback
    • Surveys
      • How to access?
    • Solicitation during conversation, email, focus group
      training
    • Usage stats
  • Analyzing
    • Analyzing Likert Scale responses
    • Compiling free text responses
WHO DECIDES?
• Committee
• Subject liaison
CLOSING THE LOOP
• Participants
• Vendors
Fischer, Christine. 2007. "Group Therapy—
Database Trials." Against the Grain 19, no. 6:
65-6.
Street, Chrissy. 2010.
"Getting the Most from a Database Trial." Legal
Information Management 10, no. 2: 147-8.
Annis Lee Adams,
Golden Gate University
ladams@ggu.edu

Jon Ritterbush,
University of Nebraska – Kearney
ritterbushjr@unk.edu Twitter: @loperlibrarian

Christine E. Ryan,
Clemson University
ryanchr@clemson.edu

Contenu connexe

Tendances

HEA Biosci Electronic Detection of plagiarism
HEA Biosci Electronic Detection of plagiarismHEA Biosci Electronic Detection of plagiarism
HEA Biosci Electronic Detection of plagiarism
Jo Badge
 

Tendances (20)

Challenges and Opportunities for Promoting Success among the Successful using...
Challenges and Opportunities for Promoting Success among the Successful using...Challenges and Opportunities for Promoting Success among the Successful using...
Challenges and Opportunities for Promoting Success among the Successful using...
 
Introduction to Learning Analytics in Blackboard
Introduction to Learning Analytics in BlackboardIntroduction to Learning Analytics in Blackboard
Introduction to Learning Analytics in Blackboard
 
Can electronic marking help engage students with assessment and feedback
Can electronic marking help engage students with assessment and feedbackCan electronic marking help engage students with assessment and feedback
Can electronic marking help engage students with assessment and feedback
 
Primo Central Trial, Usability Testing, and Implementation Options (2012)
Primo Central Trial, Usability Testing, and Implementation Options (2012)Primo Central Trial, Usability Testing, and Implementation Options (2012)
Primo Central Trial, Usability Testing, and Implementation Options (2012)
 
Assessing Pedagogy of Online Classes
Assessing Pedagogy of Online ClassesAssessing Pedagogy of Online Classes
Assessing Pedagogy of Online Classes
 
Rethinking Student Success: Analytics in Support of Teaching and Learning
Rethinking Student Success: Analytics in Support of Teaching and LearningRethinking Student Success: Analytics in Support of Teaching and Learning
Rethinking Student Success: Analytics in Support of Teaching and Learning
 
Can electronic marking help to engage students with assessment and feedback?
Can electronic marking help to engage students with assessment and feedback?Can electronic marking help to engage students with assessment and feedback?
Can electronic marking help to engage students with assessment and feedback?
 
Scimpact hiddendata awareness_survey_2015
Scimpact hiddendata awareness_survey_2015Scimpact hiddendata awareness_survey_2015
Scimpact hiddendata awareness_survey_2015
 
I Think We're Clicking
I Think We're Clicking I Think We're Clicking
I Think We're Clicking
 
Open Textbook Summit - Student PIRGs Study
Open Textbook Summit - Student PIRGs Study Open Textbook Summit - Student PIRGs Study
Open Textbook Summit - Student PIRGs Study
 
Using Web 2.0 Technologies to Facilitate Learning
Using Web 2.0 Technologies to Facilitate LearningUsing Web 2.0 Technologies to Facilitate Learning
Using Web 2.0 Technologies to Facilitate Learning
 
Directions inpowerpoint
Directions inpowerpointDirections inpowerpoint
Directions inpowerpoint
 
Connecting Librarians to Researchers
Connecting Librarians to ResearchersConnecting Librarians to Researchers
Connecting Librarians to Researchers
 
Using Google forms to engage students in the IL lecture
Using Google forms to engage students in the IL lectureUsing Google forms to engage students in the IL lecture
Using Google forms to engage students in the IL lecture
 
ACRL 2011 Data-Driven Library Web Design
ACRL 2011 Data-Driven Library Web DesignACRL 2011 Data-Driven Library Web Design
ACRL 2011 Data-Driven Library Web Design
 
Big Qualitative Data, Big Team, Little Time - A Path to Publication
Big Qualitative Data, Big Team, Little Time - A Path to PublicationBig Qualitative Data, Big Team, Little Time - A Path to Publication
Big Qualitative Data, Big Team, Little Time - A Path to Publication
 
AEJMC 2008
AEJMC 2008AEJMC 2008
AEJMC 2008
 
Using Web 2.0 Technologies to Facilitate Learning
Using Web 2.0 Technologies to Facilitate LearningUsing Web 2.0 Technologies to Facilitate Learning
Using Web 2.0 Technologies to Facilitate Learning
 
HEA Biosci Electronic Detection of plagiarism
HEA Biosci Electronic Detection of plagiarismHEA Biosci Electronic Detection of plagiarism
HEA Biosci Electronic Detection of plagiarism
 
Using Web 2.0 to Facilitate Learning
Using Web 2.0 to Facilitate LearningUsing Web 2.0 to Facilitate Learning
Using Web 2.0 to Facilitate Learning
 

En vedette (6)

Soaring in the Cloud: Collaboration Tools for Superhero Librarians
Soaring in the Cloud: Collaboration Tools for Superhero LibrariansSoaring in the Cloud: Collaboration Tools for Superhero Librarians
Soaring in the Cloud: Collaboration Tools for Superhero Librarians
 
Sharing the Buck: How Diverse Libraries Came Together to Share Costs and Gain...
Sharing the Buck: How Diverse Libraries Came Together to Share Costs and Gain...Sharing the Buck: How Diverse Libraries Came Together to Share Costs and Gain...
Sharing the Buck: How Diverse Libraries Came Together to Share Costs and Gain...
 
Free Biz Resources: Visualization Tools for Business Research
Free Biz Resources: Visualization Tools for Business ResearchFree Biz Resources: Visualization Tools for Business Research
Free Biz Resources: Visualization Tools for Business Research
 
Alam semesta 2
Alam semesta 2Alam semesta 2
Alam semesta 2
 
Health Info on the Web for the Non-Medical Librarian
Health Info on the Web for the Non-Medical LibrarianHealth Info on the Web for the Non-Medical Librarian
Health Info on the Web for the Non-Medical Librarian
 
Potensi Dasar Manusia
Potensi Dasar ManusiaPotensi Dasar Manusia
Potensi Dasar Manusia
 

Similaire à Trials by Juries: Suggested Practices for Database Trials

Phase II Overview of Information Literacy Assessment Project
Phase II Overview of Information Literacy Assessment ProjectPhase II Overview of Information Literacy Assessment Project
Phase II Overview of Information Literacy Assessment Project
shannonstaley70
 
Use of online quizzes to support inquiry-based learning in chemical engineering
Use of online quizzes to support inquiry-based learning in chemical engineeringUse of online quizzes to support inquiry-based learning in chemical engineering
Use of online quizzes to support inquiry-based learning in chemical engineering
cilass.slideshare
 
NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...
NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...
NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...
Nebraska Library Commission
 
Ppdesarr Oll O3 Mar09 1 1
Ppdesarr Oll O3 Mar09 1  1Ppdesarr Oll O3 Mar09 1  1
Ppdesarr Oll O3 Mar09 1 1
basmati
 

Similaire à Trials by Juries: Suggested Practices for Database Trials (20)

Introduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchIntroduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey Research
 
Read 6406: Creating Your IRB Submission
Read 6406: Creating Your IRB SubmissionRead 6406: Creating Your IRB Submission
Read 6406: Creating Your IRB Submission
 
CLA 2012 -- UConn Libraries Virtual Focus Groups
CLA 2012 -- UConn Libraries  Virtual Focus GroupsCLA 2012 -- UConn Libraries  Virtual Focus Groups
CLA 2012 -- UConn Libraries Virtual Focus Groups
 
NCompass Live: ACRL Outcome Measurement Made Easy: Project Outcome for Academ...
NCompass Live: ACRL Outcome Measurement Made Easy: Project Outcome for Academ...NCompass Live: ACRL Outcome Measurement Made Easy: Project Outcome for Academ...
NCompass Live: ACRL Outcome Measurement Made Easy: Project Outcome for Academ...
 
Assessing for Improvement: learning outcomes assessment for library instruction
Assessing for Improvement: learning outcomes assessment for library instructionAssessing for Improvement: learning outcomes assessment for library instruction
Assessing for Improvement: learning outcomes assessment for library instruction
 
Database Trial Success Through Community Organizing MLA 2015 poster
Database Trial Success Through Community Organizing MLA 2015 posterDatabase Trial Success Through Community Organizing MLA 2015 poster
Database Trial Success Through Community Organizing MLA 2015 poster
 
Best Practices in Online Lab Science Education
Best Practices in Online Lab Science EducationBest Practices in Online Lab Science Education
Best Practices in Online Lab Science Education
 
What will they need? Pre-assessment techniques for instruction session.
What will they need?  Pre-assessment techniques for instruction session.What will they need?  Pre-assessment techniques for instruction session.
What will they need? Pre-assessment techniques for instruction session.
 
SFASU: Using Brightspace Tools to Increase Student Engagement and Instructor ...
SFASU: Using Brightspace Tools to Increase Student Engagement and Instructor ...SFASU: Using Brightspace Tools to Increase Student Engagement and Instructor ...
SFASU: Using Brightspace Tools to Increase Student Engagement and Instructor ...
 
Straight from the user's mouth: Usability testing UNC's LibGuides
Straight from the user's mouth: Usability testing UNC's LibGuidesStraight from the user's mouth: Usability testing UNC's LibGuides
Straight from the user's mouth: Usability testing UNC's LibGuides
 
BRS SA 2.0 (2021) - Part 3 of 3.pptx
BRS SA 2.0 (2021) - Part 3 of 3.pptxBRS SA 2.0 (2021) - Part 3 of 3.pptx
BRS SA 2.0 (2021) - Part 3 of 3.pptx
 
Polling Systems Presentation - EDU
Polling Systems Presentation - EDUPolling Systems Presentation - EDU
Polling Systems Presentation - EDU
 
Phase II Overview of Information Literacy Assessment Project
Phase II Overview of Information Literacy Assessment ProjectPhase II Overview of Information Literacy Assessment Project
Phase II Overview of Information Literacy Assessment Project
 
Research process by Dr.T.V.Rao MD
Research process by Dr.T.V.Rao MDResearch process by Dr.T.V.Rao MD
Research process by Dr.T.V.Rao MD
 
Use of online quizzes to support inquiry-based learning in chemical engineering
Use of online quizzes to support inquiry-based learning in chemical engineeringUse of online quizzes to support inquiry-based learning in chemical engineering
Use of online quizzes to support inquiry-based learning in chemical engineering
 
Data informed decision making - Yaz El Hakim
Data informed decision making - Yaz El HakimData informed decision making - Yaz El Hakim
Data informed decision making - Yaz El Hakim
 
questionnaire preparation
questionnaire preparationquestionnaire preparation
questionnaire preparation
 
NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...
NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...
NCompass Live: Collecting Library User Feedback: Free! high tech and low tech...
 
Survey design 1
Survey design 1Survey design 1
Survey design 1
 
Ppdesarr Oll O3 Mar09 1 1
Ppdesarr Oll O3 Mar09 1  1Ppdesarr Oll O3 Mar09 1  1
Ppdesarr Oll O3 Mar09 1 1
 

Plus de Annis Lee Adams

Plus de Annis Lee Adams (15)

Hashtag Info Lit: A text analysis of information literacy tweets
Hashtag Info Lit: A text analysis of information literacy tweetsHashtag Info Lit: A text analysis of information literacy tweets
Hashtag Info Lit: A text analysis of information literacy tweets
 
Discovery analytics: How patrons at three campuses interacted with primo
Discovery analytics: How patrons at three campuses interacted with primoDiscovery analytics: How patrons at three campuses interacted with primo
Discovery analytics: How patrons at three campuses interacted with primo
 
Research trials and tribulations: Content analysis of online comments to arti...
Research trials and tribulations: Content analysis of online comments to arti...Research trials and tribulations: Content analysis of online comments to arti...
Research trials and tribulations: Content analysis of online comments to arti...
 
What Does the Public Say? Analyzing Online News Article Comments About Libraries
What Does the Public Say? Analyzing Online News Article Comments About LibrariesWhat Does the Public Say? Analyzing Online News Article Comments About Libraries
What Does the Public Say? Analyzing Online News Article Comments About Libraries
 
Who do they think we are? Addressing library identity perception in the academy
Who do they think we are? Addressing library identity perception in the academyWho do they think we are? Addressing library identity perception in the academy
Who do they think we are? Addressing library identity perception in the academy
 
Are we there yet? Rev up your productivity with project management tools
Are we there yet?  Rev up your productivity with project management toolsAre we there yet?  Rev up your productivity with project management tools
Are we there yet? Rev up your productivity with project management tools
 
Speed Dating for Project Management Software
Speed Dating for Project Management SoftwareSpeed Dating for Project Management Software
Speed Dating for Project Management Software
 
Tripping Hazards: Where Do Patrons Stumble on our Website?
Tripping Hazards: Where Do Patrons Stumble on our Website?Tripping Hazards: Where Do Patrons Stumble on our Website?
Tripping Hazards: Where Do Patrons Stumble on our Website?
 
Give ‘em the business: Be a biz wiz with interactive business reference tools
Give ‘em the business: Be a biz wiz with interactive business reference toolsGive ‘em the business: Be a biz wiz with interactive business reference tools
Give ‘em the business: Be a biz wiz with interactive business reference tools
 
Pathways to Online Information: A Collaborative Project for E-Resources
Pathways to Online Information: A Collaborative Project for E-ResourcesPathways to Online Information: A Collaborative Project for E-Resources
Pathways to Online Information: A Collaborative Project for E-Resources
 
A Small Scale Multi-type Library Consortium for the Purchase of Electronic R...
A Small Scale Multi-type Library Consortium  for the Purchase of Electronic R...A Small Scale Multi-type Library Consortium  for the Purchase of Electronic R...
A Small Scale Multi-type Library Consortium for the Purchase of Electronic R...
 
PDAs in Medical Education: MARP (Mobile Access Resource Project)
PDAs in Medical Education: MARP (Mobile Access Resource Project) PDAs in Medical Education: MARP (Mobile Access Resource Project)
PDAs in Medical Education: MARP (Mobile Access Resource Project)
 
Rebranding the Librarian: A Project of the ALA Emerging Leaders Program
Rebranding the Librarian: A Project of the ALA Emerging Leaders ProgramRebranding the Librarian: A Project of the ALA Emerging Leaders Program
Rebranding the Librarian: A Project of the ALA Emerging Leaders Program
 
PDAs and First Year Medical Students
PDAs and First Year Medical StudentsPDAs and First Year Medical Students
PDAs and First Year Medical Students
 
Re-Branding the Librarian Profession in the Digital Age
Re-Branding the Librarian Profession in the Digital AgeRe-Branding the Librarian Profession in the Digital Age
Re-Branding the Librarian Profession in the Digital Age
 

Trials by Juries: Suggested Practices for Database Trials

  • 1. Annis Lee Adams, Golden Gate University Jon Ritterbush, University of Nebraska - Kearney Christine E. Ryan, Clemson University Tuesday, April 3, 2012
  • 2. This panel discussion topic began with an innocent question on the ERIL-L listserv in September 2011: What tools or techniques have worked for you in gathering feedback on database trials, whether from librarians or library users?
  • 3. Annis Lee Adams, Golden Gate University
  • 4. Does the content directly support our programs? Does the platform allow for IP and proxy server authentication? Our FY budget must be completed by March, so we try to trial things in the fall and winter, but will trial all year long.
  • 5. Collection Development Librarians Faculty, as appropriate. Note: we only involve faculty if we know we can afford the product. Email announcement to CD librarians, who can forward to their faculty. Discussion at CD meetings (which are held twice monthly).
  • 6. Content Ease of use/functionality Cost Did a faculty member request it?
  • 7. Internal Blog on Wordpress Information about how to access trial Cost information Description of product(s) Evaluators comment on why or why not to acquire After trial ends, change blog entry to “draft” to keep for historical records Spreadsheet to track all product investigations, decisions made and why.
  • 8.
  • 9.
  • 10.
  • 11. Jon Ritterbush, University of Nebraska-Kearney
  • 12. A. What criteria apply to selecting product trials? B. How are trials scheduled? C. How is trial feedback solicited and recorded?
  • 13. A. What criteria apply to selecting product trials? 1. Would this product meet the standards of our E- Resource Collection Development Policy? IP access? Support the curriculum? 2. Can we realistically afford this product in the near future? Yes – continue with consideration of trial No – may postpone official trial, or proceed with full honesty to vendor
  • 14. A. What criteria apply to selecting product trials? 3. Does this trial have a librarian requestor/sponsor? Trials requested by students or faculty must be sponsored by the liaison librarian for that department or subject area. If requestor/sponsor is not the ER Librarian, the other librarian should also evaluate the trial product and solicit feedback from other faculty in their liaison area. Requiring a librarian sponsor may help deflect vendor- initiated trials
  • 15. B. How are trials scheduled? 1. Scheduled between Sept. – Nov. and Feb.- Apr. to capture maximum participation by faculty. 2. No more than three trials scheduled simultaneously to avoid “trial fatigue.” 3. Only extended trials (>3 months) are advertised on the library’s blog and website. Most trials (<1 month) are kept internal or shared with faculty by email.
  • 16. C. How is trial feedback solicited and recorded? 1. Email was a mediocre solution – Responses from non-librarians were often less descriptive as to how the trial database might be used. Time-consuming to compile into reports. 2. Short web forms have worked better Use a mix of Likert scale and free-text questions UNK uses Qualtrics. LibGuides survey boxes, SurveyMonkey, PollDaddy and Google Forms were other suggested web survey tools shared on ERIL-L
  • 17. C. How is trial feedback solicited and recorded? 3. Advertising the trial Post trial information to an internal library blog, with a sample “press release” that could be copied/pasted into emails to specific faculty. Arrange for a vendor’s webinar or live demonstration of the product for librarians and faculty. For extended trials, could post info to faculty and/or student listservs, library’s website and public blog.
  • 18. C. How is trial feedback solicited and recorded? 4. Sharing responses publicly? PRO: Allows participants to see what others think before/after their trial experience (and perhaps respond). CON: May discourage participants from sharing “frank” comments. At UNK, we’re erring on the side of privacy and keeping survey responses internal, with no identifiers required.
  • 19. Sample survey at bit.ly/y33F7D
  • 20. Trial Survey Questions How relevant is this database's content to your research and/or instruction at UNK?* (Highly Relevant / Somewhat Relevant / Limited Relevance / Not Relevant) In which courses -- or areas of research or teaching -- would this database help you?* (Free-text) How easy was it to navigate this database and access results on your search topic?* (Very Easy / Somewhat Easy / Somewhat Difficult / Very Difficult) Should the UNK Library acquire this database, in your opinion?* (Yes / No / Not sure) Additional comments? (Free-text)
  • 21. Trial Survey Questions OPTIONAL: Would you like to be notified regarding the library's decision on this database trial? If yes, please enter your email address below: Only if the respondent enters an email address, the survey will also ask… May the library contact you with follow-up questions regarding your responses to this database trial? (Yes / No)
  • 22. C. How is trial feedback solicited and recorded? 5. Record keeping of trial feedback CSV and PDF files of survey results are saved and shared with other librarians. Maintain a basic spreadsheet about past/current trials and decisions.
  • 23. Christine E. Ryan, Clemson University
  • 24. WHY TRIAL? • WE need new content • New content packages or technology • Platform changes • Consortium changes • User requests REQUESTS ORIGINATE EVERYWHERE, BUT……
  • 25. WHO DECIDES TO PROCEED WITH A TRIAL? • Committee • Subject liaison • Other? SO YOU WANT TO TRIAL, NOW WHAT? ………
  • 27. WHO COORDINATES THE TRIAL? • Beginning: what, why • End: yes/no • Everything in between • When to trial • Who to trial with • How to…………
  • 28. HOW TO ACCESS • Standing trial web page • Highly visible • Moderately visible • Hidden • Subject or LibGuide
  • 29. ADVERTISING/PROMOTING THE TRIAL • General campus email (faculty/staff) • General student email • Student organizations • Subject liaison contacts • Emails to departments • Attend department meetings • Embedded librarians • Course-specific announcements (Blackboard, library instruction) • Web page, printed announcements
  • 30. GATHERING/ANALYZING FEEDBACK • Gathering Feedback • Surveys • How to access? • Solicitation during conversation, email, focus group training • Usage stats • Analyzing • Analyzing Likert Scale responses • Compiling free text responses
  • 32. CLOSING THE LOOP • Participants • Vendors
  • 33. Fischer, Christine. 2007. "Group Therapy— Database Trials." Against the Grain 19, no. 6: 65-6. Street, Chrissy. 2010. "Getting the Most from a Database Trial." Legal Information Management 10, no. 2: 147-8.
  • 34. Annis Lee Adams, Golden Gate University ladams@ggu.edu Jon Ritterbush, University of Nebraska – Kearney ritterbushjr@unk.edu Twitter: @loperlibrarian Christine E. Ryan, Clemson University ryanchr@clemson.edu