In Search of Patterns at the Desk

910 vues

Publié le

"In Search of Patterns at the Desk: An Analysis of Reference Question Tracking Logs" is being presented at the 4th QQML 2012 International Conference in Limerick, Ireland.

Publié dans : Formation
0 commentaire
0 j’aime
Statistiques
Remarques
  • Soyez le premier à commenter

  • Soyez le premier à aimer ceci

Aucun téléchargement
Vues
Nombre de vues
910
Sur SlideShare
0
Issues des intégrations
0
Intégrations
231
Actions
Partages
0
Téléchargements
2
Commentaires
0
J’aime
0
Intégrations 0
Aucune incorporation

Aucune remarque pour cette diapositive

In Search of Patterns at the Desk

  1. 1. In Search of Patterns at the Desk:An Analysis of Reference Question Tracking Logs
  2. 2. Loyola Marymount University Private Catholic University in Los Angeles, CA 5600+ undergraduates and 1900+ graduates William H. Hannon Library opened in 2009
  3. 3. Reference Service at LMU 24/5 Information Desk Staffed by students, library staff, outsourced staff Desk encounters recorded using Gimlet question tracking system 14,210 volumes in the print Reference Collection Over 200 Electronic Databases
  4. 4. Gimlet Interface
  5. 5. Purpose of Study Content of questions (subject, difficulty level) Content of answers (characteristics of sources used, accuracy) Patterns (by patron type, service provider, subject, or time) Develop reference question tagging scheme
  6. 6. Methodology Content analysis of LMU reference questions, Fall 2010/Spring 2011 academic year Excel file data dump, deleted all non- reference questions and questions not asked at Info Desk
  7. 7. Methodolgy Took free text Q&A fields and recoded into “Reference Tag,” “School/College,” “Subject,” “Exact Source,” and “Quality” New fields finalized after several rounds of 50-question sample calibrations and “norming sessions” by 3 coders
  8. 8. Old Reference Tags (Beginning)1 Citation Style2 External Web Page3 Known Item4 Reference Book5 Referral6 Reserves7 Retrieval8 Search Construction9 Topic Source
  9. 9. Final Revised Reference Tags1 Catalog Use & Lookup2 Database Help & Use3 External Web Page4 Internal Web Page5 Reference Book (print)6 Referral7 Reserves8 Retrieval9 Other
  10. 10. School/College1 Business2 Communication & Fine Arts3 Education4 Film & Television5 Law6 Liberal Arts7 Science8 General Interest
  11. 11. Quality1 Inappropriate sources recommended2 Incomplete3 Acceptable
  12. 12. Methodology• Sampled from 3,422 total questions• Random 20% sample from all questions at levels 1-3 difficulty on READ (Reference Effort Assessment Data) Scale• All questions included from levels 4-6• Total sample size=931 questions
  13. 13. Methodology Analyzed sample in SPSS to look at frequencies and relationships Examined standardized residuals for significance
  14. 14. Selected Frequencies
  15. 15. Reference Tag: Totals
  16. 16. Catalog Use & Lookup: Exact Sources
  17. 17. Top Vendors: Database Help &Use (More Than 5x)
  18. 18. Exact Source: Database Help & Use(Used More than 5 times)Database Times UsedJSTOR 47 OneSearch 12Academic Search 45 WorldCAT 12Complete Business & Co. R.C. 10Proquest 45 GVRL 9PsycINFO 33 Lit. Resource Cntr 9Business Source 28 Euromonitor 8CompleteLexisNexis 23 Lit. Criticism Online 8Ebsco 21 Soc. Abstracts 8MLA Intl Bibliography 18 Science Direct 7ERIC 17 Biography in 6 ContextATLA 14 CMMC 6Bibloi 14 Opposing Viewpts 6ABI Inform 13 Proquest Dissert. 6Mergent 13 Sage Jnls Online 6CQ Researcher 12
  19. 19. Exact Source: External Web Page
  20. 20. Exact Source: Internal Web Page
  21. 21. Exact Source: Ref Book (Print)
  22. 22. College
  23. 23. Liberal Arts: Subject Areas
  24. 24. Science: Subject Areas
  25. 25. CFA: Subject Areas
  26. 26. Accuracy: Student Worker VersusLibrarian o Database Use & Lookup: • Students recommend more general sources versus subject-specific
  27. 27. Who Answered:Above Level “3” Difficulty LAC Librarians Staff Students 11 252 33 15
  28. 28. Monthly Patterns
  29. 29. Shorter & Less Difficult
  30. 30. Longer Questions (16+ min.)
  31. 31. More Difficult Questions (Above “3”)
  32. 32. Patterns By Hour
  33. 33. Longer Questions (16+ min.)
  34. 34. More Difficult Questions (Above “3”)
  35. 35. Day of Week Patterns: DifficultyLevel (Above “3”)
  36. 36. Databases/Higher Difficulty Level (Above “3”) Database Name Times Database Name TimesAcademic Search Complete 30 Business & Co. Resource Cntr. 10JSTOR 30 CQ Researcher 10Proquest 28 ABI Inform 10PsycINFO 20 Mergent 10Business Source Complete 19 Google Scholar 8MLA Intl Bib. 14 OneSearch 8Ebsco 13 Euromonitor 7ATLA 12 Gale Virtual Ref. Library 7 Communication & Mass MediaERIC 12 Compl. 6Lexis Nexis 11 Lit. Resource Cntr. 6
  37. 37. Patterns By College & Subject
  38. 38. Colleges with Longer Questions(16+ min.)College Number of QuestionsBusiness 48Communication & Fine Arts 21Education 6Law 5Liberal Arts 87Science 10
  39. 39. Colleges Higher Difficulty Level(Above “3”)College TimesBusiness 63Communication& Fine Arts 27Education 14Film & Television 6Liberal Arts 148Science 14
  40. 40. College of Liberal Arts: Subjects withHigher Difficulty (Above “3”)Subject TimesEnglish 29History 17Philosophy 5Psychology 15Sociology 5Theology 23
  41. 41. Patterns: Fall Versus Spring Subject Fall Spring English 29 11 Psychology 18 7Semester Total Questions: CFAFall 32Spring 17
  42. 42. More Business Questions OnMondayDay of Week Number of QuestionsSunday 7Monday 23Tuesday 18Wednesday 15Thursday 8Friday 10Saturday 6
  43. 43. More Theology Questions OnTuesdayDay of Week Number of QuestionsSunday 5Monday 11Tuesday 23Wednesday 13Thursday 10Friday 10
  44. 44. Limitations of Study Interdisciplinary questions could not be categorized by subject easily Despite “norming” sessions coders independently coded, so no interrater reliability Small sample size (20%) for first three difficulty levels Dependent on desk staff to accurately record all stats
  45. 45. Key Findings: Collections Print reference collection used in only 5.9% of all questions Small group of sources used to answer majority of ref. questions: (29 unique reference titles used for 0.2% of all possible titles)
  46. 46. Key Findings: Collections 95 unique databases used (48% of all databases available)
  47. 47. Key Findings: Collections 24% of all reference questions required an internal web page (LibGuide etc.) as a source 50% of all reference questions required the library catalog as a source 41% of all reference questions required a database as a source
  48. 48. Key Findings: Staffing More difficult/longer reference questions Oct.-Nov. and Mar.- April; less difficult and shorter in Sept. Mon-Wed. between 2- 6pm should double- staff the desk and have librarian expertise; Sat. is lighter
  49. 49. Key Findings: Staffing Librarians answered 81% of all the difficult questions (above a “3”)
  50. 50. Key Findings: DatabasesGood candidates for  Business Source Database workshops Complete based on frequency and difficulty:  Lexis Nexis JSTOR Proquest Vendor Ebsco Vendor (show Academic Search Complete and PsycINFO)
  51. 51. Key Findings: SubjectsSubject areas we serve the most at the Desk (based on difficulty/volume): Business English Psychology Theology History Education
  52. 52. Key Findings: Methodology For Reference tagging scheme, source-based approach worked better than strategy- based
  53. 53. Thank You to the Other Coders Alexander Justice Reference Librarian/Ref. Collection Development Coordinator Loyola Marymount University, Los Angeles Email: ajustice@lmu.edu Andrew Toot Overnight Information Desk Supervisor LAC/Loyola Marymount University, Los Angeles Email: andrewtoot@gmail.com
  54. 54. Additional Acknowledgements Thank you to the William H. Hannon Library Research Incentive Travel Grant Thank you to the LMU Office of Assessment/Laura Massa
  55. 55. Additional Information READ Scale: readscale.org Gimlet: gimlet.us PPT: bit.ly/deskpattern Contact Info: Susan [Gardner] Archambault Email: susan.gardner@lmu.edu Twitter: @susanLMU

×