Learning by Doing: Increasing librarian and institutional capacity for research through a collaborative international systematic review action learning set
Increasingly health librarians are occupying roles in multi-disciplinary teams in supporting the production of high-quality systematic reviews (SRs). Continuing professional development (CPD) support for SR activities is poorly served by existing library and information science (LIS) training provision. Being equipped to conduct SRs requires experiential learning, tailored review-specific advice and may be facilitated by working in a topic area familiar to the learner. One approach to meeting this training need involves use of collaborative e-learning technologies and document sharing, supported by group mentoring.
Similaire à Learning by Doing: Increasing librarian and institutional capacity for research through a collaborative international systematic review action learning set
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopLACE Project
Similaire à Learning by Doing: Increasing librarian and institutional capacity for research through a collaborative international systematic review action learning set (20)
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Learning by Doing: Increasing librarian and institutional capacity for research through a collaborative international systematic review action learning set
1. Learning
By Doing
Martin Morris (presenter),
Andrew Booth, Catherine Boden, et al.
IFLA • Cape Town, South Africa
Biomedical Libraries Section • 20 Aug
2015
Increasing Librarian and Institutional Capacity
for Research through a Collaborative
International Systematic Review Action
Learning Set
2. What I’ll cover
• A current problem in health librarianship
• Our project: A case study
• How our working environment
developed
• How we have built SR competencies
• Issues we have tackled
• TheTeam Leader role
• Lessons Learned
2
4. The problem
• Health librarians are
increasingly involved in
producing high-quality SRs
• This means that more librarians
want to understand the SR
process in practice…
4
5. …more problems
• …BUT, conducting an SR is
complicated and requires practical
experience
• …and there are few opportunities
to “get hands dirty” and practice
SR skills
5
6. …more problems
• There are various written guides, BUT…
• …actual expertise in the full SR process is
scattered, with many institutions having little
• …while CE courses are excellent, but limited
and expensive to attend
• PLUS: Existing LIS curricula contain little
information on this work, and no opportunity
for practical experience
6
7. We propose…
We believe a good answer to
these problems is the creation
of international SR
collaborations on topics in
library and information science
7
9. Background
Conduct a systematic review to address
one of the 15 questions identified in the
MLA Research Agenda: Appraising the
Best Available Evidence
9
10. Towards a research agenda
Work
starts on
defining
MLA
Research
Agenda
2009
List of
final
research
questions
made
public
Summer 2012
Call for
interested
librarians
to join
research
teams
Late 2012
SRs
formally
start
Early 2013
10
11. How the MLA built their teams
• For their 15 projects, call for interested
librarians
• Coordinators at the MLA then selected
teams based on experience, in order to
achieve a mix in each team
• Our team leader was not involved in
putting our team together
11
13. Our team
• Current 10 librarians from 3
countries (US, Canada, UK)
• Team leader, methodolgist,
subject experts
• Wide range of experience levels –
from highly experienced to those
coming to an SR for the first time
13
14. Research question
“What skills and knowledge
must librarians possess in order
to be able to design tools to help
researchers visualize, mine, and
otherwise manage large and
complex data gathered during
both quantitative and
qualitative research?”
14
15. Preliminary results
Records after duplicates
removed (n=3910)
Records screened (n=3910)
Full-text articles assessed for
eligibility (n=165)
Articles included in
qualitative analysis* (n=26)
Data visualization (n=7)
Data mining (n=9)
Data management (n=24)
Records excluded in title
abstract screen (n=3745)
Full-text articles excluded*
(n=70)
15
19. Tools for collaboration
• Started with Basecamp for doc
sharing etc….
• …but naturally moved to Google
Docs and DropBox
• We use simple Google Sheets
formulae to calculate consensus
and highlight disagreements
19
22. Building Skills
TEACHING NEW CONCEPTS
• Explanations from experts
• Selected readings to learn new
concepts before next meeting
• Group discussions and debates
22
23. Building Skills
GETTINGTHE PRACTICAL SKILLS
• Buddy activities with subsequent
consensus discussions – share
difficult cases with wider group
• Small group work caters for
quieter personalities
23
25. TheTeam Leader role
• (If involved in putting the team
together) Building a good team
–ensuring a good mix of skills and levels
of experience
• Keeping the team together
–ensuring that all feel they can contribute
• Ensuring learning opportunities
25
26. TheTeam Leader role
• Distributed SR collaboration
places a great deal of
responsibility on the coordinator
–ensuring that quieter people contribute,
and finding opportunities for that (such
as through small group work)
–promoting harmonious team dynamics
26
27. TheTeam Leader role
• Has to successfully balance knowing
when to provide instruction, and when
to allow members to become more
autonomous.
• During consensus, for example, the lead
had to work to encourage members to
say when they hadn’t fully understand a
part of the work
27
28. TheTeam Leader role
• Weekly meetings were vital to
keeping momentum and
promoting retention of staff
members
28
30. The Benefits
• Librarians understand the entire process
of an SR, not just the lit search part
• Gain confidence in evaluating the
quality of other SRs
• Practical experience of the subtleties of
the process
• Opportunities for professional reflection
30
31. The “Hive Mind”
• The model could effectively reduce
professional isolation beyond SR work
• Beyond the team leader and
methodologist, the whole team have
worked collectively to solve problems
and develop their learning
• We have sometimes called this the “hive
mind” approach
31
32. Things to watch out for
• Higher risk of attrition, particularly
given the length of the project
• The work is largely “extracurricular” –
so it’s hard to find the time to fully
participate
• The logistics can be difficult: some
people have to meet at very
inconvenient times (6-7pm)
32
33. More things to watch out for
• Need to have at least one SR
methodologist on the team
• Need to have subject expertise in
the area being researched
• Very important to have narrowly
focussed question
33
34. “What I learned…”
“…Networking and
new professional
contacts/relationship
s forged”
“I found the experience of
conducting a systematic
review to be the most
effective way to work out
the inherent subtleties and
complexities…”
“I feel I’m in a
position to help
researchers manage
the process…”
“Participating in this project
has helped me better
understand the processes
that the researchers I help
go through ”
34
35. Lessons Learned
• TheTeam Leader role is absolutely
vital
• CE and LIS curricula should include
hands-on components for SRs
35
36. Lessons Learned
• We encourage librarians to create
opportunities for distributed (possibly
international) collaboration in well-
balanced teams
• Are there possible benefits to the team
coming from different countries?
36
37. References
Eldredge, J.D., Harris, M.R. & Ascher, M.T. Defining the Medical Library
Association research agenda: methodology and final results from a
consensus process. J Med Libr Assoc. Jul 2009; 97(3): 178–185.
Eldredge, J.D, Ascher, M.T., Holmes, H.N. & Harris, M.R. The new Medical
Library Association research agenda: final results from a three-phase Delphi
study. J Med Libr Assoc. Jul 2012; 100(3): 214–218.
Harris, M.R., Holmes, H.N., Ascher, M.T. & Eldredge, J.D. Inventory of
Research Questions Identified by the 2011 MLA Research Agenda Delphi
Study. Hypothesis: The Journal of the Research Section of MLA. Winter
2013; 24(2): 5-17.
Eldredge, J.D., Ascher, M.T., Holmes, H.N. & Harris, M.R. The Research
Mentor: Top-Ranked Research Questions and Systematic Reviews.
Hypothesis: The Journal of the Research Section of MLA. Winter 2013;
24(2): 19-20.
Various icons from the Noun Project (www.nounproject.com)
37
Systematic reviews conducted by teams of ~10 librarians
-The Medical Library Association's Research Section identified top 15 questions facing medical libraries today.
Work in progress…3 years and counting!
Our team began with 11 people (including one coordinator)
Currently 10 active members
From the States, Canada and the UK
Catherine Boden is the ‘team leader’, she is a health sciences librarian and the University of Saskatchewan
Different levels of experience with SRs on the team
Lucky to have Andrew Booth on our team, he has extensive expertise in the systematic review methodology, and helped develop the ‘best fit’ framework methodology
Both title abstract screening and full text screening was done in pairs
Articles were screened for relevance and completeness (i.e., is there sufficient information for data extraction?).
Disagreements were resolved by consensus.
Reasons for exclusion:
5 – not English
13 – not about research data management, mining or visualization
32 – not about designing tools
12 – did not address librarian competencies
3 – librarian competencies were described but not in relation to research data
3- insufficiently complete for data extraction
as May 14, 2015, 29 Studies included
Note, articles can be coded in more than one category (re management, mining or visualization)
Types of candidate articles identified: journal articles, white papers, editorials, news items, magazine articles, book reviews, conference papers, copnference abstracts (posters, keynotes, presentations), etc.