Building a resource for practical assessment: adding value to value and impact
1. Building a resource for practical
assessment: adding value to value
and impact
Stephen Town
University of York, UK
Library Assessment Conference, Seattle
Wednesday 6th
August, 2008
2. Summary
• The SCONUL Value & Impact Measurement
Program (“VAMP”) recap
• The Performance Portal
• ‘Value’ options
– UK drivers (TRAC)
– Institutional Case: The Open University’s Best Value
project
– Benchmarking & national statistics
4. The University Context
(from the 2006 Library Assessment Conference,
after Lombardi)
Universities have two “bottom lines”
1. Financial (as in business)
2. Academic, largely through reputation in
• Research (the priority in “leading” Universities)
• Teaching (& maybe Learning)
5. Library Pressures for Accountability
The need is therefore to demonstrate the Library
contribution in these two dimensions:
1. Financial, through “value for money” or related
measures
2. Impact on research, teaching and learning
This also implies that “competitive” data will be highly
valued
6. The Aim & Role of Universities & their
Libraries: cautions for measurement
• Research, Teaching & Reductionism
– ‘Mode 1’ Research & impact ‘transcendental’
– ‘Mode 2’ Research & impact ‘instrumental’
– Value, Price & ‘Mandarinisation’ of research and its
support
– Interdisciplinary research
– Collaborative research across institutions
– Learning as a set of discreet assessed modules
• All of this may damage the idea of Libraries
as ‘transcendent’, collective and connective
services
7. SCONUL Member Survey Findings
• 70% undertaken value or impact measurement
• Main rationales are advocacy, service improvement,
comparison
• Half used in-house methodologies; half used
standard techniques
• Main barrier is lack of tools,
– Creating issues of time and buy-in
8. Member Survey Conclusions
• There is a need to demonstrate value and that
libraries make a difference
• Measurement needs to show ‘real’ value
• Need to link to University mission
• Libraries are, and intend to be, ahead of the game
• Impact may be difficult or impossible to measure
• All respondents welcomed the programme, and the
prospect of an available toolkit with robust and
simple tools
9. VAMP Objectives
• New missing measurement instruments &
frameworks
• A full coherent framework for performance,
improvement and innovation
• Persuasive data for University Senior
Managers, to prove value, impact,
comparability, and worth
10. Missing methods
• An impact tool or tools, for both teaching &
learning and research (from the
LIRG/SCONUL initiative?)
• A robust Value for Money/Economic Impact
tool
• Staff measures
• Process & operational costing tools
11. Program Benefits
1. Attainment & retention of Library institutional
income
2. Proof of value and impact on education and
research
3. Evidence of comparability with peer institutions
4. Justification of a continuing role for libraries and
their staff
5. Meeting national costing requirements for
separating spend on teaching and research
12. Communities of Practice
“groups of people who share a passion
for something that they know how to do,
and who interact regularly to learn how to
do it better”
“coherence through mutual engagement”
Etienne Wenger, 1998 & 2002
13. VAMP Project Structure
• Analysis March-June 2006
• Tools I (Impact) - June 2007
• Site Development - June 2007
• Tools II (Value) - ?
• CoP development
• Maintenance
16. The ‘Performance Portal’
• A Wiki of library performance measurement
containing a number of ‘approaches’, each
(hopefully) with:
– A definition
– A method or methods
– Some experience of their use in libraries (or links to this)
– The opportunity to discuss use
27. The Ontology of Performance
• ‘Frameworks’
• ‘Impact’
• ‘Quality’
• ‘Statistics’
• ‘Value’
• A visual Mind map?
29. Frameworks
Mounted
• European Framework
for Quality Management
(EFQM)
Desired
• Key Performance
Indicators
• The Balanced Scorecard
• Critical Success Factors
• The Effective Academic
Library
30. Impact
Mounted
• Impact tools
Desired
• Detailed UK experience
from LIRG/SCONUL
Initiatives
• Outcome based
evaluation
• Information Literacy
measurement
• More on research
impact
32. Quality
Mounted
• Charter Mark
• Customer Surveys
– LibQUAL+
– SCONUl Survey
– Priority Research
• Investors in People
Desired
• Benchmarking
• Quality Assurance
• ISO 9000s
• ‘Investors in People’
experience
• Opinion meters
• Quality Maturity Model
34. Statistics
Mounted
• SCONUL Statistics &
interactive service
• HELMS statistics
Desired
• Institutional experience
of using SCONUL
statistics for local
advocacy
• COUNTER
• E-resource tools
35. Value
Mounted Desired
• Contingent valuation
• ‘Transparency’ costing
• Staff & process costing,
value & contribution
• E-resource value tools
37. What is value?
• Cost efficiency
• Cost effectiveness
• Cost comparison (Case 3)
• Financial management process standards & audit
• Financial allocation (Case 1)
• Valuation
• Value added
• Return on investment
• Best value (Case 2)
38. Case 1: TRAC
UK Higher Education Transparency
initiative 2000-09
39. Transparent approach to costing
• The standard method for costing in UK HEIs
• Government requirement
• Ending of cross-subsidy (T vs R)
• Research funding based on full economic
costing (fEC)
• Positive effects on funding
• Positive effect on pricing
40. Implications
• All activity to be identified as ‘research’,
‘teaching’ or ‘other’
• Library as other? or
• All library activities either research or
teaching, or a simplistic apportioning to each
• Libraries omitted as a component of research
costs, and therefore as a share recipient
41. Case 2: the UK Open University
Library’s Best Value Program
42. OU Best Value Program Objectives
• To increase the business skills of library
managers & staff
• To develop skills to support customer-
focused, cost-efficient management decision
making
• To develop benchmarking evaluation skills
that balance quality, value and cost
efficiency
43. Strands
• Business reporting
• Process costing and continuous improvement
• Service planning
• Benchmarking
‘to generate real accountability’
44. Business reporting elements
• Library business areas
• Five PIs per area, including cost, quality &
customer impact
• Forecast, variance & remedial action
Has improved use of management information,
efficiency, prioritisation and expenditure
control
45. Process Costing
• Complete process and stage costing
• Average times and skill levels
• Included enquiries, cataloguing, e-resources, IT
support, document delivery, counter services
Has delivered justification for staffing levels against
activity, staffing formulae, redeployment to
priority areas, and process improvements
46. Service plans
• Costed service plans to achieve medium
term improvement and development through
a rolling program
Included document delivery, enquiries,
information literacy, and e-resources
47. Program benefits and outcomes
• Staff development
– cost-conscious decision-making
– business skills
• Management information improvement
• Clarity about customers and use
• Improved quality
• Ability to ‘sell benefits’
49. International Benchmarking initiatives
• OU able to engage and lead an exercise
against distance education Universities
worldwide
• In one 2008 international benchmarking
study
– Only one institution (out of eight) had a comprehensive
costing model
51. Conclusion & Questions
• What do mean by value?
• Why do we not yet have a collective view on
costing approaches?
– Skills deficiency?
– Lack of real need or real financial performance
accountability?
– We would rather not know?
– Are we more intent on increasing budgets than seeking
efficiency improvement?
52. Acknowledgments
• The VAMP Subgroup of SCONUL WGPI
Maxine Melling, Philip Payne, Rupert Wood
• The Cranfield VAMP Team, Darien Rossiter,
Michael Davis, Selena Lock, Heather Regan
• The Open University, Ann Davies
• ‘Value’ Consultants, Sue Boorman, Larraine
Cooper
• Attendees at the York Statistics meeting