2. The JISC Digitisation Programme has funded over 50 projects since 2004, at c. £24 million
3. Importance of measuring Impact and capturing Value Compelling arguments Building a body of evidence Responsibility to the public Stimulate further innovation
4. Impact is intertwined with a number of other concepts... Sustainability Usage Benefits Access Discoverability
5. Need to ensure institutions are better equipped to undertake this work... Impact Toolkit Benefits and Impact of Digitised resources Impact and embedding
6. The Impact Toolkit is an example of JISC’s work in trying to support projects undertaking this work.
7. Impact analysis and evaluation should be evenly distributed http://www.flickr.com/photos/cnmark/2132465668
10. Route of the Dorothea, towards the North Pole, in 1818 Research and Engagement
Editor's Notes
My involvement has been a little last minute... So please excuse the scattering of ideas throughout this presentation! The funder as uniquely placed – in-between the projects undertaking the work and the strategies and policies of governments and organisations. This is really all about how we develop robust, evidence-based tools and methodologies? I have been asked to speak today about what a funders perspective is on Impact and Value of Digital Collections. So I am going to just go through some of what I feel are the key issues and challenges and motivations for measuring and capturing impact and value for funders.
JISC’s pioneering digitisation programme had two main phases: 22 projects – c.£22m of funding Phase 1 – 2004 – 07 Phase 2 - 2007 - 09 Importantly these projects aren’t just about digitisation and the digital objects, but also about embedding digitisation within educational and public institutions, importance of sustainability and discovering new business models.
Why then is this so important for JISC ? Compelling arguments for the work we’re doing – and the work of the projects we fund. Put simplistically: We need to convince the unconvinced. We need to talk to different people – not those that are already converted. One firework by itself is not impressive, but fireworks is a dazzling spectacle – that’s the effect we need to have.. Dazzling! Impact could also be envisaged as ‘understanding your audience’ – Impact gets a bad press, but understanding your audience is critical to a successful project. Understanding your audience also means you can often follow paths and uncover new and innovative things, but still address the needs of your community(s). Body of evidence not just for government... But also for the projects and researchers and those undertaking the work. Why is what they’re doing so important and valuable. It is also to demonstrate – again – o those that may be sceptical within the community the value of new areas of research or new methodologies (i.e. Public engagement). We are responsible for demonstrating that the work we, and the projects undertake, has wider value – value for the public. This is essential in times of crisis. In some ways a crisis sharpens the mind – makes us realise the importance of the wider social impact of our work – we cannot take this for granted any longer. Impact and value can help us see new methodologies that work – or identify new opportunities for innovation and innovative programmes. Not just about research -We also need to make sure we don’t focus solely on research when we talk about impact – the impact of digital resources on the student learning experience can be huge, as well as for the tutor and their experience of teaching. So why is impact such a difficult thing to do and get right...?
It is worth considering why impact and value cause us such issues – especially for us Funders. The role of the funder is to make what initially appears complex – easy. To do some of the untangling, or at least give projects and institutions the power to untangle some of this. We need to unpick the thread of impact. Issues of sustainability – is the project funding model the most conducive to both sustain the digital content – and to ensure that impact is effectively measured and achieved?
JISC has tried to make sure projects have the tools and methods to effectively undertake this work – How can we make sure that this kind of work is embedded within the projects and the institutions... It should be integral to the work – not an add-on. These are a few of the things we have funded that focus on the impact and value of digitised collections and resources. The final piece of work is providing funds for projects (not only JISC funded ones) to examine impact after a time lag. When using a project based model this is important for projects to have these funds made available. (This was a recommendation from the Impact study and toolkit report).
How do we measure and define success – or what tools and methodologies can be used and implemented at a strategic level. TISDR Toolkit, for example. JISCs work in this area – Simon will talk to you a little later about some further work JISC has funded him to do on looking at the impact of the digitisation programme across the entire programme. The toolkit is based on a mixture of qualitative and quantitative measures. The quantitative stuff is relatively easy – need to do more qualitative? The toolkit allows uploading of case studies, and a community of practice and solutions around this area. There are many diverse pathways to impact so we need to make sure that we have a broad toolkit of examples so we don’t narrow the focus too much. Developed and critiqued methodology for using impact We as funders also need to ensure projects and bidders are asking the correct questions – there are no single set of questions, they may be different to different projects and institutions and collections – but they have to be right for that project. Hence the toolkit – it’s a selection, not a single choice. JISC commissioned Oxford Internet Institute to create Toolkit for the Measurement of Scholarly Resources http://microsites.oii.ox.ac.uk/tidsr/
The toolkit is often seen as a resource for once a project completes - Our role should also be to ensure that evaluation isn’t entirely back-loaded – i.e. Only happens after the project ‘completes’. Rather, we need to be much more careful in making sure that projects are able and motivated to rigorously examine issues of impact, value, usage etc before the project begins – indeed before they submit a proposal. While we have tried to encourage projects to think about this before submitting proposals – we haven’t done enough. Measurability should be built into the websites and interfaces that are funded. This shouldn’t be an after thought! This means being much clearer in the funding calls – and ensuring that markers are aware of the centrality of impact and value to the marking criteria. Work done by SCA on audience analysis and SEO etc. Evidence from the business model work shows that pre and post impact and value analysis is essential.
Are we facing a paradigm shift? What are the implications for the measurement of impact and value and even use that new technologies, such as the social web, and new methodologies, like public engagement pose? The audience is diverse, fragmented, distant. Bleeding between the academic and the public – no clear distinction. Use and impact will be radically different – community use, transforming lifelong learning – how do we measure this? Viral ... Difficult to trace or originate. Qualitative measures suffer?
How do we capture this value? How do we go about measuring it? Can we map its impact back to teaching and research? Can we measure usage on third party sites?
Impact of this work outside of the academy has led to a project that will utilise members of the public to help ‘crowdsource’ the interpretation of historic navy logbooks for researchers at the Met Office and Oxford University. There is a generational shift – early career researchers are and will be, developing different habits and ways of researching. What impact will this have on existing resources and digital collections?