Keynote for the Digikult 2017 conference. The success of crowdsourcing projects that have transcribed, categorised, linked and researched millions of cultural heritage and scientific records has inspired others to try it their own organisations. We can look to 'star' projects for ideas, but what it's really like to run a crowdsourcing project?
💫✅jodhpur 24×7 BEST GENUINE PERSON LOW PRICE CALL GIRL SERVICE FULL SATISFACT...
Wish upon a star: making crowdsourcing in cultural heritage a reality
1. Wish upon a star: making crowdsourcing
in cultural heritage a reality
Digikult 2017
Dr Mia Ridge @mia_out
Digital Curator, British Library
https://www.flickr.com/photos/nasacommons/15472779199
17. Crowdsourcing in the real world
https://www.flickr.com/photos/statelibraryofnsw/3072281873
18. Motivations as design guidelines
https://www.flickr.com/photos/statensarkiver/8975684669
Altruistic motivations
• helping to provide an accurate record of
local history
Intrinsic motivations
• reading 18thC handwriting is an enjoyable
puzzle
Extrinsic motivations
• an academic collecting a quote from a
primary source
Understand why people participate
19. Use motivations as design guidelines
https://www.flickr.com/photos/statensarkiver/8975684669
People want:
• satisfying work to do
• the experience of being good at something
• time spent with people we like
• the chance to be a part of something bigger
(Jane McGonigal, 2009)
20. Put work into invitations
https://www.flickr.com/photos/internetarchivebookimages/14592517038
21. Be prepared for window shoppers
https://www.flickr.com/photos/twm_news/6841092248
35. A question for you
If enhanced data was available
(e.g. from crowdsourcing or
computational methods), how
could you overcome barriers to
integrate it into your core
cataloguing systems?
36. Who am I?
• Digital Curator, BL
• 'Making Digital History:
the impact of digitality
on public participation
and scholarly practices
in historical research'
• MSc in human-computer
interaction:
crowdsourcing games to
improve object metadata
to enhance museum
collections
Notes de l'éditeur
The success of crowdsourcing projects that have transcribed, categorised, linked and researched millions of cultural heritage and scientific records has inspired others to try it their own organisations. We can look to 'star' projects for ideas, but what it's really like to run a crowdsourcing project?
Presentations often highlight best practices, various aspects of how to engage in new technology for outreach, preservation, etc. The focus is practical - "to make it happen", although we are concerned to always include important critical approaches (for example from digital humanities) and organisational / infrastructural issues.
Think of it as volunteering - scaled up. Like any volunteering, participation should be inherently rewarding because the task is enjoyable or the goal is meaningful (or both).
19thC examples of distributed networks of observations and research reported by post, telegraph... Technology has made it easier to reach interested people, to collate and verify work, but the basic model pre-dates computers. First example is the Oxford English Dictionary. In 1879 the editor asked the public to help with finding sources and the earliest examples of use for words for which they were lacking information. Contributions from the public would be used in the dictionary alongside those from recognised and invited experts.
This shows two things - even then, a call for public assistance could be embedded in scholarly practice, where the same intrinsic motivations of leisure, social networks and community, learning, an interest in the subject and a chance to practice skills were important. You can still participate in OED appeals today, so it must work.
It also shows that crowdsourcing as we know it has been transformed by technology, but not created by it. The ability of digital technology to provide almost instant data gathering and feedback, automatic validation and the ability to reach both broad and niche groups through loose networks have all been particularly important. For collecting institutions, technology has also helped manage the sheer physical issue of providing access to collections without space or conservation limitations.
See also: http://blog.oxforddictionaries.com/2013/02/james-murray/
http://public.oed.com/history-of-the-oed/archived-documents/april-1879-appeal/april-1879-appeal/
Digitisation backlog: collections are big, resources are small. Manually enhancing collections records is expensive and time-consuming. Very few orgs have the resources for straight digitisation.
Some headline figures re successful projects...
Can define success by the number of tasks completed. To achieve this, must also have succeeded in letting people know about your project and providing an interface that lets enough of them get started on their first tasks.
225 million lines of text corrected in Australian newspapers
Every time I talk about this site I have to go check their stats because they go up all the time. The task design is satisfying enough that some people go to the site just to transcribe. A few people spent a lot of time doing the task - a pattern we'll see in nearly every project.
You can also look at the number or type of people engaged in the tasks. This is often important for organisations whose mission is to reach the public, whether that's to give them an experience of contemporary science or access to their history through specific collections. By 2014 Zooniverse projects had reached well over a million volunteers worldwide; FamilySearch Indexing 1.2 million volunteers in 10 years. Scientific projects might also look at the impact of publications on social media and in journals that result from their projects. Museums might look at the number of researchers who find their digitised collections.
Finally, you can look at the number of people who are deeply engaged - people whose feelings or knowledge about the material or the underlying disciplines change to the extent that they change some aspect of their behaviour. In this eg, people came for the herbarium specimens and got caught up in biographical interest of the original specimen collectors.
http://herbariaunited.org/wiki/Harry_Corbyn_Levinge or http://herbariaunited.org/wiki/Augustin_Ley
Accuracy and validation requirements are different for each type
Possibly a great example for students - transcribe Ancient Greek letters from papyri. When trying this out I found myself slowly getting better at spotting different Greek letters - this is a transferrable skill, even if it’s not one I’d use very often!
Find out more: http://ancientlives.org/story
Tags - describe what you see; Comment - share what you know
Validation doesn't have to be hard - just design it into the task workflow
Invent new tasks
Balance project structure, organisational support; marketing and outreach; user experience design
My division of motivations; others may be different.
Another way of looking at it.. Source: http://www.aam-us.org/resources/publications/museum-magazine/museums-as-happiness-engineers and http://www.youtube.com/watch?v=zJ9j7kIZuoQ&feature=plcp
Design is important because lots of projects around; people will compare yours to others
Really focussed design. Altruistic and subject specialist motivation; clear sense of what to do next… Also topical content - if there are new menus, menus relevant to events (Superbowl, in this case); tantalising snippets of content...
Low friction design - friction is anything that makes it harder or delays participation. While minimising friction, also look for points that might cause anxiety. Empathy for your participants is a huge design asset.
Balancing what's useful with what's enjoyable. Working on this at the moment with Playbills - aiming to improve discoverability in catalogue, so should fit into MARC records; find line between useful for scholars and stuff they need to do for themselves
Unless you can draw on a really strong motivation (including a great community), you need to work hard to reduce task complexity and tedium
Tiny tasks that collectively contribute to getting all the needed data
Release early and often (if you can). Projects change once a community finds them. Allow time to update after launch as things will need to be tweaked and participants often have good ideas
Reserve time to pay attention!
Motivations for participation change over time... One way to keep people motivated is to provide more responsibilities, or more complex tasks as their skills and knowledge grow. In some cases, new research questions or projects emerge from the community – this helps keep people engaged and is a great way to demonstrate impact.
If going to add to or change their roles, how can you support them in this?
Years! Unlike Trove where changes show up straight away.
With any luck, one day you will run out of content to crowdsource - what then? Where does your community go?
Why am I here talking about this? Worked in museums for a long time, got interested in opportunity between public engagement and need to enhance collections. Made games c2010, then did PhD in digital history, including crowdsourcing as a stepping stone to engagement with the practices and skills of history. Edited a book.