Tata AIG General Insurance Company - Insurer Innovation Award 2024
Citizen science at informal science education institutions workshop slides
1. Why Citizen Science in
Science Centers?
Jennifer Shirk and Rick Bonney
Program Development & Evaluation
Program Development & Evaluation
2. What is Citizen Science/
Public Participation in Research?
Organized research in which members of the
public—who may or may not be trained in
science—are involved in one or more steps of
the research process
2 Program Development & Evaluation
3. Early Citizen Science “projects”
Lighthouse Surveys, 1880 Astronomical Society of the Pacific, 1889
NWS Cooperative NAS Christmas
Observer Program, Bird Count,
1890 1900
3 Program Development & Evaluation
5. Models of PPSR Contributory Collaborative Co-Created
Define a question/issue
Gather information
Develop explanations
Design data collection methods
Collect samples
Analyze samples
Analyze data
Interpret data/conclude
Disseminate conclusions
Discuss results/inquire further
Bonney et al. 2009. CAISE Inquiry Group Report.
Program Development & Evaluation
6. Citizen science learning outcomes
engage critical thinking
(Trumbull et al 2000)
science learning, bonding
(Kountoupes and Oberhauser 2008)
environmental action; social networks
(Overdevest et al. 2004)
social capital
(Ballard 2008)
improved policy
(Wing et al. 2008)
Program Development & Evaluation
7. Citizen science research outcomes
documenting range shifts
(Bonter et al. unpublished data)
identifying potential mismatches
(Batalden et al. 2007)
identifying vulnerable species
(Crimmins et al 2008, 2009)
health planning
(Levetin and Van de Water 2008)
anticipating effects on water sources
(e.g., CoCoRaHS)
Program Development & Evaluation
22. Getting Started
with Citizen Science
Jennifer Shirk and Rick Bonney
Program Development & Evaluation
Program Development & Evaluation
23. CitizenScience.org
Citizen science,
volunteer
monitoring,
participatory action
research... this site
supports organizers
of initiatives where
public participants
are involved in
scientific research.
Program Development & Evaluation
28. Determine audience
• Urban, suburban, rural?
• Youth, adults, retirees, families?
• Museums, summer camps,
nature centers, afterschool
programs, libraries, retirement
homes, church groups,
community centers?
Program Development & Evaluation
29. Choose scientific question
• What is the distribution of birds at
North American feeders?
• How do clutch sizes of Eastern Bluebirds
vary with latitude?
• What is the effect of forest
fragmentation on North American
tanagers?
• Do fake cats keep birds away from
feeders?
Program Development & Evaluation
31. Develop and test protocols
• Must ensure collection of
useful data (scientist)
• Must be easy for participants to
understand and follow
(educator)
Program Development & Evaluation
32. Recruit participants
• Advertisements
• Press Releases
• Direct mail
• Group leaders
• Partnerships
• Email lists
• Listservs
Program Development & Evaluation
33. Train participants
• Research Kits
• Online support
• Group leaders
• Teachers
Program Development & Evaluation
34. Accept data/ensure data quality
• Paper forms
• Scannable forms
• Electronic forms
• “Smart” forms
Program Development & Evaluation
36. Analyze and interpret data
Inc.
Acid
Ion
Dep.
Pattern of Pattern of Acid
Wood Thrush Ion Deposition
Decline
Program Development & Evaluation
37. Publish results
• Scientific journals
•Publications for participants
• Popular articles
• Reports to government agencies
• Management guidelines
Program Development & Evaluation
38. Measure impacts
• Scientific knowledge
• Scientific literacy
• Conservation action
Program Development & Evaluation
40. Contributory Collaborative Co-Created
Outcomes for:
Science Data precision and Intermediate
accuracy high expectations of data
precision and
accuracy
Social- Decision-making High potential for
slow to result prompt decision-
ecological making
systems
Individuals Low potential for High potential for
enhancing enhancing
stakeholder stakeholder
capacities capacities
Bonney et al. 2009; Danielsen et al. 2009
Program Development & Evaluation
41. Starting with the End in Mind
Jennifer Shirk and Rick Bonney
Program Development & Evaluation
Program Development & Evaluation
42. Science centers: “I am able to work with
this kind of programming because it...
Other
Is supported by partner
institutions
Is supported by external funding Identify the
Addresses regional or national
topics of interest outcomes
Addresses the needs and interests
of the local community
Fits well with our institutional
that you
priorities
Fits well with our departmental
need to
priorities
Fits well with the needs of a target achieve.
audience
Has a direct connection to my
institution's programming
Has a direct connection to my
institution's exhibits
0 5 10 15 20 25 30 35 40 45
Program Development & Evaluation
43. Inputs Activities Outputs Outcomes Impacts
Project team Choose a question Experiences Increased
(scientists, supp with science content and Enhanced
ort research process scientific and
staff, educators, Design a study process knowledge civic literacy
evaluators, tech Increased
nologists) Data collection Data Improved
engagement
and reporting science-society
Participant
time, knowledge New skills relationships
, skills, interests Analyze and and activities
, motivation interpret data Experiences Appreciation for
with scientific science and
Share and act on content and Enhanced
Infrastructure natural world science
results environmental
for research and
context
recruitment, trai Environmental knowledge
Ask new questions
ning, and action
support
Phillips et al. in press
Program Development & Evaluation
44. Potential Measures of
Impact?
•Awareness, knowledge, or
understanding
•Engagement or interest
Alan J. Friedman, editor •Skills
12 March 2008
•Attitudes
•Behaviors
•Other
Program Development & Evaluation
45. KNOWLEDGE
• Content? (bird migration)
• Process? (data are collected and analyzed)
• Careers? (you can be an ornithologist and become rich and
famous)
• Community? (people are watching and enjoying birds all
around you)
45 Program Development & Evaluation
46. Increase engagement
ENGAGEMENT
• Content? (develop bird migration maps)
• Process? (collect data)
• Careers? (become an intern)
• Community? (attend a community event)
46 Program Development & Evaluation
47. SKILLS
• Ask testable questions and
design studies?
• Collect accurate data?
• Analyze and interpret data?
47 Program Development & Evaluation
48. ATTITUDES
• Toward science? (science can be fun)
• About species? (learn to love pigeons)
• About careers? (I’d like to be an ornithologist when I grow up)
• About people? (It’s fun to collect data with others)
48 Program Development & Evaluation
49. Change behaviors
BEHAVIOR
• Toward science? (participate in project over time)
• About the outdoors? (spend more time outdoors watching
birds)
• About community? (Attend community meetings)
• Regarding responsibility? (recycling)
49 Program Development & Evaluation
50. Matrix for describing impacts of PPSR
NSF Impact Category PPSR Subcategory
Knowledge, Awareness, Science Content (Concepts) Nature of science
Science Process Scientific Careers
understanding of
Engagement or interest in: Content (concepts) Scientific Community
Scientific Careers Project/activity
Nature/environment Science Process
Skills Asking Questions Evaluating Results
Study Design Using Technology
Data Collection, Analysis Writing
Submission, Interpretation,
Attitudes Toward Science Enterprise Scientific Community
Science Content/theories Scientific Careers
Project activities Nature/environment
Behaviors Lifestyle changes Environmentally
Community involvement responsible behavior
Citizen action New engagement/
participation
Other Social capital Economic impacts
Community capacity Artistic Expression
Program Development & Evaluation
51. DEVISE
•Improve quality and practice of evaluations
across the field of citizen science
Aligning goals & objectives
Developing logic models
Database of tested scales
Case study evaluations
Tutorials & resources
Support & consultation
Research on learning
Program Development & Evaluation
52. Extensions and New Directions
Jennifer Shirk and Rick Bonney
Program Development & Evaluation
Program Development & Evaluation
55. Dialog
Hosting challenging conversations
(e.g., about climate change or in other contexts of risk)
(Leiserowitz 2006)
Rational Emotional
Analytical Value-based
Cognitive Affective
Program Development & Evaluation
The report led to us writing a proposal to the NSFfor DEVISE to Developing, Validating, and Implementing Situated Evaluation Instruments to Assess the Impacts of CSWe have a dream team of expert evaluators, informal science educators, and social science researchers