3. Program evaluation:
Definition
Purposes
Describe several approaches
Discuss the process
Describe the differences between experimental, quasi-experimental, and non-
experimental evaluation designs.
Discuss the appropriate use of qualitative methods in program evaluation.
Identify the uses of evaluation results.
Discuss the importance of disseminating the results of program development and
evaluation.
Identify the ethical considerations in designing and conducting evaluation research.
8. Collect data and
information
Use it to inform future
action
“Evaluators rely on stakeholders to
help develop the essential questions
that will be answered in the
evaluation” (Scaffa and Reitz, 2014:97)
9. Evaluation is vital to helping design
and shape your practice
Evaluation of your program should be
one of the first things you consider
during the development stage
Evaluation should be continuous,
throughout the life of your program
https://www.opendemocracy.net/openglobalrights/evaluation-and-human-rights
10. :
Formative or Process Evaluations
Determines whether program
activities have been implemented as
intended and resulted in certain
outputs. The results are used for
program improvement.
Theoretical framework
Design
Activities
Operation
Summative
When the results are used to
determine effects or
continue/discontinue the program.
Outcomes
Impact
Effectiveness/Efficiency
https://education.uky.edu/evaluationcenter/
12. Objectives Approach
Collect data by previously agreed
upon methods to determine whether
goals and objects are being met.
Relatively easy to quantify, but
limited to evaluating program
objectives only.
Rarely used in isolation.
Managerial Approach
Goal is to provide useful information
to managers to improve their decision
making.
Evaluations look at how the program
operates as well as goals.
Value of a program is determined by
managers and is reflected in decisions
they make based on evaluation
results.
13. Participatory Approach
A formative/process approach
Qualitative
Considers the needs of all stakeholder
groups rather than just managers –
360* approach.
Determines critical information
needed by each of stakeholder groups
to develop program improvement
plans.
Utilization-Focused Approach
A process for including “users” in the
design of the evaluation process.
Shifts attention from the program to
be evaluated and focuses on those
who will use the program evaluation
results.
Highly situation and context specific.
Mixed qualitative and quantitative
methods.
14. Appreciative Inquiry Approach
4D Model
Discovery – appreciate what is good
Dream – envision what might be
Design – consider what should be
Destiny – implementing change
AKA Transtheoretical Model of Health
Behavior Change
5 Principles
Constructivist – multiple realities exist
Simultaneity – inquiry is intervention
Poetic – programs narrating own stories
and can change directions at any time
Anticipatory – imagination of program
guides its current actions
Positive – focus on positive experiences
increases motivation, inspiration, and
engagement
http://dancumberworth.co.uk/practice/
16. Determine OT Related Data Needs Occupational performance
Adaptation
Health and wellness
Participation
Prevention
Quality of life
Role competence
Self-advocacy
Occupational justice
https://www.amazon.com/Occupational-Therapy-Practice-Framework-Process/dp/1569003610
17. Choosing Evaluation Methods
Quantitative Designs
Non-experimental
Cross-sectional, cohort
Quasi-experimental
Compare two groups, one receiving
intervention
Experimental
Randomized controlled trial
Qualitative Designs
Provides an in-depth perspective on
phenomena that are not easily
quantifiable.
Thick description, naturalistic,
experience oriented
Interviews, focus groups, observation,
document review
18. Instrumental use – to inform future action
Conceptual use – impact on decision makers’
thought processes
Process use – cognitive and behavioral changes result
from participating in the process
Symbolic use – for political gain
COMMUNICATION!
19. Beneficence
Select appropriate evaluation approach
and outcome measures
IRB approval
Using current assessments, following
copyright laws
Nonmaleficence
Train evaluators and staff to ensure
competency
Autonomy/Confidentiality
Secure evaluation materials and data
Compliance with evaluation protocol
and confidentiality as needed
Procedural Justice
IRB approval
Veracity
Report accurate results to participants
and stakeholders in a timely manner
Fidelity
Avoid conflicts of interest
Notes de l'éditeur
Therapeutic Surfing for Special Needs: Resource List
California
The Jimmy Miller Foundation provides ocean therapy for at-risk children with physical and emotional disabilities.
Marin County Spectrum Surf camp helps children with autism and cerebral palsy learn to surf.
THERASurf (Marin County) and A Walk on Water
Rhode Island
University of Rhode Island surf program effectiveness research
Hawaii
Surfers Healing (ABILITY Magazine interview)
AccesSurf Hawaii
Costa Rica
Ocean Healing Group
12 Minute TEDx UCLA talk
Evaluations provide accountability to stake holds and community: effects, efficiency.
Also, support the development and improvement of a program, or further stakeholders’ understanding of the program.
Summative example: Transportation
A program to develop a system of high speed trains is initially viewed as a failure as it exceeds planned budget. However, within a decade ridership is far greater than business plans had anticipated. The overall impact on the economy, quality of life and the environment can be demonstrated to be exceedingly positive.
Formative example: Process Evaluation
determines whether program activities have been implemented as intended and resulted in certain
outputs. You may conduct process evaluation periodically throughout the life of your program and start by reviewing the
activities and output components of the logic model (i.e., the left side).
Results of a process evaluation will strengthen your ability to report on your program and use information to improve
future activities. It allows you to track program information related to Who, What, When and Where questions:
• To whom did you direct program efforts?
• What has your program done?
• When did your program activities take place?
• Where did your program activities take place?
• What are the barriers/facilitators to implementation
of program activities?
Imagine you want to evaluate your adaptive surfing program
Or use a student group example
Page 97-99
Needs assessment – incidence and prevalence of a problem. Were we accurate? Was this thorough or focused enough? Incidence, Prevalence, Target Populations, Secondary Populations
Program Theory Evaluation – Making a logic model that explains the relationships between the parts. Evaluate the accuracy and appropriateness of the logic model.
Program Implementation Evaluation – Did it follow the design and logic models? Was it implemented in a way consistent with the vision and plan?
Program Outcome Evaluation – Long-term and establishes cause/effect relationship between program and desired changes.
Program Impact Evaluation – Measures achievement of program objectives in short term – immediate effects on target population.
Program Efficiency Evaluation – Cost benefit assessment to justify funding, secure grants, accountability to stakeholders.
Use a “Contingency Perspective” – meaning choose the appropriate approach depending on the situational needs. No need to do all of them, but important to justify why the one or two you choose is/are the most relevant/appropriate.
Surf program
Fall prevention program
Gardening program…
Use a participatory approach to plan your program evaluation at your site. Who? What? How? When appropriate? When not?
Use a U-FE approach to plan an evaluation at your site. Who? What? How? When appropriate? When not?
AI is most useful when – fear of evaluation exists, change needs to be accelerated, dialogue is critical, poor relationships or sense of hopelessness, and desire to build a community of practice.
Handout: Use it to practice on each other – about their CBPs
Image: http://dancumberworth.co.uk/practice/
Surfing program example:
Your CBP example:
Stakeholders – funders, advisory boards, program managers, those who benefit from the program, those disadvantaged by the program in some way, general public with interest in outcomes of the evaluation.