3. Program evaluation:
Describe several approaches
Discuss the process
Describe the differences between experimental, quasi-experimental, and non-
experimental evaluation designs.
Discuss the appropriate use of qualitative methods in program evaluation.
Identify the uses of evaluation results.
Discuss the importance of disseminating the results of program development and
Identify the ethical considerations in designing and conducting evaluation research.
8. Collect data and
Use it to inform future
“Evaluators rely on stakeholders to
help develop the essential questions
that will be answered in the
evaluation” (Scaffa and Reitz, 2014:97)
9. Evaluation is vital to helping design
and shape your practice
Evaluation of your program should be
one of the first things you consider
during the development stage
Evaluation should be continuous,
throughout the life of your program
Formative or Process Evaluations
Determines whether program
activities have been implemented as
intended and resulted in certain
outputs. The results are used for
When the results are used to
determine effects or
continue/discontinue the program.
12. Objectives Approach
Collect data by previously agreed
upon methods to determine whether
goals and objects are being met.
Relatively easy to quantify, but
limited to evaluating program
Rarely used in isolation.
Goal is to provide useful information
to managers to improve their decision
Evaluations look at how the program
operates as well as goals.
Value of a program is determined by
managers and is reflected in decisions
they make based on evaluation
13. Participatory Approach
A formative/process approach
Considers the needs of all stakeholder
groups rather than just managers –
Determines critical information
needed by each of stakeholder groups
to develop program improvement
A process for including “users” in the
design of the evaluation process.
Shifts attention from the program to
be evaluated and focuses on those
who will use the program evaluation
Highly situation and context specific.
Mixed qualitative and quantitative
14. Appreciative Inquiry Approach
Discovery – appreciate what is good
Dream – envision what might be
Design – consider what should be
Destiny – implementing change
AKA Transtheoretical Model of Health
Constructivist – multiple realities exist
Simultaneity – inquiry is intervention
Poetic – programs narrating own stories
and can change directions at any time
Anticipatory – imagination of program
guides its current actions
Positive – focus on positive experiences
increases motivation, inspiration, and
16. Determine OT Related Data Needs Occupational performance
Health and wellness
Quality of life
17. Choosing Evaluation Methods
Compare two groups, one receiving
Randomized controlled trial
Provides an in-depth perspective on
phenomena that are not easily
Thick description, naturalistic,
Interviews, focus groups, observation,
18. Instrumental use – to inform future action
Conceptual use – impact on decision makers’
Process use – cognitive and behavioral changes result
from participating in the process
Symbolic use – for political gain
Select appropriate evaluation approach
and outcome measures
Using current assessments, following
Train evaluators and staff to ensure
Secure evaluation materials and data
Compliance with evaluation protocol
and confidentiality as needed
Report accurate results to participants
and stakeholders in a timely manner
Avoid conflicts of interest
Notes de l'éditeur
Therapeutic Surfing for Special Needs: Resource List
The Jimmy Miller Foundation provides ocean therapy for at-risk children with physical and emotional disabilities.
Marin County Spectrum Surf camp helps children with autism and cerebral palsy learn to surf.
THERASurf (Marin County) and A Walk on Water
University of Rhode Island surf program effectiveness research
Surfers Healing (ABILITY Magazine interview)
Ocean Healing Group
12 Minute TEDx UCLA talk
Evaluations provide accountability to stake holds and community: effects, efficiency.
Also, support the development and improvement of a program, or further stakeholders’ understanding of the program.
Summative example: Transportation
A program to develop a system of high speed trains is initially viewed as a failure as it exceeds planned budget. However, within a decade ridership is far greater than business plans had anticipated. The overall impact on the economy, quality of life and the environment can be demonstrated to be exceedingly positive.
Formative example: Process Evaluation
determines whether program activities have been implemented as intended and resulted in certain
outputs. You may conduct process evaluation periodically throughout the life of your program and start by reviewing the
activities and output components of the logic model (i.e., the left side).
Results of a process evaluation will strengthen your ability to report on your program and use information to improve
future activities. It allows you to track program information related to Who, What, When and Where questions:
• To whom did you direct program efforts?
• What has your program done?
• When did your program activities take place?
• Where did your program activities take place?
• What are the barriers/facilitators to implementation
of program activities?
Imagine you want to evaluate your adaptive surfing program
Or use a student group example
Needs assessment – incidence and prevalence of a problem. Were we accurate? Was this thorough or focused enough? Incidence, Prevalence, Target Populations, Secondary Populations
Program Theory Evaluation – Making a logic model that explains the relationships between the parts. Evaluate the accuracy and appropriateness of the logic model.
Program Implementation Evaluation – Did it follow the design and logic models? Was it implemented in a way consistent with the vision and plan?
Program Outcome Evaluation – Long-term and establishes cause/effect relationship between program and desired changes.
Program Impact Evaluation – Measures achievement of program objectives in short term – immediate effects on target population.
Program Efficiency Evaluation – Cost benefit assessment to justify funding, secure grants, accountability to stakeholders.
Use a “Contingency Perspective” – meaning choose the appropriate approach depending on the situational needs. No need to do all of them, but important to justify why the one or two you choose is/are the most relevant/appropriate.
Fall prevention program
Use a participatory approach to plan your program evaluation at your site. Who? What? How? When appropriate? When not?
Use a U-FE approach to plan an evaluation at your site. Who? What? How? When appropriate? When not?
AI is most useful when – fear of evaluation exists, change needs to be accelerated, dialogue is critical, poor relationships or sense of hopelessness, and desire to build a community of practice.
Handout: Use it to practice on each other – about their CBPs
Surfing program example:
Your CBP example:
Stakeholders – funders, advisory boards, program managers, those who benefit from the program, those disadvantaged by the program in some way, general public with interest in outcomes of the evaluation.