1. Process Evaluation: Documenting the
‘How” and Understanding the “Why” of
implementation
Sources:
•Pascale Wortley, Immunization Services Division,
NCIRD, 2008
•Jane Bertrand, JHU, lecture
•Barri Burrus et al, RTI
•Ruth Saunders et al, Health Promotion Practice,
2005
2. Definitions
Process evaluation:
Examines whether program activities been implemented as
intended
Answers questions about what has happened as a result of
an intervention’s implementation
• To whom, what, when, where, how much
intervention has been delivered/received by
participants?
• How have participants reacted to the intervention?
3. 3
Uses of Process Evaluations
Barri Burrus et al, Process Evaluation , RTI, 2008
Formative
• To identify what is working
well and what needs
improvement
• To explore whether
intervention may be
harming participants
Summative
• To use data as mediators
and moderators in analysis
of impact
– Exploration of dosage
treatment effects
• To help ensure intervention
is not falsely rejected
• To document what was
done if outcome warrants
future replication of
intervention
7. What are we trying to learn through
process evaluation?
– Do we have the right mix of activities?
– Are we reaching the intended targets?
– Are the right people involved as partners,
participants, and providers?
– Do the staff/volunteers have the necessary
skills?
– How well do our activities meet with our
priorities
8. Involving stakeholders
• Gain broader perspective, avoid blind spots, try
to ensure utilization of results
• Key stakeholder: Ministry of Health
– Those served or affected by activity
– Those involved in program operations
– Those in a position to make decisions about the
activity (decision factors: cost, acceptability, etc)
• For a manageable process, the list of
stakeholders must be narrowed to primary
intended users
9. Steps in Conducting Process Evaluations (Burrus et al)
• Identify all key components of program
• Create or revise pathway/logic model including process
variables
• Determine objectives of process evaluation with
stakeholders who will use the evidence!
• Ensure that program implementation, dosage, and fidelity
are measured
• Create measures: determine data sources and instruments
• Determine measurement procedures and schedule
• Collect data
• Determine how process measures will fit in the analysis
• Include process data in impact evaluation
9
10.
11. Some useful concepts
• Reach
• Quality of implementation
• Appropriateness
• Satisfaction
• Barriers
12. Reach
• Degree to which intended audience
participates in intervention
– Percent of target population that heard
messages
– Percent of persons attending influenza
vaccination clinic that do not usually get
vaccinated
– Percent of mothers contacted by peer-to-peer
counselors
13. Quality of implementation
• Was activity implemented properly, according to
standards or protocol
– Was feedback session conducted as per guidelines?
– Are staff interacting with right people?
– Did peer counselors interaction with mothers follow
training or protocol?
– Is training of staff for a given activity standardized?
– Was training curriculum delivered in its entirety?
14. Appropriateness/Acceptability
• Interventions or messages that are delivered
may only be effective if judged appropriate by
target population, or if designed in manner to
achieve objective
– Did messages “speak” to target audience?
Note: short term outcomes also related to
appropriateness, e.g.
Did knowledge or skills increase as a result of
training?
Was information provided in training subsequently
used?
15. Satisfaction
• The extent to which participants are satisfied
with training or interaction may influence
subsequent behavior
– Provider satisfaction with session
– Peer counselor satisfaction with training
16. Barriers
• This attempts to understand why something
didn’t happen, and may identify key
environmental variables
– Reasons mothers didn’t bring their children to the
clinic (transportation, clinic schedule, other?)
– Reasons providers didn’t implement
recommendations (resources?)
– Reasons seniors didn’t come for influenza vaccination
clinic (unaware of clinic, concern about vaccine, no
perceived need?)
17. Prioritizing evaluation questions
Is the question:
• Important to your program staff and
stakeholders?
• Does it reflect key goals and objectives of your
program?
• Does it reflect key elements of your
pathway/logic model?
• Will it provide information you can act upon to
make program improvements?
• Can it be answered using available program
resources?
• Are there any available data sources?
19. Qualitative methods
• Case studies
• Structured or semi-structured interviews
• Focus groups
• Direct observation
• Reviews of program meeting minutes,
progress reports