This presentation has made to health workers who have more than two decades of experience of managing/implementing public health programs in Nepal, especially at district level and below.
8. Project in log-frame language
Resources
Results
Purpose/Objectives
Goal Impact
Outcome
Output
Input Input
Output
Input
Outcome
Output
Input Input
9. What is Monitoring?
Continuous process of collecting and analyzing
qualitative and quantitative data to track the progress of
programs.
Assess the extent to which the input deliveries and
whether work schedule and other required actions are
proceeding according to plans.
A process aimed at ensuring that activities are on the
right track and in case of deviation appropriate corrective
actions can be instituted.
Managers depend on the resulting parameters to
determine which areas require greater effort and,
thereby, may contribute to an improved response
11. What is Evaluation?
A periodic assessment of on-going or completed programs
Links a particular output or outcome directly to particular
intervention
Aim is to determine the relevance, efficiency, effectiveness,
impact and sustainability of the interventions.
Explores deeper at explaining the cause and effect and other
wider issues about the interventions.
Helps to deal with problems that monitoring is not able to
address. Monitoring data is often necessary to conduct
successful evaluation
Helps program managers determine the value of particular
program
12. Difference between Monitoring and
Evaluation?
Monitoring Evaluation
Monitoring answers
“What are we doing?”
Evaluation results
“What have we
achieved?”
13. Monitoring and Evaluation is …
A continuum of observation, information
gathering, analysis, documentation,
supervision and assessment
15. Current M&E practice in health
system
• Raw data are generated monthly, trimester and
annually at SDPs, district, region and national level
• Indicator-based analyzed reports are generated each
trimester applying rates, ratios and percentage.
• Different EDPs supported districts vs. non-supported
districts comparative analysis report generated
periodically.
• Single integrated DoHS/Regional/District level
Annual Report
16. Bottom up Performance Review
Process
National Review Workshop (Aswin 15-30)
Regional Review Workshop (Aswin 1-15)
District Review Workshop(Bhadra 15-30)
Review at PHC/HP(Shrawan 15-30)
VDC Level Review(Shrawan 15-30)
17. Uses of M&E
Strategic decisions
Project design
Resource allocation
Project process verification
Project re-alignment
Project evaluation
18. General Framework of M&E
Indicators
Input Process Output Outcome Impact
People,
Money,
Equipment,
Policies,
etc.
Training,
Logistics,
Management,
IEC/BCC,
etc.
Services.
Service use,
Knowledge,
Quality,
Behaviour,
Safer
practices,
(population
level)
Health Status
Mortality
19. What is the Most Important
Indicator?
Input?
Process?
Output?
Outcome?
Impact?
What these level of results usually refers to …?
(Let’s discuss, with some example from a project)
20. M&E pipeline
All
• Input
monitoring
Most
• Process
monitoring/
Evaluation
Some
• Outcome
Monitoring
Evaluation
Few
• Impact
Monitoring
Monitoring:
Process Evaluation
Evaluation:
Effectiveness Evaluation
Levels of Monitoring and Evaluation effort
Input Process Output Outcome Impact
21. Result monitoring
Policy Monitoring
• HIV prevalence among FSWs
• Prevalence of IPV among young pregnant women
Programme monitoring
• Programme coverage among FSWs
• Coverage of GBV reach
Project monitoring
• No of condoms distributed/consumed
• No of GBV sessions condcuted
22. Essential of programme logic
Assumptions/Context
Problem Statement
Implementation
Inputs Activities Outputs
Outcomes
Impacts
24. Thus, we came to know that M&E is
for
Programme
Improvement
Accountability
25. What are the Principles of M&E?
National M&E policies
Fundamental principles of M&E
Learning approach
M&E as an integral part of overall programme planning
cycle, including costs
Partnership and stakeholders’ engagement
Quantitative and qualitative approaches
Time-bound approach
Principle of One M&E
Principle of Accountability
Functional M&E system principle
26. Where do M&E stand for?
• Depends on what policies are adopted for
monitoring and evaluation
27. What is critical is “Accountability”
Being CAR by All, ….. as per the roles defined and
agreed
Capable
Accountable
Responsive
29. Why M&E in (of) health is complex?
Health is a basic human right
Health is the ultimate measure of development
(highest level of impact)
Health is multi-factorial
Health is complex
Health is knowledge
Good health is fairness
30. However, Access to Health is
determined by
Individual and population
Social and systems
Endogenous and exogenous
Proximal and distal
33. So the interventions?
… for good health …
Are the interventions/programs sensitive
enough to address the health determinants?
What is the performance of health
interventions/programs?
How can we know they Are? Or they are Not?
36. Supervision (of HRH)
• Way of ensuring staff competence, effectiveness and
efficiency through observation, discussion, support
and guidance.
• Management by overseeing the performance or
operation of a person or group
• Supervision is concerned with encouraging the
members of a work unit to contribute positively
toward accomplishing the organizations goal and
objectives.
39. Supervision Methods
Indirect
by analyzing records & reports and (Quantitative)
providing feedback
Direct
by observing the performance of health workers
(Qualitative) while on job doing clinical and public
health assessment, counselling etc.
by observing/verifying, IEC material displayed,
drug position;
by discussing with service providers &
beneficiaries
40. Supervisory tools
Job descriptions
Checklists
Supervision schedule
Policy manuals
Registers and records
Charts and graphs
Reports
Work plans and work
schedules
Guidelines for supervision
41. Supervision - activities
• Preparing checklist
• Preparing field visits
• Data collection and analysis
• Specifying training needs assessment
• Decision making for problems solving
42. Skills Requirement
Technical : Clinical, counseling
Human relation : Behavior, team spirit,
motivation, conflict resolution
Administrative : Planning, organizing, controlling
Decision taking : Problem solving, re-planning
43. Routine supervision strategy in
Nepal
From To
Times per Year per
institution
Center
Region 2
District
Terai 3
Hill 2
Mountain 1
District PHCC/HP 6
PHCC/HP SHP 6
SHP Ward 6
44. Challenges of M&E in health sector
in Nepal
• Data generation - quality, coverage, and use (?)
• Using evidence to inform policies and improve
programme – E2
• Using data for Advocacy and Action (A2)
Analysis & Advocacy (A2) Efficacy & Efficiency (E2)
• Reviewing systems, tools and functions
• Updating in time with state-of-art knowledge
• Rolling out again
45. What about Quality?
• In M&E, Quality matters the Most
• Quality is not automatic nor it is free
• Quality is the composite result of quality
systems (policy and governance), tools
(design) and its execution (practice)
46. Quality matters …
Data is Essential … Quality is Concern
Quality is often Subjective too … Difficult to Define
… but You Know what is Quality
Data quality = Fitness for Use
~ Tayi and Ballou
47. Why quality data ?
Better data
Better
decisions
Better
health
The whole concept is …
48. Using data for decision making
Problem Solution
Problem
Problem
solving
Solution
Use the Data
49. 6 Criteria for Data Quality
Data
quality
• Validity
• Reliability
• Integrity
• Precision
• Timeliness
• Confidentiality
50. What makes sense in M&E
• Data use is the key (central point of interest)
• Makes no sense whatever the efforts are placed
for strengthening of M&E, if the Data are Not Used
52. Investing on M&E
More relevant in broader development agenda
For health, there is no alternative
It is a governance and accountability issue
If so, it deserves adequate resource allocated
for M&E
It is the reflection of Attitude and Behavior of
individual and organization s/he belongs to
53. M&E is for Ensuring Universal
Health Coverage
• Quality M&E is to ensure the effective coverage
• Universal Health Coverage is …
• quality in access to health services - those who need the
services should get them, not only those who can pay for
them;
• that the quality of health services is good enough to
improve the health of those receiving services; and
• financial-risk protection - ensuring that the cost of using
care does not put people at risk of financial hardship.
55. Remember !
Data is for Decision Making
It is for Quality service to People
Quality is Behaviour
We have a Role to Play !
It is an Ethical Issue
M&E is the surest path to ensure Universal Health Coverage.
Thank you for your Commitment to M&E
56. Evaluation of Health Services
Deepak K Karki
National Centre for AIDS and STD Control
57. Outline
What is Evaluation ?
Evaluation of health service ? -
Why? and How?
58. What is Evaluation?
• Evaluation is the systematic investigation of the
merit, worth or significance of the service
• Effective program evaluation is a systematic way
to improve and account for public health actions.
• It is a learning.
59. Evaluation is for …
• Assigning ‘value’ addressing three inter-related
domains:
• Merit (or quality)
• Worth (or value, i.e., cost-effectiveness)
• Significance (or importance)
• Evaluation involves procedures that are useful, feasible,
ethical, and accurate.
• Evaluation is in Learning framework – better future
program design and execution arrangements.
60. What does evaluation explores?
• What will be evaluated? (i.e., what is "the program" and in what context
does it exist?)
• What aspects of the program will be considered when judging program
performance?
• What standards (i.e., type or level of performance) must be reached for
the program to be considered successful?
• What evidence will be used to indicate how the program has performed?
• What conclusions regarding program performance are justified by
comparing the available evidence to the selected standards?
• How will the lessons learned from the inquiry be used to improve public
health effectiveness?
61. Why health service evaluation?
Effectiveness
Accountability (policy/stakeholders)
Improvement
Impact
62. Where does evaluation hits, the
most?
• Implementation: Were your program’s activities put into place as
originally intended?
• Effectiveness: Is your program achieving the goals and objectives it
was intended to accomplish?
• Efficiency: Are your program’s activities being produced with
appropriate use of resources such as budget and staff time?
• Cost-Effectiveness: Does the value or benefit of achieving your
program’s goals and objectives exceed the cost of producing them?
• Attribution: Can progress on goals and objectives be shown to be
related to your program, as opposed to other things that are going
on at the same time?
65. Types of Evaluation design
• Experimental
• Quasi Experimental
• Observational
Evaluation question is the key
66. Key for a good evaluation design
Standards Questions
Utility • What is the purpose of the evaluation?
• Who will use the evaluation results and how will they use them?
• What special needs of any other stakeholders must be addressed?
Feasibility • What is the program’s stage of development?
• How intense is the program?
• How measurable are the components in the proposed focus?
Propriety • Will the focus and design adequately detect any unintended
consequences?
• Will the focus and design include examination of the experience of those?
• Who are affected by the program?
Accuracy • Is the focus broad enough to detect success or failure of the program?
• Is the design the right one to respond to the questions such as attribution
that are being asked by stakeholders?
67. Remember !
Evaluation is a periodic assessment
Aim is to determine the relevance, efficiency, effectiveness,
impact and sustainability of the interventions.
Explores deeper at explaining the cause and effect and other
wider issues about the interventions.
Helps program managers determine the value of particular
program