Monitoring and Evaluation Expert à Management Systems International
28 Nov 2014•0 j'aime•8,928 vues
1 sur 74
Results based planning and management
28 Nov 2014•0 j'aime•8,928 vues
Télécharger pour lire hors ligne
Signaler
Gouvernement et associations à but non lucratif
The presentation is to train government and non-government planners to develop their skills for results-based planning and management for social sector programmes and projects.
2. 2
OVERALL OBJECTIVES
To create a conceptual clarity in participants
on result-based Monitoring and Evaluation
To improve the skills of the participants to
carry out planning, monitoring and reporting
of programmes
3. 3
SPECIFIC OUTPUTS
• Increased knowledge on concepts of Result-based
Management
• Improved understanding on logframe and Result-based LFA
approaches
• Enhanced skills to determine programme results and
develop appropriate indicators for monitoring, evaluation
and reporting proposes
• Increased understanding of result-based reporting and skills
to report on results as per predefined formats
4. 4
AGENDA – DAY I
S.No. Session Time
DAY-I:
1. Introduction and Objectives 1145 – 1200
2. Session 1.1: Basic Concepts
- Result-based Management
- Project Cycle Management
1200 – 1315
3. Lunch 1315 – 1400
4. Session 1.2: Monitoring and Evaluation,
Use of Logframe and its types
1400 – 1530
5. Tea 1530 – 1545
6. Session 1.3: Results, Result Chain Logic 1545 – 1700
8. Project Cycle
• The project cycle visually reminds us that there
is a logical sequence to be followed in project
planning.
•
• There are numerous diagrammatic
representations of the project planning cycle. A
summary would present a sequence of
identification, implementation and information
management (or project control).
9. Project Design
and Planning
Continuous
Performance
Monitoring and
Reporting
Annual
Performance
Appraisal
Project
Adjustment
Situational
Analysis
Project Cycle
11. 11
Exercise:
Reorganize the planning stages of the community
development project shown below into a more logical
sequence
12. Result-based Management
• RBM is an approach to project/programme
management based on clearly defined results, and
the methodologies and tools to measure and achieve
them.
• RBM supports better performance and greater
accountability by applying a clear, logical framework
to plan, manage and measure an intervention with a
focus on the results you want to achieve.
15. 15
Monitoring
1. It is the continuous and periodic
assessment of the physical
implementation of a project/programme
2. To assess if inputs have been delivered
and activities carried out according to
plans, and if the project has achieved its
outputs
16. 16
Evaluation
• Evaluation provides a “thumbs up” or thumbs
down” judgment of the project/programme.
• Evaluation is a systematic and objective assessment
of the design, implementation and results of an on-going
or completed project or programme.
• The aim of the evaluation is to determine the
efficiency, effectiveness, impact, sustainability and
relevance of the project.
17. 17
Participatory M&E
Participatory Monitoring and Evaluation is a
democratic process for examining the values,
progress, constraints and achievements of
projects and programmes by all relevant
stakeholders.
18. 18
Difference
Participatory Monitoring
&Evaluation
-Participant focus and
ownership
-Broad range of stakeholder
participation
-Focus on learning
-Flexible design
-Rapid appraisal method
-Outsiders are facilitators
Traditional Monitoring
&Evaluation
-Funder focus and
ownership
-Stakeholders often don’t
participate
-Focus on accountability
-Formal methods
-Outsiders are evaluators
19. 19
Participatory M&E
• The Logical Framework Approach (LFA) is a
management tool mainly used in the design,
monitoring and evaluation of international
development projects. It is also widely known as
Goal Oriented Project Planning (GOPP) or
Objectives Oriented Project Planning (OOPP).
• It is useful to distinguish between the two terms:
the Logical Framework Approach (LFA) and
Logical Framework (LF or Logframe). They are
sometimes confused. The Logical Framework
Approach is a project design methodology, the
LogFrame is a document.
20. 20
Result-based LFA
The Results Based LFA is a document used for
planning and review.
It contains information about what an organization
hopes to achieve in the short, medium, and long
term, how that achievement will be measured, and
some key assumptions that the organization is
making about the conditions necessary for its
success.
21. 21
European Logframe
Hierarchy of
Objectives
Objectively Verifiable
Indicators
Means of
Verifications
Risks/Assumptions
Goal: Indicators
Project Purpose: Indicators:
Outputs: Indicators
Activities Resource/Inputs: Budget
22. 22
Results-Based LFA Format
NARRATIVE
SUMMARY
EXPECTED RESULTS PERFORMANCE
MEASUREMENT
ASSUMPTIONS/RISKS
AND INDICATORS
Goal (link to
Country Program
Goal):
Impact: Performance
Indicators:
Assumptions:
Risk Indicators:
(High, Medium, Low)
Purpose: Outcomes: Performance
Indicators:
Assumptions:
Risk Indicators: (High,
Medium, Low)
Inputs/Resources: Outputs: Performance
Indicators:
Assumptions:
Risk Indicators: (High,
Medium, Low)
23. 23
What is a Result?
• A result is a describable or measurable
development change resulting from a cause and
effect relationship.
Development results involve changes in:
• power relations,
• how resources are distributed,
• improvements in the well-being of a local
population, or organization,
• changes in attitudes and behaviours of people,
among other things.
24. 24
What is a Good Result?
• Results should be G-SMART
– Gender-inclusive
– Specific
– Measurable
– Achievable
– Relevant
– Time-bound
• A result is NOT a completed activity, but …
completed activities should lead to results
25. 25
Key Results Words
• Improved
• Increased
• Strengthened
• Reduced
• Enhanced
• HINT: any change that can be stated in terms
of either quality or quantity
26. 26
Results Chain
• These results are linked together into what is
commonly referred to as a results chain.
• It is very difficult to contribute to the impact
without first achieving some intermediate
steps: the outputs or short-term results and
outcomes or medium term results.
27. Results Chain
27
ACTIVITY (IES) OUTPUT(S) OUTCOME(S) IMPACT
Activities:
Judges and
More
Designing and
lawyers more
considered
delivery of
knowledgeable
interpretation
curriculum on human
about human
of gender
rights and potential
rights and
equality issues.
gender biases in
gender
Increased new
hearing cases and
equality
legal decisions
interpreting evidence.
standards and
that reflect
Arrange discussion of
how to apply
greater gender
judges and lawyers
them.
equality.
on issues of human
rights and gender
equality in hearing
cases.
More
considered
interpretation
of gender
equality issues.
Increase in new
legal decisions
that reflect
greater gender
equality.
28. Short-term Medium-term Long-term
28
Operational vs Developmental Results
Program / Project Management
Inputs Activities Outputs Outcomes Impact
Operational Results:
The administrative and
management product
of an agency, its
programs or projects
Developmental Results:
An actual change in the state of
human development is the logical
consequence of a CIDA investment in
a developing country
29. 29
Operational vs Developmental Results
Resources and Activities
Outputs
Outputs
Outputs
Outputs
Outputs
Outcomes
Outcomes
Impact
30. 30
Beneficiary Reach
Beneficiary Reach is the overarching term that includes all
individuals and groups or organizations benefiting either directly
or indirectly from a funded project or programme.
Direct Beneficiaries are those populations, groups or
organizations which are within the immediate reach of a
funded programme or project which are expected to benefit
at the output and outcome levels.
Indirect beneficiaries are those populations, groups or
organizations who will indirectly benefit from the project and
are outside the immediate reach of a given funded project or
programme, yet are expected to benefit at the impact level.
33. An indicator is a pointer. It is a number, a
fact or a perception that measures changes
in a specific condition over time. Indicators
are the key in monitoring and evaluation.
Indicators define the data required to
compare actual results with planned results
over time.
33
Indicator
34. 34
What is a Result Indicator?
• An indicator provides evidence that a result
has been achieved.
• An indicator measures progress, by noting
changes at different points in time.
• Indicators can be qualitative or quantitative.
35. Process and Result Indicators
• Results indicators are used to measure project
performance
35
– Input Indicators
– Process Indicators
– Output Indicators
– Outcome Indicators
36. 36
Indicators Types
• Quantitative
–numbers, statistics, frequency, ratios,
variances, etc.
• Qualitative
– changes in attitudes, behaviours, skills,
perceptions, quality, level of understanding,
etc.
– Both Quantitative and Qualitative (QQTTP)
39. • For each output, outcome and impact statement,
brainstorm a list of several key indicators
(qualitative, quantitative, and gender-sensitive).
• Apply the criteria to narrow the list to
MAXIMUM two to three indicators per result
statement.
• Make sure that baseline information is available
for each indicator.
39
Steps for Indicator Design
44. The Performance Management Framework
44
Performance
Framework
Performance
Indicators
Data
Sources
Collection
Methods
Frequency Responsibilities
Impact
Outcomes 1
2
Outputs 1.1
1.2
2.1
2.2
45. • Sources of performance information include
individual beneficiaries, groups of beneficiaries,
organizations, partners, documents, etc.
• Identify data sources for each indicator
• To ensure reliability, try not to change data
sources over time
45
Performance Data Sources
46. • Sources of performance information include
individual beneficiaries, groups of beneficiaries,
organizations, partners, documents, etc.
• Identify data sources for each indicator
• To ensure reliability, try not to change data
sources over time
46
Source of Verification
47. 47
Performance Data Analysis
• Compare changes to baseline data.
• ‘Triangulate’ data by trying to collect similar
information from different sources.
• Ensure that data is gender disaggregated so
that different information about men and
women is clearly shown.
48. Roles and Responsibilities for Collecting Performance Data
48
• Who will collect the data?
• Who will analyze the data?
• Who will prepare the reports?
• How will the information on performance be
shared with stakeholders?
• How will the performance information be
used to make decisions?
49. 49
Frequency of Data Collection
Performance
Indicators
Performance
Indicators
Performance
Indicators
Program/Project
Management
Performance review
in light of
assumptions made
about causal links
between activities,
outputs, outcomes
and impact
Output Outcome Impact
Continuous
Self-Assessment
Annual Project
Performance Appraisal
Program Impact
Evaluation
50. • Community-designed indicators can be
measured by communities themselves through
simple data collection instruments (records,
meetings, interviews, PRA tools, testimonials,
self-assessments, etc.).
• Technical indicators sometimes require that
questionnaires or surveys be administered by
specialized researchers.
50
Data Collection Methods
51. 51
Data Collection Techniques
1. QUESTIONNAIRE
2. INTERVIEW
3. OBSERVATION
4. FOCUSED GROUP DISCUSSION/OTHER PRA
TECHNIQUES
5. TALLY METHOD
52. 52
Sources of Information
• PRIMARY
– KEY INFORMANTS
– ORGANIZATIONS
– COMMUNITY/RELEVANT STAKEHOLDERS
– PROJECT BENEFICIARIES
• SECONDARY
– JOURNALS/NEWSPAPERS
– DOCUMENTS AND RECORDS
– CENSUS REPORTS
– ANNUAL REPORTS
55. • Factors that will negatively affect the
assumptions and that negate the positive
conditions required to produce results. Risks
may or may not be within the control of project
managers.
• External risks include the development context.
• Internal risks include such things as changes in
key project personnel, availability of resources,
etc.
55
What are Risks?
56. 56
What is Risk Analysis?
• Risk analysis involved ‘rating’ of assumptions
based on their likelihood of holding true.
– Low risk assumptions will probably hold true (little
threat to project)
– Medium risk assumptions may or may not hold
true (some threat to project)
– High risk assumptions will probably NOT hold true
(high threat to project)
57. Identifying Assumptions and Analyzing Risks
57
Assumptions Assumptions Assumptions
Moderate Risk High Risk
Outputs Outcomes Impact
Inputs /
Activities
Low Risk
Decreasing Management Control
59. 59
EXERCISE:
STAKEHOLDERS ANALYSIS
Stakeholders
Type of
Effect
Risk Potential
Impact
Likelihood Value PLAN
Punchayat
Negative Gender-biased
approach in
decision making
6 6 36 PLAN
Institutions
Influential
(Men/ Women)
Marginalized or
Vulnerable
groups
(Men /
Women)
62. Gender Analysis
– Level of women’s participation (Control,
Planning, implementation, monitoring)
– Women’s empowerment
• Decision making
• Economic Independence
• Mobility
– Ratio of women’s beneficiaries
11/29/2014 62
63. Environmental Analysis
– Determine Environmental objectives (e.g. health
clinic and environment Awareness)
– Components leading to environmental effect?
– What are these effects and their significance?
– What strategies and specific measures will be
implemented to alleviate or eliminate the
negative effects and increase its benefits?
– How will the project be monitored to ensure the
effective implementation of these measure?
11/29/2014 63
64. 64
Exercise: Work Plan and Budgeting
Average Targets Target Target Target Total Budget Budget Budget Budget Grand
Code Results/Activities
Buduget
Notes
Unit Type Unit Cost Q-I Q-II Q-III Q-IV Targets Q-I Q-II Q-III Q-IV Total
Outcom
e 1:
Output
1:
1
2
3
Personnel Costs
Other Direct Costs
Total of Output 1:
Output
2:
1
2
3
Personnel Costs
Other Direct Costs
Total of Output 2:
Total of Outcome 1:
5 Capital Costs
6 Personnel Costs (Admin)
7 Other Direct Costs (Admin)
8 5% Contingency Budget
Grand Total
65. 65
Salaries and Benefits
Gross Gross Annual Annual (Entitled) Total
Monthly Yearly Tax Employer Medical Package
S. No,. Name DESIGNATION Salary Salary Deduction Cont. PF Allowance (in PKR)
Staff Salaries
Output 1
1
2
3
4
Total Staff Salaries
Staff Salaries
Output 2
1
2
3
4
Total Salaries
Output 2
Admin Staff
Salaries
1
2
3
4
Total Admin Costs
Grand Total
66. 66
Other Direct Costs
Distribution of costs by initiatives
S.No. Other Direct Costs Output 1 Output 2 Admin Total
1 Utility Bills (Electricity / Gas / Water)
2 Telephone / Mobile
3 Vehicle Running & Maintenance
4 Internet Charges
5 Entertainment
6 Printing And Stationary
7 Books And News Paper
8 Postage And Courier
9 Repair And Maintenance
10 Audit Fee (Internal and External)
11 Misc. / Others
Grand Total (in USD)
67. 67
Budget Notes
S.No. Description of Line Items Cost Days Pax Cost (in PKR)
1 Consultant/Trainers' Fee
2 Food and Accommodation
3 Material
4 Travel Costs
5 Communication
6 Misc. Costs
Grand Total
69. 69
Reporting
• All stakeholders need to be kept informed about
the project/programme through periodic and
extra-ordinary reports
• MER systems have to be designed in such a way
that they are able to generate appropriate reports
for the various audiences (e.g. Governing body,
donors and communities etc.)
• Users of Reports
• Communities / CBOs / Local Institutions / Donors
72. 72
Functions of M&E Section
• Guide and oversee the implementation of the
project’s MER systems and requirements;
• Ensure consistency in the objectives and indicators
in the program(s);
• Prepare and monitor the program’s Performance
Monitoring Plan (PMP);
• Spearhead and coordinate the development of
quarterly reports for the donor, as well as other
required reports;
• Coordinate and ensure timely and quality sub-grantee
reporting;
73. Functions of M&E Section (Continue)
• Design and implement MER training programs for
program sub-grantees;
• Work with the field staff to develop program
success stories;
• Design and implement a system to identify, analyze,
document and disseminate lessons learned;
• Work with the Budget program coordinators to
monitor program implementation;
• Acquaint programme staff and partners with MER
tools, strategies and plans;
• Constantly work to improve MER strategies and
reporting formats.
73