SlideShare une entreprise Scribd logo
1  sur  75
Content
• Definitions and types of monitoring and
evaluation activities
• Design, Monitoring & Evaluation Cycles
• Logical framework for strategic planning
• Monitoring and Evaluation Methods: Methods
of Program Review, Supervision and Evaluation
Agenda .......
• Basic quantitative and qualitative data collection
techniques
• Data Processing: analysis, interpret and
presentation of routine program data
• Structure of a technical report: understanding
observation, interpretation, conclusion and
recommendation
Objectives
Develop a common understanding of
monitoring and evaluation
Monitoring?
Or
Monitoring and Evaluation
Design Monitoring and Evaluation (DME)
or
DME-IS (Information System)
Participatory Monitoring and Evaluation (PME)
DME cycle
Project DM&E Cycle
Diagnosis Hypothesis, Appraisal
Design Analysis
Log frame Focus strategy
M&E plan Coherent information
system
Baseline
Monitoring
Evaluation Reflective practice
Lesson learned
The Steps in a Survey Project
1. Establish the goals of the project - What you want to
learn
2. Determine your sample - Whom you will interview
3. Choose interviewing methodology - How you will
interview
4. Create your questionnaire - What you will ask
5. Pre-test the questionnaire, if practical - Test the
questions
6. Conduct interviews and enter data - Ask the
questions
7. Analyze the data - Produce the reports
M&E work plan
• Identify program goals and objectives
• Determine evaluation questions, indicators and their
feasibility
• Prepare design methodology – monitoring the process
and evaluating the effects
• Resolve implementation issues: who will carry out the
work and how will existing data and past evaluation
studies be used?
• Identify internal and external evaluation resources and
capacity
• Develop the M&E work plan matrix and timeline
• Develop plan to disseminate and use evaluation
findings
Monitoring Steps
•Setting environment
•Setting performance standard
•Measuring indicator
•Data collection for MIS
•Reporting
•Action
Steps in designing M&E system
•Review project document, goal, objective.
•LFA analysis
•Stakeholders analysis
•M&E strategy and methodology
•Select indicator
•Develop the M&E plan
•Scope of monitoring system
•Data collection for M&E
oBaseline
oRegular monitoring
oRapid survey
oMid term and Final evaluation
•Data management
•Data processing and analyzing
•Reporting and presentation
•Communication
Area of Monitoring
•Program activities
•Implementation process
•Pit fall or short coming
•Financial and expenditure
•Training & Follow-up
•Reported activities
Monitoring may be
Qualitative & quantitative
Participatory and Non-
participatory
Internal & External
Direct and Indirect
Oral or written
Component of Project Monitoring system
Financial monitoring
Monitoring project
implementation
Assessing the efficiency of project
implementation.
Beneficiary contact monitoring
Developing M&E matrix after project
start-up involves six steps
•Identify performance questions
•Identify information needs and indicators
•Know the what baseline information you need
•Select which data gathering methods to use, by
whom and how often
•Identify the necessary practical support for
information gathering
•Organize, analysis, feedback and change.
Monitoring and evaluation planning matrix
•Hierarchy of objective
•Objectively verifiable indicator (OVI)
•Sources of information
•Methods of data collection
•Methods of data analysis
•Type of activity (regular monitoring, periodic,
evaluation)
•Frequency
•Application (expected uses and uses)
•Circulation (expected information uses)
Sample of M&E Matrix
Performa
nce
Questions
Informati
on
Needs and
Indicators
Baseline
information
Requirements
status and
Responsibiliti
es
Data-Gathering
Methods,
Frequency and
Responsibilities
Required forms,
Planning, Training,
Data Management,
Expertise,
Resources, and
Responsibilities
Analysis,
Reporting,
Feedback,
Change
processes, and
responsibilities
Project Key Outcome-1:
Project Key Output-1:
Sample of M&E Matrix
Monitoring Cycle
 Selection of indicators
 Prepare checklist
 Design questioner
 Processing sheet
 Structure of presentation
 Field test questioner
 Finalize design
 Data collection
 Cross checking
 Share the emerging findings
 Data processing
 Data analysis
 Sharing the data findings
 Sharing the emerging findings
 Prepare analysis report
 Circulate the report
Limitation of Monitoring
•Inadequate resources
•Lack of knowledge and skill
•Limitation of fund
•Dilemma and problem
•Miss interpretation
•Distortion of communication
•Lack of time
Importance of evaluation
•It is an end-of-the-day’ activity: one look back over the
shoulder to see what has been done or accomplished.
•It is an event that provides staff and participants with the
opportunity to step back and take deeper look at the outcome.
•It refers to the periodic examination and analysis of the
project information that pertains to:
oProject design-goal, objectives and plans
oProject implementation- input and outputs
oResults- effect and impact.
•Evaluation attempts to identify project efficiency,
effectiveness, sustainability and relevance.
Importance of evaluation cont ….
•It provides stakeholders with information they need to
analyze whether the project is meeting its planed goal and
objectives.
•Project and program planners and donor need
information about the strengths and weaknesses, and the
successes and failures of projects in order to maintain or
improve the quality of design and strategic planning.
•Donors need information in order to assess the value of
their financial contribution.
•Evaluation provides information of core evaluation
questions (Relevance, Effectiveness, Efficiency, Impact
and Sustainability) to the decision makers (Project and
program planners and donors).
Purpose of evaluation
•Achievement-seeing what has been achieved.
•Measuring progress according to objectives.
•Improving monitoring for better management.
•Identify strength and weakness to improve the program.
•Seeing if efforts was effective-what difference has the
program made.
•Cost-benefit analysis to asses the effectiveness and
efficiency/were the cost reasonable.
•Collecting information-to plan and manage program
activities better.
•Sharing experience-to prevent other making similar
mistakes or to encourage them use similar methods.
•To revise/readjust program strategy.
•Improving effectiveness to have more impact.
•Better/Future planning.
The five core evaluation questions
Relevance: Was/Is the project good idea given the
situation needed? Dose it deal with target group priority?
Why or why not?
Effectiveness: Have the planed purpose and component
objectives, Outputs and activities been achieved? Why or
why not?
Efficiency: Were input (Recourse and time) used in the
best possible way to achieve outcomes? Why or why not?
What could we do differently to improve implementation
there by maximizing impact at an acceptable and
sustainable cost?
The five core evaluation questions
Impact: To what extent has the project contributed
towards its longer term goals? Why or why not? What
unanticipated positive or negative consequences did the
project have? Why did they arise?
Sustainability: Will there be contributed positive impacts
as a result of the project once it has finished? Why or why
not?
Information needs in different stages of a project
Stage Event Assessment Key question
Before Project Project Appraisal  Development needs
 Potentials of the target group or an
area.
 inputs required
 Cost benefit analysis
What are the problems?
What are the resources?
What are the unmet needs?
What is the cost benefit ratio?
Project Start up Baseline survey  Current situation of the target group. What is the current situation?
During
Implementation
Monitoring  Progress in implementation
 Project efficiency
and effectiveness
Is the project proceeding according to plan
and strategy?
Mid of the project Formative Evaluation  Capacity of the project to achieve what
it set out to do.
 Strength of the implementing
institution in terms of its ability to
sustain the program.
Are the project strategies working?
End of Project Summative Evaluation  Changes in the lives of the target
group;
What effect(s) did the project have?
After Project Ex-post
Evaluation
 Sustainable improvement in human
condition or well being
What impact did the project have on the
lives of the people it was designed to
affect?
Comparing Monitoring and Evaluation
Area Monitoring Evaluation
Main
Purpose
o To guide future implementation
o To ensure accountability of
management and main stakeholders
o To provide basis of evaluation and
learning
o To maintain or improve the quality of project
design and strategic planning.
o To guide future project planning
Assessment o Assess progress in implementation
o Identify project efficiency
and effectiveness
o Assess overall outputs, effects, impact and
sustainability of project.
o Determine relevance of project design.
o Identify lesson learned.
Who Needs this
Information
o Target group
o Main stakeholder
o Project management staff
o Donor
o Project planers
o Donors
o Government
o Organization’s headquarters
Timing Continuously during project
implementation
At specified times :
oDuring project implementation
oAt project completion
oAfter 3-5 years of project completion
Source of
Information
Project plan, administrative records and
reports, observation
Monitoring reports, participant or population-based
surveys, observations, in-depth case studies.
External information
Type of Data
Required
Mostly quantitative data with some
qualitative data to add meaning
Combination of qualitative data and quantitative
data
People involve Project base monitoring team o External organization,
o External staff of same organization
Process of
conduction
Continuous o Periodical
Data nature Mainly primary o Both primary and Secondary
Analysis Not in depth o In-depth
Timeliness Quicker – compare with standards o Lengthy-respect to objectives
What is indicator
A set of data when if collected on a regular basis, will
indicate progress towards specific aims and objectives
of a project or a work. Indicators are evidence that
something has happened or that an objective has been
achieved: they are not proof.
Something, which provides a basis to determinate
changes as result of project activities for a target
population.
An indicator is like a marker/milestone which
shows what progress has been made. It may be
qualitative/quantitative
Criteria of a good indicator
Relevant: The indicators should be directly linked to the project
objectives, and to the appropriate levels in the hierarchy.
Technically feasible: The indicators should be capable of being
assessed (or
‘measured’ if they are quantitative).
Reliable: The indicators should be verifiable and (relatively)
objective; i.e., conclusions based on them should be the same if they
are assessed by different people at different times and under different
circumstances.
Useable: People in the project should be able to understand and use
the information provided by the indicators to make decisions or
improve their work and the performance of the project
Participatory: The steps for working with the indicator should be
capable of being carried out with the target community and other
stakeholders in a participatory manner: i.e., data collection, analysis
and use.
Others Criteria
comprehensible - the indicators should be worded simply and clearly so that
people involved in the project will be able to understand them.
valid – the indicators should actually measure what they are supposed to measure,
e.g., measuring effects due to project interventions rather than outside influences.
sensitive – they should be capable of demonstrating changes in the situation being
observed, e.g., measuring the GNP of Uganda doesn’t tell us much about the
individual households in one district.
cost-effective – the results should be worth the time and money it costs to collect,
analyse and apply them.
timely – it should be possible to collect and analyse the data reasonably quickly,
i.e., in time to be useful for any decisions that have to be made.
ethical – the collection and use of the indicators should be acceptable to the
communities (target populations) providing the information.
Easy for collection
Specifically
Measurable
Attainable
Periodically
Redundancy
Example of Quantitative indicator
Employment & Income
Total family income
Source of income
Income stability
Type of employment
Number of people working
Labor force participation rate of particular group
Proportion of self-employed
Proportion of working in formal and informal sectors
Savings cash and kinds
Demographic characteristics of the family
Family size and stability
Age composition
Education
Proportion of children attending in school
Civil status of HH head (service, occupation)
Geographic mobility
Example of Quantitative indicator
Housing cost, quantity, value
Sale or rental value of house, housing land, land
Construction quality
House size
Access to service from house (distance, location)
Health
Percentage of MCH mortality rate
Cause specific mortality rate and leading cause of death
Time lost from work or school due to sick
Access to medical service
Amount spent on medical service
Anthropometrics measures of weight and height.
Example of Quantitative indicator
Consumption pattern
Amount spent on housing
Amount spent on food
Amount spent spent on education
Transportation
Medical
Saved
Community participation and attitude:
Number of community organizations in which
participates
Number of friends in the community and project
Political, social and religious organizations
Participation in mutual help program
Satisfaction with the community
Satisfaction with the social, economic and political situation
Types of Question
Structured: Quantitative focus---Close ended
Non-Structured: Qualitative focus---Open ended
Contingency: By filter question if yes--- what it
Matrix question: A large set of relating question
What are the considerations to develop a question
Review the output/outcome indicator.
Identify relevant information for specific indicator.
Identify relevant questions for quality information considering
followings:
Why (purpose)
For whom (level of respondent)
What (Requirement)
When/time
How (Method)
Who will do it (skills & competencies)
Information breaking
Cost constrain
Simple, shortcut & purposive
Logical
Sequential arrangement
Take feedback from all concern
Field test to check the issue of validity and reliability
Finalize
Principle of question formulation
Catchy word
Short (minimum topic to meet objective0
Simple
Easy for respondent
Interesting for respondent
Have a complete guideline
Self explanatory
Avoid superficial result
Each question focus on particular issues
Sensitivity, emerging, emotional
Sequential
Only essential
Valid supportive question
Tabulation friendly
Leading and prompting question should be avoided
Professional word, jargon, technical terms should be
avoided
Data Collection Methods
Sampling related method
Random & non random methods
Core M&E methods
Stakeholders analysis
Documents review
Biophysical measurement
Direct observation
Cost-benefit analysis
Question & survey
Semi structured interview
Case study
Data Collection Methods
Discussion method
Brainstorming
FGD
Nominal group technique (between two group)
SWOT analysis
Dreams realized or visioning
Drama & role play
Methods for specially distributed information
Mapping
GIS mapping
Photograph
Video
Data Collection Methods
Methods for time based pattern of change
Seasonal diagram
Diaries review
Trend analysis
Calendar
Significant change
Methods for analyzing linkage and relationship
Rich picture or mind map
Impact flow diagram
Institutional linkage diagram
Problem and objective trees
M&E wheel
Ranking and prioritizing
Social mapping
Well-being ranking
Matrix scoring
Relative scale
Ranking and pocket chart
Types of Reports
Technical reports
Summary of result
Nature of study
Method employed
Data (collection, sources, characteristics,
limitation)
Analysis of data and presentation of findings
Conclusion (a summery findings)
Bibliography
Technical appendices (will be given fll technical
matters relating to questioner, numerical
derivations, elaboration on particular techniques
of analysis and the like ones).
Index
Types of Reports
Popular reports (simplicity and
attractiveness, clear, minimizing technical
term, liberal, chart, diagrams, user-friendly)
The findings and their implications
Recommendation for action
Objective of the study
Method employed
Results summary and details in terms of
project objectives
Technical appendices.
Report Writing Style
Be brief, concise and to the point
Use simple and clear language
Follow a logical sequence of presentation
Avoid unsupportive statement and recommendation
Be pragmatic and constructive
Make the final product
Outline of Research Report
Front page
Project name
Title/ Heading of research report
Done by
Preliminary page
Preface/Forward
List of content page wise
Main text
Introduction
Statement of findings and recommendation
The implications drawn from the result
The summary
Brief summary
Resting in brief the research problem
The methodology
The major findings
The major recommendation
End matter
Questioner
Bibliography of source
Appendix
Outline of Monitoring Report
Front page
Project name
Title/ Heading of monitoring report
Monitoring period
Content
Summary statement
Major indicator or variable
Objective and purpose of monitoring
Method and process used for data collection
Process of data analysis
Indicator wise findings (standard Vs actual)
Recommendation
Conclusion
Appendix
Outline of Evaluation Report
Cover page
Title of the evaluation report
Name of the project or program evaluated
Location
Project duration
Name of evaluation
Evaluation period
Executive summary
Goal and objectives of the project
Project description
Project intervention
Purpose of evaluation
Major are of focus of the evaluation
Evaluation design
Methodology
Sample and sampling techniques
Instruments used
Sources of information
Method and process used for data collection
Process of data analysis
Monitoring
Monitoring is a continuous internal
management activity whose purpose is to
ensure that the program achieves its
defined objectives within a prescribed
time - frame and budget.
Evaluation
Evaluation is an internal or external
management activity to assess the
appropriateness of a program 's design and
implementation methods in achieving both
specified objectives and more general
development objectives
MIS
A management information system (MIS) is
the series of processes and actions involved
in capturing raw data, processing it into
usable information, and disseminating it to
users in the form needed.
Why M&E
•To improve program design and implementation
•To fulfill reporting requirements
•To know right track of project implementation in
relation to objectives
•To plan for future programs
Some
All Most Few
MONITORING
“Process Evaluation”
EVALUATION
“Effectiveness Evaluation”
Levels of Evaluation Efforts
Number
of
Projects
•Resources
•Staff
•Funds
•Materials
•Facilities
•Supplies
•Training
•Trained staff
•Quality of services
•Knowledge of CBO
•Behavior change
•Attitude change
•Changes in livelihood
•Increase in CBO support
Inputs
Outputs
Short-term and
intermediate
effects
Long-term
effects
Changes in :
•Income
•Nutrition
•Coping capacity in
community
•Economic impact
Outcomes
Impact
Monitoring & Evaluation Pipeline
Following Monitoring & Evaluation at different level
The following box defines the common terms with examples.
Inputs
The financial, human, and material resources used for the development intervention.
• Technical Expertise
• Equipment Funds
Activities
Actions taken or work performed.
• Training workshops conducted
Outputs
The products, capital goods, and services that result from a development intervention.
• Number of people trained
• Number of workshops conducted
Outcomes
The likely or achieved short-term and medium-term effects or changes of an
intervention’s outputs.
• Increased skills
• New employment opportunities
Impacts
The long-term consequences of the program, may be positive and negative effects.
• Improved standard of living
Types and Objectives of M & E
M&E objective Types of M&E Approaches to
M&E
1. Program design 1. Baseline
2. Formative
Quantitative
Qualitative
Participatory
Costing
2. Ongoing monitoring of
process
1. Monitoring of input, process,
output, and quality
Do
3. Determine whether
objectives are achieved
(“best practices”)
1. Evaluation of outcome/impact
2. Special studies
3. Research
Do
4. Program
refinement/redesign (M&E
for decision-making)
(“lessons learned”)
1. Program review of monitoring
data
2. Special studies
Do
5. Accountability at program
level
1.Monitoring data
2. Program review results
3. Evaluation with comparison group
Do
Qualitative methods
• Focus Group Discussions (FGD)
• Case Studies
• Interviews – semi‐structured, structured
• Participatory methods: Participatory Rural
Appraisal (PRA), Rapid Rural Appraisal (RRA)
• Most Significant Change (MSC)
• Observations
Impact evaluation methods
a. Pre‐Post
b. Simple Difference
c. Differences‐in‐Differences
d. Multivariate Regression
e. Statistical Matching
f. Instrumental Variables
g. Regression Discontinuity
h. Randomized Evaluations
Pre‐post (Before vs. After)
Method 1: Before vs. After
Impact = 42 points?
100
75
50
25
0
2010 2011 2012
42
67
25
How to measure impact?
Impact is defined as a comparison
between:
1. the outcome some time after the
program has been introduced
2. the outcome at that same point in time
had the program not been introduced
the ”counterfactual”
Impact: What is it?
Intervention
Impact
Time
Outcome
Impact: What is it?
Intervention
Impact
Time
Outcome
Impact: What is it?
Intervention
Impact
Time
Outcome
Simple Difference
A post‐ program comparison of outcomes between
the group that received the program and a “comparison”
group that did not
• Example:
– program is rolled out in phases leaving a cohort for
comparison, even though the assignment of the program
is not random
Difference‐in‐Differences (or Double Difference)
Comparison of outcome between
a treatment and comparison group (1st difference) and
before and after the program (2nd difference)
• Suitability:
– program is rolled out in phases leaving a cohort for
comparison, even though assignment of treatment is not
random
Constructing the counterfactual
• Counterfactual is often constructed by selecting a
group not affected by the program
• Non‐randomized:
– Argue that a certain excluded group mimics the
counterfactual.
• Randomized:
– Use random assignment of the program to create a
control group which mimics the counterfactual.
Conditions required
Method Comparison Works if….
Pre‐Post Program participants before
program
The program was the only factor influencing any
changes in the measured outcome over time
Simple
Difference
Individuals who did not
participate (data collected after
program)
Non‐participants are identical to participants except
for program participation, and were equally likely
to enter program before it started.
Differences
in
Differences
Same as above, plus: data
collected before and after
If the program didn’t exist, the two groups would
have had identical trajectories over this period.
Multivariate
Regression
Same as above plus: Also have
additional “explanatory” variables
Omitted (because not measured or not observed)
variables do not bias the results because they are
either: uncorrelated with the outcome, or do not
differ between participants and non‐participants
Propensity
Score
Matching
Non‐participants who have mix of
characteristics which predict that
they would be as likely to
participate as participants
Same as above
Randomized
Evaluation
Participants randomly assigned to
control group
Randomization “works” – the two groups are
statistically identical on observed and unobserved
characteristics
Method Impact Estimate
(1) Pre‐post
(2) Simple Difference
(3) Difference‐in‐Difference
(4) Regression with controls
Other Methods
• There are more sophisticated non‐experimental and
quasi‐experimental methods to estimate program
impacts:
– Multivariable Regression
– Matching
– Instrumental Variables
– Regression Discontinuity
• These methods rely on being able to “mimic” the
counterfactual under certain assumptions
• Problem: Assumptions are not testable
Constructing the counterfactual
• Counterfactual is often constructed by selecting a group
not affected by the program
• Non‐randomized:
– Argue that a certain excluded group mimics the
counterfactual.
• Randomized:
– Use random assignment of the program to create a
control group which mimics the counterfactual.
MONITORING AND EVALUATION CRITERIA
There are different versions of monitoring and evaluation
criteria suggested by various development agencies.
However, all point to a set of five (also adopted by
UNDP). These criteria internalize key constraints in
undertaking M&E functions that both time and resources
to accomplish the tasks are always limited and emphasis
is on providing timely input for management and policy
guidance to relevant stakeholders.
MONITORING AND EVALUATION CRITERIA
(1) Relevance
The first criteria in monitoring and evaluation should be
whether the objective(s) of the program/project is valid.
This can be judged in two ways. First, relevancy
assessment needs to be made in the context of original plan.
Second, it should be assessed in the changing context due to
socio-political-environmental changes originally not
perceived in the original plan. Relevancy need to be
addressed with respect to development issues, target groups,
direct beneficiaries, donor’s mission and comparative
advantage amongst development partners. Key question to
ask is whether the program/project is relevant to
beneficiaries’ needs, national priorities and donor’s (e.g.
World Bank) mandate.
MONITORING AND EVALUATION CRITERIA
(2) Effectiveness
Effectiveness addresses the extent to which a program/project
achieves its immediate objectives and produces expected
outcomes. Key question to ask is have the program/project
objective(s) been achieved or are they expected to be achieved.
It is one of the performance measures. The program/project
activities and deliverables are also assessed against timeliness
of the inputs and results.
(3) Efficiency
Efficiency is the second measure of program/project
performance and it reflects optimum transformation of inputs
into outputs. It addresses resource utilization and key question
to ask is to what extent the program/project has efficiently used
available resources (including human, capital, and
physical/natural).
MONITORING AND EVALUATION CRITERIA
(4) Impact
Impact represents changes over a given baseline (benchmark)
over specified time frame. Impact can occur both in relation to
planned as well unplanned way. It can also be positive or
negative. Key question one would ask is what are the positive
and negative, intended and unintended changes brought about
by program/project intervention. Impact assessment can be
made over two points in time and effectively requires a
baseline data on key indicators (discussion on indicators to
follow. Positive impacts are initial signs of program/project
success but these impacts need to be sustainable (refer to next
criteria)
MONITORING AND EVALUATION CRITERIA
(5) Sustainability
Sustainability is the critical element of monitoring and
evaluation criteria. One would expect that a given
program/project will be self-sustainable without external
assistance. A sustainable outcome from a given
program/project represents success of the project. Key
question to ask is will benefits/activities of a given
program/project continue after the end of the project.
Indicators
Indicators selected for M&E functions reflect that these are
SMART indicators ITAD, 1996).
Specific
Measurable
Attainable
Relevant
Trackable
Key Questions Pertaining to Data Collection-Analysis
Question Justification for the Question
What
type of
data do
we need?
Data needs are defined by key
stakeholders so that these can be
effectively used for regular monitoring,
impact assessment and program/project
evaluation.
Where
do we get
data
from?
It is important so that data gaps can be
identified at the outset. Part of data can
be collected from secondary sources
and resources can be used for only
additional data to be collected from
primary source.
Key Questions Pertaining to Data Collection-Analysis
Question Justification for the Question
What method
do we use to
obtain data?
How often to
collect data?
Resources are always limited for data collection in any
program/project. It is critical to identify method of
data collection based on resources available and time
permissible. Data collected over longer period may
not suffice program/project needs. Decision needs to
be made what data should be obtained from secondary
sources and what should be obtained from primary
sources. Primary source may include surveys, PRA,
RRA, observation, key informant, focused group
discussion etc. The context relevant method partially
will guide frequency of data collection.
Who will
undertake data
collection and
analysis?
Usually data is collected by one group of people and
analyzed by another group. It is important to identify
at the outset responsible persons or group of persons
so that data collection can proceed without delay.
Key Questions Pertaining to Data Collection-Analysis
Question Justification for the Question
Who will
use the
final results
from the
analysis of
data?
Users of information can be all or some of the key
stakeholders. However, they need to be pre-identified so
that when data is collected and analyzed required results
can be provided to them on time. This permits timely use
of results for management and follow-up decisions by
stakeholders. Timely use of information based on
appropriately analyzed data strengthens
program/project’s capability in assessing whether the
program/project is achieving its objectives.
Mechanism for Evaluation
Internal Evaluation
External Evaluation
Mid-Term Evaluation
Terminal Evaluation
Ex-Post Evaluation

Contenu connexe

Tendances

Self-Assessment of Organizational Capacity in Monitoring & Evaluation
Self-Assessment of Organizational Capacity in Monitoring & EvaluationSelf-Assessment of Organizational Capacity in Monitoring & Evaluation
Self-Assessment of Organizational Capacity in Monitoring & EvaluationMEASURE Evaluation
 
Participatory Monitoring and Evaluation
Participatory Monitoring and EvaluationParticipatory Monitoring and Evaluation
Participatory Monitoring and EvaluationLen Fontanilla
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluationCarlo Magno
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Muthuraj K
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios athanzeer
 
Stakeholder Engagement: The art & science of winning the SE snakes and ladder...
Stakeholder Engagement: The art & science of winning the SE snakes and ladder...Stakeholder Engagement: The art & science of winning the SE snakes and ladder...
Stakeholder Engagement: The art & science of winning the SE snakes and ladder...Association for Project Management
 
Ngo project management
Ngo project managementNgo project management
Ngo project managementahmed hassan
 
M&E completion training report oct 142012
M&E completion training report oct 142012M&E completion training report oct 142012
M&E completion training report oct 142012dr-ayub
 
Training workshop on project cycle management
Training workshop on project cycle management Training workshop on project cycle management
Training workshop on project cycle management mohamed osama hussein
 
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-ProjectsDeveloping-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-ProjectsNIDHI SEN
 
Monitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical frameworkMonitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical frameworkPreston Healthcare Consulting
 
Introduction to the Logical Framework Approach
Introduction to the Logical Framework ApproachIntroduction to the Logical Framework Approach
Introduction to the Logical Framework ApproachDamien Sweeney
 
Monitoring & Evaluating projects & programs: A stakeholder perspective
Monitoring & Evaluating projects & programs: A stakeholder perspectiveMonitoring & Evaluating projects & programs: A stakeholder perspective
Monitoring & Evaluating projects & programs: A stakeholder perspectiveJacques Myburgh
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluationMeshack Lomoywara
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid ProjectsTony
 

Tendances (20)

Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
 
Self-Assessment of Organizational Capacity in Monitoring & Evaluation
Self-Assessment of Organizational Capacity in Monitoring & EvaluationSelf-Assessment of Organizational Capacity in Monitoring & Evaluation
Self-Assessment of Organizational Capacity in Monitoring & Evaluation
 
Participatory Monitoring and Evaluation
Participatory Monitoring and EvaluationParticipatory Monitoring and Evaluation
Participatory Monitoring and Evaluation
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluation
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
 
Logical framework
Logical frameworkLogical framework
Logical framework
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
 
Stakeholder Engagement: The art & science of winning the SE snakes and ladder...
Stakeholder Engagement: The art & science of winning the SE snakes and ladder...Stakeholder Engagement: The art & science of winning the SE snakes and ladder...
Stakeholder Engagement: The art & science of winning the SE snakes and ladder...
 
Ngo project management
Ngo project managementNgo project management
Ngo project management
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
M&E completion training report oct 142012
M&E completion training report oct 142012M&E completion training report oct 142012
M&E completion training report oct 142012
 
Training workshop on project cycle management
Training workshop on project cycle management Training workshop on project cycle management
Training workshop on project cycle management
 
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-ProjectsDeveloping-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
 
Monitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical frameworkMonitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical framework
 
Introduction to the Logical Framework Approach
Introduction to the Logical Framework ApproachIntroduction to the Logical Framework Approach
Introduction to the Logical Framework Approach
 
Monitoring & Evaluating projects & programs: A stakeholder perspective
Monitoring & Evaluating projects & programs: A stakeholder perspectiveMonitoring & Evaluating projects & programs: A stakeholder perspective
Monitoring & Evaluating projects & programs: A stakeholder perspective
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
 
Logical framework analysis
Logical framework analysisLogical framework analysis
Logical framework analysis
 

Similaire à M & E Presentation DSK.ppt

Project Cycle Management WG5.ppt
Project Cycle Management WG5.pptProject Cycle Management WG5.ppt
Project Cycle Management WG5.pptMdFarhanShahriar3
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluationSudipta Barman
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxwelfredoyu2
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
 
Monitioring evaluations
Monitioring evaluationsMonitioring evaluations
Monitioring evaluationsmunas cheroor
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraSandra Guevara
 
Construction Management in Developing Countries, Lecture 10
Construction Management in Developing Countries, Lecture 10Construction Management in Developing Countries, Lecture 10
Construction Management in Developing Countries, Lecture 10Hari Krishna Shrestha
 
Learning_Unit_3
Learning_Unit_3Learning_Unit_3
Learning_Unit_3Jack Ong
 
C8 logical framework approach (lfa)
C8 logical framework approach (lfa)C8 logical framework approach (lfa)
C8 logical framework approach (lfa)ocasiconference
 
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptINTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptmodestuseveline
 
CPE Monitoring and Evaluation
CPE Monitoring and EvaluationCPE Monitoring and Evaluation
CPE Monitoring and EvaluationSamuel Baker
 
IWEco Webinar: Monitoring & Evaluation of Communication Campaigns – Dr. Pete...
IWEco Webinar:  Monitoring & Evaluation of Communication Campaigns – Dr. Pete...IWEco Webinar:  Monitoring & Evaluation of Communication Campaigns – Dr. Pete...
IWEco Webinar: Monitoring & Evaluation of Communication Campaigns – Dr. Pete...iweco-project
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]skzarif
 
1 b, evaluation of project
1 b, evaluation of project1 b, evaluation of project
1 b, evaluation of projectDr.R. SELVAM
 

Similaire à M & E Presentation DSK.ppt (20)

Project Cycle Management WG5.ppt
Project Cycle Management WG5.pptProject Cycle Management WG5.ppt
Project Cycle Management WG5.ppt
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Project management
Project managementProject management
Project management
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
Monitioring evaluations
Monitioring evaluationsMonitioring evaluations
Monitioring evaluations
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandra
 
Construction Management in Developing Countries, Lecture 10
Construction Management in Developing Countries, Lecture 10Construction Management in Developing Countries, Lecture 10
Construction Management in Developing Countries, Lecture 10
 
Learning_Unit_3
Learning_Unit_3Learning_Unit_3
Learning_Unit_3
 
EMIS
EMIS EMIS
EMIS
 
C8 logical framework approach (lfa)
C8 logical framework approach (lfa)C8 logical framework approach (lfa)
C8 logical framework approach (lfa)
 
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptINTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
 
M&E Concepts.pptx
M&E Concepts.pptxM&E Concepts.pptx
M&E Concepts.pptx
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
CPE Monitoring and Evaluation
CPE Monitoring and EvaluationCPE Monitoring and Evaluation
CPE Monitoring and Evaluation
 
IWEco Webinar: Monitoring & Evaluation of Communication Campaigns – Dr. Pete...
IWEco Webinar:  Monitoring & Evaluation of Communication Campaigns – Dr. Pete...IWEco Webinar:  Monitoring & Evaluation of Communication Campaigns – Dr. Pete...
IWEco Webinar: Monitoring & Evaluation of Communication Campaigns – Dr. Pete...
 
M&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMMM&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMM
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]
 
1 b, evaluation of project
1 b, evaluation of project1 b, evaluation of project
1 b, evaluation of project
 

Dernier

Top Rated Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...
Top Rated  Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...Top Rated  Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...
Top Rated Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...Call Girls in Nagpur High Profile
 
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...BarusRa
 
Design Inspiration for College by Slidesgo.pptx
Design Inspiration for College by Slidesgo.pptxDesign Inspiration for College by Slidesgo.pptx
Design Inspiration for College by Slidesgo.pptxTusharBahuguna2
 
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...Call Girls in Nagpur High Profile
 
Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...
Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...
Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...home
 
CBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call Girls
CBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call GirlsCBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call Girls
CBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call Girlsmodelanjalisharma4
 
(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...
(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...
(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...ranjana rawat
 
VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...
VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...
VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...Call Girls in Nagpur High Profile
 
VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130
VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130
VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130Suhani Kapoor
 
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779Delhi Call girls
 
VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...
VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...
VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...Suhani Kapoor
 
Best VIP Call Girls Noida Sector 44 Call Me: 8448380779
Best VIP Call Girls Noida Sector 44 Call Me: 8448380779Best VIP Call Girls Noida Sector 44 Call Me: 8448380779
Best VIP Call Girls Noida Sector 44 Call Me: 8448380779Delhi Call girls
 
WAEC Carpentry and Joinery Past Questions
WAEC Carpentry and Joinery Past QuestionsWAEC Carpentry and Joinery Past Questions
WAEC Carpentry and Joinery Past QuestionsCharles Obaleagbon
 
Chapter 19_DDA_TOD Policy_First Draft 2012.pdf
Chapter 19_DDA_TOD Policy_First Draft 2012.pdfChapter 19_DDA_TOD Policy_First Draft 2012.pdf
Chapter 19_DDA_TOD Policy_First Draft 2012.pdfParomita Roy
 
Dubai Call Girls Pro Domain O525547819 Call Girls Dubai Doux
Dubai Call Girls Pro Domain O525547819 Call Girls Dubai DouxDubai Call Girls Pro Domain O525547819 Call Girls Dubai Doux
Dubai Call Girls Pro Domain O525547819 Call Girls Dubai Douxkojalkojal131
 
Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...
Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...
Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...Yantram Animation Studio Corporation
 
Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...
Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...
Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...nagunakhan
 

Dernier (20)

Top Rated Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...
Top Rated  Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...Top Rated  Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...
Top Rated Pune Call Girls Saswad ⟟ 6297143586 ⟟ Call Me For Genuine Sex Serv...
 
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...
 
Design Inspiration for College by Slidesgo.pptx
Design Inspiration for College by Slidesgo.pptxDesign Inspiration for College by Slidesgo.pptx
Design Inspiration for College by Slidesgo.pptx
 
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
 
Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...
Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...
Recommendable # 971589162217 # philippine Young Call Girls in Dubai By Marina...
 
CBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call Girls
CBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call GirlsCBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call Girls
CBD Belapur Individual Call Girls In 08976425520 Panvel Only Genuine Call Girls
 
young call girls in Vivek Vihar🔝 9953056974 🔝 Delhi escort Service
young call girls in Vivek Vihar🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Vivek Vihar🔝 9953056974 🔝 Delhi escort Service
young call girls in Vivek Vihar🔝 9953056974 🔝 Delhi escort Service
 
(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...
(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...
(AISHA) Ambegaon Khurd Call Girls Just Call 7001035870 [ Cash on Delivery ] P...
 
VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...
VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...
VVIP Pune Call Girls Hadapsar (7001035870) Pune Escorts Nearby with Complete ...
 
VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130
VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130
VIP Call Girls Service Bhagyanagar Hyderabad Call +91-8250192130
 
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
 
VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...
VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...
VIP Russian Call Girls in Gorakhpur Deepika 8250192130 Independent Escort Ser...
 
Best VIP Call Girls Noida Sector 44 Call Me: 8448380779
Best VIP Call Girls Noida Sector 44 Call Me: 8448380779Best VIP Call Girls Noida Sector 44 Call Me: 8448380779
Best VIP Call Girls Noida Sector 44 Call Me: 8448380779
 
B. Smith. (Architectural Portfolio.).pdf
B. Smith. (Architectural Portfolio.).pdfB. Smith. (Architectural Portfolio.).pdf
B. Smith. (Architectural Portfolio.).pdf
 
young call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Service
young call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Service
young call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Service
 
WAEC Carpentry and Joinery Past Questions
WAEC Carpentry and Joinery Past QuestionsWAEC Carpentry and Joinery Past Questions
WAEC Carpentry and Joinery Past Questions
 
Chapter 19_DDA_TOD Policy_First Draft 2012.pdf
Chapter 19_DDA_TOD Policy_First Draft 2012.pdfChapter 19_DDA_TOD Policy_First Draft 2012.pdf
Chapter 19_DDA_TOD Policy_First Draft 2012.pdf
 
Dubai Call Girls Pro Domain O525547819 Call Girls Dubai Doux
Dubai Call Girls Pro Domain O525547819 Call Girls Dubai DouxDubai Call Girls Pro Domain O525547819 Call Girls Dubai Doux
Dubai Call Girls Pro Domain O525547819 Call Girls Dubai Doux
 
Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...
Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...
Captivating Charm: Exploring Marseille's Hillside Villas with Our 3D Architec...
 
Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...
Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...
Punjabi Housewife Call Girls Service Gomti Nagar \ 9548273370 Indian Call Gir...
 

M & E Presentation DSK.ppt

  • 1. Content • Definitions and types of monitoring and evaluation activities • Design, Monitoring & Evaluation Cycles • Logical framework for strategic planning • Monitoring and Evaluation Methods: Methods of Program Review, Supervision and Evaluation
  • 2. Agenda ....... • Basic quantitative and qualitative data collection techniques • Data Processing: analysis, interpret and presentation of routine program data • Structure of a technical report: understanding observation, interpretation, conclusion and recommendation
  • 3. Objectives Develop a common understanding of monitoring and evaluation
  • 4. Monitoring? Or Monitoring and Evaluation Design Monitoring and Evaluation (DME) or DME-IS (Information System) Participatory Monitoring and Evaluation (PME)
  • 6. Project DM&E Cycle Diagnosis Hypothesis, Appraisal Design Analysis Log frame Focus strategy M&E plan Coherent information system Baseline Monitoring Evaluation Reflective practice Lesson learned
  • 7. The Steps in a Survey Project 1. Establish the goals of the project - What you want to learn 2. Determine your sample - Whom you will interview 3. Choose interviewing methodology - How you will interview 4. Create your questionnaire - What you will ask 5. Pre-test the questionnaire, if practical - Test the questions 6. Conduct interviews and enter data - Ask the questions 7. Analyze the data - Produce the reports
  • 8. M&E work plan • Identify program goals and objectives • Determine evaluation questions, indicators and their feasibility • Prepare design methodology – monitoring the process and evaluating the effects • Resolve implementation issues: who will carry out the work and how will existing data and past evaluation studies be used? • Identify internal and external evaluation resources and capacity • Develop the M&E work plan matrix and timeline • Develop plan to disseminate and use evaluation findings
  • 9. Monitoring Steps •Setting environment •Setting performance standard •Measuring indicator •Data collection for MIS •Reporting •Action
  • 10. Steps in designing M&E system •Review project document, goal, objective. •LFA analysis •Stakeholders analysis •M&E strategy and methodology •Select indicator •Develop the M&E plan •Scope of monitoring system •Data collection for M&E oBaseline oRegular monitoring oRapid survey oMid term and Final evaluation •Data management •Data processing and analyzing •Reporting and presentation •Communication
  • 11. Area of Monitoring •Program activities •Implementation process •Pit fall or short coming •Financial and expenditure •Training & Follow-up •Reported activities
  • 12. Monitoring may be Qualitative & quantitative Participatory and Non- participatory Internal & External Direct and Indirect Oral or written
  • 13. Component of Project Monitoring system Financial monitoring Monitoring project implementation Assessing the efficiency of project implementation. Beneficiary contact monitoring
  • 14. Developing M&E matrix after project start-up involves six steps •Identify performance questions •Identify information needs and indicators •Know the what baseline information you need •Select which data gathering methods to use, by whom and how often •Identify the necessary practical support for information gathering •Organize, analysis, feedback and change.
  • 15. Monitoring and evaluation planning matrix •Hierarchy of objective •Objectively verifiable indicator (OVI) •Sources of information •Methods of data collection •Methods of data analysis •Type of activity (regular monitoring, periodic, evaluation) •Frequency •Application (expected uses and uses) •Circulation (expected information uses)
  • 16. Sample of M&E Matrix Performa nce Questions Informati on Needs and Indicators Baseline information Requirements status and Responsibiliti es Data-Gathering Methods, Frequency and Responsibilities Required forms, Planning, Training, Data Management, Expertise, Resources, and Responsibilities Analysis, Reporting, Feedback, Change processes, and responsibilities Project Key Outcome-1: Project Key Output-1:
  • 17. Sample of M&E Matrix
  • 18. Monitoring Cycle  Selection of indicators  Prepare checklist  Design questioner  Processing sheet  Structure of presentation  Field test questioner  Finalize design  Data collection  Cross checking  Share the emerging findings  Data processing  Data analysis  Sharing the data findings  Sharing the emerging findings  Prepare analysis report  Circulate the report
  • 19. Limitation of Monitoring •Inadequate resources •Lack of knowledge and skill •Limitation of fund •Dilemma and problem •Miss interpretation •Distortion of communication •Lack of time
  • 20. Importance of evaluation •It is an end-of-the-day’ activity: one look back over the shoulder to see what has been done or accomplished. •It is an event that provides staff and participants with the opportunity to step back and take deeper look at the outcome. •It refers to the periodic examination and analysis of the project information that pertains to: oProject design-goal, objectives and plans oProject implementation- input and outputs oResults- effect and impact. •Evaluation attempts to identify project efficiency, effectiveness, sustainability and relevance.
  • 21. Importance of evaluation cont …. •It provides stakeholders with information they need to analyze whether the project is meeting its planed goal and objectives. •Project and program planners and donor need information about the strengths and weaknesses, and the successes and failures of projects in order to maintain or improve the quality of design and strategic planning. •Donors need information in order to assess the value of their financial contribution. •Evaluation provides information of core evaluation questions (Relevance, Effectiveness, Efficiency, Impact and Sustainability) to the decision makers (Project and program planners and donors).
  • 22. Purpose of evaluation •Achievement-seeing what has been achieved. •Measuring progress according to objectives. •Improving monitoring for better management. •Identify strength and weakness to improve the program. •Seeing if efforts was effective-what difference has the program made. •Cost-benefit analysis to asses the effectiveness and efficiency/were the cost reasonable. •Collecting information-to plan and manage program activities better. •Sharing experience-to prevent other making similar mistakes or to encourage them use similar methods. •To revise/readjust program strategy. •Improving effectiveness to have more impact. •Better/Future planning.
  • 23. The five core evaluation questions Relevance: Was/Is the project good idea given the situation needed? Dose it deal with target group priority? Why or why not? Effectiveness: Have the planed purpose and component objectives, Outputs and activities been achieved? Why or why not? Efficiency: Were input (Recourse and time) used in the best possible way to achieve outcomes? Why or why not? What could we do differently to improve implementation there by maximizing impact at an acceptable and sustainable cost?
  • 24. The five core evaluation questions Impact: To what extent has the project contributed towards its longer term goals? Why or why not? What unanticipated positive or negative consequences did the project have? Why did they arise? Sustainability: Will there be contributed positive impacts as a result of the project once it has finished? Why or why not?
  • 25. Information needs in different stages of a project Stage Event Assessment Key question Before Project Project Appraisal  Development needs  Potentials of the target group or an area.  inputs required  Cost benefit analysis What are the problems? What are the resources? What are the unmet needs? What is the cost benefit ratio? Project Start up Baseline survey  Current situation of the target group. What is the current situation? During Implementation Monitoring  Progress in implementation  Project efficiency and effectiveness Is the project proceeding according to plan and strategy? Mid of the project Formative Evaluation  Capacity of the project to achieve what it set out to do.  Strength of the implementing institution in terms of its ability to sustain the program. Are the project strategies working? End of Project Summative Evaluation  Changes in the lives of the target group; What effect(s) did the project have? After Project Ex-post Evaluation  Sustainable improvement in human condition or well being What impact did the project have on the lives of the people it was designed to affect?
  • 26. Comparing Monitoring and Evaluation Area Monitoring Evaluation Main Purpose o To guide future implementation o To ensure accountability of management and main stakeholders o To provide basis of evaluation and learning o To maintain or improve the quality of project design and strategic planning. o To guide future project planning Assessment o Assess progress in implementation o Identify project efficiency and effectiveness o Assess overall outputs, effects, impact and sustainability of project. o Determine relevance of project design. o Identify lesson learned. Who Needs this Information o Target group o Main stakeholder o Project management staff o Donor o Project planers o Donors o Government o Organization’s headquarters Timing Continuously during project implementation At specified times : oDuring project implementation oAt project completion oAfter 3-5 years of project completion Source of Information Project plan, administrative records and reports, observation Monitoring reports, participant or population-based surveys, observations, in-depth case studies. External information Type of Data Required Mostly quantitative data with some qualitative data to add meaning Combination of qualitative data and quantitative data People involve Project base monitoring team o External organization, o External staff of same organization Process of conduction Continuous o Periodical Data nature Mainly primary o Both primary and Secondary Analysis Not in depth o In-depth Timeliness Quicker – compare with standards o Lengthy-respect to objectives
  • 27. What is indicator A set of data when if collected on a regular basis, will indicate progress towards specific aims and objectives of a project or a work. Indicators are evidence that something has happened or that an objective has been achieved: they are not proof. Something, which provides a basis to determinate changes as result of project activities for a target population. An indicator is like a marker/milestone which shows what progress has been made. It may be qualitative/quantitative
  • 28. Criteria of a good indicator Relevant: The indicators should be directly linked to the project objectives, and to the appropriate levels in the hierarchy. Technically feasible: The indicators should be capable of being assessed (or ‘measured’ if they are quantitative). Reliable: The indicators should be verifiable and (relatively) objective; i.e., conclusions based on them should be the same if they are assessed by different people at different times and under different circumstances. Useable: People in the project should be able to understand and use the information provided by the indicators to make decisions or improve their work and the performance of the project Participatory: The steps for working with the indicator should be capable of being carried out with the target community and other stakeholders in a participatory manner: i.e., data collection, analysis and use.
  • 29. Others Criteria comprehensible - the indicators should be worded simply and clearly so that people involved in the project will be able to understand them. valid – the indicators should actually measure what they are supposed to measure, e.g., measuring effects due to project interventions rather than outside influences. sensitive – they should be capable of demonstrating changes in the situation being observed, e.g., measuring the GNP of Uganda doesn’t tell us much about the individual households in one district. cost-effective – the results should be worth the time and money it costs to collect, analyse and apply them. timely – it should be possible to collect and analyse the data reasonably quickly, i.e., in time to be useful for any decisions that have to be made. ethical – the collection and use of the indicators should be acceptable to the communities (target populations) providing the information. Easy for collection Specifically Measurable Attainable Periodically Redundancy
  • 30. Example of Quantitative indicator Employment & Income Total family income Source of income Income stability Type of employment Number of people working Labor force participation rate of particular group Proportion of self-employed Proportion of working in formal and informal sectors Savings cash and kinds Demographic characteristics of the family Family size and stability Age composition Education Proportion of children attending in school Civil status of HH head (service, occupation) Geographic mobility
  • 31. Example of Quantitative indicator Housing cost, quantity, value Sale or rental value of house, housing land, land Construction quality House size Access to service from house (distance, location) Health Percentage of MCH mortality rate Cause specific mortality rate and leading cause of death Time lost from work or school due to sick Access to medical service Amount spent on medical service Anthropometrics measures of weight and height.
  • 32. Example of Quantitative indicator Consumption pattern Amount spent on housing Amount spent on food Amount spent spent on education Transportation Medical Saved Community participation and attitude: Number of community organizations in which participates Number of friends in the community and project Political, social and religious organizations Participation in mutual help program Satisfaction with the community Satisfaction with the social, economic and political situation
  • 33. Types of Question Structured: Quantitative focus---Close ended Non-Structured: Qualitative focus---Open ended Contingency: By filter question if yes--- what it Matrix question: A large set of relating question
  • 34. What are the considerations to develop a question Review the output/outcome indicator. Identify relevant information for specific indicator. Identify relevant questions for quality information considering followings: Why (purpose) For whom (level of respondent) What (Requirement) When/time How (Method) Who will do it (skills & competencies) Information breaking Cost constrain Simple, shortcut & purposive Logical Sequential arrangement Take feedback from all concern Field test to check the issue of validity and reliability Finalize
  • 35. Principle of question formulation Catchy word Short (minimum topic to meet objective0 Simple Easy for respondent Interesting for respondent Have a complete guideline Self explanatory Avoid superficial result Each question focus on particular issues Sensitivity, emerging, emotional Sequential Only essential Valid supportive question Tabulation friendly Leading and prompting question should be avoided Professional word, jargon, technical terms should be avoided
  • 36. Data Collection Methods Sampling related method Random & non random methods Core M&E methods Stakeholders analysis Documents review Biophysical measurement Direct observation Cost-benefit analysis Question & survey Semi structured interview Case study
  • 37. Data Collection Methods Discussion method Brainstorming FGD Nominal group technique (between two group) SWOT analysis Dreams realized or visioning Drama & role play Methods for specially distributed information Mapping GIS mapping Photograph Video
  • 38. Data Collection Methods Methods for time based pattern of change Seasonal diagram Diaries review Trend analysis Calendar Significant change Methods for analyzing linkage and relationship Rich picture or mind map Impact flow diagram Institutional linkage diagram Problem and objective trees M&E wheel Ranking and prioritizing Social mapping Well-being ranking Matrix scoring Relative scale Ranking and pocket chart
  • 39. Types of Reports Technical reports Summary of result Nature of study Method employed Data (collection, sources, characteristics, limitation) Analysis of data and presentation of findings Conclusion (a summery findings) Bibliography Technical appendices (will be given fll technical matters relating to questioner, numerical derivations, elaboration on particular techniques of analysis and the like ones). Index
  • 40. Types of Reports Popular reports (simplicity and attractiveness, clear, minimizing technical term, liberal, chart, diagrams, user-friendly) The findings and their implications Recommendation for action Objective of the study Method employed Results summary and details in terms of project objectives Technical appendices.
  • 41. Report Writing Style Be brief, concise and to the point Use simple and clear language Follow a logical sequence of presentation Avoid unsupportive statement and recommendation Be pragmatic and constructive Make the final product
  • 42. Outline of Research Report Front page Project name Title/ Heading of research report Done by Preliminary page Preface/Forward List of content page wise Main text Introduction Statement of findings and recommendation The implications drawn from the result The summary Brief summary Resting in brief the research problem The methodology The major findings The major recommendation End matter Questioner Bibliography of source Appendix
  • 43. Outline of Monitoring Report Front page Project name Title/ Heading of monitoring report Monitoring period Content Summary statement Major indicator or variable Objective and purpose of monitoring Method and process used for data collection Process of data analysis Indicator wise findings (standard Vs actual) Recommendation Conclusion Appendix
  • 44. Outline of Evaluation Report Cover page Title of the evaluation report Name of the project or program evaluated Location Project duration Name of evaluation Evaluation period Executive summary Goal and objectives of the project Project description Project intervention Purpose of evaluation Major are of focus of the evaluation Evaluation design Methodology Sample and sampling techniques Instruments used Sources of information Method and process used for data collection Process of data analysis
  • 45. Monitoring Monitoring is a continuous internal management activity whose purpose is to ensure that the program achieves its defined objectives within a prescribed time - frame and budget.
  • 46. Evaluation Evaluation is an internal or external management activity to assess the appropriateness of a program 's design and implementation methods in achieving both specified objectives and more general development objectives
  • 47. MIS A management information system (MIS) is the series of processes and actions involved in capturing raw data, processing it into usable information, and disseminating it to users in the form needed.
  • 48. Why M&E •To improve program design and implementation •To fulfill reporting requirements •To know right track of project implementation in relation to objectives •To plan for future programs
  • 49. Some All Most Few MONITORING “Process Evaluation” EVALUATION “Effectiveness Evaluation” Levels of Evaluation Efforts Number of Projects •Resources •Staff •Funds •Materials •Facilities •Supplies •Training •Trained staff •Quality of services •Knowledge of CBO •Behavior change •Attitude change •Changes in livelihood •Increase in CBO support Inputs Outputs Short-term and intermediate effects Long-term effects Changes in : •Income •Nutrition •Coping capacity in community •Economic impact Outcomes Impact Monitoring & Evaluation Pipeline
  • 50. Following Monitoring & Evaluation at different level The following box defines the common terms with examples. Inputs The financial, human, and material resources used for the development intervention. • Technical Expertise • Equipment Funds Activities Actions taken or work performed. • Training workshops conducted Outputs The products, capital goods, and services that result from a development intervention. • Number of people trained • Number of workshops conducted Outcomes The likely or achieved short-term and medium-term effects or changes of an intervention’s outputs. • Increased skills • New employment opportunities Impacts The long-term consequences of the program, may be positive and negative effects. • Improved standard of living
  • 51. Types and Objectives of M & E M&E objective Types of M&E Approaches to M&E 1. Program design 1. Baseline 2. Formative Quantitative Qualitative Participatory Costing 2. Ongoing monitoring of process 1. Monitoring of input, process, output, and quality Do 3. Determine whether objectives are achieved (“best practices”) 1. Evaluation of outcome/impact 2. Special studies 3. Research Do 4. Program refinement/redesign (M&E for decision-making) (“lessons learned”) 1. Program review of monitoring data 2. Special studies Do 5. Accountability at program level 1.Monitoring data 2. Program review results 3. Evaluation with comparison group Do
  • 52. Qualitative methods • Focus Group Discussions (FGD) • Case Studies • Interviews – semi‐structured, structured • Participatory methods: Participatory Rural Appraisal (PRA), Rapid Rural Appraisal (RRA) • Most Significant Change (MSC) • Observations
  • 53. Impact evaluation methods a. Pre‐Post b. Simple Difference c. Differences‐in‐Differences d. Multivariate Regression e. Statistical Matching f. Instrumental Variables g. Regression Discontinuity h. Randomized Evaluations
  • 54. Pre‐post (Before vs. After) Method 1: Before vs. After Impact = 42 points? 100 75 50 25 0 2010 2011 2012 42 67 25
  • 55. How to measure impact? Impact is defined as a comparison between: 1. the outcome some time after the program has been introduced 2. the outcome at that same point in time had the program not been introduced the ”counterfactual”
  • 56. Impact: What is it? Intervention Impact Time Outcome
  • 57. Impact: What is it? Intervention Impact Time Outcome
  • 58. Impact: What is it? Intervention Impact Time Outcome
  • 59. Simple Difference A post‐ program comparison of outcomes between the group that received the program and a “comparison” group that did not • Example: – program is rolled out in phases leaving a cohort for comparison, even though the assignment of the program is not random
  • 60. Difference‐in‐Differences (or Double Difference) Comparison of outcome between a treatment and comparison group (1st difference) and before and after the program (2nd difference) • Suitability: – program is rolled out in phases leaving a cohort for comparison, even though assignment of treatment is not random
  • 61. Constructing the counterfactual • Counterfactual is often constructed by selecting a group not affected by the program • Non‐randomized: – Argue that a certain excluded group mimics the counterfactual. • Randomized: – Use random assignment of the program to create a control group which mimics the counterfactual.
  • 62. Conditions required Method Comparison Works if…. Pre‐Post Program participants before program The program was the only factor influencing any changes in the measured outcome over time Simple Difference Individuals who did not participate (data collected after program) Non‐participants are identical to participants except for program participation, and were equally likely to enter program before it started. Differences in Differences Same as above, plus: data collected before and after If the program didn’t exist, the two groups would have had identical trajectories over this period. Multivariate Regression Same as above plus: Also have additional “explanatory” variables Omitted (because not measured or not observed) variables do not bias the results because they are either: uncorrelated with the outcome, or do not differ between participants and non‐participants Propensity Score Matching Non‐participants who have mix of characteristics which predict that they would be as likely to participate as participants Same as above Randomized Evaluation Participants randomly assigned to control group Randomization “works” – the two groups are statistically identical on observed and unobserved characteristics
  • 63. Method Impact Estimate (1) Pre‐post (2) Simple Difference (3) Difference‐in‐Difference (4) Regression with controls
  • 64. Other Methods • There are more sophisticated non‐experimental and quasi‐experimental methods to estimate program impacts: – Multivariable Regression – Matching – Instrumental Variables – Regression Discontinuity • These methods rely on being able to “mimic” the counterfactual under certain assumptions • Problem: Assumptions are not testable
  • 65. Constructing the counterfactual • Counterfactual is often constructed by selecting a group not affected by the program • Non‐randomized: – Argue that a certain excluded group mimics the counterfactual. • Randomized: – Use random assignment of the program to create a control group which mimics the counterfactual.
  • 66. MONITORING AND EVALUATION CRITERIA There are different versions of monitoring and evaluation criteria suggested by various development agencies. However, all point to a set of five (also adopted by UNDP). These criteria internalize key constraints in undertaking M&E functions that both time and resources to accomplish the tasks are always limited and emphasis is on providing timely input for management and policy guidance to relevant stakeholders.
  • 67. MONITORING AND EVALUATION CRITERIA (1) Relevance The first criteria in monitoring and evaluation should be whether the objective(s) of the program/project is valid. This can be judged in two ways. First, relevancy assessment needs to be made in the context of original plan. Second, it should be assessed in the changing context due to socio-political-environmental changes originally not perceived in the original plan. Relevancy need to be addressed with respect to development issues, target groups, direct beneficiaries, donor’s mission and comparative advantage amongst development partners. Key question to ask is whether the program/project is relevant to beneficiaries’ needs, national priorities and donor’s (e.g. World Bank) mandate.
  • 68. MONITORING AND EVALUATION CRITERIA (2) Effectiveness Effectiveness addresses the extent to which a program/project achieves its immediate objectives and produces expected outcomes. Key question to ask is have the program/project objective(s) been achieved or are they expected to be achieved. It is one of the performance measures. The program/project activities and deliverables are also assessed against timeliness of the inputs and results. (3) Efficiency Efficiency is the second measure of program/project performance and it reflects optimum transformation of inputs into outputs. It addresses resource utilization and key question to ask is to what extent the program/project has efficiently used available resources (including human, capital, and physical/natural).
  • 69. MONITORING AND EVALUATION CRITERIA (4) Impact Impact represents changes over a given baseline (benchmark) over specified time frame. Impact can occur both in relation to planned as well unplanned way. It can also be positive or negative. Key question one would ask is what are the positive and negative, intended and unintended changes brought about by program/project intervention. Impact assessment can be made over two points in time and effectively requires a baseline data on key indicators (discussion on indicators to follow. Positive impacts are initial signs of program/project success but these impacts need to be sustainable (refer to next criteria)
  • 70. MONITORING AND EVALUATION CRITERIA (5) Sustainability Sustainability is the critical element of monitoring and evaluation criteria. One would expect that a given program/project will be self-sustainable without external assistance. A sustainable outcome from a given program/project represents success of the project. Key question to ask is will benefits/activities of a given program/project continue after the end of the project.
  • 71. Indicators Indicators selected for M&E functions reflect that these are SMART indicators ITAD, 1996). Specific Measurable Attainable Relevant Trackable
  • 72. Key Questions Pertaining to Data Collection-Analysis Question Justification for the Question What type of data do we need? Data needs are defined by key stakeholders so that these can be effectively used for regular monitoring, impact assessment and program/project evaluation. Where do we get data from? It is important so that data gaps can be identified at the outset. Part of data can be collected from secondary sources and resources can be used for only additional data to be collected from primary source.
  • 73. Key Questions Pertaining to Data Collection-Analysis Question Justification for the Question What method do we use to obtain data? How often to collect data? Resources are always limited for data collection in any program/project. It is critical to identify method of data collection based on resources available and time permissible. Data collected over longer period may not suffice program/project needs. Decision needs to be made what data should be obtained from secondary sources and what should be obtained from primary sources. Primary source may include surveys, PRA, RRA, observation, key informant, focused group discussion etc. The context relevant method partially will guide frequency of data collection. Who will undertake data collection and analysis? Usually data is collected by one group of people and analyzed by another group. It is important to identify at the outset responsible persons or group of persons so that data collection can proceed without delay.
  • 74. Key Questions Pertaining to Data Collection-Analysis Question Justification for the Question Who will use the final results from the analysis of data? Users of information can be all or some of the key stakeholders. However, they need to be pre-identified so that when data is collected and analyzed required results can be provided to them on time. This permits timely use of results for management and follow-up decisions by stakeholders. Timely use of information based on appropriately analyzed data strengthens program/project’s capability in assessing whether the program/project is achieving its objectives.
  • 75. Mechanism for Evaluation Internal Evaluation External Evaluation Mid-Term Evaluation Terminal Evaluation Ex-Post Evaluation