SlideShare a Scribd company logo
1 of 23
What is ‘Evaluation’?

    Nathan Loynes
In this presentation:
1. Definitions and disagreements about
   evaluation.
2. Logic Models.
3. Outcomes, Indicators and Targets
4. Measuring Outcomes
5. Mark Friedman : Outcome Based
   Accountability
Working Definition of
        Programme Evaluation


The practice of evaluation involves
thoughtful, systematic collection and
analysis of information about the activities,
characteristics, and outcomes of
programmes, for use by specific people, to
reduce uncertainties, improve effectiveness,
and make decisions.

                                           2
Scott & Morrison (2005)
        Evaluation Focuses on:
•   Value & Worth
•   Education or Social Programmes
•   Activities, Characteristics and Outcomes
•   Policy Implications (What should happen
    next?)
Pawson & Tilley, 1997 (In Scott &
          Morrison)
• Realistic Evaluation
1. Take into account the ‘institutional’
   nature of programmes
2. Should be scientific
3. Evaluation should not be self-serving.
Chen (1996) (In Scott and
          Morrison, 2005)
            4 Types of Evaluation:
1.   Process-Improvement
2.   Process-Assessment
3.   Outcome-Improvement
4.   Outcome-Assessment
Working Definition of
        Programme Evaluation


The practice of evaluation involves
thoughtful, systematic collection and
analysis of information about the activities,
characteristics, and outcomes of
programmes, for use by specific people, to
reduce uncertainties, improve effectiveness,
and make decisions.

                                           6
Evaluation Strategy Clarification
All Evaluations Are:
        Partly social
        Partly political
        Partly technical
Both qualitative and quantitative data can be
   collected and used and both are valuable
There are multiple ways to address most
   evaluation needs.
 Different evaluation needs call for different
   designs, types of data and data collection
   strategies.
                                                  7
Purposes of Evaluation

Evaluations are conducted to:

           Render judgment
           Facilitate improvements
           Generate knowledge


Evaluation purpose must be specified at the
  earliest stages of evaluation planning and
  with input from multiple stakeholders.

                                               8
What are Logic Models?
To Construct a Logic Model
            You Must Describe:
 Inputs: resources, money, staff/time, facilities, etc.
 Outputs: how a program uses inputs to fulfill its
  mission – the specific strategies, service delivery.
 Outcomes: changes to individuals or populations during
  or after participation.

       Inputs              Outputs            Outcomes




                                                           10
Here is an illustration that will help you create your own
Logic Model.
    Inputs                                                  Contextual Analysis
    Resources dedicated to or                                Identify the major
    consumed by the                                            conditions and
    programme.                                              reasons for why you
                                                             are doing the work
    E.G.                                                     in your community
    money
     staff and staff time,
    volunteers and volunteer time
    facilities
    equipment and supplies



  Outputs                                                     Outcomes
  What the programme does with the inputs to                  Benefits for participants during
  fulfill its mission.                                        and after programme activities.

  E.G.                                                        E.G.
  provide x number of classes to x participants              new knowledge
  provide weekly counseling sessions                         increased skills
  educate the public about signs of child abuse by           changed attitudes
  distributing educational materials to all agencies that     modified behavior
  serve families                                              improved condition
  Identify 20 mentors to work with youth and                 altered status
  opportunities for them to meet monthly for one year




                                                                                                 11
Outcomes, Indicators, Targets




                                12
What is the difference between
  outcomes, indicators, and
            targets?
Outcomes are changes in behavior, skills,
 knowledge, attitudes, condition or status.

Outcomes are related to the core business
 of the programme, are realistic and
 attainable, within the program’s sphere of
 influence, and appropriate.

 Outcomes are what a programme is held
 accountable for.
                                              13
What is the difference between
  outcomes, indicators, and
            targets?
 Indicators are specific characteristics or
  changes that represent achievement of an
  outcome.
 Indicators are directly related to the
  outcome and help define it.
 Indicators are measurable, observable,
  can be seen, heard or read, and make
  sense in relation to the outcome whose
  achievement they signal.
                                               14
What is the difference between
 outcomes, indicators, and
           targets?


Targets specify the amount or level of
 outcome attainment that is expected,
 hoped for or required.




                                         15
Why measure outcomes?
To see if your programme is really making
 a difference in the lives of your clients
To confirm that your programme is on the
 right track
 To be able to communicate to others what
 you’re doing and how it’s making a
 difference
 To get information that will help you
 improve your programme

                                         16
Use Caution
   When Identifying Outcomes
There is No right number of outcomes.
Be sure to think about when to expect
 outcomes.
  1)Initial Outcomes
     First benefits/changes participants experience
  2)Intermediate Outcomes
     Link initial outcomes to longer-term outcomes
  3)Longer-term Outcomes
     Ultimate outcomes desired for program
      participants
                                                       17
How do you identify indicators?
 Indicators are specific characteristics or changes
  that represent achievement of an outcome.
 Indicators are directly related to the outcome
  and help define it.
 Indicators are measurable, observable, can be
  seen, heard or read, and make sense in relation
  to the outcome whose achievement they signal.
 Ask the questions shown on the following slide.




                                                    18
Questions to Ask
    When Identifying Indicators
1. What does this outcome look like when it
   occurs?
2. What would tell us it has happened?
3. What could we count, measure or weigh?
4. Can you observe it?
5. Does it tell you whether the outcome has been
   achieved?


                                                   19
The BIG question is what evidence do we need to see to be
  convinced that things are changing or improving?

The “I’ll know it (outcome) when I see it (indicator)” rule in
  action -- some examples:



I’ll know   that retention has increased among home health aides
               involved in a career ladder program
   when I see a reduction in the employee turnover rate among aides involved in
                     the program
         and when I see        survey results that indicate that aides are
                                   experiencing increased job satisfaction


                                                                                  20
Mark Friedman (2005)
• Outcomes Based Accountability
• Frustrated by social programmes ‘all talk;
  no action’
• Need for a ‘Common Language’.
• Need for accurate data
• Need for baselines.
• Differentiate between Inputs, Outcomes,
  Outputs
Summary
• Evaluation is a systematic process.
• Evaluation considers inputs, outputs, and
  outcomes.
• Evaluation involves making qualitative and
  quantitative judgements.
• Effective evaluation requires that you are
  clear about what it is that you are
  measuring/judging.

                                           22

More Related Content

What's hot

Organizational Readiness - Project Success Planning
Organizational Readiness - Project Success PlanningOrganizational Readiness - Project Success Planning
Organizational Readiness - Project Success Planning
rambassador
 
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Rachel Harris
 
Logic model templates_and_howto
Logic model templates_and_howtoLogic model templates_and_howto
Logic model templates_and_howto
Carrie Grote
 
Business Readiness Assessment & Ocm Platform
Business Readiness Assessment & Ocm PlatformBusiness Readiness Assessment & Ocm Platform
Business Readiness Assessment & Ocm Platform
Eduardo Muniz
 

What's hot (20)

Organizational Readiness - Project Success Planning
Organizational Readiness - Project Success PlanningOrganizational Readiness - Project Success Planning
Organizational Readiness - Project Success Planning
 
Literature review: measurement of client outcomes in homelessness services
Literature review: measurement of client outcomes in homelessness servicesLiterature review: measurement of client outcomes in homelessness services
Literature review: measurement of client outcomes in homelessness services
 
Creating an outcomes framework for your organisation
Creating an outcomes framework for your organisationCreating an outcomes framework for your organisation
Creating an outcomes framework for your organisation
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
 
Objective Oriented Project Planning
Objective Oriented Project PlanningObjective Oriented Project Planning
Objective Oriented Project Planning
 
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
 
Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...
Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...
Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...
 
Logic model templates_and_howto
Logic model templates_and_howtoLogic model templates_and_howto
Logic model templates_and_howto
 
Ngo’s project management
Ngo’s project managementNgo’s project management
Ngo’s project management
 
Ngo project management
Ngo project managementNgo project management
Ngo project management
 
Stakeholder Engagement
Stakeholder EngagementStakeholder Engagement
Stakeholder Engagement
 
Project communication management
Project communication  managementProject communication  management
Project communication management
 
#Resource2
#Resource2#Resource2
#Resource2
 
Analyzing Outcome Information
Analyzing Outcome InformationAnalyzing Outcome Information
Analyzing Outcome Information
 
Project stakeholder management
Project stakeholder managementProject stakeholder management
Project stakeholder management
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
 
Objective Oriented Project Planning
Objective Oriented Project PlanningObjective Oriented Project Planning
Objective Oriented Project Planning
 
Business Readiness Assessment & Ocm Platform
Business Readiness Assessment & Ocm PlatformBusiness Readiness Assessment & Ocm Platform
Business Readiness Assessment & Ocm Platform
 
Agile Project Management Part 2 Final V1.5
Agile Project Management Part 2   Final V1.5Agile Project Management Part 2   Final V1.5
Agile Project Management Part 2 Final V1.5
 
Managing difficult stakeholders - how to...
Managing difficult stakeholders - how to...Managing difficult stakeholders - how to...
Managing difficult stakeholders - how to...
 

Viewers also liked (8)

Charles Bukowski- The Mockingbird
Charles Bukowski- The MockingbirdCharles Bukowski- The Mockingbird
Charles Bukowski- The Mockingbird
 
Planning around needs and solution focused practice
Planning around needs and solution focused practicePlanning around needs and solution focused practice
Planning around needs and solution focused practice
 
Tania groupon
Tania grouponTania groupon
Tania groupon
 
SCV
SCVSCV
SCV
 
E coli
E coliE coli
E coli
 
震災に対応する際の労務管理上のポイントと助成金の活用
震災に対応する際の労務管理上のポイントと助成金の活用震災に対応する際の労務管理上のポイントと助成金の活用
震災に対応する際の労務管理上のポイントと助成金の活用
 
E coli
E coliE coli
E coli
 
ソー活元年 2013Facebook採用事例研究会
ソー活元年 2013Facebook採用事例研究会ソー活元年 2013Facebook採用事例研究会
ソー活元年 2013Facebook採用事例研究会
 

Similar to Evaluation introduction

PDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic modelPDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic model
kpravera
 
Instructional leadership workshop Session 3
Instructional leadership workshop Session 3Instructional leadership workshop Session 3
Instructional leadership workshop Session 3
Education Moving Up Cc.
 
MGSLG Monitoring and Evaluation as a Function of Management
MGSLG   Monitoring and Evaluation as a Function of ManagementMGSLG   Monitoring and Evaluation as a Function of Management
MGSLG Monitoring and Evaluation as a Function of Management
Education Moving Up Cc.
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
WorldFish
 

Similar to Evaluation introduction (20)

Outputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic ModelsOutputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic Models
 
Evaluating your public engagement activities (by Suzanne Spicer)
Evaluating your public engagement activities (by Suzanne Spicer)Evaluating your public engagement activities (by Suzanne Spicer)
Evaluating your public engagement activities (by Suzanne Spicer)
 
Lena Etuk Why Measure Social Impact?
Lena Etuk Why Measure Social Impact? Lena Etuk Why Measure Social Impact?
Lena Etuk Why Measure Social Impact?
 
PDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic modelPDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic model
 
Week 7: Missions and Measures
Week 7: Missions and MeasuresWeek 7: Missions and Measures
Week 7: Missions and Measures
 
Instructional leadership workshop Session 3
Instructional leadership workshop Session 3Instructional leadership workshop Session 3
Instructional leadership workshop Session 3
 
MGSLG Monitoring and Evaluation as a Function of Management
MGSLG   Monitoring and Evaluation as a Function of ManagementMGSLG   Monitoring and Evaluation as a Function of Management
MGSLG Monitoring and Evaluation as a Function of Management
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Results Based Monitoring and Evaluation
Results Based Monitoring and EvaluationResults Based Monitoring and Evaluation
Results Based Monitoring and Evaluation
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
Evaluating and communicating your project
Evaluating and communicating your project Evaluating and communicating your project
Evaluating and communicating your project
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Putting Program Evaluation to Work for You
Putting Program Evaluation to Work for YouPutting Program Evaluation to Work for You
Putting Program Evaluation to Work for You
 
UP ACE MTL Unit 3 Session 7
UP ACE MTL Unit 3 Session 7UP ACE MTL Unit 3 Session 7
UP ACE MTL Unit 3 Session 7
 
Impact practice in the third sector for public health practitioners
Impact practice in the third sector   for public health practitionersImpact practice in the third sector   for public health practitioners
Impact practice in the third sector for public health practitioners
 
Centre for effective services nuala doherty
Centre for effective services   nuala dohertyCentre for effective services   nuala doherty
Centre for effective services nuala doherty
 
How to effectively cultivate change
How to effectively cultivate changeHow to effectively cultivate change
How to effectively cultivate change
 
Project management
Project managementProject management
Project management
 
Measure Your Organization’s Impact with Performance Management with Josie All...
Measure Your Organization’s Impact with Performance Management with Josie All...Measure Your Organization’s Impact with Performance Management with Josie All...
Measure Your Organization’s Impact with Performance Management with Josie All...
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 

More from Nathan Loynes

Parenting capacity to change
Parenting capacity to changeParenting capacity to change
Parenting capacity to change
Nathan Loynes
 

More from Nathan Loynes (20)

2285 (2021) week 2 self awareness (hand out)
2285 (2021) week 2 self awareness (hand out)2285 (2021) week 2 self awareness (hand out)
2285 (2021) week 2 self awareness (hand out)
 
2285 (2021) week 1 introduction to the module online
2285 (2021) week 1 introduction to the module online2285 (2021) week 1 introduction to the module online
2285 (2021) week 1 introduction to the module online
 
Planning
PlanningPlanning
Planning
 
Vision, mission and values statements
Vision, mission and values statementsVision, mission and values statements
Vision, mission and values statements
 
Lllc1080 module review
Lllc1080 module reviewLllc1080 module review
Lllc1080 module review
 
Motivation
MotivationMotivation
Motivation
 
TT2018
TT2018TT2018
TT2018
 
Toxic Trio
Toxic TrioToxic Trio
Toxic Trio
 
Session 10. final course summary[ 2016
Session 10. final course summary[ 2016Session 10. final course summary[ 2016
Session 10. final course summary[ 2016
 
Session 9. violence retribution & restoration.2016
Session 9. violence retribution & restoration.2016Session 9. violence retribution & restoration.2016
Session 9. violence retribution & restoration.2016
 
Session 8. knives gangs guns.2016
Session 8.  knives gangs  guns.2016Session 8.  knives gangs  guns.2016
Session 8. knives gangs guns.2016
 
Session 7 restoration ppt.2016
Session 7  restoration   ppt.2016Session 7  restoration   ppt.2016
Session 7 restoration ppt.2016
 
Session 6 the cycle of abuse.2016
Session 6  the cycle of abuse.2016Session 6  the cycle of abuse.2016
Session 6 the cycle of abuse.2016
 
Session 5 youth crime the media.2016
Session 5  youth crime  the media.2016Session 5  youth crime  the media.2016
Session 5 youth crime the media.2016
 
Session 4 the murer of james bulger.2016
Session 4  the murer of james bulger.2016Session 4  the murer of james bulger.2016
Session 4 the murer of james bulger.2016
 
Session 3 1 lllc 2222 official statistics.2016
Session 3 1 lllc 2222 official statistics.2016Session 3 1 lllc 2222 official statistics.2016
Session 3 1 lllc 2222 official statistics.2016
 
Session 2 1 lllc 2222 defining social construction 2016
Session 2 1 lllc 2222 defining social construction 2016Session 2 1 lllc 2222 defining social construction 2016
Session 2 1 lllc 2222 defining social construction 2016
 
Session 1 lllc 2222 2016
Session 1 lllc 2222 2016Session 1 lllc 2222 2016
Session 1 lllc 2222 2016
 
Parenting capacity to change
Parenting capacity to changeParenting capacity to change
Parenting capacity to change
 
Significant harm
Significant harmSignificant harm
Significant harm
 

Evaluation introduction

  • 2. In this presentation: 1. Definitions and disagreements about evaluation. 2. Logic Models. 3. Outcomes, Indicators and Targets 4. Measuring Outcomes 5. Mark Friedman : Outcome Based Accountability
  • 3. Working Definition of Programme Evaluation The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programmes, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions. 2
  • 4. Scott & Morrison (2005) Evaluation Focuses on: • Value & Worth • Education or Social Programmes • Activities, Characteristics and Outcomes • Policy Implications (What should happen next?)
  • 5. Pawson & Tilley, 1997 (In Scott & Morrison) • Realistic Evaluation 1. Take into account the ‘institutional’ nature of programmes 2. Should be scientific 3. Evaluation should not be self-serving.
  • 6. Chen (1996) (In Scott and Morrison, 2005) 4 Types of Evaluation: 1. Process-Improvement 2. Process-Assessment 3. Outcome-Improvement 4. Outcome-Assessment
  • 7. Working Definition of Programme Evaluation The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programmes, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions. 6
  • 8. Evaluation Strategy Clarification All Evaluations Are:  Partly social  Partly political  Partly technical Both qualitative and quantitative data can be collected and used and both are valuable There are multiple ways to address most evaluation needs.  Different evaluation needs call for different designs, types of data and data collection strategies. 7
  • 9. Purposes of Evaluation Evaluations are conducted to:  Render judgment  Facilitate improvements  Generate knowledge Evaluation purpose must be specified at the earliest stages of evaluation planning and with input from multiple stakeholders. 8
  • 10. What are Logic Models?
  • 11. To Construct a Logic Model You Must Describe:  Inputs: resources, money, staff/time, facilities, etc.  Outputs: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.  Outcomes: changes to individuals or populations during or after participation. Inputs Outputs Outcomes 10
  • 12. Here is an illustration that will help you create your own Logic Model. Inputs Contextual Analysis Resources dedicated to or Identify the major consumed by the conditions and programme. reasons for why you are doing the work E.G. in your community money  staff and staff time, volunteers and volunteer time facilities equipment and supplies Outputs Outcomes What the programme does with the inputs to Benefits for participants during fulfill its mission. and after programme activities. E.G. E.G. provide x number of classes to x participants new knowledge provide weekly counseling sessions increased skills educate the public about signs of child abuse by changed attitudes distributing educational materials to all agencies that modified behavior serve families improved condition Identify 20 mentors to work with youth and altered status opportunities for them to meet monthly for one year 11
  • 14. What is the difference between outcomes, indicators, and targets? Outcomes are changes in behavior, skills, knowledge, attitudes, condition or status. Outcomes are related to the core business of the programme, are realistic and attainable, within the program’s sphere of influence, and appropriate.  Outcomes are what a programme is held accountable for. 13
  • 15. What is the difference between outcomes, indicators, and targets?  Indicators are specific characteristics or changes that represent achievement of an outcome.  Indicators are directly related to the outcome and help define it.  Indicators are measurable, observable, can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal. 14
  • 16. What is the difference between outcomes, indicators, and targets? Targets specify the amount or level of outcome attainment that is expected, hoped for or required. 15
  • 17. Why measure outcomes? To see if your programme is really making a difference in the lives of your clients To confirm that your programme is on the right track  To be able to communicate to others what you’re doing and how it’s making a difference  To get information that will help you improve your programme 16
  • 18. Use Caution When Identifying Outcomes There is No right number of outcomes. Be sure to think about when to expect outcomes. 1)Initial Outcomes  First benefits/changes participants experience 2)Intermediate Outcomes  Link initial outcomes to longer-term outcomes 3)Longer-term Outcomes  Ultimate outcomes desired for program participants 17
  • 19. How do you identify indicators?  Indicators are specific characteristics or changes that represent achievement of an outcome.  Indicators are directly related to the outcome and help define it.  Indicators are measurable, observable, can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal.  Ask the questions shown on the following slide. 18
  • 20. Questions to Ask When Identifying Indicators 1. What does this outcome look like when it occurs? 2. What would tell us it has happened? 3. What could we count, measure or weigh? 4. Can you observe it? 5. Does it tell you whether the outcome has been achieved? 19
  • 21. The BIG question is what evidence do we need to see to be convinced that things are changing or improving? The “I’ll know it (outcome) when I see it (indicator)” rule in action -- some examples: I’ll know that retention has increased among home health aides involved in a career ladder program when I see a reduction in the employee turnover rate among aides involved in the program and when I see survey results that indicate that aides are experiencing increased job satisfaction 20
  • 22. Mark Friedman (2005) • Outcomes Based Accountability • Frustrated by social programmes ‘all talk; no action’ • Need for a ‘Common Language’. • Need for accurate data • Need for baselines. • Differentiate between Inputs, Outcomes, Outputs
  • 23. Summary • Evaluation is a systematic process. • Evaluation considers inputs, outputs, and outcomes. • Evaluation involves making qualitative and quantitative judgements. • Effective evaluation requires that you are clear about what it is that you are measuring/judging. 22