SlideShare une entreprise Scribd logo
1  sur  30
Télécharger pour lire hors ligne
Program
Evaluation:
Methods and
Case Studies
Emil J. Posavac and
Raymond G. Carey
7th Edition. 2007. New
Jersey: Pearson,
Prentice Hall.
Aung Thu Nyein
DA- 8020 Policy Studies
Content
 About the authors
 Chapter 1: Program Evaluation: An
  Overview
 Chapter 3: Selecting criteria and setting
  standards
About the authors
 Emil    J. Posavac
Ph. D., University of Illinois, a professor Emeritus of Psychology at
Loyola University of Chicago,
Director of applied social psychology graduate program
Awarded for Myrdal Award by American Evaluation Association

 Raymond          G. Carey
Ph. D., Loyola University of Chicago, principal of R. G. Carey
Associates.
Widely published in the field of health services and quality
assurance.
An Overview
   Evaluation is natural routine.

   “Program evaluation is a collection of methods,
    skills, and sensitivities necessary to determine
    whether a human service is needed and likely to
    be used, whether the service is sufficiently
    intensive to meet the unmet needs identified,
    whether the service is offered as planned, and
    whether the service actually does help people in
    need at a reasonable cost without unacceptable
    side effects.”
An Overview… Contd.
But program evaluation is different with natural, automatic
evaluation.

   First, organization efforts are carried out by team. This
    specialization means that responsibility for program
    evaluation is diffused among many people.
   Secondly, most programs attempt to achieve objectives
    that can only be observed sometime in the future rather
    than in a matter of minutes. Then choice of criteria?
   Third, when evaluating our own ongoing work, a single
    individual fills many roles– workers, evaluator, beneficiary,
    recipient of the feedback, etc.
   Last, programs are usually paid for by parties other than
    clients of the program.
Evaluation tasks that need to be done

PE is designed to assist some audience to access the a
program’s merit or worth.

   Verify that resources would be devoted to meeting unmet
    needs
   Verify that implemented programs do provide services
   Examine the outcomes
   Determine which program produce the most favorable
    outcome
   Select the programs that offer the most needed types of
    services
   Provide information to maintain and improve quality
   Watch for unplanned side effects.
Common Types of Program
Evaluation
   Assess needs of the program participants
     Identify and measure the level of unmet needs,
     Some alternatives
   Examine the process of meeting the needs
     Extent of the implementation,
     the nature of people being served
     The degree to which the program operates as
       planned
   Measure the outcomes of the program
     Who had received what?
     Program service makes changes for better?
     Different opinions of people on outcome?
   Integrate the needs, costs, and outcomes
     Cost-effectiveness
Activities often confused with
program evaluation
 Basic research
 Individual assessment
 Program audit


 Although   these activities are valuable,
 program evaluation is different and more
 difficult to carry out.
Different Types of Evaluations
for Different Kinds of Programs
   No “one size fits all” approach.
   Organizations needing program evaluations
       Health care
       Criminal justice
       Business and Industry
       Government
   Time Frame of needs
       Short-term needs
       Long-term needs
       Potential needs
Extensiveness of the programs
 Some  programs are offered to small
  group of people with similar needs, but
  other are developed for use at many sites
  through out the country.
 Complexities involved.
Purpose of program evaluation
   The over all purpose of program evaluation is
    contributing to the provision of quality services to the
    people in needs.
   Feedback mechanism: formative evaluations or
    summative evaluations or evaluation for knowledge.
   A Feedback Loop
The roles of evaluators
   A variety of work setting
     Internal evaluators
     External: of governmental or regulatory
      agencies
     Private research firms
Comparison of internal and external
evaluators
 Factors    related to competence
   Access and advantages
   Technical expertise

 Personal    qualities
     Evaluator’s personal qualities: objective, fair and
      trustable.
 Factorsrelated to the purpose of an
  evaluation
     Formative, summative or quality assurance
      evaluation?
Evaluation and service
 The role of social scientist concerned with
  theory, the design of research, and
  analysis of data.
 And the role of practitioners dealing with
  people in need.
Evaluation and related activities of
organizations
 Research
 Education   and staff development
 Auditing
 Planning
 Human   resources
Chapter 3:
Selecting Criteria and
Setting Standards
Useful criteria and standards
   Research design is important, but criteria and standards as well.

   Criteria that reflect a program’s purposes
     Immediate short-term effects, but a marginal long-term ones.
   Criteria that the staff can influence
       Could meet with resistance to an evaluation if the program staff feel
        that their program will be judged on criteria that they cannot effect.
   Criteria that can be measured reliably and validly.
       Repeated observation could give same values.
   Criteria that stakeholders participate in selecting
       In consultation with evaluator and stakeholders
Developing Goals and Objectives
   How much agreement on goals is needed?
       A number of issues to be addressed.
   Different types of goals
       Implementation goals
       Intermediate goals
       Outcome goals
   Goals that apply to all programs
       Treating the subjects with respect
       Personal exposure to the program
       Depending on surveys and records to provide
        evaluations, etc.
Evaluation criteria and
evaluation questions
 Does  the program or plan match the
  values of the stakeholders?
 Does the program or plan match the
  needs of the people to be served?
 Does the program as implemented fulfill
  the plans?
 Does the outcomes achieved match the
  goals?
Using Program Theory
 Why a program theory is helpful?
 How to develop a program theory?
 Implausible program theories


   Every program embodies a conception of the
    structure, functions, and procedures appropriate
    to attain its goals.
   The conception constitutes the “logic” or plan of
    the program, which is called “Program Theory”.
Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998.
Evaluation: A Systematic Approach, 6th Ed., SAGE Publications,
Inc., London.
Assessing program theory
Framework for assessing program theory
 In relation to social needs
 Assessment of logic and plausibility
   Are the program goals and objectives well defined?
   Are the program goals and objectives feasible?
   Is the change process presumed in the program theory plausible?
   Are the program procedures for identifying members of the target
    population, delivering service to them, and sustaining that service through
    completion well defined and sufficient?
   Are the constituent components, activities, and functions of the program
    well defined and sufficient?
   Are the resources allocated to the program and its various components
    and activities adequate?
   Assessment through comparison with research and practice
   Assessment via preliminary observation
Assessing program theory-2
   Program theory can be assessed in relation to the support for
    critical assumptions found in research or documented
    program practice elsewhere. Sometimes findings are
    available for similar programs.
   Assessment of program theory yields findings that can help
    improve conceptualization of a program or, to affirm its basic
    design.




Source: Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic
Approach, 6th Ed., SAGE Publications, Inc., London.
More questions..
   Is the program accepted?
   Are the resources devoted to the program
    being expended appropriately?
       Using program costs in the planning phase
       Is offering the program fair to all stakeholders?
       Is this the way the funds are supposed to be
        spent?
       Do the outcomes justify the resources spent?
       Has the evaluation plan allowed for the
        development of criteria that are sensitive to
        undesirable side effects?
Example: Program Theory
Example: Program Theory
Example: Program Theory and theory
failure
E.g. Theory failure
Some practical limitations in
selecting evaluation criteria
 Evaluation  budget: Evaluation is not free.
 Time available for the project
 Criteria that are credible to the
  stakeholders.
Overlap in terminology in program evaluation by
 Jane T. Bertrand




Bertrand, Jane T., Understanding the Overlap in Programme Evaluation Terminology, May 2005, The
communicating initiative network.
Thanks for your attention.

Contenu connexe

Tendances

Quantitative analysis
Quantitative analysisQuantitative analysis
Quantitative analysisRajesh Mishra
 
Relational and correlational research
Relational and correlational researchRelational and correlational research
Relational and correlational researchNieLeeXin
 
Continuous Assessment
Continuous AssessmentContinuous Assessment
Continuous AssessmentManuel Reyes
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliabilitysongoten77
 
Testing, Measurement and Evaluation
Testing, Measurement and EvaluationTesting, Measurement and Evaluation
Testing, Measurement and EvaluationAnita Anwer Ali
 
RATING SCALE- Pratyusha Ranjan Sahoo.pptx
RATING SCALE- Pratyusha Ranjan Sahoo.pptxRATING SCALE- Pratyusha Ranjan Sahoo.pptx
RATING SCALE- Pratyusha Ranjan Sahoo.pptxPratyusha Ranjan Sahoo
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosaResty Samosa
 
Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)RMI Volunteer teacher
 
Methods of collecting and presenting occupational information
Methods of collecting and presenting  occupational informationMethods of collecting and presenting  occupational information
Methods of collecting and presenting occupational informationDr. Priyamvada Saarsar
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationaneez103
 
Basic Concepts of Teaching, Evaluation, Measurement, And Tests.
Basic Concepts of Teaching, Evaluation, Measurement, And Tests.Basic Concepts of Teaching, Evaluation, Measurement, And Tests.
Basic Concepts of Teaching, Evaluation, Measurement, And Tests.Dr. Amjad Ali Arain
 

Tendances (20)

Quantitative analysis
Quantitative analysisQuantitative analysis
Quantitative analysis
 
Validity & reliability
Validity & reliabilityValidity & reliability
Validity & reliability
 
Validity and Reliability
Validity and Reliability Validity and Reliability
Validity and Reliability
 
Relational and correlational research
Relational and correlational researchRelational and correlational research
Relational and correlational research
 
Descriptive research
Descriptive researchDescriptive research
Descriptive research
 
Item analysis
Item analysisItem analysis
Item analysis
 
Continuous Assessment
Continuous AssessmentContinuous Assessment
Continuous Assessment
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliability
 
Testing, Measurement and Evaluation
Testing, Measurement and EvaluationTesting, Measurement and Evaluation
Testing, Measurement and Evaluation
 
Experimental Research
Experimental ResearchExperimental Research
Experimental Research
 
RATING SCALE- Pratyusha Ranjan Sahoo.pptx
RATING SCALE- Pratyusha Ranjan Sahoo.pptxRATING SCALE- Pratyusha Ranjan Sahoo.pptx
RATING SCALE- Pratyusha Ranjan Sahoo.pptx
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
Instruction and assessment
Instruction and assessmentInstruction and assessment
Instruction and assessment
 
Validity in Assessment
Validity in AssessmentValidity in Assessment
Validity in Assessment
 
Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)
 
Methods of collecting and presenting occupational information
Methods of collecting and presenting  occupational informationMethods of collecting and presenting  occupational information
Methods of collecting and presenting occupational information
 
Action research design
Action research designAction research design
Action research design
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Measurment and evaluation
Measurment and evaluationMeasurment and evaluation
Measurment and evaluation
 
Basic Concepts of Teaching, Evaluation, Measurement, And Tests.
Basic Concepts of Teaching, Evaluation, Measurement, And Tests.Basic Concepts of Teaching, Evaluation, Measurement, And Tests.
Basic Concepts of Teaching, Evaluation, Measurement, And Tests.
 

Similaire à Program evaluation 20121016

Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Brent MacKinnon
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxstilliegeorgiana
 
Psychology Techniques - Program Evaluation
Psychology Techniques - Program EvaluationPsychology Techniques - Program Evaluation
Psychology Techniques - Program Evaluationpsychegames2
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfnoblex1
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sectorwishart5
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programsnium
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxsamuel699872
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanPriya Das
 
elines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Workelines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What WorkEvonCanales257
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MoseStaton39
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxAASTHA76
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MikeEly930
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroHelen Casimiro
 
490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docx490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docxblondellchancy
 
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...ROBELYN GARCIA PhD
 

Similaire à Program evaluation 20121016 (20)

Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Street Jibe Evaluation
Street Jibe EvaluationStreet Jibe Evaluation
Street Jibe Evaluation
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docx
 
Psychology Techniques - Program Evaluation
Psychology Techniques - Program EvaluationPsychology Techniques - Program Evaluation
Psychology Techniques - Program Evaluation
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdf
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programs
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhan
 
elines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Workelines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Work
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docx
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. Casimiro
 
Mentoring 418
Mentoring 418Mentoring 418
Mentoring 418
 
490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docx490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docx
 
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
 

Plus de nida19

Culture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growthCulture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growthnida19
 
Doing interview
Doing interviewDoing interview
Doing interviewnida19
 
Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)nida19
 
Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)nida19
 
What is this thing called science?
What is this thing called science?What is this thing called science?
What is this thing called science?nida19
 
Presentation 04.09.2012
Presentation 04.09.2012Presentation 04.09.2012
Presentation 04.09.2012nida19
 
Policy(style) by anthony
Policy(style) by anthonyPolicy(style) by anthony
Policy(style) by anthonynida19
 
My presentation erin da802
My presentation   erin da802My presentation   erin da802
My presentation erin da802nida19
 
ppt on understaing policy
ppt on understaing policyppt on understaing policy
ppt on understaing policynida19
 
Public policy theory primer
Public policy theory primer Public policy theory primer
Public policy theory primer nida19
 
Public policy analysis_dunn
Public policy analysis_dunnPublic policy analysis_dunn
Public policy analysis_dunnnida19
 

Plus de nida19 (11)

Culture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growthCulture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growth
 
Doing interview
Doing interviewDoing interview
Doing interview
 
Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)
 
Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)
 
What is this thing called science?
What is this thing called science?What is this thing called science?
What is this thing called science?
 
Presentation 04.09.2012
Presentation 04.09.2012Presentation 04.09.2012
Presentation 04.09.2012
 
Policy(style) by anthony
Policy(style) by anthonyPolicy(style) by anthony
Policy(style) by anthony
 
My presentation erin da802
My presentation   erin da802My presentation   erin da802
My presentation erin da802
 
ppt on understaing policy
ppt on understaing policyppt on understaing policy
ppt on understaing policy
 
Public policy theory primer
Public policy theory primer Public policy theory primer
Public policy theory primer
 
Public policy analysis_dunn
Public policy analysis_dunnPublic policy analysis_dunn
Public policy analysis_dunn
 

Program evaluation 20121016

  • 1. Program Evaluation: Methods and Case Studies Emil J. Posavac and Raymond G. Carey 7th Edition. 2007. New Jersey: Pearson, Prentice Hall. Aung Thu Nyein DA- 8020 Policy Studies
  • 2. Content  About the authors  Chapter 1: Program Evaluation: An Overview  Chapter 3: Selecting criteria and setting standards
  • 3. About the authors  Emil J. Posavac Ph. D., University of Illinois, a professor Emeritus of Psychology at Loyola University of Chicago, Director of applied social psychology graduate program Awarded for Myrdal Award by American Evaluation Association  Raymond G. Carey Ph. D., Loyola University of Chicago, principal of R. G. Carey Associates. Widely published in the field of health services and quality assurance.
  • 4. An Overview  Evaluation is natural routine.  “Program evaluation is a collection of methods, skills, and sensitivities necessary to determine whether a human service is needed and likely to be used, whether the service is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the service actually does help people in need at a reasonable cost without unacceptable side effects.”
  • 5. An Overview… Contd. But program evaluation is different with natural, automatic evaluation.  First, organization efforts are carried out by team. This specialization means that responsibility for program evaluation is diffused among many people.  Secondly, most programs attempt to achieve objectives that can only be observed sometime in the future rather than in a matter of minutes. Then choice of criteria?  Third, when evaluating our own ongoing work, a single individual fills many roles– workers, evaluator, beneficiary, recipient of the feedback, etc.  Last, programs are usually paid for by parties other than clients of the program.
  • 6. Evaluation tasks that need to be done PE is designed to assist some audience to access the a program’s merit or worth.  Verify that resources would be devoted to meeting unmet needs  Verify that implemented programs do provide services  Examine the outcomes  Determine which program produce the most favorable outcome  Select the programs that offer the most needed types of services  Provide information to maintain and improve quality  Watch for unplanned side effects.
  • 7. Common Types of Program Evaluation  Assess needs of the program participants  Identify and measure the level of unmet needs,  Some alternatives  Examine the process of meeting the needs  Extent of the implementation,  the nature of people being served  The degree to which the program operates as planned  Measure the outcomes of the program  Who had received what?  Program service makes changes for better?  Different opinions of people on outcome?  Integrate the needs, costs, and outcomes  Cost-effectiveness
  • 8. Activities often confused with program evaluation  Basic research  Individual assessment  Program audit  Although these activities are valuable, program evaluation is different and more difficult to carry out.
  • 9. Different Types of Evaluations for Different Kinds of Programs  No “one size fits all” approach.  Organizations needing program evaluations  Health care  Criminal justice  Business and Industry  Government  Time Frame of needs  Short-term needs  Long-term needs  Potential needs
  • 10. Extensiveness of the programs  Some programs are offered to small group of people with similar needs, but other are developed for use at many sites through out the country.  Complexities involved.
  • 11. Purpose of program evaluation  The over all purpose of program evaluation is contributing to the provision of quality services to the people in needs.  Feedback mechanism: formative evaluations or summative evaluations or evaluation for knowledge.  A Feedback Loop
  • 12. The roles of evaluators  A variety of work setting  Internal evaluators  External: of governmental or regulatory agencies  Private research firms
  • 13. Comparison of internal and external evaluators  Factors related to competence  Access and advantages  Technical expertise  Personal qualities  Evaluator’s personal qualities: objective, fair and trustable.  Factorsrelated to the purpose of an evaluation  Formative, summative or quality assurance evaluation?
  • 14. Evaluation and service  The role of social scientist concerned with theory, the design of research, and analysis of data.  And the role of practitioners dealing with people in need.
  • 15. Evaluation and related activities of organizations  Research  Education and staff development  Auditing  Planning  Human resources
  • 16. Chapter 3: Selecting Criteria and Setting Standards
  • 17. Useful criteria and standards  Research design is important, but criteria and standards as well.  Criteria that reflect a program’s purposes  Immediate short-term effects, but a marginal long-term ones.  Criteria that the staff can influence  Could meet with resistance to an evaluation if the program staff feel that their program will be judged on criteria that they cannot effect.  Criteria that can be measured reliably and validly.  Repeated observation could give same values.  Criteria that stakeholders participate in selecting  In consultation with evaluator and stakeholders
  • 18. Developing Goals and Objectives  How much agreement on goals is needed?  A number of issues to be addressed.  Different types of goals  Implementation goals  Intermediate goals  Outcome goals  Goals that apply to all programs  Treating the subjects with respect  Personal exposure to the program  Depending on surveys and records to provide evaluations, etc.
  • 19. Evaluation criteria and evaluation questions  Does the program or plan match the values of the stakeholders?  Does the program or plan match the needs of the people to be served?  Does the program as implemented fulfill the plans?  Does the outcomes achieved match the goals?
  • 20. Using Program Theory  Why a program theory is helpful?  How to develop a program theory?  Implausible program theories  Every program embodies a conception of the structure, functions, and procedures appropriate to attain its goals.  The conception constitutes the “logic” or plan of the program, which is called “Program Theory”. Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic Approach, 6th Ed., SAGE Publications, Inc., London.
  • 21. Assessing program theory Framework for assessing program theory  In relation to social needs  Assessment of logic and plausibility  Are the program goals and objectives well defined?  Are the program goals and objectives feasible?  Is the change process presumed in the program theory plausible?  Are the program procedures for identifying members of the target population, delivering service to them, and sustaining that service through completion well defined and sufficient?  Are the constituent components, activities, and functions of the program well defined and sufficient?  Are the resources allocated to the program and its various components and activities adequate?  Assessment through comparison with research and practice  Assessment via preliminary observation
  • 22. Assessing program theory-2  Program theory can be assessed in relation to the support for critical assumptions found in research or documented program practice elsewhere. Sometimes findings are available for similar programs.  Assessment of program theory yields findings that can help improve conceptualization of a program or, to affirm its basic design. Source: Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic Approach, 6th Ed., SAGE Publications, Inc., London.
  • 23. More questions..  Is the program accepted?  Are the resources devoted to the program being expended appropriately?  Using program costs in the planning phase  Is offering the program fair to all stakeholders?  Is this the way the funds are supposed to be spent?  Do the outcomes justify the resources spent?  Has the evaluation plan allowed for the development of criteria that are sensitive to undesirable side effects?
  • 26. Example: Program Theory and theory failure
  • 28. Some practical limitations in selecting evaluation criteria  Evaluation budget: Evaluation is not free.  Time available for the project  Criteria that are credible to the stakeholders.
  • 29. Overlap in terminology in program evaluation by Jane T. Bertrand Bertrand, Jane T., Understanding the Overlap in Programme Evaluation Terminology, May 2005, The communicating initiative network.
  • 30. Thanks for your attention.