SlideShare une entreprise Scribd logo
1  sur  28
The Nature of Program
Evaluation
Carlo Magno, PhD
Counseling and Educational
Psychology Department
Answer the following questions:
• Why is program evaluation needed?
• What are the roles of a professional
program evaluator?
Program evaluation is needed
because…
• Policy makers need good information about the relative
effectiveness of the program.
– Which programs are working well?
– Which poorly?
– What are the program’s relative cost and benefits?
– Which parts of the program are working?
– What can be done with those parts that are not working well?
– Have al parts of the program been thought through carefully at
the planning stage?
– What is the theory or logic model for the program effectiveness?
– What adaptations would make the program more effective?
Program Evaluation
• Systematic investigation of the merit, worth or
significance of an object (Scriven, 1999), hence
assigning “value” to a program’s efforts means
addressing those three inter-related domains:
– Merit (or quality)
– Worth (or value, i.e., cost-effectiveness)
– Significance (or importance)
• The identification, clarification, and application of
defensible criteria to determine an object’s value
in relation to those criteria (Fitzpatrick,
WEorthen, & Sanders, 2004).
Prerequisite to evaluation
• Need a program: - an organized action
– Direct service interventions
– Community mobilization efforts
– Research initiatives
– Surveillance systems
– Policy development activities
– Outbreak investigations
– Laboratory diagnostics
– Communication campaigns
– Infrastructure building projects
– Training and education services
– Administrative systems
Inquiry and Judgment in Evaluation
• (1) Determining standards for judging
quality and deciding whether those
standards should be relative or absolute.
• (2) Collecting relevant information
• (3) Applying the standards to determine
value, quality, utility, effectiveness, or
significance.
Evidence of value and judgement:
• What will be evaluated? (i.e., what is "the program" and
in what context does it exist?)
• What aspects of the program will be considered when
judging program performance?
• What standards (i.e., type or level of performance) must
be reached for the program to be considered
successful?
• What evidence will be used to indicate how the program
has performed?
• What conclusions regarding program performance are
justified by comparing the available evidence to the
selected standards?
• How will the lessons learned from the inquiry be used to
improve public health effectiveness?
Difference between Research and
Evaluation
• Purpose
• Approaches
• Who sets the agenda?
• Generalizability of results
• Criteria and standards
• Preparation
Difference in Purpose
• Research
– Add knowledge in a field, contribute to theory
– Seeks conclusion
• Evaluation
– Help those who hold a stake in whatever is
being evaluated
– Leads to judgments
Difference in Approaches
• Research
– Quest for laws
– Explore and establish causal relationships
• Evaluation
– Describing a phenomenon may use causal
relationships
– Causal relationships will depend on the needs
of the stakeholders
Difference on who sets the agenda
• Research
– The hypothesis investigated is chosen by the
researcher and the appropriate steps in
developing the theory.
• Evaluation
– Questions to be answered comes form many
sources (stakeholders).
– Consults with stakeholders to determine the
focus of the study.
Difference in generalizability of
results
• Research
– Methods are designed to maximize
generalizability to many different settings
• Evaluation
– Specific to the context which evaluation object
rests.
Difference in Criteria and standards
• Research
– Internal validity (causality),
– external validity (generalizability)
• Evaluation
– Accuracy (corresponding to reality)
– Utility (results serve practical information)
– Feasibility (realistic, prudent, diplomatic, frugal)
– Propriety (done legally and ethiocally)
Difference in Preparation
• Research
– In depth training on a single discipline in their field of
inquiry.
• Evaluation
– Responds to the needs of clients and stakeholders
with many information needs and operating in many
different settings.
– Interdisciplinary: Sensitive to a wide range of
phenomenon that they must attend to.
– Familiar with a wide variety of methods
– Establish personal working relationships with clients
(interpersonal and communication skills)
Competencies needed by professional
Evaluators (Sanders, 1999)
• Ability to describe the object and context of an evaluation
• Conceptualize appropriate purposes and framework for
evaluation
• Identify and select appropriate evaluation questions,
information needs, and sources of information
• Select mans for collecting and analyzing information
• Determine the value of the object of an evaluation
• Communicate plans and results effectively to audiences
• Manage the evaluation
• Maintain ethical standards
• Adjust to external factors influencing the evaluation
• Evaluate the evaluation
Purposes of Evaluation
• Talmage (1982)
– Render judgment in the worth of the program
– Assist decision makers responsible for deciding policy
– Serve a political function
• Rallis and Rossman (2000)
– Learning, helping practitioners and others better
understand and interpret their observations
•
Purposes of Evaluation
• Weiss (1988) and Henry (2000)
– Bring about social betterment
• Mark, Henry, and Julnes (1999)
– Betterment – alleviation of social problems, meeting
of human needs
• Chelimsky (1997) – takes a global perspective:
new technologies, demographic imbalance,
environmental protection, sustainable
development, terrorism, human rights
Purposes of Evaluation
• House and Howe (1999)
– Foster deliberate democracy-work to help less
powerful stakeholders gain a voice and to stimulate
dialogue among stakeholders in a democratic fashion.
• Mark, Henry, and Julnes (1999)
– Assessment of merit and worth
– Oversight and compliance
– Program and organizational improvement
– Knowledge development
Roles of the Professional Evaluator
• Rallis and Rossman (2000)
– Critical friend: “someone the emperor knows
and can listen to. She is more friend than
judge, although she is not afraid to offer
judgment” (p. 83)
• Schwant (2001)
– Helping practitioners develop critical judgment
Roles of the Professional Evaluator
• Patton (1996)
– Facilitator
– Collaborator
– Teacher management consultant
– OD specialist
– Social-change agent
• Preskilll and Torres (1999)
– Bring about organizational learning and
instilling a learning environment
Roles of the Professional Evaluator
• Mertens (1999), Chelimsky (1998), and
Greene (1997)
– Including the stakeholders as part of the
evaluation process
• House and Howe (1999)
– Stimulating dialogue among various groups
Roles of the Professional Evaluator
• Bickman (2001) and Chen (1990)
– Take part in program planning
– Help articulate program theories or logic
model
• Wholey (1996)
– Help policy makers and managers select the
performance dimension to be measured as
well as the tools to use in measuring those
dimensions
Roles of the Professional Evaluator
• Lipsey (2000)
– Provides expertise to track things down,
systematically observe and measure them,
and compare, analyze, and interpret with a
good faith attempt at aobjectivity.
Roles of the Professional Evaluator
• Fitzpatrick, Worthen, and Sanders (2004)
– Negotiating with stakeholders group to define the purpose of
evaluation
– Developing contracts
– Hiring and overseeing staff
– Managing budgets
– Identifying disenfranchised or underrepresented groups
– Working with advisory panels
– Collecting and analyzing and interpreting qualitative and
quantitative information
– Communicating frequently with various stakeholders to seek
input into the evaluation and to report results
– Writing reports
– Considering effective ways to disseminate information
– Meeting with the press and other representatives to report on
progress and results
– Recruiting others to evaluate the evaluation
Examples of evaluation use in
Education
• To empower teachers to have more say about how
school budget are allocated
• To judge the quality of the school curricula in specific
content areas
• To accredit schools that meet minimum accreditation
standards
• To determine the value of a middle school’s block
scheduling
• To satisfy an external funding agency’s demands for
reports on effectiveness of school programs it supports
• To assist parents and students in selecting schools in a
district with school choice
• To help teachers improve their reading program to
encourage more voluntary reading
Examples of evaluation use in other
public and Nonprofit sectors
• To decide whether to implement an urban development
program
• To establish the value of a job-training program
• To decide whether to modify a low-cost housing project’s
rental policies
• To improve a recruitment program for blood donors
• To determine the impact of a prison’s early release
program in recidivism
• To gauge community reaction to proposed fire-burning
restrictions to improve air quality
• To determine the cost-benefit contribution of a new
sports stadium for a metropolitan area
Examples of evaluation use in
Business and industry
• To improve a commercial product
• To judge the effectiveness of a corporate training
program on teamwork
• To determine the effect of a new flextime policy on
productivity, recruitment, and retention
• To identify the contributions of specific programs to
corporate profits
• To determine the public’s perception of a corporation’s
environmental image
• To recommend ways to improve retention among
younger employees
• To study the quality of performance-appraisal dfeedback
Formative and Summative
Evaluation
• Formative – provide information for
program improvement. Judgment of a part
of a program.
• Summative – concerned with providing
information to serve decisions or assist in
making judgments about program
adoption, continuation or expansion.

Contenu connexe

Tendances

Implimenting the curriculum the roles of stakeholders ---hazel and jeric
Implimenting the curriculum   the roles of stakeholders ---hazel and jericImplimenting the curriculum   the roles of stakeholders ---hazel and jeric
Implimenting the curriculum the roles of stakeholders ---hazel and jeric
Phillip Murphy Bonaobra
 
Chapter 2 curriculum decision making (complete)
Chapter 2  curriculum decision making (complete)Chapter 2  curriculum decision making (complete)
Chapter 2 curriculum decision making (complete)
Rachel Blue Adkins Ü
 

Tendances (20)

Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
 
Action research
Action researchAction research
Action research
 
Lesson plan for Theory
Lesson plan for TheoryLesson plan for Theory
Lesson plan for Theory
 
Action Research for Teachers
Action Research for TeachersAction Research for Teachers
Action Research for Teachers
 
Monitoring and Evaluation
Monitoring and Evaluation Monitoring and Evaluation
Monitoring and Evaluation
 
Implimenting the curriculum the roles of stakeholders ---hazel and jeric
Implimenting the curriculum   the roles of stakeholders ---hazel and jericImplimenting the curriculum   the roles of stakeholders ---hazel and jeric
Implimenting the curriculum the roles of stakeholders ---hazel and jeric
 
Evalution of educational planning
Evalution of educational planningEvalution of educational planning
Evalution of educational planning
 
Results based planning and management
Results based planning and managementResults based planning and management
Results based planning and management
 
Participatory Monitoring and Evaluation
Participatory Monitoring and EvaluationParticipatory Monitoring and Evaluation
Participatory Monitoring and Evaluation
 
Education planning types
Education planning typesEducation planning types
Education planning types
 
Chapter 2 curriculum decision making (complete)
Chapter 2  curriculum decision making (complete)Chapter 2  curriculum decision making (complete)
Chapter 2 curriculum decision making (complete)
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theories
 
Conducting Programme Evaluation
Conducting Programme EvaluationConducting Programme Evaluation
Conducting Programme Evaluation
 
Importance of M&E
Importance of M&EImportance of M&E
Importance of M&E
 
Principles of assessment
Principles  of assessmentPrinciples  of assessment
Principles of assessment
 
Home - School Relations: Building Collaborative Relationships
Home - School Relations: Building Collaborative RelationshipsHome - School Relations: Building Collaborative Relationships
Home - School Relations: Building Collaborative Relationships
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
 
The Theory of Change Approach
The Theory of Change ApproachThe Theory of Change Approach
The Theory of Change Approach
 
Plan Implementation
Plan ImplementationPlan Implementation
Plan Implementation
 

En vedette (10)

Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principles
 
Business ethics & corporate governance
Business ethics & corporate governanceBusiness ethics & corporate governance
Business ethics & corporate governance
 
Examination and Evaluation
Examination and EvaluationExamination and Evaluation
Examination and Evaluation
 
DAO and Corporate Governance : conceptual approach (2)
DAO and Corporate Governance : conceptual approach (2)DAO and Corporate Governance : conceptual approach (2)
DAO and Corporate Governance : conceptual approach (2)
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
Business ethics and corporate governance (2)
Business ethics and corporate governance (2)Business ethics and corporate governance (2)
Business ethics and corporate governance (2)
 
Corporate governance ppt mba
Corporate governance ppt mbaCorporate governance ppt mba
Corporate governance ppt mba
 
Evaluation in Education
Evaluation in EducationEvaluation in Education
Evaluation in Education
 
Training evaluation ppt 6
Training evaluation   ppt 6Training evaluation   ppt 6
Training evaluation ppt 6
 
Business ethics and Corporate Governance
Business ethics and Corporate GovernanceBusiness ethics and Corporate Governance
Business ethics and Corporate Governance
 

Similaire à The nature of program evaluation

evaluation process
evaluation processevaluation process
evaluation process
Tracy Culkin
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptx
ssusere0ee1d
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
contentli
 
2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes
Christopher Thorn
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
harrindl
 

Similaire à The nature of program evaluation (20)

Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
evaluation process
evaluation processevaluation process
evaluation process
 
Prt 595 week 2 lecture 2
Prt 595 week 2 lecture 2Prt 595 week 2 lecture 2
Prt 595 week 2 lecture 2
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptx
 
Evaluation in natural resources planning.pptx
Evaluation in natural resources planning.pptxEvaluation in natural resources planning.pptx
Evaluation in natural resources planning.pptx
 
Week3 rainey chapter_6
Week3 rainey chapter_6Week3 rainey chapter_6
Week3 rainey chapter_6
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Assessment 101 Part 3
Assessment 101 Part 3Assessment 101 Part 3
Assessment 101 Part 3
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems Change
 
Introduction to Policy Evaluation
Introduction to Policy EvaluationIntroduction to Policy Evaluation
Introduction to Policy Evaluation
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 
Theory Driven Evaluation.pptx
Theory Driven Evaluation.pptxTheory Driven Evaluation.pptx
Theory Driven Evaluation.pptx
 
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Evaluation Workshop
Evaluation WorkshopEvaluation Workshop
Evaluation Workshop
 
2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes2010 bush foundation inputs, process indicators, and intermediate outcomes
2010 bush foundation inputs, process indicators, and intermediate outcomes
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
Flexible Evaluation
Flexible EvaluationFlexible Evaluation
Flexible Evaluation
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 

Plus de Carlo Magno

Plus de Carlo Magno (20)

Assessment Using the SOLO Framework.pptx
Assessment Using the SOLO Framework.pptxAssessment Using the SOLO Framework.pptx
Assessment Using the SOLO Framework.pptx
 
Social and Emotional Learning
Social and Emotional LearningSocial and Emotional Learning
Social and Emotional Learning
 
Educational assessment in the 4 ir
Educational assessment in the 4 irEducational assessment in the 4 ir
Educational assessment in the 4 ir
 
The process of research mentoring
The process of research mentoringThe process of research mentoring
The process of research mentoring
 
Quality management services sustainability training
Quality management services sustainability trainingQuality management services sustainability training
Quality management services sustainability training
 
Managing technology integration in schools
Managing technology integration in schoolsManaging technology integration in schools
Managing technology integration in schools
 
Integrating technology in teaching
Integrating technology in teachingIntegrating technology in teaching
Integrating technology in teaching
 
Empowering educators on technology integration
Empowering educators on technology integrationEmpowering educators on technology integration
Empowering educators on technology integration
 
Designing an online lesson
Designing an online lessonDesigning an online lesson
Designing an online lesson
 
Curriculum integration
Curriculum integrationCurriculum integration
Curriculum integration
 
Accountability in Developing Student Learning
Accountability in Developing Student LearningAccountability in Developing Student Learning
Accountability in Developing Student Learning
 
The Instructional leader: TOwards School Improvement
The Instructional leader: TOwards School ImprovementThe Instructional leader: TOwards School Improvement
The Instructional leader: TOwards School Improvement
 
Guiding your child on their career decision making
Guiding your child on their career decision makingGuiding your child on their career decision making
Guiding your child on their career decision making
 
Assessing Science Inquiry Skills
Assessing Science Inquiry SkillsAssessing Science Inquiry Skills
Assessing Science Inquiry Skills
 
Assessment in the Social Studies Curriculum
Assessment in the Social Studies CurriculumAssessment in the Social Studies Curriculum
Assessment in the Social Studies Curriculum
 
Quantitative analysis in language research
Quantitative analysis in language researchQuantitative analysis in language research
Quantitative analysis in language research
 
Integrating technology in teaching
Integrating technology in teachingIntegrating technology in teaching
Integrating technology in teaching
 
Hallmarks of textbook
Hallmarks of textbookHallmarks of textbook
Hallmarks of textbook
 
managing the learner centered-classroom
managing the learner centered-classroommanaging the learner centered-classroom
managing the learner centered-classroom
 
Assessing learning objectives
Assessing learning objectivesAssessing learning objectives
Assessing learning objectives
 

Dernier

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 

Dernier (20)

Manulife - Insurer Transformation Award 2024
Manulife - Insurer Transformation Award 2024Manulife - Insurer Transformation Award 2024
Manulife - Insurer Transformation Award 2024
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 

The nature of program evaluation

  • 1. The Nature of Program Evaluation Carlo Magno, PhD Counseling and Educational Psychology Department
  • 2. Answer the following questions: • Why is program evaluation needed? • What are the roles of a professional program evaluator?
  • 3. Program evaluation is needed because… • Policy makers need good information about the relative effectiveness of the program. – Which programs are working well? – Which poorly? – What are the program’s relative cost and benefits? – Which parts of the program are working? – What can be done with those parts that are not working well? – Have al parts of the program been thought through carefully at the planning stage? – What is the theory or logic model for the program effectiveness? – What adaptations would make the program more effective?
  • 4. Program Evaluation • Systematic investigation of the merit, worth or significance of an object (Scriven, 1999), hence assigning “value” to a program’s efforts means addressing those three inter-related domains: – Merit (or quality) – Worth (or value, i.e., cost-effectiveness) – Significance (or importance) • The identification, clarification, and application of defensible criteria to determine an object’s value in relation to those criteria (Fitzpatrick, WEorthen, & Sanders, 2004).
  • 5. Prerequisite to evaluation • Need a program: - an organized action – Direct service interventions – Community mobilization efforts – Research initiatives – Surveillance systems – Policy development activities – Outbreak investigations – Laboratory diagnostics – Communication campaigns – Infrastructure building projects – Training and education services – Administrative systems
  • 6. Inquiry and Judgment in Evaluation • (1) Determining standards for judging quality and deciding whether those standards should be relative or absolute. • (2) Collecting relevant information • (3) Applying the standards to determine value, quality, utility, effectiveness, or significance.
  • 7. Evidence of value and judgement: • What will be evaluated? (i.e., what is "the program" and in what context does it exist?) • What aspects of the program will be considered when judging program performance? • What standards (i.e., type or level of performance) must be reached for the program to be considered successful? • What evidence will be used to indicate how the program has performed? • What conclusions regarding program performance are justified by comparing the available evidence to the selected standards? • How will the lessons learned from the inquiry be used to improve public health effectiveness?
  • 8. Difference between Research and Evaluation • Purpose • Approaches • Who sets the agenda? • Generalizability of results • Criteria and standards • Preparation
  • 9. Difference in Purpose • Research – Add knowledge in a field, contribute to theory – Seeks conclusion • Evaluation – Help those who hold a stake in whatever is being evaluated – Leads to judgments
  • 10. Difference in Approaches • Research – Quest for laws – Explore and establish causal relationships • Evaluation – Describing a phenomenon may use causal relationships – Causal relationships will depend on the needs of the stakeholders
  • 11. Difference on who sets the agenda • Research – The hypothesis investigated is chosen by the researcher and the appropriate steps in developing the theory. • Evaluation – Questions to be answered comes form many sources (stakeholders). – Consults with stakeholders to determine the focus of the study.
  • 12. Difference in generalizability of results • Research – Methods are designed to maximize generalizability to many different settings • Evaluation – Specific to the context which evaluation object rests.
  • 13. Difference in Criteria and standards • Research – Internal validity (causality), – external validity (generalizability) • Evaluation – Accuracy (corresponding to reality) – Utility (results serve practical information) – Feasibility (realistic, prudent, diplomatic, frugal) – Propriety (done legally and ethiocally)
  • 14. Difference in Preparation • Research – In depth training on a single discipline in their field of inquiry. • Evaluation – Responds to the needs of clients and stakeholders with many information needs and operating in many different settings. – Interdisciplinary: Sensitive to a wide range of phenomenon that they must attend to. – Familiar with a wide variety of methods – Establish personal working relationships with clients (interpersonal and communication skills)
  • 15. Competencies needed by professional Evaluators (Sanders, 1999) • Ability to describe the object and context of an evaluation • Conceptualize appropriate purposes and framework for evaluation • Identify and select appropriate evaluation questions, information needs, and sources of information • Select mans for collecting and analyzing information • Determine the value of the object of an evaluation • Communicate plans and results effectively to audiences • Manage the evaluation • Maintain ethical standards • Adjust to external factors influencing the evaluation • Evaluate the evaluation
  • 16. Purposes of Evaluation • Talmage (1982) – Render judgment in the worth of the program – Assist decision makers responsible for deciding policy – Serve a political function • Rallis and Rossman (2000) – Learning, helping practitioners and others better understand and interpret their observations •
  • 17. Purposes of Evaluation • Weiss (1988) and Henry (2000) – Bring about social betterment • Mark, Henry, and Julnes (1999) – Betterment – alleviation of social problems, meeting of human needs • Chelimsky (1997) – takes a global perspective: new technologies, demographic imbalance, environmental protection, sustainable development, terrorism, human rights
  • 18. Purposes of Evaluation • House and Howe (1999) – Foster deliberate democracy-work to help less powerful stakeholders gain a voice and to stimulate dialogue among stakeholders in a democratic fashion. • Mark, Henry, and Julnes (1999) – Assessment of merit and worth – Oversight and compliance – Program and organizational improvement – Knowledge development
  • 19. Roles of the Professional Evaluator • Rallis and Rossman (2000) – Critical friend: “someone the emperor knows and can listen to. She is more friend than judge, although she is not afraid to offer judgment” (p. 83) • Schwant (2001) – Helping practitioners develop critical judgment
  • 20. Roles of the Professional Evaluator • Patton (1996) – Facilitator – Collaborator – Teacher management consultant – OD specialist – Social-change agent • Preskilll and Torres (1999) – Bring about organizational learning and instilling a learning environment
  • 21. Roles of the Professional Evaluator • Mertens (1999), Chelimsky (1998), and Greene (1997) – Including the stakeholders as part of the evaluation process • House and Howe (1999) – Stimulating dialogue among various groups
  • 22. Roles of the Professional Evaluator • Bickman (2001) and Chen (1990) – Take part in program planning – Help articulate program theories or logic model • Wholey (1996) – Help policy makers and managers select the performance dimension to be measured as well as the tools to use in measuring those dimensions
  • 23. Roles of the Professional Evaluator • Lipsey (2000) – Provides expertise to track things down, systematically observe and measure them, and compare, analyze, and interpret with a good faith attempt at aobjectivity.
  • 24. Roles of the Professional Evaluator • Fitzpatrick, Worthen, and Sanders (2004) – Negotiating with stakeholders group to define the purpose of evaluation – Developing contracts – Hiring and overseeing staff – Managing budgets – Identifying disenfranchised or underrepresented groups – Working with advisory panels – Collecting and analyzing and interpreting qualitative and quantitative information – Communicating frequently with various stakeholders to seek input into the evaluation and to report results – Writing reports – Considering effective ways to disseminate information – Meeting with the press and other representatives to report on progress and results – Recruiting others to evaluate the evaluation
  • 25. Examples of evaluation use in Education • To empower teachers to have more say about how school budget are allocated • To judge the quality of the school curricula in specific content areas • To accredit schools that meet minimum accreditation standards • To determine the value of a middle school’s block scheduling • To satisfy an external funding agency’s demands for reports on effectiveness of school programs it supports • To assist parents and students in selecting schools in a district with school choice • To help teachers improve their reading program to encourage more voluntary reading
  • 26. Examples of evaluation use in other public and Nonprofit sectors • To decide whether to implement an urban development program • To establish the value of a job-training program • To decide whether to modify a low-cost housing project’s rental policies • To improve a recruitment program for blood donors • To determine the impact of a prison’s early release program in recidivism • To gauge community reaction to proposed fire-burning restrictions to improve air quality • To determine the cost-benefit contribution of a new sports stadium for a metropolitan area
  • 27. Examples of evaluation use in Business and industry • To improve a commercial product • To judge the effectiveness of a corporate training program on teamwork • To determine the effect of a new flextime policy on productivity, recruitment, and retention • To identify the contributions of specific programs to corporate profits • To determine the public’s perception of a corporation’s environmental image • To recommend ways to improve retention among younger employees • To study the quality of performance-appraisal dfeedback
  • 28. Formative and Summative Evaluation • Formative – provide information for program improvement. Judgment of a part of a program. • Summative – concerned with providing information to serve decisions or assist in making judgments about program adoption, continuation or expansion.