SlideShare une entreprise Scribd logo
1  sur  35
Télécharger pour lire hors ligne
WHAT IS
PROGRAM EVALUATION?

 JENNIFER ANN MORROW, Ph.D.
  UNIVERSITY OF TENNESSEE
Summary of Current Projects

• Program Evaluation Projects
  – Project Weed & Seed
  – Project Fairchild Tropical Gardens
• Student Development Projects
  – Project Social Networking
  – Project Writing
• Teaching Research Methods/Statistics
  Projects
  – Project RLE
Program Evaluation Philosophy

• Utilization-focused evaluation (Patton, 1996)
  – Evaluations are situation specific
• Comprehensive evaluation designs
  – Formative and summative evaluation
  – Qualitative and quantitative data
• Useful and meaningful data
  – Simple versus sophisticated analyses
• Faculty-student evaluation teams
  – Students as evaluation apprentices
What is Program Evaluation?

• “Program evaluation is the systematic collection of
  information about the activities, characteristics, and
  outcomes of programs for use by specific people to
  reduce uncertainties, improve effectiveness, and
  make decision with regard to what those programs
  are doing and affecting” (Patton, 1986).
• “Evaluation research is the systematic application
  of social research procedures for assessing the
  conceptualization, design, implementation, and
  utility of social intervention program” (Rossi &
  Freeman, 1993).
Types of Evaluation
• Formative Evaluation: focuses on identifying the
  strengths and weaknesses of a program or
  intervention.
  – Comprised of implementation (process) and progress
    evaluation
  – Occurs during the entire life of the program/intervention
  – Is performed to monitor and improve the
    program/intervention
• Summative Evaluation: focuses on determining
  the overall effectiveness or impact of a program or
  intervention.
  – Also called impact evaluation
  – Assesses if the project/intervention met its stated goals
Preparing to
                        Conduct an Evaluation
• Identify the program and its stakeholders
  – Get a description of the program (i.e.,
    curriculum)
  – Meet with all stakeholders (survey or interview
    them)
• Become familiar with information needs
  – Who wants the evaluation?
  – What is the focus of the evaluation? What
    resources do I have available to me?
  – Why is an evaluation wanted?
  – When is the evaluation wanted?
Program Provider/Staff
                                        Issues
• Expectation of a “slam-bang effect”.
• Fear that evaluation with inhibit
  creativity/innovation in regards to the
  program.
• Fear that the program will be terminated.
• Fear that information will be misused.
• Fear that evaluation will drain resources.
• Fear of losing control of the program.
• Fear of the program staff that they are being
  monitored.
What is Program Theory?

• Program theory identifies key program
  elements and how they relate to each other.

• Program theory helps us decide what data
  we should collect and how we should
  analyze it.

• It is important to develop an evaluation plan
  that measures the extent and nature of each
  individual element.
Inadequate Program
                              Evaluation Models
• Social Science Research Model: form two
  random groups, providing one with the
  service and using the other as a control
  group.

• Black-Box Evaluation: an evaluation that
  only looks at the outputs and not the internal
  operations of the program.

• Naturalistic Model: utilizing only qualitative
  methods to gather lots of data.
Theory Driven Model

• Theory-driven evaluations are more likely than
  methods-driven evaluations to discover program
  effects --- they identify and examine a larger set of
  potential program outcomes (Chen & Rossi, 1980).
• Theory-driven evaluations are not limited to one
  method (i.e., quantitative or qualitative), one data
  source (i.e., program participants, artifacts,
  community indexes, program staff), or one type of
  analysis (i.e., descriptive statistics, correlational
  analyses, group difference statistics).
• Theory-driven evaluations utilize mixed-methods
  and derive their data from multiple sources.
Improvement-Focused Model

• Program improvement is the focus.
• Utilizing this type of model, evaluators can
  help program staff to discover discrepancies
  between program objectives and the needs
  of the target population, between program
  implementation and program plans, between
  expectations of the target population and the
  services actually delivered, or between
  outcomes achieved and outcomes projected
  (Posavac & Carey, 1997).
Goals of an Evaluation

• Implementation Goals
  – Equipment needs, staff hiring and training


• Intermediate Goals
  – Program is delivered as planned


• Outcome Goals
  – Is the program effective?
Questions to ask
                                  in an Evaluation
• (1) Does the program match the values of the
  stakeholders/needs of the people being served?
• (2) Does the program as implemented fulfill the
  plans?
• (3) Do the outcomes achieved match the goals?
• (4) Is there support for program theory?
• (5) Is the program accepted?
• (6) Are the resources devoted to the program being
  expended appropriately?
Creating an Evaluation Plan

• Step 1: Creating a Logic Model

• Step 2: Reviewing the Literature

• Step 3: Determining the Methodology

• Step 4: Present a Written Proposal
Step 1: Creating a Logic
                                           Model
• Review program descriptions
  – Is there a program theory?
  – Who do they serve?
  – What do they do?
• Meet with stakeholders
  – Program personnel
  – Program sponsors
  – Clients of program
  – Other individuals/organizations impacted by the
    program
Step 1: Creating a Logic
                                         Model
• Logic models depict assumptions about the
  resources needed to support program
  activities and produce outputs, and the
  activities and outputs needed to realize the
  intended outcomes of a program (United
  Way of America, 1996; Wholey, 1994).

• The assumptions depicted in the model are
  called program theory.
Sample Logic Model

                      Sample Logic Model


     INPUTS            ACTIVITIES            OUTPUTS            OUTCOMES




Resources dedicated What the program The direct products         Benefits for
 to or consumed by does with the inputs of program activities participants during
    the program     to fulfill its mission                    and after program
                                                                   activities
Step 2: Reviewing the
                                        Literature
• Important things to consider:
  – In what ways is your program similar to other
    programs?
  – What research designs were utilized?
  – How were participants sampled?
  – Can previous measures be adopted?
  – What statistical analyses were performed?
  – What were their conclusions/interpretations?
• Creating Hypotheses/Research Questions
Step 3: Determining
                                  the Methodology
• Sampling Method
  – Probability vs. Non-probability
• Research Design
  – Experimental, Quasi-experimental, Non-
    experimental
• Data Collection
  – Ethics, Implementation, Observations, Surveys,
    Existing Records, Interviews/Focus Groups
• Statistical Analysis
  – Descriptive, Correlational, Group Differences
Step 4: Present a
                               Written Proposal
• Describe the specific purpose of the
  evaluation
  – Specific goals, objectives and/or aims of the
    evaluation
• Describe the evaluation design
  – Include theories/support for design
  – Methodology – participants, measures,
    procedure
• Describe the evaluation questions
  – Hypotheses and proposed analyses
• Present a detailed work plan and budget
Ethics in Program Evaluation

• Sometimes evaluators will have to deal with
  ethical dilemmas during the evaluation
  process
• Some potential dilemmas
  – Programs that can’t be done well
  – Presenting all findings (negative and positive)
  – Proper ethical considerations (i.e., informed
    consent)
  – Maintaining confidentiality of clients
  – Competent data collectors
• AEA ethical principles
Data Collection:
                     Implementation Checklists
• Implementation checklists are used to
  ascertain if the program is being delivered as
  planned.
• You should have questions after each
  program chapter/section that the program
  deliverer fills out.
• This can then be used to create a new
  variable: Level of implementation (none, low,
  high).
Data Collection: Observations

• What should be observed?
  – Participants, Program staff


• Utilize trained observers.
  – Your observers should be trained on how to
    observe staff and/or clients.


• Use standardized behavioral checklists.
  – You should have a standard checklist that
    observers can use while observing.
Data Collection: Surveys

• To create or not to create?
  – Finding a valid/reliable instrument
• What can surveys measure?
  – Facts and past behavioral experiences
  – Attitudes and preferences
  – Beliefs and predictions
  – Current/future behaviors
• Types of questions
  – Closed versus Open-ended questions
• How to administer?
Data Collection:
                                  Existing Records
• School records
  – GPA, absences, disciplinary problems


• Health records
  – Relevant health information


• National surveys
  – National and state indices (e.g., census data)
Data Collection:
                      Focus Groups/Interviews
• Focus groups or individual interviews can be
  conducted with program staff and/or clients.

• Can be used to obtain information on
  program effectiveness and satisfaction.

• Can also show if client needs are not being
  met.
Statistical Analysis:
                                        Quantitative
• Descriptive Statistics
  – What are the characteristics of our clients?
  – What % attended our program?
• Correlational Statistics
  – What variables are related to our outcomes?
  – How is implementation related to our outcomes?
• Group Difference Statistics
  – Is our program group different from our
    comparison group?
  – Are there group differences on outcomes?
Statistical Analysis:
                                            Qualitative
• Transcribe interviews/focus groups/observations
   – Should be done verbatim in an organized
     fashion
• Summarizing all open-ended questions
   – Summarize and keep a tally of how many
     participants give each response
• Coding and analyzing all qualitative data
   – Utilize a theoretical framework for coding (e.g.,
     Grounded Theory)
   – Use a qualitative software package to organize
     data (e.g., Nudist, NVivo)
Writing the Evaluation Report

• This should be as detailed as possible. It
  should include both the formative and
  summative evaluation findings as well as an
  action plan for improvement to the
  design/program.
• Should be written in an easy to understand
  format (don’t be too technical). For more
  technical information you can create an
  appendix.
• Include a lot of graphical displays of the
  data.
Presenting the Results

• You should present the results of the
  evaluation to all key stakeholders.
• Professional presentation (PowerPoint,
  handouts).

• Don’t just present findings (both positive and
  negative) but explain them.
• Present an action plan for possible changes.
Working as a Program
                                          Evaluator
• Get more experience
  – Take classes
    • EP 533 (basic intro), EP 651/652 (Seminar), EP 670
      (internship, can take up to 9hours), EP 693
      (Independent Study) as well as many others
    • Evaluation Certificate (12hrs)
  – Workshops
• Get Involved
• Working as a Program Evaluator
References

•   Chen, H.T., & Rossi, P.H. (1980). The multi-goal, theory-driven
    approach to evaluation: A model linking basic and applied social
    science. Social Forces, 59, 106-122.
•   Julian, D.A. (1997). The utilization of the logic model as a system level
    planning and evaluation device. Evaluation and Program Planning, 20,
    251-257.
•   Patton, M.Q. (1986). Utilization-focused evaluation (2nd ed.). Newbury
    Park, CA: Sage Publications.
•   Posavac, E.J., & Carey, R.G. (2003). Program evaluation methods and
    case studies (6th ed.). New Jersey: Prentice Hall.
•   Rossi, P.H., & Freeman, H.E. (1993). Evaluation: A systematic
    approach (5th ed.). Newbury Park, CA: Sage Publications.
•   United Way of America. (1996). Measuring program outcomes: A
    practical approach (Item No. 0989). Author.
Websites

• Evaluators’ Institute
   – http://www.evaluatorsinstitute.com/
• Guide to program evaluation
   – http://www.mapnp.org/library/evaluatn/fnl_eval.htm
• Evaluating community programs
   – http://ctb.lsi.ukans.edu/tools/EN/part_1010.htm
• Evaluation bibliography
   – http://www.ed.gov/about/offices/list/ope/fipse/biblio.html
• Higher Ed center evaluation resources
   – http://www.edc.org/hec/eval/links.html
• The Evaluation Center
   – http://www.wmich.edu/evalctr/
Websites

• American Evaluation Association (AEA)
   – http://www.eval.org/
• Southeast Evaluation Association (SEA)
   – http://www.bitbrothers.com/sea/
• Using logic models
   – http://edis.ifas.ufl.edu/WC041
• Resources for evaluators
   – http://www.luc.edu/faculty/eposava/resource.htm
• Various program evaluation publications (all pdf)
   – http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html
• Evaluation toolkit – Kellogg Foundation
   – http://www.wkkf.org/Programming/Overview.aspx?CID=281
My Contact Information

          Jennifer Ann Morrow, Ph.D.
Assistant Professor of Evaluation, Statistics, and
                  Measurement

  Department of Educational Psychology and
                  Counseling
        The University of Tennessee
            Knoxville, TN 37996
         Email: jamorrow@utk.edu
        Office Phone: 865-974-6117

Contenu connexe

Tendances

Almm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acmAlmm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acm
Alberto Mico
 
Project Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) NotesProject Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) Notes
Excellence Foundation for South Sudan
 

Tendances (20)

Monitoring And Evaluation Presentation
Monitoring And Evaluation PresentationMonitoring And Evaluation Presentation
Monitoring And Evaluation Presentation
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Learning from Evaluation
Learning from EvaluationLearning from Evaluation
Learning from Evaluation
 
M & E Fundamentals.
M & E Fundamentals.M & E Fundamentals.
M & E Fundamentals.
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
Theory of change
Theory of changeTheory of change
Theory of change
 
Almm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acmAlmm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acm
 
Use of evaluation findings; types and influences
Use of evaluation findings; types and influences Use of evaluation findings; types and influences
Use of evaluation findings; types and influences
 
Cipp model
Cipp modelCipp model
Cipp model
 
TTIPEC: Monitoring and Evaluation (Session 2)
TTIPEC: Monitoring and Evaluation (Session 2)TTIPEC: Monitoring and Evaluation (Session 2)
TTIPEC: Monitoring and Evaluation (Session 2)
 
Results based monitoring
Results based monitoringResults based monitoring
Results based monitoring
 
Project Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) NotesProject Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) Notes
 
Evaluation design
Evaluation designEvaluation design
Evaluation design
 
Monitoring and evaluation (Part 2)
Monitoring and evaluation (Part 2)Monitoring and evaluation (Part 2)
Monitoring and evaluation (Part 2)
 
Needs Assessment Presentation
Needs Assessment Presentation Needs Assessment Presentation
Needs Assessment Presentation
 
Results based planning and management
Results based planning and managementResults based planning and management
Results based planning and management
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and Development
 
Educational management planning
Educational management planningEducational management planning
Educational management planning
 
Management-Oriented Evaluation Approaches
Management-Oriented Evaluation ApproachesManagement-Oriented Evaluation Approaches
Management-Oriented Evaluation Approaches
 

En vedette

Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
nida19
 
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Jennifer Morrow
 
Collecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College SettingCollecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College Setting
Jennifer Morrow
 
APA Version 6 Quick Guide
APA Version 6 Quick GuideAPA Version 6 Quick Guide
APA Version 6 Quick Guide
Jennifer Morrow
 
Educ 6130 4 program evaluation final project
Educ 6130 4  program evaluation final projectEduc 6130 4  program evaluation final project
Educ 6130 4 program evaluation final project
shirleydesigns
 
Robert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choiceRobert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choice
Arvin Matias
 
Ins and Outs of Program Evaluation
Ins and Outs of Program EvaluationIns and Outs of Program Evaluation
Ins and Outs of Program Evaluation
kbrockmeier
 

En vedette (20)

Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Program Evaluation
Program EvaluationProgram Evaluation
Program Evaluation
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
 
Why evaluate your program?
Why evaluate your program?Why evaluate your program?
Why evaluate your program?
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. Casimiro
 
Non-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationNon-Profit Program Planning and Evaluation
Non-Profit Program Planning and Evaluation
 
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
 
Collecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College SettingCollecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College Setting
 
Preparing to go on the job market: Strategies for academic and non-academic j...
Preparing to go on the job market: Strategies for academic and non-academic j...Preparing to go on the job market: Strategies for academic and non-academic j...
Preparing to go on the job market: Strategies for academic and non-academic j...
 
APA Version 6 Quick Guide
APA Version 6 Quick GuideAPA Version 6 Quick Guide
APA Version 6 Quick Guide
 
Process for creating the Vision 2013 Strategic Plan
Process for creating the Vision 2013 Strategic PlanProcess for creating the Vision 2013 Strategic Plan
Process for creating the Vision 2013 Strategic Plan
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Capitulo 4
Capitulo 4Capitulo 4
Capitulo 4
 
LMBR Vision
LMBR VisionLMBR Vision
LMBR Vision
 
Educ 6130 4 program evaluation final project
Educ 6130 4  program evaluation final projectEduc 6130 4  program evaluation final project
Educ 6130 4 program evaluation final project
 
Robert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choiceRobert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choice
 
Ins and Outs of Program Evaluation
Ins and Outs of Program EvaluationIns and Outs of Program Evaluation
Ins and Outs of Program Evaluation
 
NCCMT Spotlight Webinar: Program Evaluation Toolkit
NCCMT Spotlight Webinar: Program Evaluation ToolkitNCCMT Spotlight Webinar: Program Evaluation Toolkit
NCCMT Spotlight Webinar: Program Evaluation Toolkit
 
Measurement & Evaluation
Measurement & EvaluationMeasurement & Evaluation
Measurement & Evaluation
 

Similaire à What is program evaluation lecture 100207 [compatibility mode]

YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome EvaluationYouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
Laura Mulrine
 
Seda emdedding learning technologies evaluating and sustainability3
Seda emdedding learning technologies   evaluating and sustainability3Seda emdedding learning technologies   evaluating and sustainability3
Seda emdedding learning technologies evaluating and sustainability3
BrianKilpatrick
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
contentli
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Institute of Development Studies
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluation
Carlo Magno
 

Similaire à What is program evaluation lecture 100207 [compatibility mode] (20)

Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
 
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome EvaluationYouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
 
Part III. Project evaluation
Part III. Project evaluationPart III. Project evaluation
Part III. Project evaluation
 
Planning and evaluation.pptx
 Planning and evaluation.pptx Planning and evaluation.pptx
Planning and evaluation.pptx
 
Seda emdedding learning technologies evaluating and sustainability3
Seda emdedding learning technologies   evaluating and sustainability3Seda emdedding learning technologies   evaluating and sustainability3
Seda emdedding learning technologies evaluating and sustainability3
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approaches
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
 
Presentation chapters 1 and 2
Presentation chapters 1 and 2Presentation chapters 1 and 2
Presentation chapters 1 and 2
 
Evaluation Workshop
Evaluation WorkshopEvaluation Workshop
Evaluation Workshop
 
Organizational Capacity-Building Series - Session 5: Program Planning
Organizational Capacity-Building Series - Session 5: Program PlanningOrganizational Capacity-Building Series - Session 5: Program Planning
Organizational Capacity-Building Series - Session 5: Program Planning
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluation
 
M & E Presentation DSK.ppt
M & E Presentation DSK.pptM & E Presentation DSK.ppt
M & E Presentation DSK.ppt
 
Trg evaluation
Trg evaluationTrg evaluation
Trg evaluation
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs Assessment
 
Educational planning models
Educational planning modelsEducational planning models
Educational planning models
 

Dernier

Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 

Dernier (20)

microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 

What is program evaluation lecture 100207 [compatibility mode]

  • 1. WHAT IS PROGRAM EVALUATION? JENNIFER ANN MORROW, Ph.D. UNIVERSITY OF TENNESSEE
  • 2. Summary of Current Projects • Program Evaluation Projects – Project Weed & Seed – Project Fairchild Tropical Gardens • Student Development Projects – Project Social Networking – Project Writing • Teaching Research Methods/Statistics Projects – Project RLE
  • 3. Program Evaluation Philosophy • Utilization-focused evaluation (Patton, 1996) – Evaluations are situation specific • Comprehensive evaluation designs – Formative and summative evaluation – Qualitative and quantitative data • Useful and meaningful data – Simple versus sophisticated analyses • Faculty-student evaluation teams – Students as evaluation apprentices
  • 4. What is Program Evaluation? • “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs for use by specific people to reduce uncertainties, improve effectiveness, and make decision with regard to what those programs are doing and affecting” (Patton, 1986). • “Evaluation research is the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention program” (Rossi & Freeman, 1993).
  • 5. Types of Evaluation • Formative Evaluation: focuses on identifying the strengths and weaknesses of a program or intervention. – Comprised of implementation (process) and progress evaluation – Occurs during the entire life of the program/intervention – Is performed to monitor and improve the program/intervention • Summative Evaluation: focuses on determining the overall effectiveness or impact of a program or intervention. – Also called impact evaluation – Assesses if the project/intervention met its stated goals
  • 6. Preparing to Conduct an Evaluation • Identify the program and its stakeholders – Get a description of the program (i.e., curriculum) – Meet with all stakeholders (survey or interview them) • Become familiar with information needs – Who wants the evaluation? – What is the focus of the evaluation? What resources do I have available to me? – Why is an evaluation wanted? – When is the evaluation wanted?
  • 7. Program Provider/Staff Issues • Expectation of a “slam-bang effect”. • Fear that evaluation with inhibit creativity/innovation in regards to the program. • Fear that the program will be terminated. • Fear that information will be misused. • Fear that evaluation will drain resources. • Fear of losing control of the program. • Fear of the program staff that they are being monitored.
  • 8. What is Program Theory? • Program theory identifies key program elements and how they relate to each other. • Program theory helps us decide what data we should collect and how we should analyze it. • It is important to develop an evaluation plan that measures the extent and nature of each individual element.
  • 9. Inadequate Program Evaluation Models • Social Science Research Model: form two random groups, providing one with the service and using the other as a control group. • Black-Box Evaluation: an evaluation that only looks at the outputs and not the internal operations of the program. • Naturalistic Model: utilizing only qualitative methods to gather lots of data.
  • 10. Theory Driven Model • Theory-driven evaluations are more likely than methods-driven evaluations to discover program effects --- they identify and examine a larger set of potential program outcomes (Chen & Rossi, 1980). • Theory-driven evaluations are not limited to one method (i.e., quantitative or qualitative), one data source (i.e., program participants, artifacts, community indexes, program staff), or one type of analysis (i.e., descriptive statistics, correlational analyses, group difference statistics). • Theory-driven evaluations utilize mixed-methods and derive their data from multiple sources.
  • 11. Improvement-Focused Model • Program improvement is the focus. • Utilizing this type of model, evaluators can help program staff to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, or between outcomes achieved and outcomes projected (Posavac & Carey, 1997).
  • 12. Goals of an Evaluation • Implementation Goals – Equipment needs, staff hiring and training • Intermediate Goals – Program is delivered as planned • Outcome Goals – Is the program effective?
  • 13. Questions to ask in an Evaluation • (1) Does the program match the values of the stakeholders/needs of the people being served? • (2) Does the program as implemented fulfill the plans? • (3) Do the outcomes achieved match the goals? • (4) Is there support for program theory? • (5) Is the program accepted? • (6) Are the resources devoted to the program being expended appropriately?
  • 14. Creating an Evaluation Plan • Step 1: Creating a Logic Model • Step 2: Reviewing the Literature • Step 3: Determining the Methodology • Step 4: Present a Written Proposal
  • 15. Step 1: Creating a Logic Model • Review program descriptions – Is there a program theory? – Who do they serve? – What do they do? • Meet with stakeholders – Program personnel – Program sponsors – Clients of program – Other individuals/organizations impacted by the program
  • 16. Step 1: Creating a Logic Model • Logic models depict assumptions about the resources needed to support program activities and produce outputs, and the activities and outputs needed to realize the intended outcomes of a program (United Way of America, 1996; Wholey, 1994). • The assumptions depicted in the model are called program theory.
  • 17. Sample Logic Model Sample Logic Model INPUTS ACTIVITIES OUTPUTS OUTCOMES Resources dedicated What the program The direct products Benefits for to or consumed by does with the inputs of program activities participants during the program to fulfill its mission and after program activities
  • 18. Step 2: Reviewing the Literature • Important things to consider: – In what ways is your program similar to other programs? – What research designs were utilized? – How were participants sampled? – Can previous measures be adopted? – What statistical analyses were performed? – What were their conclusions/interpretations? • Creating Hypotheses/Research Questions
  • 19. Step 3: Determining the Methodology • Sampling Method – Probability vs. Non-probability • Research Design – Experimental, Quasi-experimental, Non- experimental • Data Collection – Ethics, Implementation, Observations, Surveys, Existing Records, Interviews/Focus Groups • Statistical Analysis – Descriptive, Correlational, Group Differences
  • 20. Step 4: Present a Written Proposal • Describe the specific purpose of the evaluation – Specific goals, objectives and/or aims of the evaluation • Describe the evaluation design – Include theories/support for design – Methodology – participants, measures, procedure • Describe the evaluation questions – Hypotheses and proposed analyses • Present a detailed work plan and budget
  • 21. Ethics in Program Evaluation • Sometimes evaluators will have to deal with ethical dilemmas during the evaluation process • Some potential dilemmas – Programs that can’t be done well – Presenting all findings (negative and positive) – Proper ethical considerations (i.e., informed consent) – Maintaining confidentiality of clients – Competent data collectors • AEA ethical principles
  • 22. Data Collection: Implementation Checklists • Implementation checklists are used to ascertain if the program is being delivered as planned. • You should have questions after each program chapter/section that the program deliverer fills out. • This can then be used to create a new variable: Level of implementation (none, low, high).
  • 23. Data Collection: Observations • What should be observed? – Participants, Program staff • Utilize trained observers. – Your observers should be trained on how to observe staff and/or clients. • Use standardized behavioral checklists. – You should have a standard checklist that observers can use while observing.
  • 24. Data Collection: Surveys • To create or not to create? – Finding a valid/reliable instrument • What can surveys measure? – Facts and past behavioral experiences – Attitudes and preferences – Beliefs and predictions – Current/future behaviors • Types of questions – Closed versus Open-ended questions • How to administer?
  • 25. Data Collection: Existing Records • School records – GPA, absences, disciplinary problems • Health records – Relevant health information • National surveys – National and state indices (e.g., census data)
  • 26. Data Collection: Focus Groups/Interviews • Focus groups or individual interviews can be conducted with program staff and/or clients. • Can be used to obtain information on program effectiveness and satisfaction. • Can also show if client needs are not being met.
  • 27. Statistical Analysis: Quantitative • Descriptive Statistics – What are the characteristics of our clients? – What % attended our program? • Correlational Statistics – What variables are related to our outcomes? – How is implementation related to our outcomes? • Group Difference Statistics – Is our program group different from our comparison group? – Are there group differences on outcomes?
  • 28. Statistical Analysis: Qualitative • Transcribe interviews/focus groups/observations – Should be done verbatim in an organized fashion • Summarizing all open-ended questions – Summarize and keep a tally of how many participants give each response • Coding and analyzing all qualitative data – Utilize a theoretical framework for coding (e.g., Grounded Theory) – Use a qualitative software package to organize data (e.g., Nudist, NVivo)
  • 29. Writing the Evaluation Report • This should be as detailed as possible. It should include both the formative and summative evaluation findings as well as an action plan for improvement to the design/program. • Should be written in an easy to understand format (don’t be too technical). For more technical information you can create an appendix. • Include a lot of graphical displays of the data.
  • 30. Presenting the Results • You should present the results of the evaluation to all key stakeholders. • Professional presentation (PowerPoint, handouts). • Don’t just present findings (both positive and negative) but explain them. • Present an action plan for possible changes.
  • 31. Working as a Program Evaluator • Get more experience – Take classes • EP 533 (basic intro), EP 651/652 (Seminar), EP 670 (internship, can take up to 9hours), EP 693 (Independent Study) as well as many others • Evaluation Certificate (12hrs) – Workshops • Get Involved • Working as a Program Evaluator
  • 32. References • Chen, H.T., & Rossi, P.H. (1980). The multi-goal, theory-driven approach to evaluation: A model linking basic and applied social science. Social Forces, 59, 106-122. • Julian, D.A. (1997). The utilization of the logic model as a system level planning and evaluation device. Evaluation and Program Planning, 20, 251-257. • Patton, M.Q. (1986). Utilization-focused evaluation (2nd ed.). Newbury Park, CA: Sage Publications. • Posavac, E.J., & Carey, R.G. (2003). Program evaluation methods and case studies (6th ed.). New Jersey: Prentice Hall. • Rossi, P.H., & Freeman, H.E. (1993). Evaluation: A systematic approach (5th ed.). Newbury Park, CA: Sage Publications. • United Way of America. (1996). Measuring program outcomes: A practical approach (Item No. 0989). Author.
  • 33. Websites • Evaluators’ Institute – http://www.evaluatorsinstitute.com/ • Guide to program evaluation – http://www.mapnp.org/library/evaluatn/fnl_eval.htm • Evaluating community programs – http://ctb.lsi.ukans.edu/tools/EN/part_1010.htm • Evaluation bibliography – http://www.ed.gov/about/offices/list/ope/fipse/biblio.html • Higher Ed center evaluation resources – http://www.edc.org/hec/eval/links.html • The Evaluation Center – http://www.wmich.edu/evalctr/
  • 34. Websites • American Evaluation Association (AEA) – http://www.eval.org/ • Southeast Evaluation Association (SEA) – http://www.bitbrothers.com/sea/ • Using logic models – http://edis.ifas.ufl.edu/WC041 • Resources for evaluators – http://www.luc.edu/faculty/eposava/resource.htm • Various program evaluation publications (all pdf) – http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html • Evaluation toolkit – Kellogg Foundation – http://www.wkkf.org/Programming/Overview.aspx?CID=281
  • 35. My Contact Information Jennifer Ann Morrow, Ph.D. Assistant Professor of Evaluation, Statistics, and Measurement Department of Educational Psychology and Counseling The University of Tennessee Knoxville, TN 37996 Email: jamorrow@utk.edu Office Phone: 865-974-6117