10. Types of Evaluation Questions Needs Assessment Assessment of Program Process Assessment of Program Theory Impact Assessment Efficiency Assessment
11.
12.
Notes de l'éditeur
Focus on the questions to be answered – can’t answer EVERY question – the purpose of the evaluation and its subsequent set of guiding questions need to be SPECIFIC in its purpose Questions should drive the methods – we use good social science to answer our evaluation questions – quantitative, qualitative, mixed methods, etc. But ultimately, the practicality of the social science methods matters – it needs to be DOABLE and USABLE by stakeholders! Must identify what the relationship is between stakeholders and the evaluator – how will you be interacting with your stakeholders?
Any of these are legitimate responses – but all have potential downsides – 1) difficult, 2) can alienate key stakeholders or destroy credibility, 3) takes skill, time, and resources to accomplish Case study – deals with an evaluation of a life skills program designed to deal with sexual risk-taking – a hot button issue!
In all of these cases – it is the evaluator’s job to clarify ambiguous, vague, or untenable goals, objectives and questions!
Formative evaluations – focused on program improvement; results are used immediately – focusing on program need, design, implementation, impact, efficiency…can be informal Summative evaluation – focused on accountability – what the program does – usually happens at the end of a program – decisionmakers – usually formal Knowledge generation – interventions – does a particular program or intervention work? Usually conducted by academic researchers, government agencies, foundations… Robert Wood Johnson Foundation’s Active Living Program
Could fit into any of the three categories – formal purpose statements often tell us little about the actual purpose of the evaluation, who the primary audience for the evaluation results will be, and how the results will be used. It is important that we understand this at the beginning of the evaluation process!
The key here is that evaluation questions are reasonable and appropriate – they are tied to the goals of the program – DMS example – how did afterschool enrichment programs impact grades in core subjects? Not reasonable to expect that students who take karate or hip hop dance after school will automatically do better on tests or in core subjects. What are the goals of the program? Evaluation questions should stem directly from the program’s goals. Also – must be answerable within the scope of the evaluation and its resources – did housing redevelopment create economic impact? Too soon to answer that question! Criteria for performance are determined by a number of things – needs/wants of target population, goals and objectives, professional standards, legal requirements, past performance, baseline levels….This is critical because unlike academic research, we need to be able to judge the PRACTICAL VALUE of a program – understand whether it met standards or not…
75% - based on program goals Regular attendees – those attending 80% or more of the program time 60 minutes of moderate or vigorous physical activity – CDC guidelines for childhood physical activity
Needs Assessment – questions about the social conditions a program is intended to ameliorate and the need for the program Assessment of Program Theory (Evaluability) – questions about the program’s conceptualization and design Assessment of Program Process – questions about program process, operations, implementation, and service delivery Impact Assessment – questions about program outcomes and impact Efficiency Assessment – questions about program cost and cost-effectiveness
We’ll discuss social science methods throughout the course – but it is helpful to understand the broad range of techniques that can be used to collect data to help answer evaluation questions!