1. Chapter 13
An evaluation framework
1 . INTRODUCTION
2 . DECIDE: A FRAMEWORK TO GUIDE EVALUATION
2. The main aims are:
Discuss the conceptual, practical, and ethical issues
involved in evaluation.
Introduce and explain the DECIDE framework.
3. DECIDE: A framework to guide evaluation
1. Determine the goals.
2. Explore the questions.
3. Choose the evaluation approach and methods.
4. Identify the practical issues.
5. Decide how to deal with the ethical issues.
6. Evaluate, analyze, interpret, and present the data.
4. 1.Determine the goals.
What are the high-level goals of the evaluation?
Who wants it and why?
The goals influence the approach
Examples:
1. Identify the best metaphor on which the design.
2. Check to ensure that the final interface is consistent.
3. Investigate how technology affects working practices.
4. Improve the usability of an existing product.
5. 2.Explore the question
All evaluation need goals and question to guide
them.
What question might you ask about the design of a
cell phone?
6. 3.Choose the evaluation approach and methods
1. What do we mean by a design for the evaluation?
The design of the evaluation is the arrangement that will make
sure it reliably tells you what you need to know
An appropriate design will show you whether you actually got
results, and whether those results were likely due to your actions
or the circumstances you created, or to other factors
2. Why should you choose a design for your evaluation?
So your evaluation will be reliable
So you can pinpoint areas you need to work on, as well as
those that are successful
So your results are credible
So you can identify unintended consequences (both positive
and negative) and correct for them
7. When should you choose a design for
your evaluation?
In the ideal, when you’re planning the
program and evaluation, before you start any
implementation
In reality, for many organizations, at the
beginning of a program cycle, or as a new
group of participants enters
8. Who should be involved in choosing a
design for your evaluation?
It’s extremely helpful to include someone
with research experience to help you decide
on an appropriate design, and to help you
understand what each possible design
entails
9. How do you select an appropriate design for your
evaluation?
Take into consideration the necessary research
considerations
Understand threats to internal validity ( whether the intervention produced
the change)
History
Maturation
The effects of testing or observation on participants
Changes in measurement
Regression toward the mean
The selection of participants
The loss of data or participants
The nature of change
A combination of the effects of two or more of these
10. How do you select an appropriate design for your
evaluation?
Understand threats to external validity (generalizability of
your findings)
Interaction of testing or data collection and the program or
intervention
Interaction of selection procedures and the program or intervention
The effects of the research arrangements
The interference of multiple programs or interventions
11. How do you select an appropriate design for your
evaluation?
Understand common research designs
Pre- and post- single group design
Interrupted time series design with a single group (simple time series)
Interrupted time series with multiple groups (multiple time series)
Control or comparison group
Choose a design
Consider your evaluation questions
Consider the nature of your program
Consider what your participants and staff will consent to
Consider your time constraints
Consider your resources
12. 4.Identify the practical issues.
Select users
Stay on budget
Stay on schedule
Find evaluators
Select equipment
13. 5.Decide how to deal with the ethical issues.
I. Develop an informed consent form
II. Participants have a right to
1. Participants have a right to
2. Know what will happen to the findings
3. Privacy of personal information
4. Leave when they wish
5. Be treated politely
14. 6.Evaluate, analyze, interpret, and present the data
The following need to be considered
1. Reliability
2. Validity
3. Biases
4. Scope
5. Ecological validity
15. Reliability
can the study be replicated?
The same results on separate occasions under the
same circumstances
Another evaluator or researcher who follows exactly
the same procedure should get similar results
16. Validity, Biases and Scope
I. Validity
It measures what you expected?
II. Biases
is the process creating biases?
Bias occurs when the results are distorted
III. Scope
can the findings be generalized?
The scope of an evaluation study refers to how much its
findings can be generalized
17. • Ecological validity
is the environment influencing the findings?
Ecological validity concerns how the environment in
which an evaluation is conducted influences or even
distorts the results
Ecological validity is also affected when participants
are aware of being studied ->Hawthorne effect