SlideShare une entreprise Scribd logo
1  sur  94
MIXED METHOD
EVALUATION
Dawit Wolde ( MSc, Lecturer).ICME-JU
College of Health Sciences of Jimma University
E-mail:dave86520@gmail.com or dawit818@yahoo.com
Cell phone:(+251)-922489558/967657712
P.O.Box:378,Jimma University Jimma Ethiopia
Presentation objectives
At the end of the presentations participants will able to:
Define Mixed Method Evaluation
Differentiate between Quantitative and Qualitative
Evaluation methods
Explain rationale for MM Evaluation
Discuss on decision for designing MM Evaluation
Describe data collection and analysis techniques for MM
Evaluation
Clarify approaches of Mixing Methods
Discuss challenges/Limitations of MM Evaluation
Design Mixed Method Evaluation
27/02/2016 MM Evaluation 2
Presentation outline
• Definition and basic concepts of MM Evaluation
• Quantitative(QUANT) Vs. Qualitative(QUAL) Evaluation
methods
• Rationale for MM Evaluation
• Decisions for designing MM Evaluation
• Data collection and analysis techniques in MM Evaluation
• Approaches for MM Evaluation
• Limitations for MM Evaluation
27/02/2016 MM Evaluation 3
Training methods
• Interactive lectures,
• Group discussion(exercises),
• Plenary presentations
 Allocated time:12 hours(1/2 days)
27/02/2016 MM Evaluation 4
BRAINSTORMING Q
What is Mixed Method Evaluation? What will be mixed?
(10 minutes)
27/02/2016 MM Evaluation 5
Definition and basic concepts of MM
Evaluation
• Mixed Method Evaluation is a methodology for conducting
evaluation that involves collecting, analyzing, and integrating (or
mixing) quantitative and qualitative evaluation and/or data in a
single study or a longitudinal program of inquiry.
• Involves mixing or combining of quantitative and qualitative
evaluation techniques, methods, approaches, concepts or language
into a single study.
• Is based on the claim that both qualitative and quantitative
evaluation, in combination, provides a better understanding of a
research problem or issue than either evaluation approach alone.
27/02/2016 MM Evaluation 6
Definition and basic concepts ….
• Mixed-method evaluation is the intentional or planned use
of diverse methods for particular mixed-method purposes.
• For example: If the purpose is to determine the effect level
of our program and how and why effect emerged-
randomized control trial / quasi experimental design, and
case study design can be used.
• In MM approach to evaluation, methods or designs were
integrated throughout the evaluation process including
theory development, data collection, analysis and
interpretation.
27/02/2016 MM Evaluation 7
Definition and basic concepts ….
• Some of the common areas in which mixed-method
approaches may be used include:
o Initiating, designing, developing and expanding
interventions;
o Evaluation;
o Improving research design; and
o Corroborating findings, data triangulation or convergence
27/02/2016 MM Evaluation 8
GROUP EXERCISES 1
Be in your previous group and differentiate between
Quantitative(QUANT) and Qualitative(QUAL) Evaluation/Research
method with the following attributes:
 Nature of reality
 Purpose
 Research approach
 Subjectivity and Objectivity
 Group studied
 Variables used
 Type of data collected
 Data collection techniques
 Type of data analysis
 Results
Then present outcome of your discussion to participants(20 minutes).
27/02/2016 MM Evaluation 9
Quantitative Vs. Qualitative research
methods
Attributes QUANT QUAL
Purpose To test hypothesis, look at cause &
effect, and make predictions
To understand and interpret social
interactions
Group studied Larger and randomly selected Smaller and non-randomly
selected
Variables Specific variables studied Study of the whole, not variables
Research Approaches Descriptive study
Correlational study
Quasi-Experimental study
Experimental study
Narratives
Phenomenology
Grounded theory
Ethnography
Case study
Type of data collected Numbers and statistics Words, images or objects
Form of data collected Quantitative data based on precise
measurements using structured
and validated data collection
instruments
Qualitative data such as open
ended responses, interviews,
participant observations, field
notes and reflections
Type of data analysis Identify statistical relationships Identify pattern,features,themes
27/02/2016 MM Evaluation 10
Quantitative Vs. Qualitative research
methods….
Attributes QUANT QUAL
Objectivity and subjectivity Objectivity is critical Subjectivity is expected
Role of researcher Researcher and their biases are
not known to participants in the
study and participants
characteristics are deliberately
hidden from the researcher(double
blind studies)
Researcher and their biases may
be known to participants in the
study and participant
characteristics may be known to
researcher
Results Generalizable findings that are
applied to other population
Particular or specialized findings
that is less generalizable
Scientific method Confirmatory or top-down; the
researcher test the hypothesis and
theory with the data(Deductive
reasoning)
Exploratory or bottom-up; the
researcher generate a new
hypothesis and theory from the
data collected(Inductive
reasoning)
View of human behavior Regular and predictable Dynamic,situational,social and
personal
Most common research objectives Describe, explain and predict Explore, discover and construct
Focus Narrow-angle lense;tests a specific
hypothesis
Wide-angle lense;examine the
breadth and depth of phenomena
27/02/2016 MM Evaluation 11
Quantitative Vs. Qualitative research
methods….
Attributes QUANT QUAL
Nature of observation Study behavior under
controlled conditions; isolate
causal effects
Study behavior in a
natural environment
Nature of reality Single reality; objective Multiple realities; subjective
Final report Statistical reports with
correlation;comparision of
means & statistical
significance of findings
Narrative report with
contextual description and
direct quotation from
research participants
27/02/2016 MM Evaluation 12
Rationale for MM Evaluation
Group exercise 2:
• Be in your previous group and discuss on the reason for
using (Mixed Method Evaluation).
• Then present outcome of your discussion to participants (20
minutes).
27/02/2016 MM Evaluation 13
Rationale for MM Evaluation
• The major reasons for conducting a mixed method evaluation are:
1.Triangulation of Evaluation findings: refers to enhancing the
validity or credibility of evaluation findings by comparing
information obtained from different methods of data collection.
 Using different methods to address the same phenomena
 Seeks convergence,corroboration,correspondence of results
from the different methods.
 Design examples: Concurrent triangulation designs
2.Development: refers to using result of one method to help
develop the sample or instrumentation of another.
 Design Example: Sequential Exploratory and Sequential
transformative
27/02/2016 MM Evaluation 14
Rationale …
3.Complementarity: refers to extending the
comprehensiveness of evaluation findings through results
from different methods that broaden and deepen the
understanding reached.
 Using different method to address the different part of
phenomena
 Seeks elaboration,enhancement,illustration and
clarification of results.
 Design examples: Sequential Exploratory, Sequential
Explanatory, and Sequential Transformative.
 Moreover concurrent nested and concurrent
transformative design can also be used.
27/02/2016 MM Evaluation 15
Rationale …
4.Initiation: refers to generating new insights into evaluation findings
through results from the different methods that diverge and thus call for
reconciliation through further analysis, reframing or a shift in perspective.
Looking for contradictory results and using different methods to
collect data to explain the discrepancy
Seeks for discovery of paradox and contradiction, new perspective
of frameworks
Design example: Concurrent nested and concurrent transformative
design.
5.Value diversity: refers to incorporating a wider diversity of values
through the use of different methods that they advance different values.
Using different method to address the different part of phenomena
Seek to extend the breadth and range of inquiry
Design example: Sequential exploratory, sequential transformative,
concurrent nested and concurrent transformative designs.
27/02/2016 MM Evaluation 16
Rationale ….
Operational benefits realized by using mixed-method
designs or data collection strategies:
Reveal (disclose) unanticipated results.
Provide a deeper understanding of why change is
or is not occurring as planned.
Enable to have a wider range of perspectives than
might be captured by a single method.
Provides flexibility for the evaluator in choosing the
most appropriate method.
27/02/2016 MM Evaluation 17
Rationale ….
Strength representativeness of in-depth qualitative studies(by
linking case study selection to the quantitative sampling
frame).
Provides conditions of generality and credibility of the
evaluation conclusions.
Help to examine the interactions among the complex and
changing contextual factors that can influence program
implementation and impacts.
Provide information to improve the sufficiency of the
program.
Ensures buy-in from both QUANT and QUAL-oriented
evaluators and users.
27/02/2016 MM Evaluation 18
DESIGNING MM EVALUATION
27/02/2016 MM Evaluation 19
Basic steps in Program M&E
1.Engage
stakeholders
2.Describe the
program
3.Focus the
evaluation
design
4.Gather
credible
evidence
5.Justify
conclusions
6.Ensure use
and share
lesson learned
Source:CDC’s framework for program evaluation in public health,1999.
Evaluation
standards:
o Utility
o Feasibility
o Propriety
o Accuracy
27/02/2016 MM Evaluation 20
Evaluation
standards require
gathering of
credible/correct
information(Accur
acy) and use of
information(Utility)
****MM Evaluation
And to ensure perception
of credibility by
evaluation users ,use of
multiple source of data is
one strategy
This will
enhances
….
Designing MM Evaluation
• In planning MM evaluation approach four decisions are
required:
a) Decision I: At which stage or stages of the evaluation will
MM be used?MM design is much stronger if QUANT and
QUAL approaches are integrated into several (or ideally
all) stages of the evaluation.
b) Decision 2: Will QUANT and QUAL methods be used
sequentially or concurrently?
c) Decision 3: Will QUANT and QUAL methods be given
relatively equal weight, or will one methodology be
dominant?
d) Decision 4: Will the design be single- or multilevel?
27/02/2016 MM Evaluation 21
Designing MM Evaluation…
27/02/2016 MM Evaluation 22
Fig 1:Single level MM design: Sequential MM design with dominant QUANT approach. Studying
Interhousehold transfers as a survival strategy for low-income households in Cartagena,
Colombia(Source:Michael Bamberger,2012)
Designing MM Evaluation…
27/02/2016 MM Evaluation 23
Fig 2.Single level MM design: Sequential MM design with dominant QUAL approach. Evaluating the
adoption of new seed varieties by different types of farmers(Source:Michael Bamberger,2012).
Designing MM Evaluation…
27/02/2016 MM Evaluation 24
Fig 3.Multilevel MM design: Evaluating the effect of a school feeding program on attendance and
performance(Source: Michael Bamberger,2012).
.
Designing MM Evaluation
• Steps for designing and implementing MM Evaluation
follows similar procedures like design and implementation
of either QUANT or QUAL evaluations.
• The steps include:
o Developing Evaluation questions
o Matching questions with appropriate information
gathering techniques
o Collecting data
o Analyzing the data and
o Providing information to interested audiences
27/02/2016 MM Evaluation 25
GROUP EXERCISE 3
Be in your previous group: identify the stage at which MM
evaluation can be applied and clarify how to apply at each
stages. Then present outcome of your discussion to
participants(20 minutes).
27/02/2016 MM Evaluation 26
Designing MM Evaluation….
• MM approach can be applied during(Stages of Evaluation or
research):
I. Formulation of hypotheses
II. Sampling
III. Evaluation design
IV. Data collection
V. Triangulation
VI. Data analysis
27/02/2016 MM Evaluation 27
Designing MM Evaluation….
I. Formulation of hypotheses
• QUAN evaluation usually draws hypothesis deductively
from existing theories or literature reviews
• While QUAL evaluation develop hypothesis inductively
as the study evolves.
• MM combines both approaches.
• For example: a hypothesis developed deductively using a
QUAN approach can be explored and refined through
QUAL approaches such as interviews and observations. In
contrast the initial stage of QUAL data collection may
describe processes and issues that a QUAN approach can
test through data collected in a sample survey.
27/02/2016 MM Evaluation 28
Designing MM Evaluation….
II.Sampling
• QUAL evaluation uses small number of subjects selected
purposively(theoretical sampling) to ensure that all groups
are covered.
• QUAN evaluation uses a relatively large randomly
selected sample permitting generalization to larger
population and the statistical comparison of d/t groups.
• MM sampling uses the same sampling frame to generate
both a large QUANT survey sample and to select a small
but representative sample for in-depth QUAL analysis.
• Ensuring that the QUAL samples are reasonably
representative of the total sample population is one of the
most important contributions of MM designs.
27/02/2016 MM Evaluation 29
Designing MM Evaluation….
III. Evaluation design
• For example: Use of QUAL approach to evaluate the project
implementation process and influence of contextual
variables on project performance in communities where
QUAN survey of project participants being conducted.
IV.Data collection
• QUAN evaluations collects standardized numerical data,
whereas QUAL often use less structured data collection
methods that provides greater flexibility and that seeks to
understand the complexities of situation.
• MM data collection builds on the strength of QUAN data
while digging deeper ,capturing sensitive data ,studying
processes and behavioral change.
27/02/2016 MM Evaluation 30
Designing MM Evaluation….
V.Triangulation
• MM tend to use triangulation more systematically and as integral
part of the evaluation design.
• Information obtained through triangulation is used to:
o enhance the reliability and validity of estimates of key
indicators by comparing information from different sources;
o deepening the understanding of the meaning of statistical
relationships identified in the quantitative analysis; and
o ensuring that the perspectives of all key stakeholders, with
particular emphasis on poor and vulnerable groups, are
captured and compared.
• In such a way if estimates obtained from different sources are
consistent this increases the validity and credibility of the data
o What if the estimates differ???
27/02/2016 MM Evaluation 31
Designing MM Evaluation….
VI.Data analysis
• In MM Evaluation both analysis techniques are employed.
o QUAL analysis help understand the meaning that
different subjects or groups give to the statistical
associations found in the QUANT analysis and to
provide cases and examples to illuminate the findings.
o Whereas, QUANT analysis used to assess how well the
cases included in the QUAL studies represent the total
population of interest and which if any sectors have not
been covered.
27/02/2016 MM Evaluation 32
GROUP EXERCISE 4
Be in your previous group and:
• Identify data collection techniques in MM Evaluation
• Discuss on their strength and weaknesses and
• And clarify the basis in choosing among them.
• Then present outcome of your discussion to
participants(30 minutes).
27/02/2016 MM Evaluation 33
Designing MM Evaluation….
• In any particular Evaluation, the choice between various
data collection techniques and strategies depend on answers
to the following questions:
o Who is the information for and who will use the finding
of the evaluation?
o What kind of information are needed?
o How is the information to be used? For what purpose is
Evaluation being done?
o When is the information needed?
o What resource are available to conduct the evaluation?
27/02/2016 MM Evaluation 34
Designing MM Evaluation….
• Data are commonly collected through both Qualitative and
Quantitative Methods
• Qualitative approaches aim to address the „how‟ and „why‟ of a
program and tend to use unstructured methods of data
collection to fully explore the topic.
 For example: It tries to answer:” Why do participants enjoy
the program?‟ and „How does the program help increase self
esteem for participants?”
• Quantitative approaches on the other hand address the „what‟ of
the program.
 They use a systematic standardized approach and
 Ask questions such as „what activities did the program run?‟
and „what skills do staffs need to implement the program
effectively?‟
27/02/2016 MM Evaluation 35
Designing MM Evaluation….
Commonly used data collection techniques
Quantitative
• Structured survey
• Structured observation
guides
• Program MIS on input and
output data
• Review of institution data-
clinic records, school
records etc…
Qualitative
• In-depth interviews
• Key informant interview
• Group interviews (Focus groups,
community meetings etc…)
• Unstructured observation: Participant
and non-participant observation
• Video or audio recordings
• Photography
• Document analysis
• Artifacts/Objects
• Participatory group techniques( e.g.
PRA(Participatory rural
appraisal),most significant
change(MSC))
27/02/2016 MM Evaluation 36
Designing MM Evaluation….
a)Observation:
• Help to gather firsthand data on programs, processes, or
behaviors being studied.
• They provide evaluators with an opportunity to collect data
on a wide range of behaviors, to capture a great variety of
interactions, and to openly explore the evaluation topic.
• By directly observing operations and activities, the evaluator
can develop a holistic perspective, i.e., an understanding of
the context within which the project operates.
• Also allow evaluator to learn about things the participants or
staff may be unaware of or that they are unwilling or unable
to discuss in an interview or focus group.
27/02/2016 MM Evaluation 37
Designing MM Evaluation….
When to use observations?
• Can be useful during both the formative and
summative phases of evaluation
• For example, during the formative phase,
observations can be useful in determining whether
or not the project is being delivered and operated as
planned.
• Observations during the summative phase of
evaluation can be used to determine whether or not
the project is successful.
27/02/2016 MM Evaluation 38
Designing MM Evaluation….
• Types of information for which observations are a good source:
27/02/2016 MM Evaluation 39
The setting - The physical environment within which the project takes place.
The human, social environment - The ways in which all actors (staff, participants, others) interact and
behave toward each other.
Project implementation activities - What goes on in the life of the project? What do various actors (staff,
participants, others) actually do? How are resources allocated?
The native language of the program - Different organizations and agencies have their own language or
jargon to describe the problems they deal with in their work; capturing the precise language of all
participants is an important way to record how staff and participants understand their experiences.
Nonverbal communication - Nonverbal cues about what is happening in the project: on the way all
participants dress, express opinions, physically space themselves during discussions, and arrange
themselves in their physical setting.
Notable nonoccurrence's - Determining what is not occurring although the expectation is that it should
be occurring as planned by the project team, or noting the absence of some particular activity/factor that
is noteworthy and would serve as added information.
Designing MM Evaluation….
How many observations?
• In participant observation this may be a moot point (except with regard
to data recording), when an outside observer is used, the question of
"how much" becomes very important.
• While most people agree that one observation is not enough, there is no
hard and fast rule regarding how many samples need to be drawn.
• Recommendation:
o to avoid atypical situations,
o carry out observations more than one time, and
o where possible and relevant spread the observations out over time.
• Participant observation is often difficult to incorporate in evaluations;
therefore, the use of outside observers is far more common.
•
27/02/2016 MM Evaluation 40
Designing MM Evaluation….
Advantages of observations:
o Provide direct information about behavior of individuals and
groups
o Permit evaluator to enter into and understand situation/context
o Provide good opportunities for identifying unanticipated outcomes
o Exist in natural, unstructured, and flexible setting
Disadvantages of Observations:
o Expensive and time consuming
o Need well-qualified, highly trained observers; may need to be
content experts
o May affect behavior of participants(Hawthrone effect)
o Selective perception of observer may distort data
o Investigator has little control over situation
o Behavior or set of behaviors observed may be atypical
27/02/2016 MM Evaluation 41
Designing MM Evaluation….
b)Interviews:
• Interviews provide very different data from observations
o Allow to capture the perspectives of project participants,
staff, and others associated with the project
o Is used with the assumption that the participants‟
perspectives are meaningful, knowable, and able to be
made explicit, and that their perspectives affect the
success of the project.
o Compared to survey is selected when interpersonal
contact is important and when opportunities for follow-
up of interesting comments are desired.
27/02/2016 MM Evaluation 42
Designing MM Evaluation….
Types of Interviews:
• Two types of interviews are commonly used in
evaluation research:
o Structured interviews
o In-depth(unstructured) interviews
27/02/2016 MM Evaluation 43
Designing MM Evaluation….
In-depth interview:
• Is a dialogue between a skilled interviewer and an
interviewee
• Its goal is to elicit rich, detailed material that can be
used in analysis (Lofland and Lofland, 1995).
• Such interviews are best conducted face to face,
although in some situations telephone interviewing
can be successful.
• In-depth interviews are characterized by extensive
probing and open-ended questions.
27/02/2016 MM Evaluation 44
Designing MM Evaluation….
When to use in-depth interviews?
• In-depth interviews can be used at any stage of the evaluation process.
• They are especially useful in answering questions such as those suggested
by Patton (1990):
o What does the program look and feel like to the participants? To
other stakeholders?
o What are the experiences of program participants?
o What do stakeholders know about the project?
o What thoughts do stakeholders knowledgeable about the program
operations, processes, and outcomes?
o What are participants‟ and stakeholders‟ expectations?
o What features of the project are most salient to the participants?
o What changes do participants perceive in themselves as a result of
their involvement in the project?
27/02/2016 MM Evaluation 45
Designing MM Evaluation….
• Specific circumstances for which in-depth interviews are
particularly appropriate include:
o Complex subject matter;
o Detailed information sought;
o Busy, high-status respondents; and
o Highly sensitive subject matter.
27/02/2016 MM Evaluation 46
Designing MM Evaluation….
Advantages of in-depth interviews:
• Usually yield richest data, details, new insights
• Permit face-to-face contact with respondents
• Provide opportunity to explore topics in depth
• Afford ability to experience the affective as well as cognitive aspects of
responses
• Allow interviewer to explain or help clarify questions, increasing the
likelihood of useful responses
• Allow interviewer to be flexible in administering interview to particular
individuals or circumstances
Disadvantages of in-depth interviews:
• Expensive and time-consuming
• Need well-qualified, highly trained interviewers
• Interviewee may distort information through recall error, selective
perceptions, desire to please interviewer
• Flexibility can result in inconsistencies across interviews
• Volume of information too large; may be difficult to transcribe and
reduce data
27/02/2016 MM Evaluation 47
Designing MM Evaluation….
c)Focus Groups:
• Combine elements of both interviewing and participant
observation
• Focus groups capitalize on group dynamics.
• The hallmark of focus groups is the explicit use of the group
interaction to generate data and insights that would be
unlikely to emerge without the interaction found in a group.
• The technique inherently allows observation of group
dynamics, discussion, and firsthand insights into the
respondents‟ behaviors, attitudes, language, etc.
• Focus groups are a gathering of 8 to 12 people who share
some characteristics relevant to the evaluation(can be less
than this depending on the interaction).
•
27/02/2016 MM Evaluation 48
Designing MM Evaluation….
When to use focus groups?
• Focus groups are useful in answering the same type of
questions as in-depth interviews, except in a social context.
• Specific applications of the focus group method in
evaluations include:
o identifying and defining problems in project
implementation;
o identifying project strengths, weaknesses, and
recommendations;
o assisting with interpretation of quantitative findings;
o obtaining perceptions of project outcomes and impacts;
and generating new ideas.
27/02/2016 MM Evaluation 49
Designing MM Evaluation….
What type groups?
• The participants are usually a relatively
homogeneous group of people.
• Answering the question, "Which respondent
variables represent relevant similarities among the
target population?" requires some thoughtful
consideration when planning the evaluation.
• Respondents‟ social class, level of expertise, age,
cultural background, and sex should always be
considered.
27/02/2016 MM Evaluation 50
Designing MM Evaluation….
How many groups?
• Determining how many groups requires balancing
cost and information needs.
• A good rule of thumb is to conduct at least two
groups for every variable considered to be relevant
to the outcome (sex, age, educational level, etc.).
27/02/2016 MM Evaluation 51
Other Qualitative Methods
Reading assignment:
• Less common but potentially useful qualitative
methods for project evaluation includes:
o Document studies(public records and personal
records),
o Key informants,
o Alternative (authentic) assessment or
performance assessment, and
o Case studies.
27/02/2016 MM Evaluation 52
Designing MM Evaluation….
Summary: Advantages and Disadvantages of MM data collection techniques
1.Qualitative
Advantages:
• Good for further exploring the effects and unintended consequences of a
program
Disadvantages:
• Expensive and time consuming to implement
• The findings cannot be generalized to participants outside of the program
and are only indicative of the group involved
2.Quantitative
Advantages:
• They are cheaper to implement,
• Are standardized so comparisons can be easily made and
• the size of the effect can usually be measured.
Disadvantages:
• Limited in their capacity for the investigation and explanation of similarities
and unexpected differences
Recommended: Combining both techniques
27/02/2016 MM Evaluation 53
GROUP EXERCISE 5
Be in your previous group and discuss on types of
triangulation in MM Evaluation. Then present outcome of
your discussion to participants(20 minutes).
27/02/2016 MM Evaluation 54
Designing MM Evaluation….
Types Examples
Using different conceptual frameworks Comparing feminist, human rights, social exclusion or
economic(e.g. Cost-benefit) analysis of frameworks
Different method of data
collection(Triangulation by DCT)
Comparing structured survey, direct observation, secondary
data, artifacts
Different interviewers Comparing interviewer sex,age,ethinicity,economic status,
form of dress, language etc…on responses
Different times(Triangulation by time) Comparing responses or observations at different times of a
day, days of the week, times of year
Different location and contexts Comparing response and observations when interviewers
conducted in the home when other people are present, in
locations where the respondents may be able to speak more
freely, in the street and other public places, at work, in the
class room.
Types of triangulationused in MM Evaluation
27/02/2016 MM Evaluation 55
GROUP EXERCISE 6
Be in your previous group and discuss on data analysis
techniques in MM Evaluation. Then present outcome of
your discussion to participants(15 minutes).
27/02/2016 MM Evaluation 56
Designing MM Evaluation….
Approach Description
Parallel Mixed
Method data analysis
This involves two separate analysis processes: QUANT data are analyzed using
conventional QUANT methods (such as frequency tables, cross-tables, regression
analysis, etc.) while a separate analysis of QUAL data is conducted using QUAL
methods such as content analysis. The findings of the two sets of analysis are then
compared
Conversion mixed
method data
analysis
QUAL data are converted into QUANT indicators(“quantitizing”) using rating,
scoring and scaling*.so that QUANT analysis techniques can be used
QUANT data are converted to QUAL indicators (“qualitizing”) so that QUAL
analysis procedures can be used
Sequential mixed
method data
analysis
a) QUAL data analysis is followed by QUANT analysis
b) QUANT data analysis is followed by QUAL analysis
c) Iterative MM designs. The analysis includes sequential QUANT and QUAL
steps
Multilevel mixed
method analysis
QUANT and QUAL analysis techniques are used at different levels of a multilevel
evaluation design
Dataanalysistechniques in MM Evaluation
27/02/2016 MM Evaluation 57
APPROACHES TO MIXED
METHOD EVALUATION
Day 2
Types of Mixed Method approaches to
Evaluation
• Based on timing of data collection, emphasis given to the
type of data collected and mixing approach used, MM
evaluation classified in to six as:
o Sequential
1. Sequential Explanatory design
2. Sequential Exploratory design
3. Sequential Transformative design
o Concurrent
1. Concurrent Triangulation design
2. Concurrent Nested/Embedded design
3. Concurrent Transformative design
27/02/2016 MM Evaluation 59
Sequential explanatory design
• Quantitative data are collected and analyzed first, followed by the
collection and analysis of qualitative data
• That means qualitative and quantitative data are not combined
(mixed) in the data analysis; rather, integration takes place when
the findings are interpreted.
• In this case qualitative data are used to enhance, complement, and
in some cases follow up on unexpected quantitative findings.
• Its strength:
 separate phases of design, data collection, and reporting for
qualitative and quantitative data(easy to implement)
• Its weaknesses:
 the time and resources needed for separate data collection
phases
 the expertise needed to integrate the qualitative and
quantitative findings.
27/02/2016 MM Evaluation 60
Sequential explanatory design….
More weight to the quantitative component
27/02/2016 MM Evaluation 61
QUAN data
collection
QUAN data
analysis
qual data
collection
qual data
analysis
Interpretati
on of entire
analysis
Sequential exploratory design
• The reverse of the sequential explanatory design, with
quantitative data used to enhance and complement
qualitative results.
• This approach is especially useful when the researcher‟s
interest is in enhancing generalizability, and it may or may
not be guided by a theoretical perspective.
• For example: instrument development is an example of this
approach
• Strength and weakness similar to sequential explanatory
design
27/02/2016 MM Evaluation 62
Sequential exploratory design…
More weight to the qualitative component
27/02/2016 MM Evaluation 63
QUAL data
collection
QUAL data
analysis
quan data
collection
quan data
analysis
Interpretati
on of entire
analyses
Example: Sequential combinations pattern in
Explanatory and Exploratory design
27/02/2016 MM Evaluation 64
An evaluation conducted to determine whether WASH project is leading
to a higher rate of hand washing in a particular community.
Examples of Explanatory and Exploratory
designs
Case I: Using one method to explain the findings of another
method: Explanatory
• Evaluation intended to measure the extent of implementation and
factors that affect implementation of youth vocational training
project can use both quantitative and qualitative method.
• Primary by using quantitative method it will determine the extent
of implementation of the project and next by using qualitative
method it will explore the reason why or why not the project is
implemented in the way it is implemented.
27/02/2016 MM Evaluation 65
Examples of Explanatory and Exploratory designs….
Case II: Using one method to inform the design of another method
፡Exploratory
• In some cases, one method can be used to help guide the use of another
method, or to explain the findings from another method.
For example:
• In the first case, imagine for the evaluation of a youth vocational training
project including the evaluation question: “Why do youth choose to
participate in project activities?”
• The evaluator may wish to conduct a survey of participants, but be unsure how
to word the questions, or what answer choices to include. By first
conducting individual and focus group interviews with participants and non-
participants, the evaluator may be able to identify some common reasons for
participation among the target population, and then use these data to
construct the survey.
• In this way, the qualitative methods (individual and focus group interviews),
conducted first, can inform the quantitative method (survey), that comes
afterward. Because this use of mixed-method evaluation requires each method
to be sequenced, one after the other, these methods are often incorporated into
mixed-method evaluations using sequential processes.
• Again, the design choice has time and resource implications.
27/02/2016 MM Evaluation 66
Sequential transformative design
• Either qualitative or quantitative data may be collected first.
• Once again, qualitative and quantitative data are analyzed
separately, and the findings are integrated during the
interpretation phase.
• This approach is often used to ensure that the views and
perspectives of a diverse range of participants are
represented or when a deeper understanding of a process
that is changing as a result of being studied is sought.
• Its strengths and weaknesses are similar to those of the
sequential explanatory design.
27/02/2016 MM Evaluation 67
Sequential transformative design….
27/02/2016 MM Evaluation 68
Sequential transformative design….
Case III: To ensure that the views and perspectives of a diverse range of
participants are represented
• After evaluating a certain project, for example a counseling project working
on students at high schools found out that counselors were misguiding
students in choosing their own courses at college level due to differential
advising of students by counselors. Some counselors follow the standard-
based curriculum and others the traditional one. As a result of this some
students are advised to begin their college mathematics with a course they
should not take. Despite the fact that majority of students disagree with
the recommendations forwarded, most students were following the
recommendations which mathematics course to take. In order to discover
and understand students’ experiences with the advising process and its
implications for their college experience (adversely affected students); a case
study approach was conducted among purposively sampled students (who
began their college mathematics course taking at different difficulty levels).
Then the information obtained from interviews and students’ academic
records could be used to inform the construction of a survey to be sent to a
representative sample of students.
27/02/2016 MM Evaluation 69
Concurrent triangulation design
• Used when the focus is on confirming, cross-validating, or
corroborating findings from a single study.
• Qualitative and quantitative data are collected concurrently,
such that weaknesses of one kind of data are ideally offset by
strengths of the other kind.
• Typically, equal weight is given to the two kinds of data in
mixing the findings, although one kind of data can be
weighted more heavily.
• The qualitative and quantitative data are analyzed separately,
and mixing takes place when the findings are interpreted.
27/02/2016 MM Evaluation 70
Concurrent triangulation design…
27/02/2016 MM Evaluation 71
QUAN QUAL
QUAN data
collection
QUAN data
analysis
QUAL data
collection
QUAL data
analysis
Data results compared
+
Concurrent triangulation design…
27/02/2016 MM Evaluation 72
Analysiswill takeplace separately
Concurrent triangulation design…
Strength:
• the ability to maximize the information provided by a single
study (for example, when interest is in cross-validation), and
a shorter data collection period compared to the sequential
data collection approaches.
Weaknesses:
• the additional complexity associated with collecting
qualitative and quantitative data at the same time and the
expertise needed to usefully apply both methods.
• Discrepancies between the qualitative and quantitative
findings may also be difficult to reconcile.
27/02/2016 MM Evaluation 73
Examples of Concurrent triangulation
design…
Case IV: Using different methods to answer different
questions or to answer different parts of the same question
• For example: an evaluation was conducted on ICCM
program and intended to answer the following questions:
 Is there statistical significant difference between those
health posts with ICCM services and those without in
early child health seeking behavior of community(care
takers)?
 How care takers perceive on quality of ICCM services
provided?
• Here to answer the 1st question quasi-experimental
design(pre-post) can be employed and for the second
question(perception) we can use case study design.
27/02/2016 MM Evaluation 74
Examples of Concurrent triangulation
design…
Case V: Using different methods to answer the same question
For example፡
• Evaluators may use secondary data from health institutions to
measure implementation status of long-term family planning
methods among clients after implementation of the project. But
they may also suspect that health institutions are either under or
over reporting. To help mitigate the risk of bias caused by under or
over reporting in the government data, the evaluation team may
conduct in-depth interviews or FGDs with key informants of FP
clients to obtain a more accurate picture of how the project
implemented and accessed program clients.
• The data generated from qualitative method help to provide a
broad picture of implementation of the project and how well
clients are accepting the project.
27/02/2016 MM Evaluation 75
Concurrent nested/Embedded design
• Qualitative and quantitative data are collected concurrently
and analyzed together during the analysis phase.
• Greater weight is given to one kind of data, in the sense that
one kind of data is typically embedded in the other.
• Qualitative and quantitative data are mixed in the analysis
phase, a process that can take many different forms.
• Four strategies to mix qualitative and quantitative data in the
analysis stage:
o Data transformation
o Typology development
o Extreme case analysis
o Data consolidation/merging
27/02/2016 MM Evaluation 76
Concurrent nested/Embedded design…
Data transformation:
o In which qualitative data are transformed to quantitative data or
quantitative data are transformed into narrative, and the resulting
data are analyzed.
o Typically, the transformed qualitative data exhibit a nominal or
ordinal scale of measurement.
Typology development:
o In which the analysis of one kind of data produces a typology or
set of categories that is used as a framework in analyzing the
other kind of data.
o The analyses of the qualitative data could produce themes that
allow a variable with nominally scaled categories to be developed,
in which the categories provide an explanation of why things
were happened in the way they happened and if not why so.
o This variable could then be used in the quantitative analysis.
27/02/2016 MM Evaluation 77
Concurrent nested/Embedded design…
Extreme case analysis ፡
o In which extreme cases identified with one kind of data are
examined with the other kind, with the goal of explaining why
these cases are extreme.
o For example: Statistical outliers identified through quantitative
data can be explained with qualitative data during analysis.
Data consolidation/merging:
o In which a careful review of both kinds of data leads to the
creation of new variables or data sets expressed in a qualitative or
quantitative metric.
o The merged data are then used in additional analyses.
o A review of the qualitative and quantitative data may suggest
new variables.
•
27/02/2016 MM Evaluation 78
Concurrent nested/Embedded design…
27/02/2016 MM Evaluation 79
QUAN
qual
QUAL
quan
Analysis of findings Analysis of findings
Concurrent nested/Embedded design…
27/02/2016 MM Evaluation 80
QUAN
Pre-test
Data &
Results
QUAN
Post-test
Data &
Results
Intervention
qual
Process(
before, during
and after trial)
Interpretation
Concurrent nested/Embedded design….
Strength:
• The shorter data collection period and the multiple
perspectives embedded in the data,
Weaknesses:
• The level of expertise needed to execute the study
successfully, especially in mixing the qualitative and
quantitative data within the data analysis, and difficulties in
reconciling conflicting results from the qualitative and
quantitative analyses.
27/02/2016 MM Evaluation 81
Concurrent transformative design
• Qualitative and quantitative data are collected concurrently
and can be weighted equally or unequally during the
integration of findings.
• The design may have one method embedded in the other so
that diverse participants are given a choice in the change
process of an organization.
• Qualitative and quantitative data are typically mixed during
the analysis phase.
• The strengths and weaknesses of this approach are similar to
those of the other concurrent approaches.
•
27/02/2016 MM Evaluation 82
Concurrent transformative design….
27/02/2016 MM Evaluation 83
QUAN+QUAL
Vision,Advocacy,Ideaology,Framework QUAL
Vision,Advocacy,Ideaology,Framework
quan
Concurrent transformative design….
Strength:
• Strengths include a shorter data collection period.
Weaknesses:
• Whereas weaknesses include the need to transform data so
that it can be mixed in the analysis phase and difficulties in
reconciling conflicting results using qualitative and
quantitative data.
27/02/2016 MM Evaluation 84
GROUP EXERCISE 7
In your group discuss on:
1. Operational considerations in deciding between
sequential and concurrent designs.
2. Limitation or challenges of MM Evaluation
Then present outcome of your discussion to
participants(20 minutes).
27/02/2016 MM Evaluation 85
Challenges in using Mixed Methods in
Evaluations
Difficult to ensure scientific rigor in Evaluation
The difficulty to ensure that the two data collection methods
complement but don‟t duplicate each other so that cost of
gathering evaluation information is as efficient as possible.
Methodological mix requires that evaluators should have a
series of skills and abilities of both approaches: QUANT and
QUAL
Much more expensive and complex than either of
approaches. This might influence Evaluation sponsors.
Sometimes conflicting results might occur, cause for
disagreement and difficult to interpret.
Paradigm difference might create difference among
stakeholders.
27/02/2016 MM Evaluation 86
Challenges in using Mixed Methods in
Evaluations…
Summary:
Increases the complexity of Evaluations: They are complex
to plan and conduct.
Relies on a multidisciplinary team of researchers:
Quantitative and Qualitative
Requires increased resources: They are labor intensive and
require greater resource and time than those needed to
conduct a single method study.
27/02/2016 MM Evaluation 87
Summary of section
Features of MM
Evaluation
Timing
Concurrent or
Sequential
Integration
Data analysis phase:
Connecting,
transforming or
separating
Interpretation phase:
Separating, connecting
or merging
Purpose
Triangulation
Complementarity
Development
Initiation and/or
Value diversity
Priority
Equal or Unequal
27/02/2016 MM Evaluation 88
Summary of section
Mixed Method design Methodological rationale
Sequential Explanatory design Complementarity
Sequential Exploratory design Development, complementarity and /or
expansion
Sequential transformative design Complementarity, development and/or
expansion/value diversity
Concurrent triangulation design Triangulation
Concurrent nested design Complementarity, initiation and/or
expansion/value diversity
Concurrent transformative design Complementarity, initiation and/or
expansion/value diversity
27/02/2016 MM Evaluation 89
Rationale and approaches of MM Evaluation
GROUP EXERCISE 8(SECTION
3 END)
Be in your previous group and answer the following
questions. Then present outcome of your discussion to
participants(60 minutes).
27/02/2016 MM Evaluation 90
Section 3 end group Exercises
• For your evaluation questions in the previous sections
(Protocol Evaluation Questions), design MM approach to
evaluation.
1. What approach of MM Evaluation is appropriate for your
evaluation question? Why?
2. At which stage of the evaluation you will apply MM
Evaluation? Why?
3. List down the appropriate data collection techniques you
will employ to answer your evaluation questions?
4. Which MM Evaluation analysis technique will be used?
Why?
5. List down the type of triangulation you will use, if any?
27/02/2016 MM Evaluation 91
Recommended readings
oJohn W.Creswell.Research Design: Quantitative, Qualitative
and Mixed Method approaches. Second edition.
oGennifer C.Greene.Mixed Methods in Social Inquiry.Jossey-
Bass.2007
oStefan Cojocaru. Challenges in Using Mix Methods in
Evaluation. Volume 3.September 2013.
oMichael Bamberger. Introduction to Mixed Methods in
Impact Evaluation.Impcat Evaluation Notes. August 2012.
oMichael Bamberger .The Mixed Methods Approach to
Evaluation. Social Impact Concept Note Series. June 2013.
27/02/2016 MM Evaluation 92
Recommended readings…..
27/02/2016 MM Evaluation 93
THANK YOU
“…….good journey…..”
27/02/2016 MM Evaluation 94

Contenu connexe

Tendances

Qualitative and quantitative analysis
Qualitative and quantitative analysisQualitative and quantitative analysis
Qualitative and quantitative analysisNellie Deutsch (Ed.D)
 
Mixed methods research2012
Mixed methods research2012Mixed methods research2012
Mixed methods research2012Gus Cons
 
Mixed methods-research -design-and-procedures
Mixed methods-research -design-and-proceduresMixed methods-research -design-and-procedures
Mixed methods-research -design-and-proceduresABCComputers
 
Seminar on tools of data collection Research Methodology
Seminar on tools of data collection Research MethodologySeminar on tools of data collection Research Methodology
Seminar on tools of data collection Research Methodologyprajwalshetty86
 
Mixed Method Research Design
Mixed Method Research DesignMixed Method Research Design
Mixed Method Research Designbclassengdept
 
Mixed methods designs
Mixed methods designs Mixed methods designs
Mixed methods designs faisal khallab
 
Basic introduction to research methods
Basic introduction to research methodsBasic introduction to research methods
Basic introduction to research methodssmshaake
 
Qualitative vs Quantitative
Qualitative vs QuantitativeQualitative vs Quantitative
Qualitative vs Quantitativeeilire91
 
Good practice in evaluation research
Good practice in evaluation researchGood practice in evaluation research
Good practice in evaluation researchQSR International
 
Mixed research
Mixed researchMixed research
Mixed researchjeamroan
 
Scale development khalid-key concepts
Scale development khalid-key conceptsScale development khalid-key concepts
Scale development khalid-key conceptsKhalid Mahmood
 

Tendances (20)

Qualitative and quantitative analysis
Qualitative and quantitative analysisQualitative and quantitative analysis
Qualitative and quantitative analysis
 
Mixed methods research2012
Mixed methods research2012Mixed methods research2012
Mixed methods research2012
 
Mixed Method Research
Mixed Method ResearchMixed Method Research
Mixed Method Research
 
Bolouri qualitative method
Bolouri qualitative methodBolouri qualitative method
Bolouri qualitative method
 
Mixed methods-research -design-and-procedures
Mixed methods-research -design-and-proceduresMixed methods-research -design-and-procedures
Mixed methods-research -design-and-procedures
 
Week 12 mixed methods
Week 12   mixed methodsWeek 12   mixed methods
Week 12 mixed methods
 
Seminar on tools of data collection Research Methodology
Seminar on tools of data collection Research MethodologySeminar on tools of data collection Research Methodology
Seminar on tools of data collection Research Methodology
 
Mixed Method Research Design
Mixed Method Research DesignMixed Method Research Design
Mixed Method Research Design
 
Ch05 instrumentation
Ch05 instrumentationCh05 instrumentation
Ch05 instrumentation
 
Rm
RmRm
Rm
 
Mixed methods designs
Mixed methods designs Mixed methods designs
Mixed methods designs
 
Research method
Research methodResearch method
Research method
 
Ppt of mixed method design
Ppt of mixed method designPpt of mixed method design
Ppt of mixed method design
 
Basic introduction to research methods
Basic introduction to research methodsBasic introduction to research methods
Basic introduction to research methods
 
Qualitative vs Quantitative
Qualitative vs QuantitativeQualitative vs Quantitative
Qualitative vs Quantitative
 
Good practice in evaluation research
Good practice in evaluation researchGood practice in evaluation research
Good practice in evaluation research
 
Research methodology methods and techniques 2004
Research methodology   methods and techniques 2004Research methodology   methods and techniques 2004
Research methodology methods and techniques 2004
 
Mixed research
Mixed researchMixed research
Mixed research
 
02 mixed methods designs
02 mixed methods designs02 mixed methods designs
02 mixed methods designs
 
Scale development khalid-key concepts
Scale development khalid-key conceptsScale development khalid-key concepts
Scale development khalid-key concepts
 

Similaire à Day 6&7

MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdfMohdTaufiqIshak
 
Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17tjcarter
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdfssuser9878d0
 
MIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGS
MIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGSMIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGS
MIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGSMhenAcenas
 
Research methodology
Research methodologyResearch methodology
Research methodologyAnkita Kunwar
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
GBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdfGBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdfStanleyChivandire1
 
research design
 research design research design
research designkpgandhi
 
RESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptxRESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptxPRADEEP ABOTHU
 
Topic 4 Contemporary Research Design.pptx (2).pdf
Topic 4 Contemporary Research Design.pptx (2).pdfTopic 4 Contemporary Research Design.pptx (2).pdf
Topic 4 Contemporary Research Design.pptx (2).pdfSASIALRAJAMoe
 
R.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptxR.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptxtalhaaziz78
 
chapter 3, guide in making research papers
chapter 3, guide in making research paperschapter 3, guide in making research papers
chapter 3, guide in making research papersRONALDARTILLERO1
 
Designing and Planning a Research.pptx
Designing and Planning a Research.pptxDesigning and Planning a Research.pptx
Designing and Planning a Research.pptxDrHafizKosar
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes successcontentli
 
Chapter 5 - Research design and proposal.ppt
Chapter 5 - Research design and proposal.pptChapter 5 - Research design and proposal.ppt
Chapter 5 - Research design and proposal.pptOlusegun Atiku (PhD)
 
Mixed methods research. newpptx
Mixed methods research. newpptxMixed methods research. newpptx
Mixed methods research. newpptxmuhammad abu bakr
 
Q3-M7-1styr-howtomakeinquiriesss_(1).pptx
Q3-M7-1styr-howtomakeinquiriesss_(1).pptxQ3-M7-1styr-howtomakeinquiriesss_(1).pptx
Q3-M7-1styr-howtomakeinquiriesss_(1).pptxMarielleGuanioMabaca
 

Similaire à Day 6&7 (20)

MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
mixed_methods.pdf
mixed_methods.pdfmixed_methods.pdf
mixed_methods.pdf
 
Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
MIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGS
MIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGSMIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGS
MIXED METHODS REPORT ON MASTERS DEGREE YEAR 2023-2024 THE PHILIPPINE SETTINGS
 
Research methodology
Research methodologyResearch methodology
Research methodology
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
GBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdfGBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdf
 
research design
 research design research design
research design
 
RESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptxRESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptx
 
Topic 4 Contemporary Research Design.pptx (2).pdf
Topic 4 Contemporary Research Design.pptx (2).pdfTopic 4 Contemporary Research Design.pptx (2).pdf
Topic 4 Contemporary Research Design.pptx (2).pdf
 
Aims And Objectives
Aims And ObjectivesAims And Objectives
Aims And Objectives
 
Slrm502 brm
Slrm502 brmSlrm502 brm
Slrm502 brm
 
R.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptxR.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptx
 
chapter 3, guide in making research papers
chapter 3, guide in making research paperschapter 3, guide in making research papers
chapter 3, guide in making research papers
 
Designing and Planning a Research.pptx
Designing and Planning a Research.pptxDesigning and Planning a Research.pptx
Designing and Planning a Research.pptx
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Chapter 5 - Research design and proposal.ppt
Chapter 5 - Research design and proposal.pptChapter 5 - Research design and proposal.ppt
Chapter 5 - Research design and proposal.ppt
 
Mixed methods research. newpptx
Mixed methods research. newpptxMixed methods research. newpptx
Mixed methods research. newpptx
 
Q3-M7-1styr-howtomakeinquiriesss_(1).pptx
Q3-M7-1styr-howtomakeinquiriesss_(1).pptxQ3-M7-1styr-howtomakeinquiriesss_(1).pptx
Q3-M7-1styr-howtomakeinquiriesss_(1).pptx
 

Day 6&7

  • 1. MIXED METHOD EVALUATION Dawit Wolde ( MSc, Lecturer).ICME-JU College of Health Sciences of Jimma University E-mail:dave86520@gmail.com or dawit818@yahoo.com Cell phone:(+251)-922489558/967657712 P.O.Box:378,Jimma University Jimma Ethiopia
  • 2. Presentation objectives At the end of the presentations participants will able to: Define Mixed Method Evaluation Differentiate between Quantitative and Qualitative Evaluation methods Explain rationale for MM Evaluation Discuss on decision for designing MM Evaluation Describe data collection and analysis techniques for MM Evaluation Clarify approaches of Mixing Methods Discuss challenges/Limitations of MM Evaluation Design Mixed Method Evaluation 27/02/2016 MM Evaluation 2
  • 3. Presentation outline • Definition and basic concepts of MM Evaluation • Quantitative(QUANT) Vs. Qualitative(QUAL) Evaluation methods • Rationale for MM Evaluation • Decisions for designing MM Evaluation • Data collection and analysis techniques in MM Evaluation • Approaches for MM Evaluation • Limitations for MM Evaluation 27/02/2016 MM Evaluation 3
  • 4. Training methods • Interactive lectures, • Group discussion(exercises), • Plenary presentations  Allocated time:12 hours(1/2 days) 27/02/2016 MM Evaluation 4
  • 5. BRAINSTORMING Q What is Mixed Method Evaluation? What will be mixed? (10 minutes) 27/02/2016 MM Evaluation 5
  • 6. Definition and basic concepts of MM Evaluation • Mixed Method Evaluation is a methodology for conducting evaluation that involves collecting, analyzing, and integrating (or mixing) quantitative and qualitative evaluation and/or data in a single study or a longitudinal program of inquiry. • Involves mixing or combining of quantitative and qualitative evaluation techniques, methods, approaches, concepts or language into a single study. • Is based on the claim that both qualitative and quantitative evaluation, in combination, provides a better understanding of a research problem or issue than either evaluation approach alone. 27/02/2016 MM Evaluation 6
  • 7. Definition and basic concepts …. • Mixed-method evaluation is the intentional or planned use of diverse methods for particular mixed-method purposes. • For example: If the purpose is to determine the effect level of our program and how and why effect emerged- randomized control trial / quasi experimental design, and case study design can be used. • In MM approach to evaluation, methods or designs were integrated throughout the evaluation process including theory development, data collection, analysis and interpretation. 27/02/2016 MM Evaluation 7
  • 8. Definition and basic concepts …. • Some of the common areas in which mixed-method approaches may be used include: o Initiating, designing, developing and expanding interventions; o Evaluation; o Improving research design; and o Corroborating findings, data triangulation or convergence 27/02/2016 MM Evaluation 8
  • 9. GROUP EXERCISES 1 Be in your previous group and differentiate between Quantitative(QUANT) and Qualitative(QUAL) Evaluation/Research method with the following attributes:  Nature of reality  Purpose  Research approach  Subjectivity and Objectivity  Group studied  Variables used  Type of data collected  Data collection techniques  Type of data analysis  Results Then present outcome of your discussion to participants(20 minutes). 27/02/2016 MM Evaluation 9
  • 10. Quantitative Vs. Qualitative research methods Attributes QUANT QUAL Purpose To test hypothesis, look at cause & effect, and make predictions To understand and interpret social interactions Group studied Larger and randomly selected Smaller and non-randomly selected Variables Specific variables studied Study of the whole, not variables Research Approaches Descriptive study Correlational study Quasi-Experimental study Experimental study Narratives Phenomenology Grounded theory Ethnography Case study Type of data collected Numbers and statistics Words, images or objects Form of data collected Quantitative data based on precise measurements using structured and validated data collection instruments Qualitative data such as open ended responses, interviews, participant observations, field notes and reflections Type of data analysis Identify statistical relationships Identify pattern,features,themes 27/02/2016 MM Evaluation 10
  • 11. Quantitative Vs. Qualitative research methods…. Attributes QUANT QUAL Objectivity and subjectivity Objectivity is critical Subjectivity is expected Role of researcher Researcher and their biases are not known to participants in the study and participants characteristics are deliberately hidden from the researcher(double blind studies) Researcher and their biases may be known to participants in the study and participant characteristics may be known to researcher Results Generalizable findings that are applied to other population Particular or specialized findings that is less generalizable Scientific method Confirmatory or top-down; the researcher test the hypothesis and theory with the data(Deductive reasoning) Exploratory or bottom-up; the researcher generate a new hypothesis and theory from the data collected(Inductive reasoning) View of human behavior Regular and predictable Dynamic,situational,social and personal Most common research objectives Describe, explain and predict Explore, discover and construct Focus Narrow-angle lense;tests a specific hypothesis Wide-angle lense;examine the breadth and depth of phenomena 27/02/2016 MM Evaluation 11
  • 12. Quantitative Vs. Qualitative research methods…. Attributes QUANT QUAL Nature of observation Study behavior under controlled conditions; isolate causal effects Study behavior in a natural environment Nature of reality Single reality; objective Multiple realities; subjective Final report Statistical reports with correlation;comparision of means & statistical significance of findings Narrative report with contextual description and direct quotation from research participants 27/02/2016 MM Evaluation 12
  • 13. Rationale for MM Evaluation Group exercise 2: • Be in your previous group and discuss on the reason for using (Mixed Method Evaluation). • Then present outcome of your discussion to participants (20 minutes). 27/02/2016 MM Evaluation 13
  • 14. Rationale for MM Evaluation • The major reasons for conducting a mixed method evaluation are: 1.Triangulation of Evaluation findings: refers to enhancing the validity or credibility of evaluation findings by comparing information obtained from different methods of data collection.  Using different methods to address the same phenomena  Seeks convergence,corroboration,correspondence of results from the different methods.  Design examples: Concurrent triangulation designs 2.Development: refers to using result of one method to help develop the sample or instrumentation of another.  Design Example: Sequential Exploratory and Sequential transformative 27/02/2016 MM Evaluation 14
  • 15. Rationale … 3.Complementarity: refers to extending the comprehensiveness of evaluation findings through results from different methods that broaden and deepen the understanding reached.  Using different method to address the different part of phenomena  Seeks elaboration,enhancement,illustration and clarification of results.  Design examples: Sequential Exploratory, Sequential Explanatory, and Sequential Transformative.  Moreover concurrent nested and concurrent transformative design can also be used. 27/02/2016 MM Evaluation 15
  • 16. Rationale … 4.Initiation: refers to generating new insights into evaluation findings through results from the different methods that diverge and thus call for reconciliation through further analysis, reframing or a shift in perspective. Looking for contradictory results and using different methods to collect data to explain the discrepancy Seeks for discovery of paradox and contradiction, new perspective of frameworks Design example: Concurrent nested and concurrent transformative design. 5.Value diversity: refers to incorporating a wider diversity of values through the use of different methods that they advance different values. Using different method to address the different part of phenomena Seek to extend the breadth and range of inquiry Design example: Sequential exploratory, sequential transformative, concurrent nested and concurrent transformative designs. 27/02/2016 MM Evaluation 16
  • 17. Rationale …. Operational benefits realized by using mixed-method designs or data collection strategies: Reveal (disclose) unanticipated results. Provide a deeper understanding of why change is or is not occurring as planned. Enable to have a wider range of perspectives than might be captured by a single method. Provides flexibility for the evaluator in choosing the most appropriate method. 27/02/2016 MM Evaluation 17
  • 18. Rationale …. Strength representativeness of in-depth qualitative studies(by linking case study selection to the quantitative sampling frame). Provides conditions of generality and credibility of the evaluation conclusions. Help to examine the interactions among the complex and changing contextual factors that can influence program implementation and impacts. Provide information to improve the sufficiency of the program. Ensures buy-in from both QUANT and QUAL-oriented evaluators and users. 27/02/2016 MM Evaluation 18
  • 20. Basic steps in Program M&E 1.Engage stakeholders 2.Describe the program 3.Focus the evaluation design 4.Gather credible evidence 5.Justify conclusions 6.Ensure use and share lesson learned Source:CDC’s framework for program evaluation in public health,1999. Evaluation standards: o Utility o Feasibility o Propriety o Accuracy 27/02/2016 MM Evaluation 20 Evaluation standards require gathering of credible/correct information(Accur acy) and use of information(Utility) ****MM Evaluation And to ensure perception of credibility by evaluation users ,use of multiple source of data is one strategy This will enhances ….
  • 21. Designing MM Evaluation • In planning MM evaluation approach four decisions are required: a) Decision I: At which stage or stages of the evaluation will MM be used?MM design is much stronger if QUANT and QUAL approaches are integrated into several (or ideally all) stages of the evaluation. b) Decision 2: Will QUANT and QUAL methods be used sequentially or concurrently? c) Decision 3: Will QUANT and QUAL methods be given relatively equal weight, or will one methodology be dominant? d) Decision 4: Will the design be single- or multilevel? 27/02/2016 MM Evaluation 21
  • 22. Designing MM Evaluation… 27/02/2016 MM Evaluation 22 Fig 1:Single level MM design: Sequential MM design with dominant QUANT approach. Studying Interhousehold transfers as a survival strategy for low-income households in Cartagena, Colombia(Source:Michael Bamberger,2012)
  • 23. Designing MM Evaluation… 27/02/2016 MM Evaluation 23 Fig 2.Single level MM design: Sequential MM design with dominant QUAL approach. Evaluating the adoption of new seed varieties by different types of farmers(Source:Michael Bamberger,2012).
  • 24. Designing MM Evaluation… 27/02/2016 MM Evaluation 24 Fig 3.Multilevel MM design: Evaluating the effect of a school feeding program on attendance and performance(Source: Michael Bamberger,2012). .
  • 25. Designing MM Evaluation • Steps for designing and implementing MM Evaluation follows similar procedures like design and implementation of either QUANT or QUAL evaluations. • The steps include: o Developing Evaluation questions o Matching questions with appropriate information gathering techniques o Collecting data o Analyzing the data and o Providing information to interested audiences 27/02/2016 MM Evaluation 25
  • 26. GROUP EXERCISE 3 Be in your previous group: identify the stage at which MM evaluation can be applied and clarify how to apply at each stages. Then present outcome of your discussion to participants(20 minutes). 27/02/2016 MM Evaluation 26
  • 27. Designing MM Evaluation…. • MM approach can be applied during(Stages of Evaluation or research): I. Formulation of hypotheses II. Sampling III. Evaluation design IV. Data collection V. Triangulation VI. Data analysis 27/02/2016 MM Evaluation 27
  • 28. Designing MM Evaluation…. I. Formulation of hypotheses • QUAN evaluation usually draws hypothesis deductively from existing theories or literature reviews • While QUAL evaluation develop hypothesis inductively as the study evolves. • MM combines both approaches. • For example: a hypothesis developed deductively using a QUAN approach can be explored and refined through QUAL approaches such as interviews and observations. In contrast the initial stage of QUAL data collection may describe processes and issues that a QUAN approach can test through data collected in a sample survey. 27/02/2016 MM Evaluation 28
  • 29. Designing MM Evaluation…. II.Sampling • QUAL evaluation uses small number of subjects selected purposively(theoretical sampling) to ensure that all groups are covered. • QUAN evaluation uses a relatively large randomly selected sample permitting generalization to larger population and the statistical comparison of d/t groups. • MM sampling uses the same sampling frame to generate both a large QUANT survey sample and to select a small but representative sample for in-depth QUAL analysis. • Ensuring that the QUAL samples are reasonably representative of the total sample population is one of the most important contributions of MM designs. 27/02/2016 MM Evaluation 29
  • 30. Designing MM Evaluation…. III. Evaluation design • For example: Use of QUAL approach to evaluate the project implementation process and influence of contextual variables on project performance in communities where QUAN survey of project participants being conducted. IV.Data collection • QUAN evaluations collects standardized numerical data, whereas QUAL often use less structured data collection methods that provides greater flexibility and that seeks to understand the complexities of situation. • MM data collection builds on the strength of QUAN data while digging deeper ,capturing sensitive data ,studying processes and behavioral change. 27/02/2016 MM Evaluation 30
  • 31. Designing MM Evaluation…. V.Triangulation • MM tend to use triangulation more systematically and as integral part of the evaluation design. • Information obtained through triangulation is used to: o enhance the reliability and validity of estimates of key indicators by comparing information from different sources; o deepening the understanding of the meaning of statistical relationships identified in the quantitative analysis; and o ensuring that the perspectives of all key stakeholders, with particular emphasis on poor and vulnerable groups, are captured and compared. • In such a way if estimates obtained from different sources are consistent this increases the validity and credibility of the data o What if the estimates differ??? 27/02/2016 MM Evaluation 31
  • 32. Designing MM Evaluation…. VI.Data analysis • In MM Evaluation both analysis techniques are employed. o QUAL analysis help understand the meaning that different subjects or groups give to the statistical associations found in the QUANT analysis and to provide cases and examples to illuminate the findings. o Whereas, QUANT analysis used to assess how well the cases included in the QUAL studies represent the total population of interest and which if any sectors have not been covered. 27/02/2016 MM Evaluation 32
  • 33. GROUP EXERCISE 4 Be in your previous group and: • Identify data collection techniques in MM Evaluation • Discuss on their strength and weaknesses and • And clarify the basis in choosing among them. • Then present outcome of your discussion to participants(30 minutes). 27/02/2016 MM Evaluation 33
  • 34. Designing MM Evaluation…. • In any particular Evaluation, the choice between various data collection techniques and strategies depend on answers to the following questions: o Who is the information for and who will use the finding of the evaluation? o What kind of information are needed? o How is the information to be used? For what purpose is Evaluation being done? o When is the information needed? o What resource are available to conduct the evaluation? 27/02/2016 MM Evaluation 34
  • 35. Designing MM Evaluation…. • Data are commonly collected through both Qualitative and Quantitative Methods • Qualitative approaches aim to address the „how‟ and „why‟ of a program and tend to use unstructured methods of data collection to fully explore the topic.  For example: It tries to answer:” Why do participants enjoy the program?‟ and „How does the program help increase self esteem for participants?” • Quantitative approaches on the other hand address the „what‟ of the program.  They use a systematic standardized approach and  Ask questions such as „what activities did the program run?‟ and „what skills do staffs need to implement the program effectively?‟ 27/02/2016 MM Evaluation 35
  • 36. Designing MM Evaluation…. Commonly used data collection techniques Quantitative • Structured survey • Structured observation guides • Program MIS on input and output data • Review of institution data- clinic records, school records etc… Qualitative • In-depth interviews • Key informant interview • Group interviews (Focus groups, community meetings etc…) • Unstructured observation: Participant and non-participant observation • Video or audio recordings • Photography • Document analysis • Artifacts/Objects • Participatory group techniques( e.g. PRA(Participatory rural appraisal),most significant change(MSC)) 27/02/2016 MM Evaluation 36
  • 37. Designing MM Evaluation…. a)Observation: • Help to gather firsthand data on programs, processes, or behaviors being studied. • They provide evaluators with an opportunity to collect data on a wide range of behaviors, to capture a great variety of interactions, and to openly explore the evaluation topic. • By directly observing operations and activities, the evaluator can develop a holistic perspective, i.e., an understanding of the context within which the project operates. • Also allow evaluator to learn about things the participants or staff may be unaware of or that they are unwilling or unable to discuss in an interview or focus group. 27/02/2016 MM Evaluation 37
  • 38. Designing MM Evaluation…. When to use observations? • Can be useful during both the formative and summative phases of evaluation • For example, during the formative phase, observations can be useful in determining whether or not the project is being delivered and operated as planned. • Observations during the summative phase of evaluation can be used to determine whether or not the project is successful. 27/02/2016 MM Evaluation 38
  • 39. Designing MM Evaluation…. • Types of information for which observations are a good source: 27/02/2016 MM Evaluation 39 The setting - The physical environment within which the project takes place. The human, social environment - The ways in which all actors (staff, participants, others) interact and behave toward each other. Project implementation activities - What goes on in the life of the project? What do various actors (staff, participants, others) actually do? How are resources allocated? The native language of the program - Different organizations and agencies have their own language or jargon to describe the problems they deal with in their work; capturing the precise language of all participants is an important way to record how staff and participants understand their experiences. Nonverbal communication - Nonverbal cues about what is happening in the project: on the way all participants dress, express opinions, physically space themselves during discussions, and arrange themselves in their physical setting. Notable nonoccurrence's - Determining what is not occurring although the expectation is that it should be occurring as planned by the project team, or noting the absence of some particular activity/factor that is noteworthy and would serve as added information.
  • 40. Designing MM Evaluation…. How many observations? • In participant observation this may be a moot point (except with regard to data recording), when an outside observer is used, the question of "how much" becomes very important. • While most people agree that one observation is not enough, there is no hard and fast rule regarding how many samples need to be drawn. • Recommendation: o to avoid atypical situations, o carry out observations more than one time, and o where possible and relevant spread the observations out over time. • Participant observation is often difficult to incorporate in evaluations; therefore, the use of outside observers is far more common. • 27/02/2016 MM Evaluation 40
  • 41. Designing MM Evaluation…. Advantages of observations: o Provide direct information about behavior of individuals and groups o Permit evaluator to enter into and understand situation/context o Provide good opportunities for identifying unanticipated outcomes o Exist in natural, unstructured, and flexible setting Disadvantages of Observations: o Expensive and time consuming o Need well-qualified, highly trained observers; may need to be content experts o May affect behavior of participants(Hawthrone effect) o Selective perception of observer may distort data o Investigator has little control over situation o Behavior or set of behaviors observed may be atypical 27/02/2016 MM Evaluation 41
  • 42. Designing MM Evaluation…. b)Interviews: • Interviews provide very different data from observations o Allow to capture the perspectives of project participants, staff, and others associated with the project o Is used with the assumption that the participants‟ perspectives are meaningful, knowable, and able to be made explicit, and that their perspectives affect the success of the project. o Compared to survey is selected when interpersonal contact is important and when opportunities for follow- up of interesting comments are desired. 27/02/2016 MM Evaluation 42
  • 43. Designing MM Evaluation…. Types of Interviews: • Two types of interviews are commonly used in evaluation research: o Structured interviews o In-depth(unstructured) interviews 27/02/2016 MM Evaluation 43
  • 44. Designing MM Evaluation…. In-depth interview: • Is a dialogue between a skilled interviewer and an interviewee • Its goal is to elicit rich, detailed material that can be used in analysis (Lofland and Lofland, 1995). • Such interviews are best conducted face to face, although in some situations telephone interviewing can be successful. • In-depth interviews are characterized by extensive probing and open-ended questions. 27/02/2016 MM Evaluation 44
  • 45. Designing MM Evaluation…. When to use in-depth interviews? • In-depth interviews can be used at any stage of the evaluation process. • They are especially useful in answering questions such as those suggested by Patton (1990): o What does the program look and feel like to the participants? To other stakeholders? o What are the experiences of program participants? o What do stakeholders know about the project? o What thoughts do stakeholders knowledgeable about the program operations, processes, and outcomes? o What are participants‟ and stakeholders‟ expectations? o What features of the project are most salient to the participants? o What changes do participants perceive in themselves as a result of their involvement in the project? 27/02/2016 MM Evaluation 45
  • 46. Designing MM Evaluation…. • Specific circumstances for which in-depth interviews are particularly appropriate include: o Complex subject matter; o Detailed information sought; o Busy, high-status respondents; and o Highly sensitive subject matter. 27/02/2016 MM Evaluation 46
  • 47. Designing MM Evaluation…. Advantages of in-depth interviews: • Usually yield richest data, details, new insights • Permit face-to-face contact with respondents • Provide opportunity to explore topics in depth • Afford ability to experience the affective as well as cognitive aspects of responses • Allow interviewer to explain or help clarify questions, increasing the likelihood of useful responses • Allow interviewer to be flexible in administering interview to particular individuals or circumstances Disadvantages of in-depth interviews: • Expensive and time-consuming • Need well-qualified, highly trained interviewers • Interviewee may distort information through recall error, selective perceptions, desire to please interviewer • Flexibility can result in inconsistencies across interviews • Volume of information too large; may be difficult to transcribe and reduce data 27/02/2016 MM Evaluation 47
  • 48. Designing MM Evaluation…. c)Focus Groups: • Combine elements of both interviewing and participant observation • Focus groups capitalize on group dynamics. • The hallmark of focus groups is the explicit use of the group interaction to generate data and insights that would be unlikely to emerge without the interaction found in a group. • The technique inherently allows observation of group dynamics, discussion, and firsthand insights into the respondents‟ behaviors, attitudes, language, etc. • Focus groups are a gathering of 8 to 12 people who share some characteristics relevant to the evaluation(can be less than this depending on the interaction). • 27/02/2016 MM Evaluation 48
  • 49. Designing MM Evaluation…. When to use focus groups? • Focus groups are useful in answering the same type of questions as in-depth interviews, except in a social context. • Specific applications of the focus group method in evaluations include: o identifying and defining problems in project implementation; o identifying project strengths, weaknesses, and recommendations; o assisting with interpretation of quantitative findings; o obtaining perceptions of project outcomes and impacts; and generating new ideas. 27/02/2016 MM Evaluation 49
  • 50. Designing MM Evaluation…. What type groups? • The participants are usually a relatively homogeneous group of people. • Answering the question, "Which respondent variables represent relevant similarities among the target population?" requires some thoughtful consideration when planning the evaluation. • Respondents‟ social class, level of expertise, age, cultural background, and sex should always be considered. 27/02/2016 MM Evaluation 50
  • 51. Designing MM Evaluation…. How many groups? • Determining how many groups requires balancing cost and information needs. • A good rule of thumb is to conduct at least two groups for every variable considered to be relevant to the outcome (sex, age, educational level, etc.). 27/02/2016 MM Evaluation 51
  • 52. Other Qualitative Methods Reading assignment: • Less common but potentially useful qualitative methods for project evaluation includes: o Document studies(public records and personal records), o Key informants, o Alternative (authentic) assessment or performance assessment, and o Case studies. 27/02/2016 MM Evaluation 52
  • 53. Designing MM Evaluation…. Summary: Advantages and Disadvantages of MM data collection techniques 1.Qualitative Advantages: • Good for further exploring the effects and unintended consequences of a program Disadvantages: • Expensive and time consuming to implement • The findings cannot be generalized to participants outside of the program and are only indicative of the group involved 2.Quantitative Advantages: • They are cheaper to implement, • Are standardized so comparisons can be easily made and • the size of the effect can usually be measured. Disadvantages: • Limited in their capacity for the investigation and explanation of similarities and unexpected differences Recommended: Combining both techniques 27/02/2016 MM Evaluation 53
  • 54. GROUP EXERCISE 5 Be in your previous group and discuss on types of triangulation in MM Evaluation. Then present outcome of your discussion to participants(20 minutes). 27/02/2016 MM Evaluation 54
  • 55. Designing MM Evaluation…. Types Examples Using different conceptual frameworks Comparing feminist, human rights, social exclusion or economic(e.g. Cost-benefit) analysis of frameworks Different method of data collection(Triangulation by DCT) Comparing structured survey, direct observation, secondary data, artifacts Different interviewers Comparing interviewer sex,age,ethinicity,economic status, form of dress, language etc…on responses Different times(Triangulation by time) Comparing responses or observations at different times of a day, days of the week, times of year Different location and contexts Comparing response and observations when interviewers conducted in the home when other people are present, in locations where the respondents may be able to speak more freely, in the street and other public places, at work, in the class room. Types of triangulationused in MM Evaluation 27/02/2016 MM Evaluation 55
  • 56. GROUP EXERCISE 6 Be in your previous group and discuss on data analysis techniques in MM Evaluation. Then present outcome of your discussion to participants(15 minutes). 27/02/2016 MM Evaluation 56
  • 57. Designing MM Evaluation…. Approach Description Parallel Mixed Method data analysis This involves two separate analysis processes: QUANT data are analyzed using conventional QUANT methods (such as frequency tables, cross-tables, regression analysis, etc.) while a separate analysis of QUAL data is conducted using QUAL methods such as content analysis. The findings of the two sets of analysis are then compared Conversion mixed method data analysis QUAL data are converted into QUANT indicators(“quantitizing”) using rating, scoring and scaling*.so that QUANT analysis techniques can be used QUANT data are converted to QUAL indicators (“qualitizing”) so that QUAL analysis procedures can be used Sequential mixed method data analysis a) QUAL data analysis is followed by QUANT analysis b) QUANT data analysis is followed by QUAL analysis c) Iterative MM designs. The analysis includes sequential QUANT and QUAL steps Multilevel mixed method analysis QUANT and QUAL analysis techniques are used at different levels of a multilevel evaluation design Dataanalysistechniques in MM Evaluation 27/02/2016 MM Evaluation 57
  • 58. APPROACHES TO MIXED METHOD EVALUATION Day 2
  • 59. Types of Mixed Method approaches to Evaluation • Based on timing of data collection, emphasis given to the type of data collected and mixing approach used, MM evaluation classified in to six as: o Sequential 1. Sequential Explanatory design 2. Sequential Exploratory design 3. Sequential Transformative design o Concurrent 1. Concurrent Triangulation design 2. Concurrent Nested/Embedded design 3. Concurrent Transformative design 27/02/2016 MM Evaluation 59
  • 60. Sequential explanatory design • Quantitative data are collected and analyzed first, followed by the collection and analysis of qualitative data • That means qualitative and quantitative data are not combined (mixed) in the data analysis; rather, integration takes place when the findings are interpreted. • In this case qualitative data are used to enhance, complement, and in some cases follow up on unexpected quantitative findings. • Its strength:  separate phases of design, data collection, and reporting for qualitative and quantitative data(easy to implement) • Its weaknesses:  the time and resources needed for separate data collection phases  the expertise needed to integrate the qualitative and quantitative findings. 27/02/2016 MM Evaluation 60
  • 61. Sequential explanatory design…. More weight to the quantitative component 27/02/2016 MM Evaluation 61 QUAN data collection QUAN data analysis qual data collection qual data analysis Interpretati on of entire analysis
  • 62. Sequential exploratory design • The reverse of the sequential explanatory design, with quantitative data used to enhance and complement qualitative results. • This approach is especially useful when the researcher‟s interest is in enhancing generalizability, and it may or may not be guided by a theoretical perspective. • For example: instrument development is an example of this approach • Strength and weakness similar to sequential explanatory design 27/02/2016 MM Evaluation 62
  • 63. Sequential exploratory design… More weight to the qualitative component 27/02/2016 MM Evaluation 63 QUAL data collection QUAL data analysis quan data collection quan data analysis Interpretati on of entire analyses
  • 64. Example: Sequential combinations pattern in Explanatory and Exploratory design 27/02/2016 MM Evaluation 64 An evaluation conducted to determine whether WASH project is leading to a higher rate of hand washing in a particular community.
  • 65. Examples of Explanatory and Exploratory designs Case I: Using one method to explain the findings of another method: Explanatory • Evaluation intended to measure the extent of implementation and factors that affect implementation of youth vocational training project can use both quantitative and qualitative method. • Primary by using quantitative method it will determine the extent of implementation of the project and next by using qualitative method it will explore the reason why or why not the project is implemented in the way it is implemented. 27/02/2016 MM Evaluation 65
  • 66. Examples of Explanatory and Exploratory designs…. Case II: Using one method to inform the design of another method ፡Exploratory • In some cases, one method can be used to help guide the use of another method, or to explain the findings from another method. For example: • In the first case, imagine for the evaluation of a youth vocational training project including the evaluation question: “Why do youth choose to participate in project activities?” • The evaluator may wish to conduct a survey of participants, but be unsure how to word the questions, or what answer choices to include. By first conducting individual and focus group interviews with participants and non- participants, the evaluator may be able to identify some common reasons for participation among the target population, and then use these data to construct the survey. • In this way, the qualitative methods (individual and focus group interviews), conducted first, can inform the quantitative method (survey), that comes afterward. Because this use of mixed-method evaluation requires each method to be sequenced, one after the other, these methods are often incorporated into mixed-method evaluations using sequential processes. • Again, the design choice has time and resource implications. 27/02/2016 MM Evaluation 66
  • 67. Sequential transformative design • Either qualitative or quantitative data may be collected first. • Once again, qualitative and quantitative data are analyzed separately, and the findings are integrated during the interpretation phase. • This approach is often used to ensure that the views and perspectives of a diverse range of participants are represented or when a deeper understanding of a process that is changing as a result of being studied is sought. • Its strengths and weaknesses are similar to those of the sequential explanatory design. 27/02/2016 MM Evaluation 67
  • 69. Sequential transformative design…. Case III: To ensure that the views and perspectives of a diverse range of participants are represented • After evaluating a certain project, for example a counseling project working on students at high schools found out that counselors were misguiding students in choosing their own courses at college level due to differential advising of students by counselors. Some counselors follow the standard- based curriculum and others the traditional one. As a result of this some students are advised to begin their college mathematics with a course they should not take. Despite the fact that majority of students disagree with the recommendations forwarded, most students were following the recommendations which mathematics course to take. In order to discover and understand students’ experiences with the advising process and its implications for their college experience (adversely affected students); a case study approach was conducted among purposively sampled students (who began their college mathematics course taking at different difficulty levels). Then the information obtained from interviews and students’ academic records could be used to inform the construction of a survey to be sent to a representative sample of students. 27/02/2016 MM Evaluation 69
  • 70. Concurrent triangulation design • Used when the focus is on confirming, cross-validating, or corroborating findings from a single study. • Qualitative and quantitative data are collected concurrently, such that weaknesses of one kind of data are ideally offset by strengths of the other kind. • Typically, equal weight is given to the two kinds of data in mixing the findings, although one kind of data can be weighted more heavily. • The qualitative and quantitative data are analyzed separately, and mixing takes place when the findings are interpreted. 27/02/2016 MM Evaluation 70
  • 71. Concurrent triangulation design… 27/02/2016 MM Evaluation 71 QUAN QUAL QUAN data collection QUAN data analysis QUAL data collection QUAL data analysis Data results compared +
  • 72. Concurrent triangulation design… 27/02/2016 MM Evaluation 72 Analysiswill takeplace separately
  • 73. Concurrent triangulation design… Strength: • the ability to maximize the information provided by a single study (for example, when interest is in cross-validation), and a shorter data collection period compared to the sequential data collection approaches. Weaknesses: • the additional complexity associated with collecting qualitative and quantitative data at the same time and the expertise needed to usefully apply both methods. • Discrepancies between the qualitative and quantitative findings may also be difficult to reconcile. 27/02/2016 MM Evaluation 73
  • 74. Examples of Concurrent triangulation design… Case IV: Using different methods to answer different questions or to answer different parts of the same question • For example: an evaluation was conducted on ICCM program and intended to answer the following questions:  Is there statistical significant difference between those health posts with ICCM services and those without in early child health seeking behavior of community(care takers)?  How care takers perceive on quality of ICCM services provided? • Here to answer the 1st question quasi-experimental design(pre-post) can be employed and for the second question(perception) we can use case study design. 27/02/2016 MM Evaluation 74
  • 75. Examples of Concurrent triangulation design… Case V: Using different methods to answer the same question For example፡ • Evaluators may use secondary data from health institutions to measure implementation status of long-term family planning methods among clients after implementation of the project. But they may also suspect that health institutions are either under or over reporting. To help mitigate the risk of bias caused by under or over reporting in the government data, the evaluation team may conduct in-depth interviews or FGDs with key informants of FP clients to obtain a more accurate picture of how the project implemented and accessed program clients. • The data generated from qualitative method help to provide a broad picture of implementation of the project and how well clients are accepting the project. 27/02/2016 MM Evaluation 75
  • 76. Concurrent nested/Embedded design • Qualitative and quantitative data are collected concurrently and analyzed together during the analysis phase. • Greater weight is given to one kind of data, in the sense that one kind of data is typically embedded in the other. • Qualitative and quantitative data are mixed in the analysis phase, a process that can take many different forms. • Four strategies to mix qualitative and quantitative data in the analysis stage: o Data transformation o Typology development o Extreme case analysis o Data consolidation/merging 27/02/2016 MM Evaluation 76
  • 77. Concurrent nested/Embedded design… Data transformation: o In which qualitative data are transformed to quantitative data or quantitative data are transformed into narrative, and the resulting data are analyzed. o Typically, the transformed qualitative data exhibit a nominal or ordinal scale of measurement. Typology development: o In which the analysis of one kind of data produces a typology or set of categories that is used as a framework in analyzing the other kind of data. o The analyses of the qualitative data could produce themes that allow a variable with nominally scaled categories to be developed, in which the categories provide an explanation of why things were happened in the way they happened and if not why so. o This variable could then be used in the quantitative analysis. 27/02/2016 MM Evaluation 77
  • 78. Concurrent nested/Embedded design… Extreme case analysis ፡ o In which extreme cases identified with one kind of data are examined with the other kind, with the goal of explaining why these cases are extreme. o For example: Statistical outliers identified through quantitative data can be explained with qualitative data during analysis. Data consolidation/merging: o In which a careful review of both kinds of data leads to the creation of new variables or data sets expressed in a qualitative or quantitative metric. o The merged data are then used in additional analyses. o A review of the qualitative and quantitative data may suggest new variables. • 27/02/2016 MM Evaluation 78
  • 79. Concurrent nested/Embedded design… 27/02/2016 MM Evaluation 79 QUAN qual QUAL quan Analysis of findings Analysis of findings
  • 80. Concurrent nested/Embedded design… 27/02/2016 MM Evaluation 80 QUAN Pre-test Data & Results QUAN Post-test Data & Results Intervention qual Process( before, during and after trial) Interpretation
  • 81. Concurrent nested/Embedded design…. Strength: • The shorter data collection period and the multiple perspectives embedded in the data, Weaknesses: • The level of expertise needed to execute the study successfully, especially in mixing the qualitative and quantitative data within the data analysis, and difficulties in reconciling conflicting results from the qualitative and quantitative analyses. 27/02/2016 MM Evaluation 81
  • 82. Concurrent transformative design • Qualitative and quantitative data are collected concurrently and can be weighted equally or unequally during the integration of findings. • The design may have one method embedded in the other so that diverse participants are given a choice in the change process of an organization. • Qualitative and quantitative data are typically mixed during the analysis phase. • The strengths and weaknesses of this approach are similar to those of the other concurrent approaches. • 27/02/2016 MM Evaluation 82
  • 83. Concurrent transformative design…. 27/02/2016 MM Evaluation 83 QUAN+QUAL Vision,Advocacy,Ideaology,Framework QUAL Vision,Advocacy,Ideaology,Framework quan
  • 84. Concurrent transformative design…. Strength: • Strengths include a shorter data collection period. Weaknesses: • Whereas weaknesses include the need to transform data so that it can be mixed in the analysis phase and difficulties in reconciling conflicting results using qualitative and quantitative data. 27/02/2016 MM Evaluation 84
  • 85. GROUP EXERCISE 7 In your group discuss on: 1. Operational considerations in deciding between sequential and concurrent designs. 2. Limitation or challenges of MM Evaluation Then present outcome of your discussion to participants(20 minutes). 27/02/2016 MM Evaluation 85
  • 86. Challenges in using Mixed Methods in Evaluations Difficult to ensure scientific rigor in Evaluation The difficulty to ensure that the two data collection methods complement but don‟t duplicate each other so that cost of gathering evaluation information is as efficient as possible. Methodological mix requires that evaluators should have a series of skills and abilities of both approaches: QUANT and QUAL Much more expensive and complex than either of approaches. This might influence Evaluation sponsors. Sometimes conflicting results might occur, cause for disagreement and difficult to interpret. Paradigm difference might create difference among stakeholders. 27/02/2016 MM Evaluation 86
  • 87. Challenges in using Mixed Methods in Evaluations… Summary: Increases the complexity of Evaluations: They are complex to plan and conduct. Relies on a multidisciplinary team of researchers: Quantitative and Qualitative Requires increased resources: They are labor intensive and require greater resource and time than those needed to conduct a single method study. 27/02/2016 MM Evaluation 87
  • 88. Summary of section Features of MM Evaluation Timing Concurrent or Sequential Integration Data analysis phase: Connecting, transforming or separating Interpretation phase: Separating, connecting or merging Purpose Triangulation Complementarity Development Initiation and/or Value diversity Priority Equal or Unequal 27/02/2016 MM Evaluation 88
  • 89. Summary of section Mixed Method design Methodological rationale Sequential Explanatory design Complementarity Sequential Exploratory design Development, complementarity and /or expansion Sequential transformative design Complementarity, development and/or expansion/value diversity Concurrent triangulation design Triangulation Concurrent nested design Complementarity, initiation and/or expansion/value diversity Concurrent transformative design Complementarity, initiation and/or expansion/value diversity 27/02/2016 MM Evaluation 89 Rationale and approaches of MM Evaluation
  • 90. GROUP EXERCISE 8(SECTION 3 END) Be in your previous group and answer the following questions. Then present outcome of your discussion to participants(60 minutes). 27/02/2016 MM Evaluation 90
  • 91. Section 3 end group Exercises • For your evaluation questions in the previous sections (Protocol Evaluation Questions), design MM approach to evaluation. 1. What approach of MM Evaluation is appropriate for your evaluation question? Why? 2. At which stage of the evaluation you will apply MM Evaluation? Why? 3. List down the appropriate data collection techniques you will employ to answer your evaluation questions? 4. Which MM Evaluation analysis technique will be used? Why? 5. List down the type of triangulation you will use, if any? 27/02/2016 MM Evaluation 91
  • 92. Recommended readings oJohn W.Creswell.Research Design: Quantitative, Qualitative and Mixed Method approaches. Second edition. oGennifer C.Greene.Mixed Methods in Social Inquiry.Jossey- Bass.2007 oStefan Cojocaru. Challenges in Using Mix Methods in Evaluation. Volume 3.September 2013. oMichael Bamberger. Introduction to Mixed Methods in Impact Evaluation.Impcat Evaluation Notes. August 2012. oMichael Bamberger .The Mixed Methods Approach to Evaluation. Social Impact Concept Note Series. June 2013. 27/02/2016 MM Evaluation 92