SlideShare une entreprise Scribd logo
1  sur  64
Télécharger pour lire hors ligne
Qualitative Research Methods
(QRM) – Unit -5
(Reliability & Validity)
Dr. Shriram S. Dawkhar
SIBAR, Kondhwa, Pune.
Syllabus
5. Quality Criteria in Qualitative Research:
Reliability, Validity, Objectivity, Alternative
Criteria, Criteria for Evaluating the Building of
Theories, Quality Assessment as a Challenge
for Qualitative Research, Triangulation,
Analytic Induction, Generalization in
Qualitative Research, The Constant
Comparative Method, Process Evaluation and
Quality Management. (5)
Lets first Understand Reliability
and Validity for Quantitative
Research
11-4
Evaluating Measurement Tools
We have to test Validity and Reliability of our research Instrument. (e.g. –
Questionnaire)
Criteria
Validity
Practicality Reliability
Tests of sound measurement
• Sound measurement must meet the
• Tests of validity : Refers to the fact that the instrument is
able to measure what it is designed for.
• Reliability : Means every time the instrument is used it should
give the same result
• Practicality : Means that the instrument is very easy &
practical to use.
• These are three major considerations
one should use in evaluating a measurement
tool.
1) Test of Validity
• Validity refers to the extent to which a test
measures what we actually wish to measure.
• Validity is the most critical criterion and
indicates the degree to which an instrument
measures what it is supposed to measure.
• Validity can also be thought of as utility.
• In other words validity is the extent to
which differences found with a measuring
instrument reflect true differences among
those being tested.
Reliability
• A measure is reliable to the degree that it supplies
consistent results.
• Reliability is a necessary contributor to validity
but is not a sufficient condition for validity.
• It is concerned with estimates of the degree to
which a measurement is free of random or
unstable error.
• Reliable instruments are robust and work well at
different times under different conditions. This
distinction of time and condition is the basis for
three perspectives on reliability – stability,
equivalence, and internal consistency.
11-8
Reliability Estimates
Stability
Internal
Consistency
Equivalence
Practicality
Economy InterpretabilityConvenience
Reliability & Validity
• Qualitative vs. Quantitative
“The quantitative study must convince the reader that
procedures have been followed faithfully because very little
concrete description of what anyone does is provided.
The qualitative study provides the reader with a depiction in
enough detail to show that the author’s conclusion ‘makes
sense’. . . The quantitative study portrays a world of variables
and static states. By contrast the qualitative study describes
people acting in events.”
(Firestone, 1987, p. 19)
Reliability & Validity : Challenges
1.What can you possibly tell from an n ( Sample) of 1?
• 2. What is it worth to just get the researcher’s
interpretation of what is taking place?
• 3. How can you generalize from a small, nonrandom
sample?
Reliability & Validity : Challenges
• 4. If the researcher is the primary
instrument for data collection and analysis,
how can we be sure the researcher is a valid
and reliable instrument?
• 5. How will you know when to stop
collecting data?
• 6. Isn’t the researcher biased and just
finding out what he or she expects to find?
Reliability & Validity : Challenges
• 7. Without hypotheses, how will you know what
you’re looking for?
• 8. Don’t people lie to field researchers?
• 9. Doesn’t the researcher’s presence result in a
change in participants’ normal behavior, thus
contaminating the data?
• 10. If somebody else did the study, would they
get the same results?
Reliability & Validity
• Criteria
✓Some argue that qualitative researchers
should consider validity and reliability from
the philosophical assumptions underlying the
chosen paradigm.
Reliability & Validity
• Qualitative vs. Quantitative Criteria
Quantitative Qualitative
Internal Validity Credibility
External Validity Transferability
Reliability Dependability
Objectivity Confirmability
Important Slide
Reliability & Validity : 1) Credibility:
•Credibility: the extent to which interpretations can be
validated as true, correct, and dependable
•Is the study believable from the perspective of those
observed, does it ring true to the people studied?
• Is the data complete?
• Some argue one can never truly discover the reality of a
situation and should look for what’s credible instead.
•Credibility:
- Validity is relative
- Humans are closer to the truth than if a data
collection instrument had been interjected between
us and the participants
Reliability & Validity : 1)Credibility:
•1) Use triangulation
•2) Have adequate engagement in
data collection, saturation
•3) Employ reflexivity
Reliability & Validity : 1)Credibility:
How to bring it
Reliability & Validity : 1)Credibility:
How to bring it
1) Use triangulation to overcome
inherent flaws
•- data
•- investigator
•- interdisciplinary
•- theory
2) Have adequate engagement in data collection,
saturation
- Holistic
- Look for data that supports
alternate explanations
Reliability & Validity : 1)Credibility: How to bring it
• 3) Employ reflexivity
Reflexivity: the process of reflecting
critically upon the self as the
researcher, the “human instrument”
Reliability & Validity : 1)Credibility: How to bring it
Reliability & Validity: 2) Transferability:
•Transferability: degree to which the results
can be applied to other settings/situations
• - Researcher supplies thick (detailed)
descriptions
• - Pays careful attention to the
sample
•Transferability:
• “In qualitative research, a singe case or
small, nonrandom, purposeful sample is
selected precisely because the researcher
wishes to understand the particular in
depth, not to find out what is generally true
of the many.” (Merriam, 2009, 224)
Reliability & Validity: 2) Transferability:
Reliability & Validity: 3) Dependability:
•Dependability: concerned with whether or
not the findings can be duplicated/repeated
•Describes changes in the setting
and how those changes affected the research
•Dependability:
✓ Difficult because human behavior is
constantly changing
- many interpretations
- no benchmarks or static means of
measurements
- similarity of answers does not
ensure accuracy
Reliability & Validity: 3) Dependability:
•Dependability:
✓ More important to ask if whether the
results are consistent with the data collected
Reliability & Validity: 3) Dependability:
✓Strategies to help with dependability:
- triangulation (discussed in unit -1)
- peer examination
- investigator’s position
- audit trail
Audit trail: independent readers can authenticate
the findings of a study by following the trail of
the researcher
Reliability & Validity: 3) Dependability:
How to achieve it
•Confirmability: the degree to which the
results can be corroborated (Validated) by
others
• - Results should be well-reasoned
• - The results of the study vs. the
• researcher’s bias?
Reliability & Validity: 4) Confirmability
•Confirmability:
✓One concern is reactivity
• - How the act of observation
changes a situation
Reliability & Validity: 4) Confirmability
Reliability & Validity
✓Different criteria apply to
different methods
i.e. In narrative analysis look for what
“tells” a persuasive story in a narrative
way vs. the thick description needed in an
ethnography of a cultural group
Objectivity
•Objectivity is considered as an ideal for scientific
inquiry, as a good reason for valuing scientific
knowledge, and as the foundation of the authority of
science in society.
•It expresses the thought that the claims, methods and
results of science are not, or should not be influenced
by particular perspectives, value commitments,
community bias or personal interests, to name a few
significant factors.
•Scientific objectivity is a feature of scientific claims,
methods and results.
Objectivity in Qualitative Research
• The obstacles special to the social sciences are caused by the
special involvement of the investigator with his topic of study,
which relates to both his interests and his emotional make-up.
• Even though achieving complete objectivity in science is an
impossibility, aiming at it, or attaining as much of it as
reasonably possible, is a necessary condition for the conduct
of all scientific inquiry.
• The only way in which we can strive for ‘objectivity’ in
theoretical analysis is to expose the valuations to full light,
make them conscious, specific, and explicit, and permit them
to determine the theoretical research.
• A more balanced view of objectivity both as a method as well
as ideal must be considered.
Evaluating Research
1.Are the methods of research appropriate to the
nature of the question being asked?
2. Is the connection to an existing body of
knowledge or theory clear?
3.Are there clear accounts of the criteria used for
the selection of cases for study and of the data
collection and analysis?
• 4. Does the sensitivity of the methods match the
needs of the research question?
• 5.Were the data collection and record keeping
systematic?
• 6. Is reference made to accepted procedures for
analysis?
Evaluating Research
• 7. How systematic is the analysis?
• 8. Is there adequate discussion of how themes,
concepts and categories were derived from the
data?
• 9. Is there adequate discussion of the evidence
for and against the researcher’s arguments?
• 10. Is a clear distinction made between the data
and their interpretation?
Evaluating Research
Criteria for Evaluating the Building of Theories
• Corbin and Strauss (1990) mention four points of
departure for judging empirically grounded theories and
the procedures that led to them. According to their
suggestion, you should critically assess
• 1) the validity, reliability, and credibility of the data,
• 2 the plausibility, and the value of the theory itself,
• 3 the adequacy of the research process which has
generated, elaborated, or tested the theory, and
• 4 the empirical grounding of the research findings.
For evaluating the research process
itself, they suggest seven criteria:
• Criterion 1 How was the original sampling
selected? On what grounds (selective sampling)?
• Criterion 2 What major categories emerged?
• Criterion 3 What were some of the events,
incidents, actions, and so on that indicated some
of these major categories?
For evaluating the research process itself, they
suggest seven criteria
• Criterion 4 On the basis of what categories did
theoretical sampling proceed? That is, how did
theoretical formulations guide some of the data
collection? After the theoretical sampling was
carried out, how representative did these
categories prove to be?
• Criterion 5 What were some of the hypotheses
pertaining to relations among categories? On
what grounds were they formulated and tested?
Criterion
• 6 Were there instances when hypotheses did not
hold up against what was actually seen? How
were the discrepancies accounted for? How did
they affect the hypotheses?
• Criterion 7 How and why was the core category
selected? Was the selection sudden or gradual,
difficult or easy? On what grounds were the final
analytic decisions made?
For evaluating the research process itself, they
suggest seven criteria
Criteria for Evaluating the Building of
Theories
• A central role is given to the question of whether
the findings and the theory are grounded in the
empirical relations and data—if it is a grounded
theory (building) or not.
• For an evaluation of the realization of this aim,
Corbin and Strauss suggest seven criteria for
answering the question of the empirical grounding
of findings and theories:
Criteria for Evaluating the Building of
Theories
• Criterion 1 Are concepts generated?
• Criterion 2 Are the concepts systematically related?
• Criterion 3 Are there many conceptual linkages and
are the categories well developed? Do the categories
have conceptual density?
• Criterion 4 Is there much variation built into the
theory?
• Criterion 5 Are broader conditions that affect the
phenomenon under study built into its
explanation?
• Criterion 6 Has "process" been taken into
account?
• Criterion 7 Do the theoretical findings seem
significant and to what extent? (1990, pp. 17-18)
Criteria for Evaluating the Building of
Theories
Criteria for Theory Development
in Qualitative Research
• 1 The degree to which generic/formal theory is
produced.
• 2 The degree of development of the theory.
• 3 The novelty of the claims made.
• 4 The consistency of the claims with empirical
observations and the inclusion of representative
examples of the latter in the report.
• 5 The credibility of the account to readers and/or
those studied.
• 6 The extent to which findings are transferable to
other settings.
• 7 The reflexivity of the account: the degree to
which the effects on the findings of the
researcher and of the research settings employed
are assessed and/or the amount of information
about the research process that is provided to
readers.
Criteria for Theory Development in
Qualitative Research
Quality Assessment as a Challenge for
Qualitative Research
• Nevertheless, the question of how to assess the quality
of qualitative research is currently raised in three
respects.
• First, by the researchers who want to check and secure
their proceeding and their results.
• Second, by the consumers of qualitative research—the
readers of publications or the funding agencies, who
want to assess what has been presented to them;
• and finally in the evaluation of research in reviewing
research proposals and in peer reviews of manuscripts
submitted to journals
• ("Are the results credible and
appropriate?"),
• "Do they address the research question(s)?"
• "Data collection procedures are fully
explained
Triangulation (explained in detail in Unit-1),
Analytic Induction,
Generalization in Qualitative
Research
Analytic Induction
• Znaniecki (1934) introduced analytic induction.
This strategy explicitly starts from a specific case.
According to Buhler-Niederberger it can be
characterized as follows:
• Analytic induction is a method of systematic
interpretation of events, which includes the
process of generating hypotheses as well as
testing them. Its decisive instrument is to analyze
the exception, the case, which is deviant to the
hypothesis. (1985, p. 476)
Steps of Analytic Induction
• 1 A rough definition of the phenomenon to be
explained is formulated.
• 2 A hypothetical explanation of the phenomenon
is formulated.
• 3 A case is studied in the light of this hypothesis
to find out whether the hypothesis corresponds to
the facts in this case.
• 4 If the hypothesis is not correct, either the
hypothesis is reformulated or the phenomenon to
be explained is redefined in a way that excludes
this case.
Steps of Analytic Induction
• 5 Practical certainty can be obtained after a small
number of cases have been studied, but the
discovery of each individual negative case by the
researcher or another researcher refutes the
explanation and calls for its reformulation.
• 6 Further cases are studied, the phenomenon is
redefined, and the hypotheses are reformulated
until a universal relation is established; each
negative case calls for redefinition or
reformulation.
Generalization in Qualitative
Research
• The generalization of concepts and relations found
from analysis is another strategy of grounding
qualitative research.
• The central points to consider in such an evaluation
are first the analyses and, second, the steps taken to
arrive at more or less general statements.
• The problem of generalization in qualitative
research is that its statements are often made for a
certain context or specific cases and based on
analyses of relations, conditions, processes, etc., in
them.
Generalization in Qualitative
Research
• However, when attempts are made at
generalizing the findings, this context
link has to be given up in order to find
out whether the findings are valid
independently of and outside specific
contexts.
Generalization in Qualitative Research
• Correspondingly, various possibilities are discussed for mapping
out the path from the case to the theory in a way that will allow you
to reach at least a certain generalization.
• A first step is to clarify which degree of generalization you are
aiming at and is possible to obtain with the concrete study in order
to derive appropriate claims for generalization.
• A second step is the cautious integration of different cases and
contexts in which the relations under study are empirically
analyzed.
• The generalizability of the results is often closely linked to the way
the sampling is done.
• The third step is the systematic comparison of the collected
material. Here again, the procedures for developing grounded
theories can be drawn on.
The Constant Comparative Method
• In the process of developing theories, and
additional to the method of "theoretical
sampling", Glaser (1969) suggests the constant
comparative method as a procedure for
interpreting texts.
The Constant Comparative
Method
• “It basically consists of four stages: “
• (1) comparing incidents applicable to each
category,
• (2) integrating categories and their properties,
• (3) delimiting the theory, and
• (4) writing the theory" (1969, p. 220).
The Constant Comparative
Method
• Although this method is a continuous
growth process—each stage after a time
transforms itself into the next—previous
stages remain in operation throughout the
analysis and provide continuous
development to the following stage until the
analysis is terminated. (1969, p. 220)
Process Evaluation and Quality
Management
• A) Process Evaluation :
• Qualitative research is embedded in a process in
a special way. It does not make sense to ask and
answer questions of sampling or concerning
special methods in an isolated way. Whether a
sampling is appropriate can only be answered
with regard to the research question, to the
results, and to the generalizations that are aimed
at and the methods used.
Process Evaluation :
• Abstract measures like the representativeness of a sample, which
can be judged generally, do not have any benefit here.
• A central starting point for answering such questions is the
sounding of the research process, which means whether the
sampling that was applied harmonizes with the concrete research
question and with the concrete process.
• Activities for optimizing qualitative research in the concrete case
have to start from the stages of the qualitative research process.
Correspondingly, note a shift in the accent of evaluating
qualitative methods and their use from mere evaluation of the
application to process evaluation.
• Thus, the aspect of grounding is shifted to the level of the research
process. The aim of this shift is also to underscore a different
understanding of quality in qualitative research and to relate it to a
concrete project.
Quality Management
• Impulses for further developments can be provided by the
general discussion about quality management (Kamiske and
Brauer 1995), which lies mainly in the areas of industrial
production but also of public services (Murphy 1994).
• But some of the concepts and strategies used in this discussion
may be adopted to promote a discussion about quality in
research, which is appropriate to the issues and research
concepts.
• 1) The concept of auditing : It provides first intersections: "An
audit is understood as a systematic, independent examination of
an activity and its results, by which the existence and
appropriate application of specified demands are evaluated and
documented" (Kamiske and Brauer 1995, p. 5).
Quality Management
• In particular, the "procedural audit" is
interesting for qualitative research.
• It should guarantee that "the pre-defined
demands are fulfilled and are useful for the
respective application .... Priority is always
given to an enduring remedy of causes of
mistakes, not only a simple detection of
mistakes" .
Quality Management
• 2) Concepts like "member checks" or
communicative validation explicitly take
this orientation into account.
Quality Management
• Designing the research process and
proceeding in a way which gives enough
room to those who are studied realizes this
orientation implicitly.
• For an evaluation, both aspects may be
analyzed explicitly: how far did the study
proceed in such a way that it answered its
research question
Quality Management
• Make sure that the definition of the goals and standards of the
project are as clear as possible, and that all researchers and co-
workers integrate themselves in this definition.
• Define how these goals and standards and, more generally, the
quality are obtained; finally, a consensus about the way to apply
certain methods (perhaps through joint interview training) and its
analysis is a precondition for quality in the research process.
• Provide a clear definition of the responsibilities for obtaining
quality in the research process.
• Allow transparency of the judgment and the assessment and
quality in the process.
References
• 1. An Introduction to Qualitative Research,
Uwe Flick, 4th Edition, SAGE
• 2. Research Methods in the Social Sciences,
Bridget Somekh & Cathy Lewin, 5th
Edition, SAGE India.
• https://www.slideshare.net/shoeb786/objecti
vity-in-social-science-research

Contenu connexe

Tendances

2 defining the research problem
2 defining the research problem2 defining the research problem
2 defining the research problem
Vinay Jeengar
 
Non probability sampling methods
Non probability sampling methodsNon probability sampling methods
Non probability sampling methods
Prashant Benki
 

Tendances (20)

Chapter 6 (sample design)
Chapter 6 (sample design)Chapter 6 (sample design)
Chapter 6 (sample design)
 
Probability sampling
Probability samplingProbability sampling
Probability sampling
 
Positivist approach to research
Positivist approach to researchPositivist approach to research
Positivist approach to research
 
Business research methodology
Business research methodologyBusiness research methodology
Business research methodology
 
2 defining the research problem
2 defining the research problem2 defining the research problem
2 defining the research problem
 
Research Process and Research Design.
Research Process and Research Design.Research Process and Research Design.
Research Process and Research Design.
 
Introduction to qualitative research methods
Introduction to qualitative research methodsIntroduction to qualitative research methods
Introduction to qualitative research methods
 
Pretesting in questionnaire
Pretesting in questionnairePretesting in questionnaire
Pretesting in questionnaire
 
Research Tools and Techniques
Research Tools and TechniquesResearch Tools and Techniques
Research Tools and Techniques
 
Problem definition research methodology
Problem definition  research methodologyProblem definition  research methodology
Problem definition research methodology
 
data collection
data collection data collection
data collection
 
Field research
Field researchField research
Field research
 
Non probability sampling methods
Non probability sampling methodsNon probability sampling methods
Non probability sampling methods
 
Comparison between qualitative and quantitative research
Comparison between qualitative and quantitative researchComparison between qualitative and quantitative research
Comparison between qualitative and quantitative research
 
Qualitative research
Qualitative researchQualitative research
Qualitative research
 
Research Process Phases
Research Process PhasesResearch Process Phases
Research Process Phases
 
Ucc504 business research methods case study 220413
Ucc504  business research methods   case study 220413Ucc504  business research methods   case study 220413
Ucc504 business research methods case study 220413
 
3.Qualitative data collection techniques by elmusharaf
3.Qualitative data collection techniques by  elmusharaf3.Qualitative data collection techniques by  elmusharaf
3.Qualitative data collection techniques by elmusharaf
 
Non-Probability sampling
Non-Probability samplingNon-Probability sampling
Non-Probability sampling
 
Sampling methods
Sampling methodsSampling methods
Sampling methods
 

Similaire à Qualitative Research Methods

Reliability and validity.pptx
Reliability and validity.pptxReliability and validity.pptx
Reliability and validity.pptx
NathanMoyo1
 
Qualitative Research Methods
Qualitative Research MethodsQualitative Research Methods
Qualitative Research Methods
Anil Sharma
 
Fb11001 reliability and_validity_in_qualitative_research_summary
Fb11001 reliability and_validity_in_qualitative_research_summaryFb11001 reliability and_validity_in_qualitative_research_summary
Fb11001 reliability and_validity_in_qualitative_research_summary
Dr. Akshay S. Bhat
 
Enhancing the quality of qualitative research using software
Enhancing the quality of qualitative research using softwareEnhancing the quality of qualitative research using software
Enhancing the quality of qualitative research using software
Merlien Institute
 
Merriam ch 9 5.25.10
Merriam ch 9 5.25.10Merriam ch 9 5.25.10
Merriam ch 9 5.25.10
Daberkow
 
Week 9 validity and reliability
Week 9 validity and reliabilityWeek 9 validity and reliability
Week 9 validity and reliability
wawaaa789
 

Similaire à Qualitative Research Methods (20)

Reliability and validity.pptx
Reliability and validity.pptxReliability and validity.pptx
Reliability and validity.pptx
 
Qualitative Research Methods
Qualitative Research MethodsQualitative Research Methods
Qualitative Research Methods
 
Fb11001 reliability and_validity_in_qualitative_research_summary
Fb11001 reliability and_validity_in_qualitative_research_summaryFb11001 reliability and_validity_in_qualitative_research_summary
Fb11001 reliability and_validity_in_qualitative_research_summary
 
Enhancing the quality of qualitative research using software
Enhancing the quality of qualitative research using softwareEnhancing the quality of qualitative research using software
Enhancing the quality of qualitative research using software
 
Credibility, validity, reliability and transferability
Credibility, validity, reliability and transferabilityCredibility, validity, reliability and transferability
Credibility, validity, reliability and transferability
 
Merriam ch 9 5.25.10
Merriam ch 9 5.25.10Merriam ch 9 5.25.10
Merriam ch 9 5.25.10
 
Triangulation: An Approach to establish Credibility and Dependability of Qual...
Triangulation: An Approach to establish Credibility and Dependability of Qual...Triangulation: An Approach to establish Credibility and Dependability of Qual...
Triangulation: An Approach to establish Credibility and Dependability of Qual...
 
Qualitative Data Quality- Dr Ryan Thomas Williams
Qualitative Data Quality- Dr Ryan Thomas WilliamsQualitative Data Quality- Dr Ryan Thomas Williams
Qualitative Data Quality- Dr Ryan Thomas Williams
 
Managing Quality In Qualitative Research
Managing Quality In Qualitative ResearchManaging Quality In Qualitative Research
Managing Quality In Qualitative Research
 
TSL3133 Topic 10 Data Collection Considerations
TSL3133 Topic 10 Data Collection ConsiderationsTSL3133 Topic 10 Data Collection Considerations
TSL3133 Topic 10 Data Collection Considerations
 
JC-16-23June2021-rel-val.pptx
JC-16-23June2021-rel-val.pptxJC-16-23June2021-rel-val.pptx
JC-16-23June2021-rel-val.pptx
 
Data Quality in Quantitative Research- Dr Ryan Thomas Williams
Data Quality in Quantitative Research- Dr Ryan Thomas WilliamsData Quality in Quantitative Research- Dr Ryan Thomas Williams
Data Quality in Quantitative Research- Dr Ryan Thomas Williams
 
Week 9 validity and reliability
Week 9 validity and reliabilityWeek 9 validity and reliability
Week 9 validity and reliability
 
Data collection reliability
Data collection reliabilityData collection reliability
Data collection reliability
 
QUALITATIVE.pptx
QUALITATIVE.pptxQUALITATIVE.pptx
QUALITATIVE.pptx
 
Validity and Reliability.pdf
Validity and Reliability.pdfValidity and Reliability.pdf
Validity and Reliability.pdf
 
Validity and Reliability.pdf
Validity and Reliability.pdfValidity and Reliability.pdf
Validity and Reliability.pdf
 
Chapter_7_Research_design_qualitative_me.ppt
Chapter_7_Research_design_qualitative_me.pptChapter_7_Research_design_qualitative_me.ppt
Chapter_7_Research_design_qualitative_me.ppt
 
تجارب مميزة في البحث النوعي وتطبيقاته
تجارب مميزة في البحث النوعي وتطبيقاتهتجارب مميزة في البحث النوعي وتطبيقاته
تجارب مميزة في البحث النوعي وتطبيقاته
 
Validity and reliability (aco section 6a) sheena jayma msgs ed
Validity and reliability (aco section 6a) sheena jayma msgs edValidity and reliability (aco section 6a) sheena jayma msgs ed
Validity and reliability (aco section 6a) sheena jayma msgs ed
 

Plus de Dr. Shriram Dawkhar, Sinhgad Institutes

Plus de Dr. Shriram Dawkhar, Sinhgad Institutes (13)

Brm chap-4 present-updated
Brm chap-4 present-updatedBrm chap-4 present-updated
Brm chap-4 present-updated
 
Brm unit 3-dr. shriram dawkhar
Brm unit 3-dr. shriram dawkharBrm unit 3-dr. shriram dawkhar
Brm unit 3-dr. shriram dawkhar
 
Business research methods_mba_unit-2
Business research methods_mba_unit-2Business research methods_mba_unit-2
Business research methods_mba_unit-2
 
Research methods nicholas walliman
Research methods nicholas wallimanResearch methods nicholas walliman
Research methods nicholas walliman
 
Business research methods_unit-1
Business research methods_unit-1Business research methods_unit-1
Business research methods_unit-1
 
Brm unit.5 data.analysis_interpretation_shriram.dawkhar.1
Brm unit.5 data.analysis_interpretation_shriram.dawkhar.1Brm unit.5 data.analysis_interpretation_shriram.dawkhar.1
Brm unit.5 data.analysis_interpretation_shriram.dawkhar.1
 
Shriram correlation
Shriram correlationShriram correlation
Shriram correlation
 
Shriram edpm chapter-3
Shriram edpm chapter-3Shriram edpm chapter-3
Shriram edpm chapter-3
 
Sampling brm chap-4
Sampling brm chap-4Sampling brm chap-4
Sampling brm chap-4
 
Qualitative and quantitative research
Qualitative and quantitative researchQualitative and quantitative research
Qualitative and quantitative research
 
Research design brm-chap-2..
Research design brm-chap-2..Research design brm-chap-2..
Research design brm-chap-2..
 
Entrepreneurship development unit-1_for_internal_distribution
Entrepreneurship development unit-1_for_internal_distributionEntrepreneurship development unit-1_for_internal_distribution
Entrepreneurship development unit-1_for_internal_distribution
 
Theories of entrepreneurship_shriram.dawkhar
Theories of entrepreneurship_shriram.dawkharTheories of entrepreneurship_shriram.dawkhar
Theories of entrepreneurship_shriram.dawkhar
 

Dernier

The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai KuwaitThe Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
daisycvs
 
Mifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in Oman
Mifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in OmanMifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in Oman
Mifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in Oman
instagramfab782445
 
Mifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pills
Mifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pillsMifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pills
Mifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pills
Abortion pills in Kuwait Cytotec pills in Kuwait
 
Mckinsey foundation level Handbook for Viewing
Mckinsey foundation level Handbook for ViewingMckinsey foundation level Handbook for Viewing
Mckinsey foundation level Handbook for Viewing
Nauman Safdar
 

Dernier (20)

Putting the SPARK into Virtual Training.pptx
Putting the SPARK into Virtual Training.pptxPutting the SPARK into Virtual Training.pptx
Putting the SPARK into Virtual Training.pptx
 
Marel Q1 2024 Investor Presentation from May 8, 2024
Marel Q1 2024 Investor Presentation from May 8, 2024Marel Q1 2024 Investor Presentation from May 8, 2024
Marel Q1 2024 Investor Presentation from May 8, 2024
 
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai KuwaitThe Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
 
Katrina Personal Brand Project and portfolio 1
Katrina Personal Brand Project and portfolio 1Katrina Personal Brand Project and portfolio 1
Katrina Personal Brand Project and portfolio 1
 
Phases of Negotiation .pptx
 Phases of Negotiation .pptx Phases of Negotiation .pptx
Phases of Negotiation .pptx
 
Cracking the 'Career Pathing' Slideshare
Cracking the 'Career Pathing' SlideshareCracking the 'Career Pathing' Slideshare
Cracking the 'Career Pathing' Slideshare
 
Mifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in Oman
Mifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in OmanMifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in Oman
Mifepristone Available in Muscat +918761049707^^ €€ Buy Abortion Pills in Oman
 
Uneak White's Personal Brand Exploration Presentation
Uneak White's Personal Brand Exploration PresentationUneak White's Personal Brand Exploration Presentation
Uneak White's Personal Brand Exploration Presentation
 
joint cost.pptx COST ACCOUNTING Sixteenth Edition ...
joint cost.pptx  COST ACCOUNTING  Sixteenth Edition                          ...joint cost.pptx  COST ACCOUNTING  Sixteenth Edition                          ...
joint cost.pptx COST ACCOUNTING Sixteenth Edition ...
 
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
 
Mifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pills
Mifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pillsMifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pills
Mifty kit IN Salmiya (+918133066128) Abortion pills IN Salmiyah Cytotec pills
 
Escorts in Nungambakkam Phone 8250092165 Enjoy 24/7 Escort Service Enjoy Your...
Escorts in Nungambakkam Phone 8250092165 Enjoy 24/7 Escort Service Enjoy Your...Escorts in Nungambakkam Phone 8250092165 Enjoy 24/7 Escort Service Enjoy Your...
Escorts in Nungambakkam Phone 8250092165 Enjoy 24/7 Escort Service Enjoy Your...
 
BeMetals Investor Presentation_May 3, 2024.pdf
BeMetals Investor Presentation_May 3, 2024.pdfBeMetals Investor Presentation_May 3, 2024.pdf
BeMetals Investor Presentation_May 3, 2024.pdf
 
Arti Languages Pre Seed Teaser Deck 2024.pdf
Arti Languages Pre Seed Teaser Deck 2024.pdfArti Languages Pre Seed Teaser Deck 2024.pdf
Arti Languages Pre Seed Teaser Deck 2024.pdf
 
Dr. Admir Softic_ presentation_Green Club_ENG.pdf
Dr. Admir Softic_ presentation_Green Club_ENG.pdfDr. Admir Softic_ presentation_Green Club_ENG.pdf
Dr. Admir Softic_ presentation_Green Club_ENG.pdf
 
Rice Manufacturers in India | Shree Krishna Exports
Rice Manufacturers in India | Shree Krishna ExportsRice Manufacturers in India | Shree Krishna Exports
Rice Manufacturers in India | Shree Krishna Exports
 
HomeRoots Pitch Deck | Investor Insights | April 2024
HomeRoots Pitch Deck | Investor Insights | April 2024HomeRoots Pitch Deck | Investor Insights | April 2024
HomeRoots Pitch Deck | Investor Insights | April 2024
 
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60% in 6 Months
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60%  in 6 MonthsSEO Case Study: How I Increased SEO Traffic & Ranking by 50-60%  in 6 Months
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60% in 6 Months
 
Mckinsey foundation level Handbook for Viewing
Mckinsey foundation level Handbook for ViewingMckinsey foundation level Handbook for Viewing
Mckinsey foundation level Handbook for Viewing
 
Power point presentation on enterprise performance management
Power point presentation on enterprise performance managementPower point presentation on enterprise performance management
Power point presentation on enterprise performance management
 

Qualitative Research Methods

  • 1. Qualitative Research Methods (QRM) – Unit -5 (Reliability & Validity) Dr. Shriram S. Dawkhar SIBAR, Kondhwa, Pune.
  • 2. Syllabus 5. Quality Criteria in Qualitative Research: Reliability, Validity, Objectivity, Alternative Criteria, Criteria for Evaluating the Building of Theories, Quality Assessment as a Challenge for Qualitative Research, Triangulation, Analytic Induction, Generalization in Qualitative Research, The Constant Comparative Method, Process Evaluation and Quality Management. (5)
  • 3. Lets first Understand Reliability and Validity for Quantitative Research
  • 4. 11-4 Evaluating Measurement Tools We have to test Validity and Reliability of our research Instrument. (e.g. – Questionnaire) Criteria Validity Practicality Reliability
  • 5. Tests of sound measurement • Sound measurement must meet the • Tests of validity : Refers to the fact that the instrument is able to measure what it is designed for. • Reliability : Means every time the instrument is used it should give the same result • Practicality : Means that the instrument is very easy & practical to use. • These are three major considerations one should use in evaluating a measurement tool.
  • 6. 1) Test of Validity • Validity refers to the extent to which a test measures what we actually wish to measure. • Validity is the most critical criterion and indicates the degree to which an instrument measures what it is supposed to measure. • Validity can also be thought of as utility. • In other words validity is the extent to which differences found with a measuring instrument reflect true differences among those being tested.
  • 7. Reliability • A measure is reliable to the degree that it supplies consistent results. • Reliability is a necessary contributor to validity but is not a sufficient condition for validity. • It is concerned with estimates of the degree to which a measurement is free of random or unstable error. • Reliable instruments are robust and work well at different times under different conditions. This distinction of time and condition is the basis for three perspectives on reliability – stability, equivalence, and internal consistency.
  • 10. Reliability & Validity • Qualitative vs. Quantitative “The quantitative study must convince the reader that procedures have been followed faithfully because very little concrete description of what anyone does is provided. The qualitative study provides the reader with a depiction in enough detail to show that the author’s conclusion ‘makes sense’. . . The quantitative study portrays a world of variables and static states. By contrast the qualitative study describes people acting in events.” (Firestone, 1987, p. 19)
  • 11. Reliability & Validity : Challenges 1.What can you possibly tell from an n ( Sample) of 1? • 2. What is it worth to just get the researcher’s interpretation of what is taking place? • 3. How can you generalize from a small, nonrandom sample?
  • 12. Reliability & Validity : Challenges • 4. If the researcher is the primary instrument for data collection and analysis, how can we be sure the researcher is a valid and reliable instrument? • 5. How will you know when to stop collecting data? • 6. Isn’t the researcher biased and just finding out what he or she expects to find?
  • 13. Reliability & Validity : Challenges • 7. Without hypotheses, how will you know what you’re looking for? • 8. Don’t people lie to field researchers? • 9. Doesn’t the researcher’s presence result in a change in participants’ normal behavior, thus contaminating the data? • 10. If somebody else did the study, would they get the same results?
  • 14. Reliability & Validity • Criteria ✓Some argue that qualitative researchers should consider validity and reliability from the philosophical assumptions underlying the chosen paradigm.
  • 15. Reliability & Validity • Qualitative vs. Quantitative Criteria Quantitative Qualitative Internal Validity Credibility External Validity Transferability Reliability Dependability Objectivity Confirmability Important Slide
  • 16. Reliability & Validity : 1) Credibility: •Credibility: the extent to which interpretations can be validated as true, correct, and dependable •Is the study believable from the perspective of those observed, does it ring true to the people studied? • Is the data complete? • Some argue one can never truly discover the reality of a situation and should look for what’s credible instead.
  • 17. •Credibility: - Validity is relative - Humans are closer to the truth than if a data collection instrument had been interjected between us and the participants Reliability & Validity : 1)Credibility:
  • 18. •1) Use triangulation •2) Have adequate engagement in data collection, saturation •3) Employ reflexivity Reliability & Validity : 1)Credibility: How to bring it
  • 19. Reliability & Validity : 1)Credibility: How to bring it 1) Use triangulation to overcome inherent flaws •- data •- investigator •- interdisciplinary •- theory
  • 20. 2) Have adequate engagement in data collection, saturation - Holistic - Look for data that supports alternate explanations Reliability & Validity : 1)Credibility: How to bring it
  • 21. • 3) Employ reflexivity Reflexivity: the process of reflecting critically upon the self as the researcher, the “human instrument” Reliability & Validity : 1)Credibility: How to bring it
  • 22. Reliability & Validity: 2) Transferability: •Transferability: degree to which the results can be applied to other settings/situations • - Researcher supplies thick (detailed) descriptions • - Pays careful attention to the sample
  • 23. •Transferability: • “In qualitative research, a singe case or small, nonrandom, purposeful sample is selected precisely because the researcher wishes to understand the particular in depth, not to find out what is generally true of the many.” (Merriam, 2009, 224) Reliability & Validity: 2) Transferability:
  • 24. Reliability & Validity: 3) Dependability: •Dependability: concerned with whether or not the findings can be duplicated/repeated •Describes changes in the setting and how those changes affected the research
  • 25. •Dependability: ✓ Difficult because human behavior is constantly changing - many interpretations - no benchmarks or static means of measurements - similarity of answers does not ensure accuracy Reliability & Validity: 3) Dependability:
  • 26. •Dependability: ✓ More important to ask if whether the results are consistent with the data collected Reliability & Validity: 3) Dependability:
  • 27. ✓Strategies to help with dependability: - triangulation (discussed in unit -1) - peer examination - investigator’s position - audit trail Audit trail: independent readers can authenticate the findings of a study by following the trail of the researcher Reliability & Validity: 3) Dependability: How to achieve it
  • 28. •Confirmability: the degree to which the results can be corroborated (Validated) by others • - Results should be well-reasoned • - The results of the study vs. the • researcher’s bias? Reliability & Validity: 4) Confirmability
  • 29. •Confirmability: ✓One concern is reactivity • - How the act of observation changes a situation Reliability & Validity: 4) Confirmability
  • 30. Reliability & Validity ✓Different criteria apply to different methods i.e. In narrative analysis look for what “tells” a persuasive story in a narrative way vs. the thick description needed in an ethnography of a cultural group
  • 31. Objectivity •Objectivity is considered as an ideal for scientific inquiry, as a good reason for valuing scientific knowledge, and as the foundation of the authority of science in society. •It expresses the thought that the claims, methods and results of science are not, or should not be influenced by particular perspectives, value commitments, community bias or personal interests, to name a few significant factors. •Scientific objectivity is a feature of scientific claims, methods and results.
  • 32. Objectivity in Qualitative Research • The obstacles special to the social sciences are caused by the special involvement of the investigator with his topic of study, which relates to both his interests and his emotional make-up. • Even though achieving complete objectivity in science is an impossibility, aiming at it, or attaining as much of it as reasonably possible, is a necessary condition for the conduct of all scientific inquiry. • The only way in which we can strive for ‘objectivity’ in theoretical analysis is to expose the valuations to full light, make them conscious, specific, and explicit, and permit them to determine the theoretical research. • A more balanced view of objectivity both as a method as well as ideal must be considered.
  • 33. Evaluating Research 1.Are the methods of research appropriate to the nature of the question being asked? 2. Is the connection to an existing body of knowledge or theory clear? 3.Are there clear accounts of the criteria used for the selection of cases for study and of the data collection and analysis?
  • 34. • 4. Does the sensitivity of the methods match the needs of the research question? • 5.Were the data collection and record keeping systematic? • 6. Is reference made to accepted procedures for analysis? Evaluating Research
  • 35. • 7. How systematic is the analysis? • 8. Is there adequate discussion of how themes, concepts and categories were derived from the data? • 9. Is there adequate discussion of the evidence for and against the researcher’s arguments? • 10. Is a clear distinction made between the data and their interpretation? Evaluating Research
  • 36. Criteria for Evaluating the Building of Theories • Corbin and Strauss (1990) mention four points of departure for judging empirically grounded theories and the procedures that led to them. According to their suggestion, you should critically assess • 1) the validity, reliability, and credibility of the data, • 2 the plausibility, and the value of the theory itself, • 3 the adequacy of the research process which has generated, elaborated, or tested the theory, and • 4 the empirical grounding of the research findings.
  • 37. For evaluating the research process itself, they suggest seven criteria: • Criterion 1 How was the original sampling selected? On what grounds (selective sampling)? • Criterion 2 What major categories emerged? • Criterion 3 What were some of the events, incidents, actions, and so on that indicated some of these major categories?
  • 38. For evaluating the research process itself, they suggest seven criteria • Criterion 4 On the basis of what categories did theoretical sampling proceed? That is, how did theoretical formulations guide some of the data collection? After the theoretical sampling was carried out, how representative did these categories prove to be? • Criterion 5 What were some of the hypotheses pertaining to relations among categories? On what grounds were they formulated and tested? Criterion
  • 39. • 6 Were there instances when hypotheses did not hold up against what was actually seen? How were the discrepancies accounted for? How did they affect the hypotheses? • Criterion 7 How and why was the core category selected? Was the selection sudden or gradual, difficult or easy? On what grounds were the final analytic decisions made? For evaluating the research process itself, they suggest seven criteria
  • 40. Criteria for Evaluating the Building of Theories • A central role is given to the question of whether the findings and the theory are grounded in the empirical relations and data—if it is a grounded theory (building) or not. • For an evaluation of the realization of this aim, Corbin and Strauss suggest seven criteria for answering the question of the empirical grounding of findings and theories:
  • 41. Criteria for Evaluating the Building of Theories • Criterion 1 Are concepts generated? • Criterion 2 Are the concepts systematically related? • Criterion 3 Are there many conceptual linkages and are the categories well developed? Do the categories have conceptual density? • Criterion 4 Is there much variation built into the theory?
  • 42. • Criterion 5 Are broader conditions that affect the phenomenon under study built into its explanation? • Criterion 6 Has "process" been taken into account? • Criterion 7 Do the theoretical findings seem significant and to what extent? (1990, pp. 17-18) Criteria for Evaluating the Building of Theories
  • 43. Criteria for Theory Development in Qualitative Research • 1 The degree to which generic/formal theory is produced. • 2 The degree of development of the theory. • 3 The novelty of the claims made. • 4 The consistency of the claims with empirical observations and the inclusion of representative examples of the latter in the report.
  • 44. • 5 The credibility of the account to readers and/or those studied. • 6 The extent to which findings are transferable to other settings. • 7 The reflexivity of the account: the degree to which the effects on the findings of the researcher and of the research settings employed are assessed and/or the amount of information about the research process that is provided to readers. Criteria for Theory Development in Qualitative Research
  • 45. Quality Assessment as a Challenge for Qualitative Research • Nevertheless, the question of how to assess the quality of qualitative research is currently raised in three respects. • First, by the researchers who want to check and secure their proceeding and their results. • Second, by the consumers of qualitative research—the readers of publications or the funding agencies, who want to assess what has been presented to them; • and finally in the evaluation of research in reviewing research proposals and in peer reviews of manuscripts submitted to journals
  • 46. • ("Are the results credible and appropriate?"), • "Do they address the research question(s)?" • "Data collection procedures are fully explained
  • 47. Triangulation (explained in detail in Unit-1), Analytic Induction, Generalization in Qualitative Research
  • 48. Analytic Induction • Znaniecki (1934) introduced analytic induction. This strategy explicitly starts from a specific case. According to Buhler-Niederberger it can be characterized as follows: • Analytic induction is a method of systematic interpretation of events, which includes the process of generating hypotheses as well as testing them. Its decisive instrument is to analyze the exception, the case, which is deviant to the hypothesis. (1985, p. 476)
  • 49. Steps of Analytic Induction • 1 A rough definition of the phenomenon to be explained is formulated. • 2 A hypothetical explanation of the phenomenon is formulated. • 3 A case is studied in the light of this hypothesis to find out whether the hypothesis corresponds to the facts in this case. • 4 If the hypothesis is not correct, either the hypothesis is reformulated or the phenomenon to be explained is redefined in a way that excludes this case.
  • 50. Steps of Analytic Induction • 5 Practical certainty can be obtained after a small number of cases have been studied, but the discovery of each individual negative case by the researcher or another researcher refutes the explanation and calls for its reformulation. • 6 Further cases are studied, the phenomenon is redefined, and the hypotheses are reformulated until a universal relation is established; each negative case calls for redefinition or reformulation.
  • 51. Generalization in Qualitative Research • The generalization of concepts and relations found from analysis is another strategy of grounding qualitative research. • The central points to consider in such an evaluation are first the analyses and, second, the steps taken to arrive at more or less general statements. • The problem of generalization in qualitative research is that its statements are often made for a certain context or specific cases and based on analyses of relations, conditions, processes, etc., in them.
  • 52. Generalization in Qualitative Research • However, when attempts are made at generalizing the findings, this context link has to be given up in order to find out whether the findings are valid independently of and outside specific contexts.
  • 53. Generalization in Qualitative Research • Correspondingly, various possibilities are discussed for mapping out the path from the case to the theory in a way that will allow you to reach at least a certain generalization. • A first step is to clarify which degree of generalization you are aiming at and is possible to obtain with the concrete study in order to derive appropriate claims for generalization. • A second step is the cautious integration of different cases and contexts in which the relations under study are empirically analyzed. • The generalizability of the results is often closely linked to the way the sampling is done. • The third step is the systematic comparison of the collected material. Here again, the procedures for developing grounded theories can be drawn on.
  • 54. The Constant Comparative Method • In the process of developing theories, and additional to the method of "theoretical sampling", Glaser (1969) suggests the constant comparative method as a procedure for interpreting texts.
  • 55. The Constant Comparative Method • “It basically consists of four stages: “ • (1) comparing incidents applicable to each category, • (2) integrating categories and their properties, • (3) delimiting the theory, and • (4) writing the theory" (1969, p. 220).
  • 56. The Constant Comparative Method • Although this method is a continuous growth process—each stage after a time transforms itself into the next—previous stages remain in operation throughout the analysis and provide continuous development to the following stage until the analysis is terminated. (1969, p. 220)
  • 57. Process Evaluation and Quality Management • A) Process Evaluation : • Qualitative research is embedded in a process in a special way. It does not make sense to ask and answer questions of sampling or concerning special methods in an isolated way. Whether a sampling is appropriate can only be answered with regard to the research question, to the results, and to the generalizations that are aimed at and the methods used.
  • 58. Process Evaluation : • Abstract measures like the representativeness of a sample, which can be judged generally, do not have any benefit here. • A central starting point for answering such questions is the sounding of the research process, which means whether the sampling that was applied harmonizes with the concrete research question and with the concrete process. • Activities for optimizing qualitative research in the concrete case have to start from the stages of the qualitative research process. Correspondingly, note a shift in the accent of evaluating qualitative methods and their use from mere evaluation of the application to process evaluation. • Thus, the aspect of grounding is shifted to the level of the research process. The aim of this shift is also to underscore a different understanding of quality in qualitative research and to relate it to a concrete project.
  • 59. Quality Management • Impulses for further developments can be provided by the general discussion about quality management (Kamiske and Brauer 1995), which lies mainly in the areas of industrial production but also of public services (Murphy 1994). • But some of the concepts and strategies used in this discussion may be adopted to promote a discussion about quality in research, which is appropriate to the issues and research concepts. • 1) The concept of auditing : It provides first intersections: "An audit is understood as a systematic, independent examination of an activity and its results, by which the existence and appropriate application of specified demands are evaluated and documented" (Kamiske and Brauer 1995, p. 5).
  • 60. Quality Management • In particular, the "procedural audit" is interesting for qualitative research. • It should guarantee that "the pre-defined demands are fulfilled and are useful for the respective application .... Priority is always given to an enduring remedy of causes of mistakes, not only a simple detection of mistakes" .
  • 61. Quality Management • 2) Concepts like "member checks" or communicative validation explicitly take this orientation into account.
  • 62. Quality Management • Designing the research process and proceeding in a way which gives enough room to those who are studied realizes this orientation implicitly. • For an evaluation, both aspects may be analyzed explicitly: how far did the study proceed in such a way that it answered its research question
  • 63. Quality Management • Make sure that the definition of the goals and standards of the project are as clear as possible, and that all researchers and co- workers integrate themselves in this definition. • Define how these goals and standards and, more generally, the quality are obtained; finally, a consensus about the way to apply certain methods (perhaps through joint interview training) and its analysis is a precondition for quality in the research process. • Provide a clear definition of the responsibilities for obtaining quality in the research process. • Allow transparency of the judgment and the assessment and quality in the process.
  • 64. References • 1. An Introduction to Qualitative Research, Uwe Flick, 4th Edition, SAGE • 2. Research Methods in the Social Sciences, Bridget Somekh & Cathy Lewin, 5th Edition, SAGE India. • https://www.slideshare.net/shoeb786/objecti vity-in-social-science-research