Grantham University Wk 11 Evidence Based Nursing Practice Discussion Question...
Evidence based practice
1. This page offers a starting point for finding information on Evidence
Based Practice [EBP].
There are many definitions of EBP with differing emphases. A
survey of social work faculty even showed they have different ideas
about just what makes up EBP (This can be a source of confusion
for students and newcomers to this topic of study. Perhaps the
best known is Sackett et al's (1996, 71-72) now dated definition
from evidence based medicine: "Evidence based medicine is the
conscientious, explicit, and judicious use of current best evidence in
making decisions about the care of individual patients. The practice
of evidence based medicine means integrating individual clinical
expertise with the best available external clinical evidence from
systematic research. By individual clinical expertise we mean the
proficiency and judgment that individual clinicians acquire through
clinical experience and clinical practice. Increased expertise is
reflected in many ways, but especially in more effective and efficient
diagnosis and in the more thoughtful identification and
compassionate use of individual patients' predicaments, rights, and
preferences in making clinical decisions about their care. By best
available external clinical evidence we mean clinically relevant
research, often from the basic sciences of medicine, but especially
from patient centered clinical research into the accuracy and
precision of diagnostic tests (including the clinical examination), the
power of prognostic markers, and the efficacy and safety of
therapeutic, rehabilitative, and preventive regimens."
This early definition, however, proved to have some important
limitations in practice. Haynes et al (2002) - Sackett's colleagues in
the McMaster Group of physicians in Canada - pointed out the
definition did not pay enough attention to the traditional
determinants of clinical decisions. That is, it purposefully
emphasized research knowledge but did not equally emphasize the
client's needs and situation, nor the client's stated wishes and
goals, nor the clinicians' expertise in assessing and integrating all
these elements into a plan of intervention.
The contemporary definition of EBP is simply "the integration of
the best research evidence with clinical expertise and patient
values" (Sackett, et al. 2000, p. x). This simpler, current, definition
gives equal emphasis to 1) the patient's situation, 2) the patient's
goals, values and wishes, 3) the best available research evidence,
and 4) the clinical expertise of the practitioner. The difference is
that a patient may refuse interventions with strong research support
due to differences in beliefs and values. Similarly, the clinician may
be aware of factors in the situation (co-occurring disorders, lack of
resources, lack of funding, etc.) that indicate interventions with the
best research support may not be practical to offer. The clinician
may also notice that the best research was done on a population
Page 1 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
2. different from the current client, making its relevance questionable,
even though its rigor is strong. Such differences may include age,
medical conditions, gender, race or culture and many others.
This contemporary definition of EBP has been endorsed by many
social workers. Gibbs and Gambrill (2002), Mullen and Shlonsky
(2004, Rubin (2008), and Drisko and Grady (2012) all apply it in
their publications. Social workers often add emphasis to client
values and views as a key part of intervention planning. Many
social workers also argue that clients should be active participants
in intervention planning, not merely recipient's of a summary of
"what works" from an "expert" (Drisko & Grady, 2012) Actively
involving clients in intervention planning may also be a useful way
to enhancing client motivation and to empower clients.
Some in social work view EBP as a mix of a) learning what
treatments "work" based on the best available research (whether
experiential or not), b) discussing client views about the treatment to
consider cultural and other differences, and to honor client self
determination and autonomy, c) considering the professionals
"clinical wisdom" based on work with similar and dissimilar cases
that may provide a context for understanding the research
evidence, and d) considering what the professional can, and can
not, provide fully and ethically (Gambrill, 2003; Gilgun, 2005). With
much similarity but some differences, the American Psychological
Association (2006, p. 273) defines EBP as "the integration of the
best available research with clinical expertise in the context of
patient characteristics, culture and preferences." Gilgun (2005)
notes that while research is widely discussed, the meanings of
"clinical expertise" and "client values and preferences" have not
been widely discussed and have no common definition.
Drisko & Grady (2012) argue that the EBP practice decision making
process defined by Sackett and colleagues seems to fit poorly with
the way health care payers enact EBP at a macro, policy, level.
Clinical social workers point to list of approved treatments that will
be funded for specific disorders - and note this application of EBP
does not include specific client values and preferences and ignores
situational clinical expertise. Drisko & Grady point out that there is a
conflict between the EBP model and how it is implemented
administratively to save costs in health care. While cost savings are
very important, this use of "EBP" is not consistent with the Sackett
model. Further, the criteria used to develop lists of approved
treatments is generally not clear or transparent - or even stated.
Payers very often appear to apply standards that are different from
multidisciplinary sources of systematic reviews of research like the
Cochrane Collaboration. Clinical expertise and client values too
often drop out of the administrative applications of EBP.
Evidence based practice is one useful approach to improving the
impact of practice in medicine, psychology, social work, nursing and
allied fields. Of course, professions have directed considerable
attention to "evidence" for many years (if not for as long as they
Page 2 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
3. have existed!). They have also honored many different kinds of
evidence. EBP advocates put particular emphasis on the results of
large-scale experimental comparisons to document the efficacy of
treatments against untreated control groups, against other
treatments, or both. (See, for example, the University of Oxford's
Hierarchy of Evidence for EBM). They do this because well
conceptualized and completed experiments (also called RCTs) are
a great way to show a treatment caused a specific change. The
ability to make cause and effect determinations is the great strength
of experiments. Note that this frames "evidence" in a very specific
and delimited manner. Scholars in social work and other
professions have argued for "Many Ways of Knowing" (Hartman,
1990). They seek to honor the knowledge developed by many
different kinds of research - and to remind clinicians, researchers
and the public that the conceptualization underlying research may
be too narrow and limited. Thus Drisko & Grady (2012) argue that
EBP, as summarized by researchers, may devalue non-
experimental research. Experiments are only as good as the
exploratory research that discovers new concepts, and the
descriptive research that helps in the development of tests and
measures. Only emphasizing experiments ignores the very
premises on which they rest. Finally, note that EBM/EBP
hierarchies of research evidence include many non-experimental
forms of research since experiments for some populations may be
unethical or impractical - or simply don't address the kinds of
knowledge needed in practice.
All the "underpinnings" of experimental research: the quality of
conceptualizations, the quality of measures, the clarity and
specificity of treatments used, the quality of samples studied and of
the quality and completeness of collected data are assumed to be
sound and fully adequate when used to determine "what works."
There is also an assumption that the questions framing the research
allow for critical perspectives and are fully ethical. Social workers
would argue they should also include social diversity samples well -
since diverse kinds of people show up at real world clinics.
International standards affirm basic ethical principles supporting
respect for persons, beneficence and social justice (see The
Belmont Report.)
Is EBP only about Intervention or Treatment Planning?
No. This may be the most common application of EBP for clinical
social workers, but the EBP process can also be applied to a)
making choices about diagnostic tests and protocols to insure
thorough and accurate diagnosis), b) selecting preventive or harm-
reduction interventions or programs, c) determining the etiology of a
disorder or illness, d) determining the course or progression of a
disorder or illness, e) determining the prevalence of symptoms as
part of establishing or refining diagnostic criteria, f) completing
economic decision-making about medical and social service
programs (University of Oxford Centre for Evidence-based
Page 3 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
4. Medicine, 2011), and even g) understanding how a client
experiences a problem or disorder (Rubin, 2008).
EBP is also not the same as defining empirically supported
treatments (ESTs), empirically supported interventions (ESIs), or
'best practices.' These are different ideas and are based on
different models. These models don't include client values and
preferences nor clinical expertise as EBP does.
EBP as a Social Movement
While EBP is most often described in terms of a practice decision-
making process, it is also useful to think of it as a much larger social
movement. Drisko and Grady (2012) argue that at a macro-level,
EBP is actively used by policy makers to shape service delivery and
funding. At a messo- level, EBP is impacting the kinds of
interventions that agencies offer, and even shaping how supervision
is done. Drisko and Grady (2012) also argue that EBP is
establishing a hierarchy of research evidence that is privileging
experimental research over other ways of knowing. Experimental
evidence has many merits, but is not the only way of knowing of use
and importance in social work practice. Finally, the impact of EBP
may alter how both practice and research courses are taught in
social work. There are other aspects of EBP beyond the core
practice decision-making process that are re-shaping social work
practice, social work education, and our clients' lives. As such, it
may be viewed as a public idea or a social movement at a macro
level.
Why Evidence Based Practice or EBP?
It is one step toward making sure each client gets the best service
possible.
Some argue it helps keep your knowledge up to date, supplements
clinical judgment, can save time and most important can improve
care and even save lives. Its a way to balance your own views with
large scale research evidence.
Some say it's unethical to use treatments that aren't known to work.
(Of course, services may need to be so individualized in unique
circumstances that so that knowing "what works" in general may not
be the most salient factor in helping any particular client. Still, using
the best available research knowledge is always beneficial.)
Several web sites serve as portals to bodies of research useful to
EBP. The focus on these organizations varies, but the emphasis
remains on (mainly) experimental demonstration of the efficacy of
treatments.
How is EBP Implemented in Practice?
Profiling research that informs professionals and clients about what
works is where evidence based practice starts. These summaries
Page 4 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
5. tells us what we know about treatment and program efficacy based
on experimental work - as well as what we don't know or aren't
really sure about.
Having access to information on what works allows professionals, in
conjunction with clients, to select treatments that are most likely to
be helpful (and least likely to be harmful) before intervention is
begun. Practice evaluation is quite different in that takes place at
the start of treatment, during treatment and after treatment.
Practice evaluation also uses single case methods rather than large
sample, experimental research designs. EBP and practice
evaluation work together very well, but they have different purposes
and use very different methods.
The creation of "User's Guides" is one way to make the results of
research more available to practitioners. In medicine, the idea is to
get research results to the practitioner in an easy to assimilate
fashion, though this often has a price.
Funding is being offered to support EBP from governments and
private/insurance sources.
However, to understand and critically appraise this material, a lot of
methodological knowledge is needed. Sites offering introductions to
the technology of EBP are growing.
How is EBP Taught?
There are some useful resources for Teaching and Learning about
EBP. One fine example is offered by Middlesex University in the
United Kingdom which includes good information on critical
appraisal of information in EBP.
The State University of New York's Downstate Medical center offers
a (medically oriented) online course in EBP, including a brief but
useful glossary.
The Major Sources of Research for use in EBP:
The Cochrane Collaboration [ www.cochrane.org ] sets standards
for reviews of medical, health and mental health treatments and
offers "systematic reviews" of related research by disorder. The
Cochrane Reviews offer a summary of international published and
sometimes pre-publication research. Cochrane also offers
Methodological Abstracts to orient researchers and research
consumers alike.
The Campbell Collaboration [ www.campbellcollaboration.org ]
offers review of the impact of social service programs. "The
Campbell Collaboration (C2) is an organization that aims to help
people make well-informed decisions about the effects of
interventions in the social, behavioral and educational arenas. C2's
objectives are to prepare, maintain and disseminate systematic
reviews of studies of interventions. C2 acquires and promotes
Page 5 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
6. access to information about trials of interventions. C2 builds
summaries and electronic brochures of reviews and reports of trials
for policy makers, practitioners, researchers and the public."
C2 SPECTR is a registry of over 10,000 randomized and
possibly randomized trials in education, social work and
welfare, and criminal justice.
C2 RIPE [Register of Interventions and Policy
Evaluation] offers researchers, policymakers,
practitioners, and the public free access to reviews and
review-related documents. These materials cover 4
content areas: Education, Crime and Justice, Social
Welfare and Methods.
The United States government also offers treatment guidelines
based on EBP principles at the National Guideline
Clearinghouse. [ http://www.guideline.gov/ ] This site includes
very good information on medication as wll as very clear statements
of concern about medications indicated in guidelines which later
prove to have limitations.
The U.S. government provides information on ongoing, government
sponsored, clinical trials.
Other Online Resources for EBP and Treatment Guidelines
Derived from EBP Criteria and Procedures:
The American Psychiatric Association offers Practices Guidelines.
Please be aware that the numbers of practice guidelines are few.
Existing guidelines may be up to 50 pages in length. If you are not
allowed to enter via this hyperlink, paste the following URL into your
browser:
http://www.psych.org/psych_pract/treatg/pg/prac_guide.cfm
The Agency for Healthcare Research and Quality also offers
outcome research information. AHRQ offers an alphabetical listing
out outcome studies.
Note that there are a growing number of commercial [.com] sites
that offer their consultation regarding EBP. It is not always easy to
determine their organization structure and purposes, the basis of
their recommendations and any potential conflicts of interest. In this
regard, the sites of the government and of professional
organizations are "better" resources as their purposes, missions
and funding sources are generally more clear and publicly stated.
References:
Page 6 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
7. American Psychological Association. (2006). APA presidential task
force on evidence based practice. Washington, DC: Author
Dobson, K., & Craig, K. (1998). Empirically supported therapies:
Best practice in professional psychology. Thousand Oaks,
CA: Sage.
Drisko, J. & Grady, M. (2012). Evidence-based practice in clinical
social work. New York: Springer-Verlag.
Elwood, J.M. (2007). Critical appraisal of epidemiological studies
and clinical trials (3rd ed.) New York: Oxford University Press.
Gambrill, E. (2003). Evidence-based practice: Implications for
knowledge development and use in social work. In A. Rosen & E.
Proctor (Eds.), Developing practice guidelines for social work
intervention (pp. 37-58). New York: Columbia University Press.
Gibbs, L. (2003). Evidence-based practice for the helping
professions. New York: Wadsworth.
Gilgun, J. (2005). The four cornerstones of qualitative research.
Qualitative Health Research, 16(3), 436-443.
Howard, M., McMillen, C., & Pollio, D. (2003). Teaching evidence-
based practice: Toward a new paradigm for social work education.
Research on Social Work Practice, 13, 234-259.
Mace, C., Moorey, S., & Roberts, B. (Eds.). (2001). Evidence in the
psychological therapies: A critical guide for practitioners.
Philadelphia, PA: Taylor & Francis.
Mantzoukas, S. (2008). A review of evidence-based practice,
nursing research and reflection: Levelling the hierarchy. Journal of
Clinical Nursing, 17(2), 214-223.
Roberts, A., & Yeager, K. (Eds.). (2004). Evidence-based practice
manual: Research and outcome measures in health and human
services. New York: Oxford University Press.
Sackett, D., Rosenberg, W., Muir Gray, J., Haynes, R. Richardson,
W. (1996). Evidencebased medicine: what it is and what it isn't.
British Medical Journal, 312, 71-72.
http://cebm.jr2.ox.ac.uk/ebmisisnt.html
Sackett, D., Richardson, W., Rosenberg, W., & Haynes, R. (1997).
Evidence-based medicine: How to practice and teach EBM. New
York: Churchill Livingstone.
Simpson, G., Segall, A., & Williams, J. (2007). Social work
education and clinical learning: Reply to Goldstein and Thyer.
Clinical Social Work Journal, (35), 33-36.
Smith, S., Daunic, A., & taylor, G. (2007). Treatment fidelity in
applied educational research: Expanding the adoption and
Page 7 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm
8. application of measures to ensure evidence-based practice.
Education & Treatment of Children, 30(4), pp. 121-134.
Stout, C., & Hayes, R. (Eds.). (2005). The evidence-based practice:
Methods, models, and tools for mental health professionals.
Hoboken, NJ: Wiley.
Stuart, R., & Lilienfeld, S. (2007). The evidence missing from
evidence-based practice. American Psychologist, 62(6), pp. 615-
616.
Trinder, L., & Reynolds, S. (2000). Evidence-based practice: A
critical appraisal. New York: Blackwell.
Wampold, B. (2007). Psychotherapy: The humanistic (and
effective) treatment. American Psychologist, 62(8), pp. 857-873.
back to top of page
to Social Work Resources Home Page
text copyright by J. Drisko - page begun 3/11/04; updated 9/24/12
Page 8 of 8Evidence Based Practice
19/05/2015http://sophia.smith.edu/~jdrisko/evidence_based_practice.htm