4. What do we consider evidence?
• prospective economic analysis?
• retrospective evaluation?
• single RCT?
• meta-analysis of RCTs?
• Non-experimental evaluations?
• Observational studies?
• descriptive statistics and performance metrics?
• constituent observation?
• Expert testimony?
• case study?
5. Example: Saccharin as a
Hazardous Waste
• In 1980 EPA listed as a hazardous waste based on
available scientific literature
• If we stop here, was the policy evidence-based?
• In 2000, toxicology community identified that the
studies on which EPA’s decision had relied on
studies with rats the identified biological
mechanisms not in humans
• If we stop here, was the policy still evidence-
based?
• A public request led EPA to re-consider the
information underlying the listing
• In 2010, EPA de-listed saccharin as a hazardous
waste
• An interesting case study in evidence…
6. Evidence Defined
There are many definitions. How does your audience
define evidence?
For example:
“Evidence is any information that can be used to come
to a conclusion and support a judgment or…to make
decisions that will improve…policies, actions, and
outcomes.”
-- N. Bennett, Conservation Biology 2016
“Using Perceptions as evidence to improve conservation and environmental management”
7. Observations on Nomenclature
• A social construct
• Definitions influence by epistemological and
ontological perspectives
• Context-dependent
• Ongoing dialogue in U.S. and internationally about
what constitutes “high quality” evidence
8. GAO’s Rules of Evidence
• Competence: Was the methodology used to collect
the evidence competently executed by competent
professionals?
• Relevance: Does the evidence address the question?
• Sufficiency: Is the evidence convincing to the
customers/to a reasonable person?
Rules of Evidence from the GAO Yellow Book
10. What does this mean for
Qualitative Approaches?
• Research and context inform the methods
• “high quality” evidence can depend on the audience’s
perception of credibility, reliability, validity
12. Word of Caution for Conducting
Case Studies
“Do case studies, but do them with the
understanding that your methods will be
challenged from rational (and irrational)
perspectives and that the insights resulting from
your case studies may be underappreciated”
– Robert Yin, p. xiii
13. What is a Case Study
• A method for considering complex characteristics
and relationships of real-life events
• "A method for learning about a complex instance,
based on a comprehensive understanding of that
instance obtained by extensive description and
analysis of that instance taken as a whole and in its
context.“ – GAO 1990
• Can be explanatory or descriptive
• Multiple case studies provide a means for cross-case
analysis (e.g., good/bad examples)
14. Classic Case Study Example:
Cuban Missile Crisis
• Allison & Zelikow developed 3 explanatory case
studies for the Missile Crisis:
– Rational actor
– Complex bureaucracy
– Politically motivated groups
• Demonstrated the ability to conduct case studies
for explanatory purposes, and transferable to
many types of complex governmental actions
15. Another Example:
Evaluation at EPA
• Hart (2016) developed 3 case studies of EPA programs to
understand what affects the agency’s ability to produce
program evaluation
• Cross case analysis enabled synthesis of how factors
interacted in different situations
17. Reasons for Using Case Studies for
Program Justification & Reporting
1. Linkages between funding and results may not
be well understood
2. Contributing and mediating factors, including
lags, may be outside scope of control
3. Other methods may not be available based on
cost, time, etc.
18. “Failure” Cases
• Innovations are mixed with incidents of
“failures”
• They are invaluable learning experiences when
we take an honest look
19. “Failure” Cases
• e.g., EPA’s use
of damage
cases, NASA
accident
investigations,
NTSB
investigations
20. Success “Stories”
• Help identify best practices for implementation
• Can aid in explaining how achieving results
• Facilitation replication of “what works” in
specific contexts
21. Example: Malnutrition in Vietnam
• Jerry Sternin opened new program in 1990 to
address malnutrition in Vietnam, without any
existing knowledge of the problem (or the
language)
• “We had no idea what we were going to do”
• Conventional wisdom: bad sanitation, high poverty,
ignorance about nutrition
Example from Heath
and Heath, Switch
22. Example: Malnutrition in Vietnam
• Sent out researchers to villages to observe health
• Learned its possible to be in a poor, rural village
and have quality nutrition
• But why?
23. Example: Malnutrition in Vietnam
• Observed difference between homes:
– Homes with healthy kids mixed shrimp and sweet-
potato greens into foods, otherwise considered
“poor” foods
• Took the lesson and applied practice through
community-level training
• Six months after the program started training
households, 2/3 of kids were better-nourished
24. Example: EPA Non-Point Source
Waterbodies Restored
• EPA’s CWA Sec. 319 program is developing case
studies for successful restorations
• Relates to justification of an “Agency Priority
Goal”
• EPA describes that conducting the case studies to:
– Allow states with successes to be highlighted
– And better allows EPA to track restorations
• EPA provides states with templates and tools to
practically facilitate development of the cases
25. Example: EPA Non-Point Source
Waterbodies Restored (cont’d)
• The template:
– describe the problem
– highlight the project
– articulate the results
26. Example: EPA Non-Point Source
Waterbodies Restored (cont’d)
• Because of EPA’s level of documentation, this
data can be used for a number of other
purposes and studies
• For example, Hart (2016) analyzed water quality
outcomes in states that prioritized funding for
319 activities
27. Advice for Using Case Studies to
Discuss Program Results
• Know your audience
• Be prepared to explain why specific cases are
chosen
• Consider working with independent entities to
review, validate, and encourage objectivity in
case selection
– Aim to avoid perception that only selecting cases to
look good, or masking important less favorable
details
28. Advice for Using Case Studies to
Discuss Program Results (cont’d)
• Develop mixed methods case studies; rely on
multiple data sources for theme convergence to
improve validity (triangulation)
– Include appropriate descriptive statistics and relevant
data points
• Find compelling cases, but avoid temptation to
use the “best case” as a “representative case” –
maintain credibility
– The best story may not be the best case study
• Compare cases to identify differences in results
and articulate contextual differences
30. What is the Commission?
• CEP is the result of discussions between Congress and the
Executive Branch on improving how government uses
survey and administrative data
• Making better use of administrative data has tremendous
potential to improve how government programs operate
• Created by bipartisan legislation co-sponsored by Speaker
Paul Ryan and Senator Patty Murray, enacted March 30,
2016 (P.L. 114-140)
31. Who are the Commissioners?
• 15-member bipartisan commission:
Katharine Abraham
University of Maryland
(CHAIR)
Ron Haskins
Brookings Institution
(CO-CHAIR)
Hilary Hoynes
University of California,
Berkeley
Kenneth Troske
University of Kentucky
Jeffrey Liebman
Harvard University
Allison Orris
OMB
Bruce Meyer
University of Chicago
Sherry Glied
New York University
Robert Shea
Grant Thornton LLP
Kim Wallin
Wallin Ltd.
Paul Ohm
Georgetown University
Robert Hahn
University of Oxford
Latanya Sweeney
Harvard University
Kathleen Rice
Faerge Baker Daniels LLP
Robert Groves
Georgetown University
Researchers and Administrators Privacy Experts
President
Speaker of
the House
House Minority
Leader
Senate Majority
Leader
Senate Minority
Leader
32. What will the Commission work on?
• The Commission will have the opportunity to
– consider how data, research, and evaluation are currently used to
build evidence,
– and how to strengthen evidence-building in the Federal
government
• Key Areas of Focus:
1. Integrating Survey & Administrative Data
2. Supporting Data Infrastructure & Security
3. Incorporating Evaluation in Program Design
4. Considering a Federal Data Clearinghouse
33. What is the Commission’s timeline?
CEP
Enacted
March
2016
First
Commission
Meeting
July 22, 2016
Meetings, Research &
Deliberations
July 2016-August 2017
Final Report
September
2017
Commission
Ends
Sept. 30,
2017
With ¾
Approval of the
Commission
34. How can you participate?
• Submit Written Public Comments:
– Request for Comments will be available soon
• Attend a Meeting of the Commission
– Check for notices on soon to be launched CEP.gov website