Envisioning Solutions to Problem Scenarios in Evaluation (AEA 2014)
1. Dr. Kathy Newcomer, Steve Mumford,
Yvonne Watson, and Nick Hart
George Washington University
2. Purpose: Share perspectives on thorny
evaluation dilemmas and cultivate a community
of practice
Introduction to systems thinking (10 min)
Small group dialogue by perspective (25 min)
Internal evaluator conducting evaluations (#1)
Internal evaluator overseeing external eval (#2)
External evaluator (#3)
Each group discusses problem scenarios from past
evaluation work
Debrief on applying systems thinking (10 min)
3. Systems: mental constructs that help us make
sense of the world and organize its complexity
Draws our attention to:
Interrelationships: how people, objects and
processes are connected (patterns) and with what
consequence within a particular context
Perspectives: how different people interpret a
situation, judge success, and imagine alternatives
Boundaries: how we can make things manageable
4. Be reflective: observe yourself as others
might, including your assumptions and values
See the world from others’ perspectives
Attend to emerging patterns
Explore indirect influence
Recognize your own involvement
Act responsibly (ethically, with wise
judgment) & take responsibility for actions
5. What are perspectives of different actors?
What are important aspects of the context? Are
additional details needed?
What are ethical, practical, methodological, professional,
other factors to consider?
What are potential actions and their consequences?
Which would you take and why?
What assumptions and values apply?
How might the scenario and action choice look different
from a different perspective (e.g., internal vs. external
evaluator)?
6. How did aspects of systems thinking show
up in your group discussion?
To what extent was systems thinking a
helpful guide for thinking through scenarios?
What aspects were most or least useful? Why?
What other frameworks might add further value?
Is dialogue about generic problem scenarios a
useful method for generating diverse
approaches to resolving thorny dilemmas?
7. Picture this:
You produce an evaluation report for a small non-profit client that
presents mixed and inconclusive results.
While some areas of inquiry suggest possible positive outcomes,
the data are not very compelling or high in quality, and are
balanced by some negative outcomes in the same area.
The client claims to understand the report and the reasons for
your mixed conclusions.
Then the following occurs:
You attend a program-wide celebration hosted by the client.
You notice a promotional handout summarizing evaluation
findings, produced by the client without your knowledge.
You realize that the handout includes only positive results and
completely omits negative results and caveats.
What do you do?
8. Picture this:
You are conducting an evaluation for a client who leads a small, relatively
“young” non-profit engaged in a new grant.
Your evaluation contract and plan includes a focus group with program
implementers about their experience with the program.
This is your only opportunity to collect program implementation data to
help interpret outcomes and suggest process improvements.
Then the following occurs:
A couple weeks before the focus group, the client demands to videotape
the meeting and select complimentary quotes, compile them into a
promotional video and place it on the non-profit’s website.
You are concerned with confidentiality and misconstruing participants’
comments, but the client gives an ultimatumthat the first priority for the
90 min focus group meeting is to produce a testimonial video.
The client controls scheduling and communication of the focus group
and insists implementers are not available for additional meetings.
What do you do?
9. Marty Monell (Pesticides)
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST)
Picture this:
The director of a non-profit engages you to survey staff members about
perceptions of the organizational culture and the benefits of more evaluation.
When analyzing responses, you see that some staff members provided harsh and
personal critiques of the ED’s leadership style in open-ended comments.
One particularly critical staff member accuses him of racist and sexist behavior.
The online survey is anonymous, but the agency has a very small staff.
What do you do?
Follow-up:
You alert the ED to the general themes of the critical comments in a one-on-one
meeting, and he insists on hearing more detail.
He shares that one staff member who completed the survey recently resigned
shortly after a negative annual performance review.
The ED believes this person supplied the harshest criticism out of spite.
What do you do now?
10. Marty Monell (Pesticides)
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST)
Picture this:
Your office has been charged to collaborate on an evaluation design and
implementation for a demonstration project with two other agencies.
When the funding was provided, senior leaders agreed on the policy-relevant
questions to address in the evaluation.
After participant recruitment begins:
You learn from one of the collaborating agencies that a critical data
element has not been included in a survey to address a key policy-relevant
question.
The collaborating partner refuses to modify the survey to include the
critical data element.
What do you do?
11. Marty Monell (Pesticides)
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST)
Picture this:
After collecting program data, you engage in a thoughtful analysis to draw up
findings and recommendations.
You send written findings and recommendations to your organization’s senior
leaders in advance of an in-person meeting.
During your meeting
The senior leaders express interest in your findings and appear supportive of your
recommendations.
An economist invited to the meeting speaks up to say your analysis is incomplete
because you did not establish a causal relationship on which to base your
recommendations.
The senior leaders take note and immediately appear to become skeptical of your
findings.
What do you do?
12. Marty Monell (Pesticides)
Barnes Johnson (RCRA) OR Caroline Hoskinson (OUST)
Picture this:
Tomorrow you are scheduled to meet with your organization's senior leaders to
present your evaluation findings.
Your written analysis and findings are being routed through an internal
organizational clearance process in advance of the meeting.
During the clearance process:
Your immediate supervisor decides to edit your findings and change a key
recommendation before submitting the materials to clearance, but does not
discuss with you in advance.
The changes to the findings are factually erroneous and the altered
recommendation is not supported by the evaluation.
After discovering the issue, you approach your supervisor to discuss the changes.
Your supervisor insists the edits were correct and will not permit a retraction.
What do you do?
13. Picture this:
The HQ office administers a national program implemented by the
District office.
The District office initiates an evaluation to examine the efficiency and
effectiveness of the program.
During the course of the evaluation the following occurs:
The HQ office assigns a representative to the evaluation team that
believes the District’s program should be shut down.
Rumors circulate that the HQ office plans to shut down the program
while the evaluation is underway and 9 months from completion.
HQ personnel request a copy of the draft report to meet with
members of Congress to discuss the program without involving any
members from the District program or evaluation team.
What’s an internal evaluator to do?
14. Picture this:
An office initiates an evaluation of an innovative program.
The contractor leading the evaluation is highly respected
and has a solid track record for good quality work.
The evaluation results will be used to determine if the
approach should be scaled up.
During the evaluation the following occurs:
The contractor begins to submit work that is of questionable
quality.
The client office loses confidence in the contractor’s ability
to complete the work and begins to scrutinize every
deliverable.
What’s an internal evaluator to do?
15. Picture this:
An office initiates an evaluation of a program.
Three program staff are assigned to the evaluation team.
The evaluation seeks to determine program effectiveness
and if at all possible, impact.
During the evaluation the following occurs:
We discover no data exist to assess impact questions.
One staff person insists on using more rigorous methods
despite the evaluation team’s agreement to focus on
outcomes and insists on revisiting the issue at every
meeting.
The contractor expresses concern regarding resource
constraints required to respond to the staffer’s comments.
What’s an evaluator to do?
Notes de l'éditeur
Handouts of all slides (~40) and scenarios for each small group (~15)
Steve starts off
Include brief bios of presenters here
Identify three group leaders (adjust group number and perspectives as necessary according to number of participants – may want to have them indicate perspectives by a show of hands to get sense of group sizes)
At end, opportunity to share contact info to keep in touch and continue dialogue on thorny evaluation problems
Systems are unique and individual ways of viewing the world, not necessarily reflective of any objective reality.
Boundaries are necessarily arbitrary, and we need to think through the implications of the boundaries we choose
Focus mostly on adopting different perspectives in discussing problem scenarios, but also consider patterns and boundaries
Discuss bullet items in pairs of two, going down the list. First two relate to our outlook towards ourselves and others, second two to opportunities to exert influence (and avoid attempts at control, which might lead to defensive reactions), and third two to the actions we take (acknowledging that we are not simply neutral observers); all are interrelated
Small group facilitators will pose short, real-life situations from practice, including key players, context & a dilemma/choice
Apply systems thinking to a dialogue of potential actions you did/might take, possible considering some or all of the following questions
Start with one scenario and practice applying the different questions. You might need to supply some additional detail, but only if absolutely necessary (and keep confidential). Then participants could offer their own scenarios; if no takers, move to #2. Probably time for about 3-4 total scenarios (~5 min each), depending on depth of discussion. Facilitators should monitor time and move on as needed, since scenarios are unlikely to reach a final consensus on an action to take. Kathy rotates around groups.
Free-flowing large group discussion, facilitated by Kathy – add intro points of common themes from small group discussions as needed. Invite participants to share email info to keep in touch after session via email list