SlideShare une entreprise Scribd logo
1  sur  43
Communicate the ‘Of What' and ‘On What' to the ‘For Whom'An Introduction to Improving Extension Impact Reporting Nathan J. Meyer, Extension Educator ESE Draft May 20102009
Overview ,[object Object]
So what do we need to know?
How can we craft better reports?
How can we understand and support people in using our reports?,[object Object]
Why do we care?
Our mission is impact Taking University research and education to the people of Minnesota ... discovering real-world solutions to real-life problems. University of Minnesota Extension Mission
Extension - A modern Pony Express?
The Extension Hedgehog Extension is the highest value higher education investment available to government and non-governmental funding agencies because Extension education programs have impact, and we can prove it with credible measures. And the positive impacts of our research and education programs are multiplied by thousands of volunteers that apply new knowledge and skills in service to their communities.  -McGrath, Conway & Johnson (2007)
Our future is impact No public entity can afford to offer programs that are nice and interesting but without significant impact and value. Dean Beverly Durgan (2009)
What are impacts and how do we evaluate for them?          (A Brief Review)
Impact is… The positive & negative, primary & secondary long-term effects produced by an intervention, directly or indirectly, intended or unintended. ,[object Object]
Difference of indicator of interest with and without the intervention,[object Object]
So impact evaluation is… Attribution. Effects produced by… Counterfactual. What would have happened otherwise….
So impact evaluation must… Articulate ‘of what’, ‘on what’, and ‘for whom’. Address attribution and the counterfactual. Articulate and use the program theory. Use a mixed-methods approach. Start early.
Program theory is important.
Still a challenge…? No clear program theory (explicit goals). Effects are typically small. Effects cannot be separated from other sources. Few standardized, consistent procedures for evaluating programs.
So how can we effectively communicate the  'of what' and 'on what' to the 'for whom'?
Provide 'real users' with 'information they can understand and apply' about our impacts for 'real purposes'.
It’s simple, right? Elements of an Impact Statement: Describe the problem in simple terms. Describe the Extension program response. Describe the results of the Extension program. Describe who was responsible.
Use is complicated…. Johnson (1998)
A straightforward approach….
Working backward….
Define a ‘real purpose’ and intended use. Some Key Evaluation Functions: ,[object Object]
Guidance
Reconceptualization
Mobilization of support,[object Object]
Conceptual. Influences how think about a program or issue. (Guidance, Reconceptualization)
Symbolic. Use as token to support position or action. (Mobilize Support)
(Process.) Changes resulting from engagement in the evaluation.(Potentially All),[object Object]
Get to know your 'real users'.
Get to know your ‘real users’. What decisions are the findings expected to inform? (Primary Uses) Who exactly will make these decisions? When? (Intended Users) Funder Don at Initiative Foundation Actually…Barb will be presenting findings to Don. He trusts her opinion.….
Get to know your 'real users'. The personal factor is the presence of an identifiable individual or group of people who personally care about the evaluation and the findings it generates. Patton (2008)
Different users, different relations…
Then give them a report they can believe and apply Truth. Extent to which research adheres to canons of scientific method, and accords to previous knowledge of how the world works. Utility. Extent to which research provides actionable direction, or challenges the status quo.
Communicate with your users We all fall prey to this tendency [slinging jargon] from time to time, but rarely do we consider the costs of lapsing into this kind of lingo, costs measured in a failure to communicate with the people we most need to reach, and in the mental laziness that jargon permits. Zuckerman (2001/2002)
Mean # of Participants Format data for action
Proxies can be powerful… ,[object Object]
Completion of 8th grade algebra
Family dinners

Contenu connexe

Tendances

Week 6.docxby blanca luz benavidez submission date 22 fe
Week 6.docxby blanca luz benavidez submission date 22 feWeek 6.docxby blanca luz benavidez submission date 22 fe
Week 6.docxby blanca luz benavidez submission date 22 fessuser774ad41
 
SCC2011 - Evaluation: Facing the tricky questions
SCC2011 - Evaluation: Facing the tricky questionsSCC2011 - Evaluation: Facing the tricky questions
SCC2011 - Evaluation: Facing the tricky questionsBritish Science Association
 
Cognition, cues, nudges and affordances in mobile communication
Cognition, cues, nudges and affordances in mobile communicationCognition, cues, nudges and affordances in mobile communication
Cognition, cues, nudges and affordances in mobile communicationTyler Gayheart
 
Virtual Communication in Educational Institutions
Virtual Communication in Educational InstitutionsVirtual Communication in Educational Institutions
Virtual Communication in Educational InstitutionsTanya Joosten
 
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...MEASURE Evaluation
 
Family planning 170706
Family planning 170706Family planning 170706
Family planning 170706kristofferryan
 
Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...
Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...
Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...Tim Merrill
 
Theory of Change and Outcome Mapping
Theory of Change and Outcome Mapping Theory of Change and Outcome Mapping
Theory of Change and Outcome Mapping ikmediaries
 
Effective Useful Life and Persistence for Behavioral Programs and Framing the...
Effective Useful Life and Persistence for Behavioral Programs and Framing the...Effective Useful Life and Persistence for Behavioral Programs and Framing the...
Effective Useful Life and Persistence for Behavioral Programs and Framing the...Opinion Dynamics
 
Masterclass on Radical Thinking, Change and Transformation for London leaders...
Masterclass on Radical Thinking, Change and Transformation for London leaders...Masterclass on Radical Thinking, Change and Transformation for London leaders...
Masterclass on Radical Thinking, Change and Transformation for London leaders...NHS Improving Quality
 
Theory of Change: Wandsworth
Theory of Change: WandsworthTheory of Change: Wandsworth
Theory of Change: WandsworthPatrick Torsney
 

Tendances (15)

Impact in psychological research Rosie Meek
Impact in psychological research Rosie MeekImpact in psychological research Rosie Meek
Impact in psychological research Rosie Meek
 
Week 6.docxby blanca luz benavidez submission date 22 fe
Week 6.docxby blanca luz benavidez submission date 22 feWeek 6.docxby blanca luz benavidez submission date 22 fe
Week 6.docxby blanca luz benavidez submission date 22 fe
 
SCC2011 - Evaluation: Facing the tricky questions
SCC2011 - Evaluation: Facing the tricky questionsSCC2011 - Evaluation: Facing the tricky questions
SCC2011 - Evaluation: Facing the tricky questions
 
Cognition, cues, nudges and affordances in mobile communication
Cognition, cues, nudges and affordances in mobile communicationCognition, cues, nudges and affordances in mobile communication
Cognition, cues, nudges and affordances in mobile communication
 
Response from Taskforce
Response from TaskforceResponse from Taskforce
Response from Taskforce
 
Virtual Communication in Educational Institutions
Virtual Communication in Educational InstitutionsVirtual Communication in Educational Institutions
Virtual Communication in Educational Institutions
 
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
 
Family planning 170706
Family planning 170706Family planning 170706
Family planning 170706
 
Issues and challenges in participant recruitment: Using innovative technologi...
Issues and challenges in participant recruitment: Using innovative technologi...Issues and challenges in participant recruitment: Using innovative technologi...
Issues and challenges in participant recruitment: Using innovative technologi...
 
Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...
Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...
Coping with Complexity in Healthcare: Enabling Sense-Making Through Great UX ...
 
Theory of Change and Outcome Mapping
Theory of Change and Outcome Mapping Theory of Change and Outcome Mapping
Theory of Change and Outcome Mapping
 
Effective Useful Life and Persistence for Behavioral Programs and Framing the...
Effective Useful Life and Persistence for Behavioral Programs and Framing the...Effective Useful Life and Persistence for Behavioral Programs and Framing the...
Effective Useful Life and Persistence for Behavioral Programs and Framing the...
 
Information Use Map
Information Use MapInformation Use Map
Information Use Map
 
Masterclass on Radical Thinking, Change and Transformation for London leaders...
Masterclass on Radical Thinking, Change and Transformation for London leaders...Masterclass on Radical Thinking, Change and Transformation for London leaders...
Masterclass on Radical Thinking, Change and Transformation for London leaders...
 
Theory of Change: Wandsworth
Theory of Change: WandsworthTheory of Change: Wandsworth
Theory of Change: Wandsworth
 

En vedette

ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52
ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52
ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52Suphamongkol
 
Recipe For Your Revenue Strategy National Instruments
Recipe For Your Revenue Strategy   National InstrumentsRecipe For Your Revenue Strategy   National Instruments
Recipe For Your Revenue Strategy National Instrumentsdemo88
 
Tips and Tricks to Boost Your Year-End Fundraising Campaign
Tips and Tricks to Boost Your Year-End Fundraising CampaignTips and Tricks to Boost Your Year-End Fundraising Campaign
Tips and Tricks to Boost Your Year-End Fundraising CampaignAddThis
 
Personalize Your Website & Email Marketing for Better Results
Personalize Your Website & Email Marketing for Better ResultsPersonalize Your Website & Email Marketing for Better Results
Personalize Your Website & Email Marketing for Better ResultsAddThis
 
El aparato respiratorio(juan y rubén).pptx
El aparato respiratorio(juan y rubén).pptxEl aparato respiratorio(juan y rubén).pptx
El aparato respiratorio(juan y rubén).pptxlygobu78
 
An Intuitive Intro To Machine Learning
An Intuitive Intro To Machine LearningAn Intuitive Intro To Machine Learning
An Intuitive Intro To Machine LearningBen Freundorfer
 
Sentiment Analysis and Social Media: How and Why
Sentiment Analysis and Social Media: How and WhySentiment Analysis and Social Media: How and Why
Sentiment Analysis and Social Media: How and WhyDavide Feltoni Gurini
 
How compliant are they
How compliant are theyHow compliant are they
How compliant are theyArthur Mboue
 
Microsoft Kundenlizenztraining - Server am 15. Oktober 2015
Microsoft Kundenlizenztraining - Server am 15. Oktober 2015Microsoft Kundenlizenztraining - Server am 15. Oktober 2015
Microsoft Kundenlizenztraining - Server am 15. Oktober 2015Microsoft Österreich
 

En vedette (13)

resume ma.
resume ma.resume ma.
resume ma.
 
ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52
ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52
ด.ช.ศุภมงคล แผ้วเกษม ม.1ห้อง2 เลขที่52
 
Recipe For Your Revenue Strategy National Instruments
Recipe For Your Revenue Strategy   National InstrumentsRecipe For Your Revenue Strategy   National Instruments
Recipe For Your Revenue Strategy National Instruments
 
Tips and Tricks to Boost Your Year-End Fundraising Campaign
Tips and Tricks to Boost Your Year-End Fundraising CampaignTips and Tricks to Boost Your Year-End Fundraising Campaign
Tips and Tricks to Boost Your Year-End Fundraising Campaign
 
Deber de computacion
Deber de computacionDeber de computacion
Deber de computacion
 
Mitos
MitosMitos
Mitos
 
oceanoscontaminados
oceanoscontaminadosoceanoscontaminados
oceanoscontaminados
 
Personalize Your Website & Email Marketing for Better Results
Personalize Your Website & Email Marketing for Better ResultsPersonalize Your Website & Email Marketing for Better Results
Personalize Your Website & Email Marketing for Better Results
 
El aparato respiratorio(juan y rubén).pptx
El aparato respiratorio(juan y rubén).pptxEl aparato respiratorio(juan y rubén).pptx
El aparato respiratorio(juan y rubén).pptx
 
An Intuitive Intro To Machine Learning
An Intuitive Intro To Machine LearningAn Intuitive Intro To Machine Learning
An Intuitive Intro To Machine Learning
 
Sentiment Analysis and Social Media: How and Why
Sentiment Analysis and Social Media: How and WhySentiment Analysis and Social Media: How and Why
Sentiment Analysis and Social Media: How and Why
 
How compliant are they
How compliant are theyHow compliant are they
How compliant are they
 
Microsoft Kundenlizenztraining - Server am 15. Oktober 2015
Microsoft Kundenlizenztraining - Server am 15. Oktober 2015Microsoft Kundenlizenztraining - Server am 15. Oktober 2015
Microsoft Kundenlizenztraining - Server am 15. Oktober 2015
 

Similaire à Intro To Impact Reporting

Using Data to Empower Youth Program Participants at Changeist
Using Data to Empower Youth Program Participants at ChangeistUsing Data to Empower Youth Program Participants at Changeist
Using Data to Empower Youth Program Participants at ChangeistData Con LA
 
Outcome Mapping for Insight to Impact Meeting
Outcome Mapping for Insight to Impact MeetingOutcome Mapping for Insight to Impact Meeting
Outcome Mapping for Insight to Impact MeetingODI_Webmaster
 
Ictj Om Concepts Dp Rev
Ictj Om Concepts Dp RevIctj Om Concepts Dp Rev
Ictj Om Concepts Dp Revdpage
 
Integrating UX and evidence-based approaches to design effective youth mental...
Integrating UX and evidence-based approaches to design effective youth mental...Integrating UX and evidence-based approaches to design effective youth mental...
Integrating UX and evidence-based approaches to design effective youth mental...Penny Hagen
 
The Program Oriented Evaluation Approaches Essay
The Program Oriented Evaluation Approaches EssayThe Program Oriented Evaluation Approaches Essay
The Program Oriented Evaluation Approaches EssayTammy Moncrief
 
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docxRunning head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docxtoltonkendal
 
Utilization Based Evaluation In Mental Health
Utilization Based Evaluation In Mental HealthUtilization Based Evaluation In Mental Health
Utilization Based Evaluation In Mental HealthPaul Frankel
 
Pathways to Impact and Public Engagement
Pathways to Impact and Public Engagement Pathways to Impact and Public Engagement
Pathways to Impact and Public Engagement Jamie Gallagher
 
Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations
Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations
Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations citinfo
 
CCS_Capabilities_Presentation_4-21-2016
CCS_Capabilities_Presentation_4-21-2016CCS_Capabilities_Presentation_4-21-2016
CCS_Capabilities_Presentation_4-21-2016Lauren McCormack
 
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...CONUL Teaching & Learning
 
Dissertation Project Report.
Dissertation Project Report. Dissertation Project Report.
Dissertation Project Report. Priya Pandey
 
Effective Communication in Health and Social Care Organization
Effective Communication in Health and Social Care OrganizationEffective Communication in Health and Social Care Organization
Effective Communication in Health and Social Care Organizationwww.assignmentdesk.co.uk
 
Imagining Social Work Education into the Future: Skills for Social Justice in...
Imagining Social Work Education into the Future: Skills for Social Justice in...Imagining Social Work Education into the Future: Skills for Social Justice in...
Imagining Social Work Education into the Future: Skills for Social Justice in...Laurel Hitchcock
 
Public Engagement with Research - Music
Public Engagement with Research - Music Public Engagement with Research - Music
Public Engagement with Research - Music Jamie Gallagher
 

Similaire à Intro To Impact Reporting (20)

Using Data to Empower Youth Program Participants at Changeist
Using Data to Empower Youth Program Participants at ChangeistUsing Data to Empower Youth Program Participants at Changeist
Using Data to Empower Youth Program Participants at Changeist
 
Outcome Mapping for Insight to Impact Meeting
Outcome Mapping for Insight to Impact MeetingOutcome Mapping for Insight to Impact Meeting
Outcome Mapping for Insight to Impact Meeting
 
Ictj Om Concepts Dp Rev
Ictj Om Concepts Dp RevIctj Om Concepts Dp Rev
Ictj Om Concepts Dp Rev
 
Integrating UX and evidence-based approaches to design effective youth mental...
Integrating UX and evidence-based approaches to design effective youth mental...Integrating UX and evidence-based approaches to design effective youth mental...
Integrating UX and evidence-based approaches to design effective youth mental...
 
The Program Oriented Evaluation Approaches Essay
The Program Oriented Evaluation Approaches EssayThe Program Oriented Evaluation Approaches Essay
The Program Oriented Evaluation Approaches Essay
 
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docxRunning head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
 
How To Write A Essay Proposal
How To Write A Essay ProposalHow To Write A Essay Proposal
How To Write A Essay Proposal
 
Davos 2
Davos 2Davos 2
Davos 2
 
Program Evaluation
Program EvaluationProgram Evaluation
Program Evaluation
 
Pasupathi Raj - Project Research Paper
Pasupathi Raj - Project Research PaperPasupathi Raj - Project Research Paper
Pasupathi Raj - Project Research Paper
 
Utilization Based Evaluation In Mental Health
Utilization Based Evaluation In Mental HealthUtilization Based Evaluation In Mental Health
Utilization Based Evaluation In Mental Health
 
Pathways to Impact and Public Engagement
Pathways to Impact and Public Engagement Pathways to Impact and Public Engagement
Pathways to Impact and Public Engagement
 
Fellows annotated
Fellows annotatedFellows annotated
Fellows annotated
 
Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations
Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations
Pesuading Policy Makers: Effective CIT Program Evaluation and Public Relations
 
CCS_Capabilities_Presentation_4-21-2016
CCS_Capabilities_Presentation_4-21-2016CCS_Capabilities_Presentation_4-21-2016
CCS_Capabilities_Presentation_4-21-2016
 
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
 
Dissertation Project Report.
Dissertation Project Report. Dissertation Project Report.
Dissertation Project Report.
 
Effective Communication in Health and Social Care Organization
Effective Communication in Health and Social Care OrganizationEffective Communication in Health and Social Care Organization
Effective Communication in Health and Social Care Organization
 
Imagining Social Work Education into the Future: Skills for Social Justice in...
Imagining Social Work Education into the Future: Skills for Social Justice in...Imagining Social Work Education into the Future: Skills for Social Justice in...
Imagining Social Work Education into the Future: Skills for Social Justice in...
 
Public Engagement with Research - Music
Public Engagement with Research - Music Public Engagement with Research - Music
Public Engagement with Research - Music
 

Intro To Impact Reporting

  • 1. Communicate the ‘Of What' and ‘On What' to the ‘For Whom'An Introduction to Improving Extension Impact Reporting Nathan J. Meyer, Extension Educator ESE Draft May 20102009
  • 2.
  • 3. So what do we need to know?
  • 4. How can we craft better reports?
  • 5.
  • 6. Why do we care?
  • 7. Our mission is impact Taking University research and education to the people of Minnesota ... discovering real-world solutions to real-life problems. University of Minnesota Extension Mission
  • 8. Extension - A modern Pony Express?
  • 9. The Extension Hedgehog Extension is the highest value higher education investment available to government and non-governmental funding agencies because Extension education programs have impact, and we can prove it with credible measures. And the positive impacts of our research and education programs are multiplied by thousands of volunteers that apply new knowledge and skills in service to their communities. -McGrath, Conway & Johnson (2007)
  • 10. Our future is impact No public entity can afford to offer programs that are nice and interesting but without significant impact and value. Dean Beverly Durgan (2009)
  • 11.
  • 12.
  • 13. What are impacts and how do we evaluate for them? (A Brief Review)
  • 14.
  • 15.
  • 16. So impact evaluation is… Attribution. Effects produced by… Counterfactual. What would have happened otherwise….
  • 17. So impact evaluation must… Articulate ‘of what’, ‘on what’, and ‘for whom’. Address attribution and the counterfactual. Articulate and use the program theory. Use a mixed-methods approach. Start early.
  • 18. Program theory is important.
  • 19. Still a challenge…? No clear program theory (explicit goals). Effects are typically small. Effects cannot be separated from other sources. Few standardized, consistent procedures for evaluating programs.
  • 20.
  • 21. So how can we effectively communicate the 'of what' and 'on what' to the 'for whom'?
  • 22. Provide 'real users' with 'information they can understand and apply' about our impacts for 'real purposes'.
  • 23. It’s simple, right? Elements of an Impact Statement: Describe the problem in simple terms. Describe the Extension program response. Describe the results of the Extension program. Describe who was responsible.
  • 24. Use is complicated…. Johnson (1998)
  • 27.
  • 30.
  • 31. Conceptual. Influences how think about a program or issue. (Guidance, Reconceptualization)
  • 32. Symbolic. Use as token to support position or action. (Mobilize Support)
  • 33.
  • 34. Get to know your 'real users'.
  • 35. Get to know your ‘real users’. What decisions are the findings expected to inform? (Primary Uses) Who exactly will make these decisions? When? (Intended Users) Funder Don at Initiative Foundation Actually…Barb will be presenting findings to Don. He trusts her opinion.….
  • 36. Get to know your 'real users'. The personal factor is the presence of an identifiable individual or group of people who personally care about the evaluation and the findings it generates. Patton (2008)
  • 38. Then give them a report they can believe and apply Truth. Extent to which research adheres to canons of scientific method, and accords to previous knowledge of how the world works. Utility. Extent to which research provides actionable direction, or challenges the status quo.
  • 39. Communicate with your users We all fall prey to this tendency [slinging jargon] from time to time, but rarely do we consider the costs of lapsing into this kind of lingo, costs measured in a failure to communicate with the people we most need to reach, and in the mental laziness that jargon permits. Zuckerman (2001/2002)
  • 40. Mean # of Participants Format data for action
  • 41.
  • 42. Completion of 8th grade algebra
  • 44. Value of the volunteer hour
  • 45.
  • 47. Involve your users Control of technical decision making – stakeholder to evaluator Diversity among stakeholders selected for participation – diverse to limited Power relations among participating stakeholders – conflicting to neutral Manageability of evaluation implementation – unmanageable to manageable Depth of participation – deep to consultative
  • 48.
  • 52.
  • 56.
  • 57.
  • 59. Design a thoughtful impact evaluation
  • 60. Define a ‘real purpose’ and uses
  • 61. Get to know your ‘real users’
  • 62. Create a report for their specific uses
  • 63. Support users in using the report
  • 64.

Notes de l'éditeur

  1. Briefly review the structure of the presentation.
  2. Describe the nature of the project, personal questions prompted by reading Patton.
  3. The mission of the University of Minnesota Extension is to make an impact on our public – to solve their problems. Impact has been and continues to be the foundation of the Land Grant Mission and their Extension programming.
  4. Impact is also a cornerstone of our sustainability. West, Drake and Londo (2009) claim in a Journal of Extension article that Extension may be headed for a fate similar to the Pony Express. In their words: “The Pony Express and Extension are two completely dissimilar organizations linked by a common problem: survival in changing times.” Though functionally dissimilar organizations, Extension like the Pony Express is now trying to sustain itself in a world much different than the one for which it was created. They propose a variety of challenges, and potential solutions. Among them is effective education design, and ultimately evaluation:“The fact remains, however, that Extension is dominated by individuals with subject-matter expertise but with little or no formal training in education, communication, psychology, or other fields relevant to Extension's mission of education. The stark reality is that we have limited evidence to demonstrate Extension's effectiveness and, in this day of heightened scrutiny and expectations for governmental programs, we must improve in this arena.” [Bold added.]Note their call for evidence to demonstrate Extension’s effectiveness.Photo from http://commons.wikimedia.org/wiki/File:Pony_express.jpg. Frank E. Webner, Pony Express rider~ ca. 1861.This work is in the public domain in the United States because it is a work of the United States Federal Government under the terms of Title 17, Chapter 1, Section 105 of the US Code. See Copyright.
  5. Demonstrating impact is inherent to our competitive advantage. Summary statement of the Extension Hedgehog from McGrath, Conway & Johnson (2007) article in Journal of Extension. Building from Collins’ Good to Great concept, the authors described a thought experiment in which they tried to identify the one thing that Extension does best for its customers. I have bolded their assertion that Extension programs have impact, and can prove it through credible evaluation. Insofar as Extension staff agree with this summary of their work, they essentially concede the centrality of impact evaluation and reporting.
  6. And, finally, the demonstration of impacts is inherent to our sustainability in Minnesota. In the Extension Enews, January 7, 2009 issue, Dean Durgan wrote about impact in her Making a Measurable Difference in Minnesota. To survive in critical budgetary times, the University of Minnesota Extension must be diligent in demonstrating a significant impact and value.
  7. So, I suggest that impact evaluation/reporting bookends our 3-part strategic programming process – identifying and making explicit public value, program business planning, and impact evaluation/reporting. Essentially, these processes combine to ensure our programming is 1) worth public investment, 2) well-designed to achieve its purpose, and 3) actually achieves intended and important impacts.
  8. Processing and Reflecting: Ask to summarize key points. Ask clarification questions. Workbook: Provide space to jot questions, observations, Ah Ha's.
  9. The term ‘impact’ is variously defined (Evaluation Gap Working Group, 2006; Leeuw & Vaessen, 2009; Rossi, Lipsey, & Freeman, 2004; White 2010) but typically refers to long-term effects produced by an intervention. White (2010) points out that evaluators typically define impact as the final level of a program (intervention) causal chain. He contrasts this with the definition more commonly used by those in the ‘impact evaluation field’ as difference of an indicator of interest with/without the intervention. The definition used here from Leeuw & Vaessen (2009) also points out that impacts can be positive or negative, direct or indirect, and intended or unintended.
  10. Bennettand Rockwell (1995) provided the useful TOP framework to help educators connect their planning and evaluation efforts. In this case, it can also be used to exemplify what IS and IS NOT considered a program impact. Along the planning side of the framework, we note problems with SEE (Social, Economic, Environment) conditions can call for changes in practices (behavior), which denote changes in KASA (Knowledge, Aspirations, Skills, Attitudes) and so forth to the specific resources necessary to pull off an intervention. Along the evaluation side of the framework, we note that the presences of resources can be evaluated, types and numbers of activities, numbers and demographics of participants, and so forth up to changes in practice and SEE outcomes. It is essentially the SEE outcomes and potentially the changes in practice that can be considered impacts long-term impacts of an intervention. KASA are typically considered short-term outcomes necessary for impacts. Other elements are functions of the intervention.
  11. Impact evaluation tends to be defined in various forms as the difference in SEE conditions for populations with/without a specific intervention (Evaluation Gap Working Group, 2006; Leeuw & Vaessen, 2009; Rossi, Lipsey, & Freeman, 2004; White, 2010). Leeuw & Vaessen (2009) and White (2010) both called attention to two premises underlying impact evaluation: attribution and a counterfactual. ‘Attribution’ implies an approach to evaluation that attributes changes in conditions to the intervention (as opposed to maturity or effects of other intervention). ‘Counterfactual’ implies an approach to evaluation that includes an attempt to describe what would have happened in the absence of the intervention. White (2010) argued that there must always be a counterfactual in an impact evaluation, even if not a comparison group. He also argued that attribution needs to be a focus of the evaluation, even when there may be multiple contributing interventions to disentangle.
  12. There is no absolute design for impact evaluation, though RCTs and other comparative-group designs are often referred as the ‘gold standard’ (Evaluation Gap Working Group, 2006; Leeuw & Vaessen, 2009; Rossi, Lipsey, & Freeman, 2004; White, 2010). However, it is worth giving thought to the appropriateness of our designs well before reporting (ideally when initially planning the program). Pertinent to this short review is the methodological guidance that Leeuw & Vaessen (2009) offered to those planning impact evaluations – 1) articulate the type and scope of the evaluation and agree what is valued, 2) clarify the program theory and address attribution, and 3) use a mixed-methods approach. White (2010, 2008) explored the use of mixed methods for impact evaluation. Notably, he suggested that the size of the intervention being evaluated may be an effective criteria for judging the applicability of qualitative or quantitative approaches (White, 2010) .
  13. It is worth noting here the importance of program theory in guiding impact evaluation (and articulation of impacts). Many authors have explored the potential for program theory in helping evaluators develop appropriate foci, designs and measures (Chen, 2005; Patton, 2008; Rogers, 2008; Rossi, Lipsey, & Freeman, 2004). Germaine to the focus of this presentation is the typical contention that Extension programs are too complicated, multi-faceted or intertwined with other interventions to measure impact. Insofar as this is becoming a more and more tenuous position, it is worth looking closely at how these authors can help us clarify our causal claims underlying our program theories, assess the kinds of evidence supporting these claims, and develop impact evaluations that assess critical gaps. Explain how this has been accomplished in the Best Practices for Field Days Program, Ulead and Youth Development Extension programming (Joyce Hoelting, Personal Communication).
  14. Smith & Straughn (1983) described the challenges entailed by a then new Extension accountability and monitoring policy. After deconstructing implications of the new policy, they pointed out four significant challenges to evaluation of the impacts of Extension programming. And they suggested solutions for each of these challenges. Many programs, in their view, did not have explicit goals, objectives, program theory. To mitigate this challenge, they suggested working with programs to make explicit their theories and/or abstaining from evaluation of programs lacking clarity. They suggested that programs may achieve relatively indistinguishable effects, potentially intertwined with other interventions. To mitigate these challenges, they suggested 1) focusing Extension programming on audiences where more significant impacts can be achieved more rapidly, and 2) evaluating groups of programs in multiple geographies to generate more convincing evidence of attributable impact. Finally, they suggested that Extension programs tended not to be evaluated for external audiences. To mitigate this challenge, they suggested developing more standardized, consistent evaluation procedures that were palpable to outside stakeholders. Nonetheless, they conceded the significance of the challenge, quoting Rossi and Freeman (1982): “We cannot overstress the difficulty of undertaking impact evaluations” (Smith & Straughn,1983, p. 55).To what extent are these challenges inherent in our current call for evidence of impact? Can we solve them in the same manner? [Suggest yes, if follow guidelines presented earlier. An early start on impact evaluation planning will improve coordination with program theory planning. Considering impact evaluation, we can give more attention to our target audiences, and operationalization of SEE Conditions. We can give more attention to careful program theory to enable more resources available for evaluation of necessary claims.]
  15. Processing and Reflecting: Ask to summarize key points. Ask clarification questions. Workbook: Provide space to jot questions, observations, Ah Ha's.
  16. So, the big goal is to provide 'real users' with 'information they can understand and apply' for 'real purposes‘ (Fitzpatrick, Sanders, & Worthen, 2004; Patton, 2008). Seems easy enough. So, why do we think the results of evaluations are so rarely used as we intend them (Patton, 2008; Weiss, 1980)?
  17. Can it be so simple? Franz and McCann(2007) described a training/support process at Virginia Cooperative Extension for training staff to write and embrace straightforward impact statements. On their website (http://www.cals.vt.edu/communications/writingimpactstatements.html), they present concise definitions of program impacts, impact audiences and impact reporting. Germaine to this presentation, they also present a stepwise formula for writing impact statements. In a general sense, this is the target for a good impact evaluation report. And it provides a good framework for starting our discussion. To get started, It will be helpful to sketch this skeleton report.But, it is not really this simple to craft an effective report. As we learned in the last section of this presentation, there are elements of this report (e.g., the response, results, and who is responsible) that ultimately rely on an effective evaluation design. Moreover, we cannot assume that one report structure will work for different users or situations. In fact, the one-way method of writing may lead to reports that are not used at all, or misused.
  18. Scholars don’t know for sure why evaluations are seldom used as intended (or used at all). In fact, they are still really trying to understand how/why/when use happens. But, the bottom line is that use is complicated (Amara, Ouimet, & Landry, 2004; Cousins & Leithwood, 1986; Johnson, 1998; Lipton, 1992; Patton, 2008; Shulha & Cousins, 1997; Weiss, 1988). For instance, the following framework from Johnson (1998) depicts the interactions of ‘background’, ‘interactional’ and ‘utilization variables’ in the use process. There are influences inherent to the internal and external environments and contexts of the evaluation. There are interrelated feedback loops. While this is not the only model, nor necessarily the absolutely correct model of evaluation use, it is clear that our impact reports must account for and integrate within these complicated utilization processes.
  19. Unfortunately, evaluation reports are rarely tailored well to these complicated utilization processes (Fitzpatrick, Sanders, & Worthen, 2004, Patton, 2008). I suggest, however, that we can meet the challenge by following a fairly straightforward process to be thoughtful about our reports. Essentially, the production of good technical writing, evaluation report or otherwise, demands a clear sense of the ‘real purpose’ for a ‘real audience’ that encompasses ‘real content’. Following this straightforward approach to reporting, we essentially clarify our audience (user characteristics), purpose (outcomes) and content (program characteristics). We then connect these through specific attention to our impact evaluation design, report characteristics.The following slides will examine elements of this model in more detail.
  20. While the model is fairly straightforward, it is more practical to essentially work both ends to the middle. Thus, I suggest beginning by clarifying your potential (or even desired) outcomes, and the intended uses that will address these outcomes.
  21. It is relatively easy to develop a long list of potential outcomes of our impact reports: budget renewal, new program investments, improved abilities to describe the importance of a program, developing supporting attitudes toward a program, supporting a position about a program, etc. Weiss (1988), however, helps us understand that evaluations actually serve a much more manageable list of key functions to address these varied outcomes. Evaluations can serve as a warning signal. They can provide users with guidance in making decisions or informed actions. They can change/improve user’s concepts about programs and issues. They can also be useful in mobilizing support for programs and issues. So, working backward, it is helpful to define the ultimate outcome that our users (and/or we or our leadership) intend to accomplish through our report of impact. Then, we can identify which of the key evaluation functions are likely to best address this outcome. [DISCUSS SOME PURPOSES & FUNCTIONS]
  22. Finally, it will be necessary for users to make use of our impact reports in certain ways related to our appropriate evaluation functions. Though various taxonomies of uses differ in number and complexity (Cousins, 2004; Cousins & Leithwood, 1986; Johnson, 1998; Patton, 2008; Shulha & Cousins, 1997), it will be helpful for our purposes to discern between three common types of use: instrumental, conceptual, and symbolic. Evaluation results can be used directly to inform decisions or actions – instrumental use. This type of use is related to the ‘warning’ and potentially ‘guidance’ functions of evaluations. It is the type of use that we probably conceive resulting from our evaluations. But, it is actually relatively rare (Patton, 2008; Weiss, 1982).Evaluation results can influence how we think about a program or issue – conceptual use. This type of use is related to the ‘reconceptualization’ and potentially ‘guidance’ functions of evaluations. Weiss (1982) called this kind of use ‘enlightenment’, and described it as relatively more common in policy settings.Evaluation results can also be used as tokens to support a position or action – symbolic use. This type of use is related to the ‘mobilize support’ function of evaluations. However, there is some disagreement as to whether this is an appropriate use of evaluation, insofar as the findings may not be used or may be misconstrued to achieve support.Finally, the processes of involvement in an evaluation can also induce individual, organizational or other changes – process uses. This type of use is potentially related to all of the functions of evaluation. It is a relatively new area of focus in research on evaluation utilization (Patton, 2008).
  23. It is clear at this point that the ‘real purpose’ of your report relates directly to the function of your report, and how people will probably use it. In this chart, I make some subsequent suggestions about what you may need to know and pay attention to in formatting your report. For instrumental uses, I suggest it is important to format for utility/action. And it is therefore useful to know something about the decision/action, who will make the decision/action, and how the decision/action is likely to unfold. For conceptual uses, I suggest it is important to format for conceptual change. So, it is useful to know about who to involve, their concepts, and sources of learning. For symbolic uses, I suggest formatting to mobilize support. It is therefore helpful to know about the nature of the position to be supported, the context where calls to mobilize are likely to take place, and who is likely to make these calls. In this case, I also suggest care to prevent misuse.
  24. It is also helpful to learn as much as possible about your ‘real users’ evaluation-related characteristics. Patton (2008) and others have described different ways that users may intend to utilize results (more on this later) and their timelines for use. Weiss (1982) and others have discussed the importance of the context in influencing use. And we know from education studies that individuals have different information processing preferences. So once we identify ‘who cares’, we need to work with these people, our colleagues, leaders and regional directors to learn what works for them. [ASK FOR SUGGESTIONS OF QUESTIONS WE MIGHT WANT TO ANSWER ABOUT THEM.] For instance, we may ask about their prior experiences with evaluations, or their expectations for good evaluation. We might ask what they intend to do with results of the evaluation, and when they may need the results. Some of the things that we want to learn about our users are consistent among uses and reports – experience with evaluation, information processing preferences. Others are specific to our purposes and intended uses – context, timeline, learning setting.
  25. According to Patton (2008), we need to identify our specific users. It is quite typical to identify ‘stakeholders’ – in this example ‘funder’. To develop an effective report, however, we should be asking, “what funder?” – in this example ‘Don at Initiative Foundation’. We should ask, “who exactly…?” – in this example ‘Barb’. This will help us tailor our reports specifically for our users and their uses. Targeting more abstract definitions of our users (e.g., funder) can provide us some useful direction. But, as this example demonstrates, we risk being wildly off-target until we get more specific.
  26. Above all, your ‘real users’ are the people, the specific individuals, who care personally about the evaluation of your program. In a mid-1970s study of the utilization of 20 federal health evaluations, Patton (2008) and colleagues identified two factors that were consistently important: 1) considerations of politics, and 2) a ‘personal factor’. In Patton’s (2008, p. 66) words: “The personal factor is the presence of an identifiable individual or group of people who personally care about the evaluation and the findings it generates. Where such a person or group was present, evaluation were used; where the personal factor was absent, there was a correspondingly marked absence of evaluation impact.” The importance of the personal factor has since become well-accepted within evaluation practice. So, it is important for our purposes of reporting impacts to start by asking ‘who cares’? We need to work with our colleagues, leaders and regional directors to identify these specific people. And in the case that no one cares, we may need to 1) rethink the focus of our programming (i.e. the stool), or 2) work on cultivating the personal factor before reporting (i.e., building understanding of the relevance/importance of our programming).
  27. Graphic adapted from Patton (2008) Stakeholder Analysis: Power versus Interest Grid (p. 80). Patton (2008) also pointed out that we might cultivate different kinds of relationships with different kinds of users. Ideally, we will look for the personal factor in people who have a high interest in our program impacts, and high power to use the impact evaluation reports that we provide them. However, we may also want to cultivate evaluation reporting relationships with other people surrounding our evaluation. Some can increase the overall diversity of our reporting process. Others may be important elements of the context of our evaluations.[DESCRIBE examples like the Best Practices for Field Days program, Minnesota Master Naturalist. Ask participants for examples.]
  28. In drafting our reports (and ideally planning the evaluation design), it is important to learn as much as we can related to two tests to which our users are likely to subject our results. Weiss and Bucuvalas (1980) interviewed 155 researchers in mental health fields about their use of 50 research reports. Analyzing these results, they identified two key frames of reference for interpreting reports of research: a truth and a utility test. The truth test encompasses 1) the extent to which the research appears to adhere to canons of the scientific method (i.e., commonly controlled experiments), and 2) accords with the user’s previous knowledge about how the world works. NOTE According to Weiss and Bucuvalas (1980), the more research seems to conform to user’s previous knowledge, the generally less important the adherence to canons of scientific method. The utility test encompasses 1) the extent to which the research provides explicit, practical directions with which users can actually do something, and 2) the extent to which the results challenge current practices or conceptions. NOTE Weiss and Bucuvalas (1980) suggested that challenging results tend to be perceived as more useful when results are less actionable.
  29. Armed with a deep understanding of our programs, evaluation designs and analysis, it is all too easy to write a report that users cannot easily understand (Patton, 2008; Valovirta, 2002). Lacking careful attention to the specifics associated with our impact evaluations, we risk slinging generalized concepts loose and unfettered at our users - inert ideas seemingly irrelevant or incomprehensibleto their personal needs (Whitehead, 1959). In the words of Zuckerman (2001/2002): “We all fall prey to this tendency [slinging jargon] from time to time, but rarely do we consider the costs of lapsing into this kind of lingo, costs measured in a failure to communicate with the people we most need to reach, and in the mental laziness that jargon permits” (p. 34).To guard against this “mental dryrot,” Whitehead (1959) cautioned: “we must enunciate two educational commandments, ‘Do not teach too many subjects,’ and again, ‘What you teach, teach thoroughly’” (p. 3). We can adopt a similar caution for our impact reports: choose carefully the few messages to communicate, and communicate these thoroughly.
  30. Patton (2008)described the importance of formatting data for action (as opposed to however it spits out in an Excel, SPSS or other graph). In this (admittedly simplified) case, I imagine a Master Naturalist program asking how to improve the distribution of different age-classes of participants. In the first graph, these data are presented in a typical manner, requiring users to make interpretations or even additional calculations to answer the question. In the second graph, I have reorganized the categories to present data in rank order. A little more helpful. In the final graph, I present data in rank order as above/below a mean level of participation. This makes immediately apparent to users the data to answer their question. Participation in the first three groups needs to increase….
  31. Many Extension programs address complicated or complex issues (Rogers, 2008) with inherently complex impacts. It can therefore be useful to designate ‘proxies’ or indicators that provide rational, research-supported evidence of impact. For example, a colleague described use of somatic cell count in cows as a proxy of their overall health. Completion of 8th grade algebra is commonly accepted as an important gateway into collegiate science. Therefore, the 8th grade completion rates may be a useful proxy (and focus) for STEM programming and evaluation. The value of the volunteer hour can be used to monetize the impact of master volunteer programs on SEE conditions. Proxies essentially help users make better sense of our complex/complicated program theories and impacts.
  32. Walter, Nuntley, and Davies (2003), Patton (2008) and others discussed the importance of reporting at the right place and time. It is fairly useless, for example, to submit a report related to Extension budgets two weeks after county commissioners have discussed/voted on their budget. From Weiss (1982), we can gather a few questions to answer about the time/place where our report will be useful:What are the boundaries that encompass the purpose of my report – actors, time, purpose? Who needs to see it? When and where?Do my users have a desired end-state in mind? Are they willing to consider alternatives? How should I frame my report?How do my users perceive my purpose as significant? How should I format my report? Aremy users going to follow a clear, sequential order to use my report to address my purpose? How should I format my report? When and where is it most important for users to have it?NOTE that Weiss (1982) also pointed out that user decision making processes seldom encompass all (or any) of the above characteristics. They more often tend to be diffuse, with decisions solidifying over time. Therefore, we should not be surprised if our answers to any of the above questions are fuzzy or nonexistent. Nonetheless, the process of thinking through these questions will help us to better position/format our reports. And we should plan reports that work in this fuzziness.
  33. A number of influential evaluation scholars and practitioners have also explored the potential benefit of user involvement in the evaluation (and/or reporting) process (Cousins and Earl,1995, 1992;Cousins, Goh, & Clark, 2006; Patton, 2008; Walter, Nuntley, Davies; 2003). Cousins,Goh and Clark (2006) explored, for example, how evaluation data use in school contexts can lead to data valuing.Teachers and administrators in a group of 4 studyschools were involved in a two-step interview process. Cousins, Goh and Clark (2006) identified through analysis of interview transcripts a series of supportive and inhibiting factors that impacted reliance on evaluative inquiry in their study schools. One of the more influential factors they dubbed ‘data use leads to data valuing’. In other words, it was through using data that some of their participant teachers tended to start recognizing its value.Cousins and Earl (1995, 1992) pointed out, however, that involving participants in evaluations does not always work out well. It is a relatively fragile marriage. So, we should proceed carefully into user involvement strategies.Patton (2008) provided a useful taxonomy of levels of user involvement to guide our thoughtful planning of participatory strategies. Essentially, we can involve users to a lesser (informing) to greater (collaborating) extent, or even support their own abilities to deploy evaluation (empowerment). For each of these strategies, we make certain promises to our users. Strategies are also well suited to different kinds of users. In the end, it may be more typical to collaborate with some users in crafting our impact reports (e.g., regional directors, program leaders, key county extension committee members) or consult others (e.g., a program specialist or county commissioner) and simply inform others (Extension leadership, state legislators, federal reporting staff). NOTE, however, that authors like Lipton (1992) and Patton (2008) have described the importance of deeper involvement (consulting to collaborating) in all aspects of the impact evaluation. Ideally, we involve our intended users in helping us determine the priority impacts that will focus our evaluation, designs and data that they can trust and use, and reporting formats that fit their needs. This harkens back to an early slide in the presentation – involvement will ideally begin early…not right before we craft our reports. But involvement even then can be important.
  34. Cousins and Weaver (2004)elaborated another useful taxonomy for planning involvement in evaluation and reporting. They identified 5 important aspects of evaluations that could be conducted in more or less collaborative/inclusive manners. They subsequently suggested that a given collaborative project could be analyzed by rating it Likert-style on each of the proposed scales for “a helpful way to think about collaboration” (Cousins & Weaver, 2004, p. 37).This taxonomy can help us be thoughtful in which parts of our impact evaluation and reporting processes involve our users to some extent.
  35. Walter, Nuntley and Davies (2003) compiled a meta-analysis of research utilization literature, ultimately identifying a number of practices that support ‘research impact’ or use. In addition to concerns about formatting, content and placement of our impact reports, these practices enable us to support our users in making use of whatever report we produce. Walter, Nuntley and Davies (2003) suggest re-presenting results over and over, both print and orally. They suggest facilitating and/or educating users to use results. The social influence of others and/or collaborations in the evaluation process can catalyze use. And feedback/reminders are important.These supports should come as no surprise to many Extension field staff, as they are essentially hallmarks of effective adult education and behavior change practice. Walter, Nuntley and Davies (2003) have, in some ways, suggested only that we practice in our evaluations that which we do in our primary education work.
  36. Weiss (1988) provided a helpful list of other channels beyond the written report that can be used to disseminate results of our impact evaluations. Considering Walter, Nuntley and Davies (2003) call for re-presentation of results and cultivation of social influence, we can imagine how these and potentially other channels may be useful secondary outlets (or potentially primary outlets) for reporting our impacts.[ASK for some descriptions of channels that participants have used in reporting their impacts.]
  37. ConcedingSmith & Straughn’s (1983) call for standardized, palpable impact evaluation practices, it will be helpful to consider how our impact evaluation and reporting efforts coordinate with our established reporting practices. Due to the fluid nature of the federal reporting process, it can be conceived as an element central to coordinated reporting. In this report, we are relatively free to establish our own proposed impacts, outcomes and performance targets. We can also report on changes in plan and unintended outcomes. Therefore, the impacts and outcomes that become a focus of that process can evolve from our 3-fold program planning efforts (public value, program business planning, and impact evaluation planning). Moreover, they can be informed by expectations from our various policy/budgetary stakeholders. Data collected for the federal reporting process can then be used synchronously in fulfilling other reporting requests. Ideally, we could pull data from various scales, or comparative data as necessary for effect sizes and counterfactuals. Moreover, the federal process is responsive to the use of program theory in explaining program impacts.Therefore, it is recommended that program teams explore ways to improve their synchronization of federal reporting and other impact evaluation/reporting practices.[DESCRIBE a hypothetical example of how this may unfold. DISCUSS with participants benefits and challenges to coordinating our efforts in this manner.]
  38. In summary, this presentation has explored recommendations for improving our reports of impacts of Extension programming. In a brief overview, the presentation described the necessity of reporting impacts to sustain a viable Extension. We need subsequently to clarify and prioritize our impacts, and design thoughtful impact evaluation. I suggest even including this as one of ‘three-legs’ that support our program quality. In developing our reports of impacts, we need to define a ‘real purpose’, evaluation function and related uses. We need to get to know our users in some specific ways – looking for the personal factor, and high interest/power users. And we need to develop reports that account for user’s truth/utility tests, communication styles, and decision making contexts. Finally, we can employ different strategies to support their use.I concluded the presentation with a call for coordination of our impact evaluation and reporting practices. This can be accomplished by using the federal reporting process to build and gather our evidence.