Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Fix it or flee it

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Chargement dans…3
×

Consultez-les par la suite

1 sur 18 Publicité

Plus De Contenu Connexe

Les utilisateurs ont également aimé (20)

Similaire à Fix it or flee it (20)

Publicité

Fix it or flee it

  1. 1. Fix it or Flee it? Proven approaches for dealing with failing, flagging and floundering association programs Greg Melia, CAE @gmeliacae
  2. 2. Photo credit: hebedesign on flickr Program Assessment is a lot of GRIEF
  3. 3. Program Assessment is a lot of GRIEF • Goals • Research • Impact • Efficiency • Finances Credit: Angie Torres on Flickr
  4. 4. Defining the Goal of the Review –Work with key stakeholders and decision- makers –Solicit their key questions: • What do they want to know? • What information will help their decision-making? • What do they think needs to be evaluated? –Refine questions to be measurable and defined
  5. 5. Exploring Program Goals Investigate the Key “Whys”: •Why was the program originally established? •Why has it continued to be offered? •Why has it change over time? Review the Original Intended Approach: •Who, what, when, where, why and how? •What was implemented? •What was not?
  6. 6. Lessons Learned • Verbalizing goals of the review are an important part of preparing for change • Seek superordinate goals • Some agreements are easier reached upfront
  7. 7. Research 3 kinds of data: Corpus Inscriptionum Opinions and Descriptions Imponderabilia of behavior Bronislaw Malinowski Credit: Photographie et Ethnologie: Les photographes français au XXème siècle devant d’autres formes de cultures.
  8. 8. Lessons Learned • Historical minutes and documents can be VERY informative • Seek to understand program mutations • Good decisions based on bad data usually give bad results • Remember they may have been involved
  9. 9. Impact Measure • Outputs = Immediate I attended • Outcomes = Short-term I gained knowledge • Impacts = Long-term I served more members as a result
  10. 10. Efficiency Staff and Volunteer Time Complaints and Comments Marketing/Communication Registration/Orders Technology Photo credit: mansikka on Flickr
  11. 11. Lessons Learned • How do people feel about it? Do they have suggestions to increase efficiency? • What are the trends in terms of level of effort? • What is cost/benefit of doing it the same way versus trying a new approach? • Can improved efficiency save it?
  12. 12. Finances • Direct fixed expenses – Incurred regardless of how many people participate, buy, or are served (e.g., office rent) • Variable expenses – Incurred per each additional unit or person that is serviced (e.g., per person meal charge) • Mixed expenses – Incurred per each additional set of units or persons serviced (e.g., room rental, temporary help)
  13. 13. Key Calculations • Revenue – Expense = Margin – Margin per attendee/unit – Margin per staff hour – Margin per impact
  14. 14. Lessons Learned • One full flight is more profitable than two half full ones. • Most do not realize the true full cost. • Understanding costs helps determine price. • Price increases (and closing programs) take time.
  15. 15. Implementation • Who to involve –Consultant? –Stakeholders –Staff –Non-users • Who to interview –Former volunteers –Former staff –Vendors Photo credit: kierenmccarthy.co.uk
  16. 16. Communication of Results • Review purpose of review • Review what was done • Arrange key findings in logical order, highlighting interpretations where appropriate • Close with recommendations or issues to be addressed Credit: EE.UTD.Events on Flickr
  17. 17. Lessons Learned • Equivocal recommendations get equivocal results • Decision-makers need synthesized data, not the opportunity for data analysis • Survey data is important, but the proof is in historical performance • Plan and communicate transitions
  18. 18. Thank You! Greg Melia, CAE gmelia@asaecenter.org @gmeliacae

×