SlideShare une entreprise Scribd logo
1  sur  39
Setting the Stage:
Workshop Framing and
Crosscutting Issues
Simon Hearn, ODI
Evaluation Methods for Large-Scale, Complex, Multi-National
Global Health Initiatives, Institute of Medicine
7-8 January, London, UK
What do we mean by complex
interventions?
The nature of the intervention:
1. Focus of objectives
2. Governance
3. Consistency of implementation
How it works:
4. Necessariness
5. Sufficiency
6. Change trajectory
Photo: Les Chatfield - http://www.flickr.com/photos/elsie/
What are the challenges of evaluating
complex interventions?
Describing what is being implemented
Getting data about impacts
Attributing impacts to a particular programme

Photo: Les Chatfield - http://www.flickr.com/photos/elsie/
Why a framework is needed

Wikipedia: Evaluation Methods
Why a framework is needed

Image: Simon Kneebone. http://simonkneebone.com/
The Rainbow Framework
DEFINE what is to be evaluated
Why do we need to start
with a clear definition?

Photo: Hobbies on a Budget / Flickr
1. Develop initial description
2. Develop program theory
or logic model
3. Identify potential
unintended results
Options for representing logic models
Pipeline / results chain
Logical framework
Outcomes hierarchy / theory of change
Realist Matrix
FRAME what is to be evaluated
Frame Decision

Make Decision

Frame Evaluation

Design Evaluation

Source: Hobbies on a Budget / Flickr
1. Identify primary intended
users
2. Decide purpose(s)
3. Specify key evaluation
questions
4. Determine what ‘success’
looks like
DESCRIBE what happened
1. Sample
2. Use measures, indicators
or metrics
3. Collect and/or retrieve
data
4. Manage data
5. Combine qualitative and
quantitative data
6. Analyze data
7. Visualize data
Combine qualitative and quantitative data
Enrich
Examine
Explain
Triangulate

Parallel
Sequential
Component
Integrated
UNDERSTAND CAUSES of
outcomes and impacts
Outcomes

Impacts
As a profession, we often either

oversimplify causation
or we overcomplicate it!
“In my opinion, measuring attribution is
critical, and we can't do that unless we use
control groups to compare them to.”
Comment in an expert discussion on The Guardian online, May 2013
1. Check that the results
support causal attribution
2. Compare results to the
counterfactual
3. Investigate possible
alternative explanations
SYNTHESIZE data from one or more
evaluations
Was it good?
Did it work?
Was it
effective?

For whom did it
work?
In what ways did
it work?

Was it value for money?
Was it cost-effective?
Did it succeed in terms of
the Triple Bottom Line?
How do we synthesize diverse evidence about
performance?
All intended
impacts achieved























GOOD

??

??

BAD

Some intended
impacts achieved

No negative
impacts

Overall synthesis

24
1.
2.
3.

Synthesize data from a
single evaluation
Synthesize data across
evaluations
Generalize findings
REPORT and SUPPORT USE of findings
I can honestly say
that not a day goes
by when we don’t
use those
evaluations in one
way or another
1. Identify reporting
requirements
2. Develop reporting media
3. Ensure accessibility
4. Develop
recommendations
5. Support use
MANAGE your evaluation
1. Understand and engage with
stakeholders
2. Establish decision making processes
3. Decide who will conduct the
evaluation
4. Determine and secure resources
5. Define ethical and quality evaluation
standards
6. Document management processes
and agreements
7. Develop evaluation plan or framework
8. Review evaluation
9. Develop evaluation capacity
Making decisions
Look at type of questions

DESCRIBE

Descriptive Questions – Was the policy
implemented as planned?

UNDERSTAND
CAUSES

Causal questions – Did the policy change
contribute to improved health outcomes?

SYNTHESIZE
REPORT &
SUPPORT
USE

Synthesis questions – Was the policy
overall a success?
Action questions – What should we do?
Making decisions
Compare pros and cons
Making decisions
Create an evaluation matrix

Participant
Questionnaire

Key Informant
Interviews

Observation of
Project Records program
implementation

KEQ1 What was the
quality of
implementation?

✔

✔

✔

KEQ2 To what extent
were the program
objectives met?

✔

✔

✔

KEQ3 What other
impacts did the
program have?

✔

✔

KEQ4 How could the
program be
improved?

✔

✔

✔

✔
www.betterevaluation.org
Documenting
Sharing
R&D
Events

Descriptions
Comments
Examples

Guides
Tools
Founding Partners

Financial Supporters
For more information:
www.betterevaluation.org/start_here

Contenu connexe

Tendances

9 Steps to Advocacy Evaluation
9 Steps to Advocacy Evaluation9 Steps to Advocacy Evaluation
9 Steps to Advocacy EvaluationInnovation Network
 
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...FOME2015
 
Understanding what works and why in peer and community based programs for HIV...
Understanding what works and why in peer and community based programs for HIV...Understanding what works and why in peer and community based programs for HIV...
Understanding what works and why in peer and community based programs for HIV...Australian Federation of AIDS Organisations
 
Terms, Tips, and Trends: Evaluation Essentials for Nonprofits
Terms, Tips, and Trends: Evaluation Essentials for NonprofitsTerms, Tips, and Trends: Evaluation Essentials for Nonprofits
Terms, Tips, and Trends: Evaluation Essentials for NonprofitsInnovation Network
 
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping MethodData for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping MethodMEASURE Evaluation
 
RAPID perspective on Research into Use
RAPID perspective on Research into UseRAPID perspective on Research into Use
RAPID perspective on Research into UseSimon Hearn
 
Co Inst Eval Presentation 09
Co Inst Eval Presentation 09Co Inst Eval Presentation 09
Co Inst Eval Presentation 09June Gothberg
 
Research impact assessment workshop cara 170507 v140724
Research impact assessment workshop cara 170507 v140724Research impact assessment workshop cara 170507 v140724
Research impact assessment workshop cara 170507 v140724KMb Unit, York University
 
RiPPLE project communication and M&E
RiPPLE project communication and M&ERiPPLE project communication and M&E
RiPPLE project communication and M&EEwen Le Borgne
 
Evaluating conferences and events: new approaches and initiatives
Evaluating conferences and events: new approaches and initiativesEvaluating conferences and events: new approaches and initiatives
Evaluating conferences and events: new approaches and initiativesGlenn O'Neil
 
Framework and Toolkit to Strengthen Evaluation Capacity
Framework and Toolkit to Strengthen Evaluation CapacityFramework and Toolkit to Strengthen Evaluation Capacity
Framework and Toolkit to Strengthen Evaluation CapacityMEASURE Evaluation
 
FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...
FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...
FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...FOME2015
 
Using Engineering Methods During Intervention Design to Increase Participant ...
Using Engineering Methods During Intervention Design to Increase Participant ...Using Engineering Methods During Intervention Design to Increase Participant ...
Using Engineering Methods During Intervention Design to Increase Participant ...Elizabeth (Lisa) Gardner
 
Monitoring stakeholder engagement
Monitoring stakeholder engagementMonitoring stakeholder engagement
Monitoring stakeholder engagementSimon Hearn
 
CR and Sustainability - Research methodology
CR and Sustainability - Research methodologyCR and Sustainability - Research methodology
CR and Sustainability - Research methodologyCEMCA
 
Humanitarian advocacy
Humanitarian advocacyHumanitarian advocacy
Humanitarian advocacyGlenn O'Neil
 
Small grants, big opportunities: Using the small grants mechanism to build l...
Small grants, big opportunities: Using the small grants mechanism  to build l...Small grants, big opportunities: Using the small grants mechanism  to build l...
Small grants, big opportunities: Using the small grants mechanism to build l...MEASURE Evaluation
 
Assessing the PEPFAR Global OVC Graduation Benchmarks
Assessing the PEPFAR Global OVC Graduation BenchmarksAssessing the PEPFAR Global OVC Graduation Benchmarks
Assessing the PEPFAR Global OVC Graduation BenchmarksMEASURE Evaluation
 

Tendances (20)

9 Steps to Advocacy Evaluation
9 Steps to Advocacy Evaluation9 Steps to Advocacy Evaluation
9 Steps to Advocacy Evaluation
 
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
 
Understanding what works and why in peer and community based programs for HIV...
Understanding what works and why in peer and community based programs for HIV...Understanding what works and why in peer and community based programs for HIV...
Understanding what works and why in peer and community based programs for HIV...
 
Arizona systems
Arizona systemsArizona systems
Arizona systems
 
Terms, Tips, and Trends: Evaluation Essentials for Nonprofits
Terms, Tips, and Trends: Evaluation Essentials for NonprofitsTerms, Tips, and Trends: Evaluation Essentials for Nonprofits
Terms, Tips, and Trends: Evaluation Essentials for Nonprofits
 
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping MethodData for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
 
RAPID perspective on Research into Use
RAPID perspective on Research into UseRAPID perspective on Research into Use
RAPID perspective on Research into Use
 
Co Inst Eval Presentation 09
Co Inst Eval Presentation 09Co Inst Eval Presentation 09
Co Inst Eval Presentation 09
 
Research impact assessment workshop cara 170507 v140724
Research impact assessment workshop cara 170507 v140724Research impact assessment workshop cara 170507 v140724
Research impact assessment workshop cara 170507 v140724
 
RiPPLE project communication and M&E
RiPPLE project communication and M&ERiPPLE project communication and M&E
RiPPLE project communication and M&E
 
Evaluating conferences and events: new approaches and initiatives
Evaluating conferences and events: new approaches and initiativesEvaluating conferences and events: new approaches and initiatives
Evaluating conferences and events: new approaches and initiatives
 
Framework and Toolkit to Strengthen Evaluation Capacity
Framework and Toolkit to Strengthen Evaluation CapacityFramework and Toolkit to Strengthen Evaluation Capacity
Framework and Toolkit to Strengthen Evaluation Capacity
 
FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...
FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...
FoME Symposium 2015 | Workshop 9: Story-telling and other New Methods of Eval...
 
Using Engineering Methods During Intervention Design to Increase Participant ...
Using Engineering Methods During Intervention Design to Increase Participant ...Using Engineering Methods During Intervention Design to Increase Participant ...
Using Engineering Methods During Intervention Design to Increase Participant ...
 
Monitoring stakeholder engagement
Monitoring stakeholder engagementMonitoring stakeholder engagement
Monitoring stakeholder engagement
 
CR and Sustainability - Research methodology
CR and Sustainability - Research methodologyCR and Sustainability - Research methodology
CR and Sustainability - Research methodology
 
Humanitarian advocacy
Humanitarian advocacyHumanitarian advocacy
Humanitarian advocacy
 
Small grants, big opportunities: Using the small grants mechanism to build l...
Small grants, big opportunities: Using the small grants mechanism  to build l...Small grants, big opportunities: Using the small grants mechanism  to build l...
Small grants, big opportunities: Using the small grants mechanism to build l...
 
Assessing the PEPFAR Global OVC Graduation Benchmarks
Assessing the PEPFAR Global OVC Graduation BenchmarksAssessing the PEPFAR Global OVC Graduation Benchmarks
Assessing the PEPFAR Global OVC Graduation Benchmarks
 
Eawag ke seminar_updated 170317
Eawag ke seminar_updated 170317Eawag ke seminar_updated 170317
Eawag ke seminar_updated 170317
 

Similaire à Setting the Stage Workshop: Complex Global Health Evaluation Framework

Chapter 13 An evaluation framework
Chapter 13 An evaluation frameworkChapter 13 An evaluation framework
Chapter 13 An evaluation frameworkvuongdq93
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
Ot5101 005 week 5
Ot5101 005 week 5Ot5101 005 week 5
Ot5101 005 week 5stanbridge
 
Human rights presentation at nyu
Human rights presentation at nyuHuman rights presentation at nyu
Human rights presentation at nyuPatrick Mphaka
 
Introduction To Evaluation
Introduction To EvaluationIntroduction To Evaluation
Introduction To EvaluationFacetoFace
 
Doing Better Evaluation with Mixed Methods
Doing Better Evaluation with Mixed MethodsDoing Better Evaluation with Mixed Methods
Doing Better Evaluation with Mixed MethodsWorldFish
 
R.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptxR.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptxtalhaaziz78
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptxMaiwandHoshmand1
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosaResty Samosa
 
Better evaluation series cbd139 - frame the boundaries of the evaluation
Better evaluation series   cbd139 - frame the boundaries of the evaluationBetter evaluation series   cbd139 - frame the boundaries of the evaluation
Better evaluation series cbd139 - frame the boundaries of the evaluationIbisconsultingMaroc
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorldFish
 
FCAS M&E Seminar
FCAS M&E SeminarFCAS M&E Seminar
FCAS M&E SeminarItad Ltd
 
Webinar 6: Selecting strategies and techniques best suited to address barrier...
Webinar 6: Selecting strategies and techniques best suited to address barrier...Webinar 6: Selecting strategies and techniques best suited to address barrier...
Webinar 6: Selecting strategies and techniques best suited to address barrier...Canadian Patient Safety Institute
 
Webinar 5: Identifying barriers and enablers, and determinants, in practice
Webinar 5: Identifying barriers and enablers, and determinants, in practiceWebinar 5: Identifying barriers and enablers, and determinants, in practice
Webinar 5: Identifying barriers and enablers, and determinants, in practiceCanadian Patient Safety Institute
 
How Do We Evaluate That? Evaluation in the Uncontrolled World
How Do We Evaluate That? Evaluation in the Uncontrolled WorldHow Do We Evaluate That? Evaluation in the Uncontrolled World
How Do We Evaluate That? Evaluation in the Uncontrolled WorldMEASURE Evaluation
 
RESEARCH BOTANY.pptx
RESEARCH BOTANY.pptxRESEARCH BOTANY.pptx
RESEARCH BOTANY.pptxSauravMedhi1
 

Similaire à Setting the Stage Workshop: Complex Global Health Evaluation Framework (20)

Chapter 13 An evaluation framework
Chapter 13 An evaluation frameworkChapter 13 An evaluation framework
Chapter 13 An evaluation framework
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
Ot5101 005 week 5
Ot5101 005 week 5Ot5101 005 week 5
Ot5101 005 week 5
 
Human rights presentation at nyu
Human rights presentation at nyuHuman rights presentation at nyu
Human rights presentation at nyu
 
Introduction To Evaluation
Introduction To EvaluationIntroduction To Evaluation
Introduction To Evaluation
 
Doing Better Evaluation with Mixed Methods
Doing Better Evaluation with Mixed MethodsDoing Better Evaluation with Mixed Methods
Doing Better Evaluation with Mixed Methods
 
10 things about_evaluation_final_with_hyperlinks
10 things about_evaluation_final_with_hyperlinks10 things about_evaluation_final_with_hyperlinks
10 things about_evaluation_final_with_hyperlinks
 
R.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptxR.M Evaluation Program complete research.pptx
R.M Evaluation Program complete research.pptx
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptx
 
P process plus sa (1)
P process plus sa (1)P process plus sa (1)
P process plus sa (1)
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
Better evaluation series cbd139 - frame the boundaries of the evaluation
Better evaluation series   cbd139 - frame the boundaries of the evaluationBetter evaluation series   cbd139 - frame the boundaries of the evaluation
Better evaluation series cbd139 - frame the boundaries of the evaluation
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
FCAS M&E Seminar
FCAS M&E SeminarFCAS M&E Seminar
FCAS M&E Seminar
 
Webinar 6: Selecting strategies and techniques best suited to address barrier...
Webinar 6: Selecting strategies and techniques best suited to address barrier...Webinar 6: Selecting strategies and techniques best suited to address barrier...
Webinar 6: Selecting strategies and techniques best suited to address barrier...
 
Webinar 5: Identifying barriers and enablers, and determinants, in practice
Webinar 5: Identifying barriers and enablers, and determinants, in practiceWebinar 5: Identifying barriers and enablers, and determinants, in practice
Webinar 5: Identifying barriers and enablers, and determinants, in practice
 
How Do We Evaluate That? Evaluation in the Uncontrolled World
How Do We Evaluate That? Evaluation in the Uncontrolled WorldHow Do We Evaluate That? Evaluation in the Uncontrolled World
How Do We Evaluate That? Evaluation in the Uncontrolled World
 
RESEARCH BOTANY.pptx
RESEARCH BOTANY.pptxRESEARCH BOTANY.pptx
RESEARCH BOTANY.pptx
 

Plus de Simon Hearn

BetterEvaluation - Webinar number 2: Define
BetterEvaluation - Webinar number 2: DefineBetterEvaluation - Webinar number 2: Define
BetterEvaluation - Webinar number 2: DefineSimon Hearn
 
How to navigate the maze of evaluation methods and approaches
How to navigate the maze of evaluation methods and approachesHow to navigate the maze of evaluation methods and approaches
How to navigate the maze of evaluation methods and approachesSimon Hearn
 
Online faciliation
Online faciliationOnline faciliation
Online faciliationSimon Hearn
 
Introduction to Outcome Mapping
Introduction to Outcome MappingIntroduction to Outcome Mapping
Introduction to Outcome MappingSimon Hearn
 
OM for policy influencing
OM for policy influencingOM for policy influencing
OM for policy influencingSimon Hearn
 
OM and Complexity
OM and ComplexityOM and Complexity
OM and ComplexitySimon Hearn
 

Plus de Simon Hearn (9)

What is impact?
What is impact?What is impact?
What is impact?
 
BetterEvaluation - Webinar number 2: Define
BetterEvaluation - Webinar number 2: DefineBetterEvaluation - Webinar number 2: Define
BetterEvaluation - Webinar number 2: Define
 
How to navigate the maze of evaluation methods and approaches
How to navigate the maze of evaluation methods and approachesHow to navigate the maze of evaluation methods and approaches
How to navigate the maze of evaluation methods and approaches
 
Online faciliation
Online faciliationOnline faciliation
Online faciliation
 
Roma for seval
Roma for sevalRoma for seval
Roma for seval
 
Introduction to Outcome Mapping
Introduction to Outcome MappingIntroduction to Outcome Mapping
Introduction to Outcome Mapping
 
OM for policy influencing
OM for policy influencingOM for policy influencing
OM for policy influencing
 
OM and Complexity
OM and ComplexityOM and Complexity
OM and Complexity
 
OM Introduction
OM IntroductionOM Introduction
OM Introduction
 

Dernier

8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCRashishs7044
 
Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...
Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...
Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...lizamodels9
 
Flow Your Strategy at Flight Levels Day 2024
Flow Your Strategy at Flight Levels Day 2024Flow Your Strategy at Flight Levels Day 2024
Flow Your Strategy at Flight Levels Day 2024Kirill Klimov
 
Keppel Ltd. 1Q 2024 Business Update Presentation Slides
Keppel Ltd. 1Q 2024 Business Update  Presentation SlidesKeppel Ltd. 1Q 2024 Business Update  Presentation Slides
Keppel Ltd. 1Q 2024 Business Update Presentation SlidesKeppelCorporation
 
Buy gmail accounts.pdf Buy Old Gmail Accounts
Buy gmail accounts.pdf Buy Old Gmail AccountsBuy gmail accounts.pdf Buy Old Gmail Accounts
Buy gmail accounts.pdf Buy Old Gmail AccountsBuy Verified Accounts
 
Pitch Deck Teardown: Geodesic.Life's $500k Pre-seed deck
Pitch Deck Teardown: Geodesic.Life's $500k Pre-seed deckPitch Deck Teardown: Geodesic.Life's $500k Pre-seed deck
Pitch Deck Teardown: Geodesic.Life's $500k Pre-seed deckHajeJanKamps
 
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCRashishs7044
 
8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCR8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCRashishs7044
 
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,noida100girls
 
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdfNewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdfKhaled Al Awadi
 
Global Scenario On Sustainable and Resilient Coconut Industry by Dr. Jelfina...
Global Scenario On Sustainable  and Resilient Coconut Industry by Dr. Jelfina...Global Scenario On Sustainable  and Resilient Coconut Industry by Dr. Jelfina...
Global Scenario On Sustainable and Resilient Coconut Industry by Dr. Jelfina...ictsugar
 
Marketplace and Quality Assurance Presentation - Vincent Chirchir
Marketplace and Quality Assurance Presentation - Vincent ChirchirMarketplace and Quality Assurance Presentation - Vincent Chirchir
Marketplace and Quality Assurance Presentation - Vincent Chirchirictsugar
 
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCRashishs7044
 
Contemporary Economic Issues Facing the Filipino Entrepreneur (1).pptx
Contemporary Economic Issues Facing the Filipino Entrepreneur (1).pptxContemporary Economic Issues Facing the Filipino Entrepreneur (1).pptx
Contemporary Economic Issues Facing the Filipino Entrepreneur (1).pptxMarkAnthonyAurellano
 
Call Girls Miyapur 7001305949 all area service COD available Any Time
Call Girls Miyapur 7001305949 all area service COD available Any TimeCall Girls Miyapur 7001305949 all area service COD available Any Time
Call Girls Miyapur 7001305949 all area service COD available Any Timedelhimodelshub1
 
2024 Numerator Consumer Study of Cannabis Usage
2024 Numerator Consumer Study of Cannabis Usage2024 Numerator Consumer Study of Cannabis Usage
2024 Numerator Consumer Study of Cannabis UsageNeil Kimberley
 
Organizational Structure Running A Successful Business
Organizational Structure Running A Successful BusinessOrganizational Structure Running A Successful Business
Organizational Structure Running A Successful BusinessSeta Wicaksana
 
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607dollysharma2066
 
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In.../:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...lizamodels9
 

Dernier (20)

Corporate Profile 47Billion Information Technology
Corporate Profile 47Billion Information TechnologyCorporate Profile 47Billion Information Technology
Corporate Profile 47Billion Information Technology
 
8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR
 
Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...
Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...
Call Girls In Radisson Blu Hotel New Delhi Paschim Vihar ❤️8860477959 Escorts...
 
Flow Your Strategy at Flight Levels Day 2024
Flow Your Strategy at Flight Levels Day 2024Flow Your Strategy at Flight Levels Day 2024
Flow Your Strategy at Flight Levels Day 2024
 
Keppel Ltd. 1Q 2024 Business Update Presentation Slides
Keppel Ltd. 1Q 2024 Business Update  Presentation SlidesKeppel Ltd. 1Q 2024 Business Update  Presentation Slides
Keppel Ltd. 1Q 2024 Business Update Presentation Slides
 
Buy gmail accounts.pdf Buy Old Gmail Accounts
Buy gmail accounts.pdf Buy Old Gmail AccountsBuy gmail accounts.pdf Buy Old Gmail Accounts
Buy gmail accounts.pdf Buy Old Gmail Accounts
 
Pitch Deck Teardown: Geodesic.Life's $500k Pre-seed deck
Pitch Deck Teardown: Geodesic.Life's $500k Pre-seed deckPitch Deck Teardown: Geodesic.Life's $500k Pre-seed deck
Pitch Deck Teardown: Geodesic.Life's $500k Pre-seed deck
 
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
8447779800, Low rate Call girls in Shivaji Enclave Delhi NCR
 
8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCR8447779800, Low rate Call girls in Tughlakabad Delhi NCR
8447779800, Low rate Call girls in Tughlakabad Delhi NCR
 
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
 
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdfNewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdf
 
Global Scenario On Sustainable and Resilient Coconut Industry by Dr. Jelfina...
Global Scenario On Sustainable  and Resilient Coconut Industry by Dr. Jelfina...Global Scenario On Sustainable  and Resilient Coconut Industry by Dr. Jelfina...
Global Scenario On Sustainable and Resilient Coconut Industry by Dr. Jelfina...
 
Marketplace and Quality Assurance Presentation - Vincent Chirchir
Marketplace and Quality Assurance Presentation - Vincent ChirchirMarketplace and Quality Assurance Presentation - Vincent Chirchir
Marketplace and Quality Assurance Presentation - Vincent Chirchir
 
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
 
Contemporary Economic Issues Facing the Filipino Entrepreneur (1).pptx
Contemporary Economic Issues Facing the Filipino Entrepreneur (1).pptxContemporary Economic Issues Facing the Filipino Entrepreneur (1).pptx
Contemporary Economic Issues Facing the Filipino Entrepreneur (1).pptx
 
Call Girls Miyapur 7001305949 all area service COD available Any Time
Call Girls Miyapur 7001305949 all area service COD available Any TimeCall Girls Miyapur 7001305949 all area service COD available Any Time
Call Girls Miyapur 7001305949 all area service COD available Any Time
 
2024 Numerator Consumer Study of Cannabis Usage
2024 Numerator Consumer Study of Cannabis Usage2024 Numerator Consumer Study of Cannabis Usage
2024 Numerator Consumer Study of Cannabis Usage
 
Organizational Structure Running A Successful Business
Organizational Structure Running A Successful BusinessOrganizational Structure Running A Successful Business
Organizational Structure Running A Successful Business
 
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
 
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In.../:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
 

Setting the Stage Workshop: Complex Global Health Evaluation Framework

  • 1. Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi-National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK
  • 2. What do we mean by complex interventions? The nature of the intervention: 1. Focus of objectives 2. Governance 3. Consistency of implementation How it works: 4. Necessariness 5. Sufficiency 6. Change trajectory Photo: Les Chatfield - http://www.flickr.com/photos/elsie/
  • 3. What are the challenges of evaluating complex interventions? Describing what is being implemented Getting data about impacts Attributing impacts to a particular programme Photo: Les Chatfield - http://www.flickr.com/photos/elsie/
  • 4. Why a framework is needed Wikipedia: Evaluation Methods
  • 5. Why a framework is needed Image: Simon Kneebone. http://simonkneebone.com/
  • 7. DEFINE what is to be evaluated
  • 8. Why do we need to start with a clear definition? Photo: Hobbies on a Budget / Flickr
  • 9. 1. Develop initial description 2. Develop program theory or logic model 3. Identify potential unintended results
  • 10. Options for representing logic models Pipeline / results chain Logical framework Outcomes hierarchy / theory of change Realist Matrix
  • 11. FRAME what is to be evaluated
  • 12. Frame Decision Make Decision Frame Evaluation Design Evaluation Source: Hobbies on a Budget / Flickr
  • 13. 1. Identify primary intended users 2. Decide purpose(s) 3. Specify key evaluation questions 4. Determine what ‘success’ looks like
  • 15. 1. Sample 2. Use measures, indicators or metrics 3. Collect and/or retrieve data 4. Manage data 5. Combine qualitative and quantitative data 6. Analyze data 7. Visualize data
  • 16. Combine qualitative and quantitative data Enrich Examine Explain Triangulate Parallel Sequential Component Integrated
  • 19. As a profession, we often either oversimplify causation or we overcomplicate it!
  • 20. “In my opinion, measuring attribution is critical, and we can't do that unless we use control groups to compare them to.” Comment in an expert discussion on The Guardian online, May 2013
  • 21. 1. Check that the results support causal attribution 2. Compare results to the counterfactual 3. Investigate possible alternative explanations
  • 22. SYNTHESIZE data from one or more evaluations
  • 23. Was it good? Did it work? Was it effective? For whom did it work? In what ways did it work? Was it value for money? Was it cost-effective? Did it succeed in terms of the Triple Bottom Line?
  • 24. How do we synthesize diverse evidence about performance? All intended impacts achieved            GOOD ?? ?? BAD Some intended impacts achieved No negative impacts Overall synthesis 24
  • 25. 1. 2. 3. Synthesize data from a single evaluation Synthesize data across evaluations Generalize findings
  • 26. REPORT and SUPPORT USE of findings
  • 27. I can honestly say that not a day goes by when we don’t use those evaluations in one way or another
  • 28. 1. Identify reporting requirements 2. Develop reporting media 3. Ensure accessibility 4. Develop recommendations 5. Support use
  • 29.
  • 31. 1. Understand and engage with stakeholders 2. Establish decision making processes 3. Decide who will conduct the evaluation 4. Determine and secure resources 5. Define ethical and quality evaluation standards 6. Document management processes and agreements 7. Develop evaluation plan or framework 8. Review evaluation 9. Develop evaluation capacity
  • 32.
  • 33. Making decisions Look at type of questions DESCRIBE Descriptive Questions – Was the policy implemented as planned? UNDERSTAND CAUSES Causal questions – Did the policy change contribute to improved health outcomes? SYNTHESIZE REPORT & SUPPORT USE Synthesis questions – Was the policy overall a success? Action questions – What should we do?
  • 35. Making decisions Create an evaluation matrix Participant Questionnaire Key Informant Interviews Observation of Project Records program implementation KEQ1 What was the quality of implementation? ✔ ✔ ✔ KEQ2 To what extent were the program objectives met? ✔ ✔ ✔ KEQ3 What other impacts did the program have? ✔ ✔ KEQ4 How could the program be improved? ✔ ✔ ✔ ✔

Notes de l'éditeur

  1. Focus – Are the objectives defined or emergent objectives?Involvement– Is there clear governance or is it flexible and shifting?Consistency – Is there a standard delivery or does it adapt in context?Necessariness – is the intervention the only way to achieve the intended impacts? And is it possible to know all other pathways?Sufficiency – can the intervention produce the intended impacts by itself? And if not, can we predict the external factors required?Change trajectory – What is the relationship between inputs and outcomes, and how does it change over time?
  2. We’re here to discuss practice of designing and conducting particular types of evaluations: mixed-method evaluations of large, complex interventions. But what makes these different? What do we especially need to pay attention to? I want to suggest three challenges that we might like to reflect on during the sessions in these two days:Perhaps your main challenge with these types of evaluations is being able to describe what is being implemented, when it varies across multiple sites, programme components and over time? Developing a meaningful programme theory for a complex intervention is no mean feat.Perhaps your main challenge isbeing able to get data about what impacts have been achieved, given time lags before theybecome evident? Our tried a tested toolkits are being stretched to their limits.Or maybeit’s being able to attribute the impacts to a particular program, or the activities of particular partners, given the multiple factors affecting impacts? Perhaps where statistically valid counterfactuals are not possible.
  3. There are a vast number of evaluation methods in the toolboxes of evaluators, researchers and managers. However, when development agencies or NGOs or in-country governments start to plan an evaluation, or to engage an evaluator, it is difficult to find accessible, comprehensive and credible information about how to choose the right combination of evaluation methods, and how to implement them well – or how to manage these issues with a potential evaluator. Many guides to evaluation focus on only a few types of data collection methods and a few types of research designs. Usually missing are newer types of data collection (such as GIS logging on mobile phones, or tracking social media mentions), designs and strategies that can be used when experimental or quasi-experimental research designs are not feasible or appropriate, and methods to address issues such as clarifying and negotiating the values that should underpin the evaluation. There is information about some of these available on various web sites, but they can be difficult to locate, of varying quality, and not co-ordinated.So we are either bamboozled by the massive range of options or…
  4. We stick to what we know, which can have alarming consequences, especially for complex evaluations where you really have to examine the nut you are dealing with before deciding on the tool to use.
  5. What we have found useful is to distinguish between different tasks in evaluation, which need different types of methods or processes. We have developed the Rainbow Framework which sets out seven clusters of tasks.An evaluation starts by engaging stakeholders – those people who need to be involved in the decision making about an evaluation. This is part of MANAGING an evaluation. The next step is to describe the program - setting the boundaries and articulating its theory of change. This is part of DEFINING what is to be evaluated. The next step is to focus the evaluation design. We have found it is useful to separate this into two clusters of tasks. FRAMING the evaluation sets the boundaries of the evaluation - making decisions about its purposes, the primary intended users, the key evaluation questions, and the money, time and other resources available. Then the design of the evaluation sets out how data will be collected and analysed to answer three different types of questions – questions that require DESCRIPTION of what has happened, questions that require methods so we can UNDERSTAND the CAUSES of what has happened, and questions about the overall SYNTHESIS of the value of the program. Finally there are methods and processes to report and support use of findings. While this process often comes at the end of the evaluation, it should be planned from the beginning.
  6. Developing an agreed definition of what is being evaluated is important to establish common understanding & identify any disagreements or gaps which can lead to problems later - What is it that we are evaluating? The apples or oranges or the whole basket?For complex interventions, a good description identifies whether there are multiple components, multiple levels of activity, multiple stakeholders and multiple intended impacts – and whether there are emergent aspects such as strategies and goals that will emerge as a better understanding of the situation and possibilities is developed.
  7. There are three important tasks involved in defining what is to be evaluated:To develop an initial description – what is to be evaluated?To develop a program theory or logic model – what are the intended outcomes and impacts – and how will the program help to bring them about?And to also identify potential unintended results – so the evaluation can address these as well and take these into account in the overall judgement of success.
  8. Logic models, or theories of change, are increasingly popular in evaluation. But many of them are little more than a list of program activities and intended impacts with arrows in between. A good logic model actually has a theory of change – how is change understood to come about – and a theory of action – how will the program activate this theory of change?For complex interventions, a simple linear pipeline or results chain or logical framework is unlikely to be sufficient to show the layers of activities and impacts, the multiple contributors and emergent elements. Instead outcomes hierarchies or realist matrices are more likely to be useful.
  9. The second cluster of tasks in an evaluation relate to framing the evaluation
  10. The idea of framing an evaluation draws on research into decision making which found that effective decision makers don’t rush to make the decision, ..but stop to frame it. They make sure they are clear about what needs to be taken into account, including who needs to be involved in making the decision, and when it is needed.With evaluation, there is often pressure to rush to design an evaluation – to choose metrics or a research design... But it is important to frame it, to be clear about what needs to be taken into account when designing it – and in particular to be clear about the intended uses of the evaluation.
  11. There are four tasks in framing that need to be addressed before you can design an evaluation. The first two are related – who is this evaluation for? And how are they going to use it? The first two tasks are interconnected and where you start will vary. Sometimes the primary intended users have already been identified, and the next task is to identify how they intend to use the evaluation. Other times the purpose of the evaluation has been identified and the next task is to identify the primary intended users. For example if an evaluation is being conducted to improve how a program is working, we should identify who would make decisions about changing the way things are done, and would therefore be the intended users.From this comes the third task – specifying key evaluation questions – descriptive, causal, synthesis, actionAnd the fourth one is about values – determining what success looks like. Processes, outcomes, costs & benefits. Standards and criteria. For complex interventions, with multiple contributors, there are likely to be multiple primary intended users of an evaluation with different purposes. They may have different key evaluation questions and different ideas about what ‘success’ will look like. They might decided to undertake separate evaluations, or to develop a joint evaluation which acknowledges their different information needs and values.
  12. Combining qualitative and quantitative – a topic very pertinent to this workshop. Very much a juggling act. Be very clear about purpose – why combining? Enrich quant with qual. Examine hypothesis from qual with quant. Explain unexpected quant results with qual. Triangulate results to verify or reject.When gathering data:Parallel Data Gathering: gathering qualitative and quantitative data at the same time. Sequential Data Gathering: gathering one type of data first and then using this to inform the collection of the other type of data. When combining data:Component design: collecting data independently and then combining at the end for interpretation and conclusions. Integrated design: combining different options during the conduct of the evaluation to provide more insightful understandings.
  13. The definition of Outcomes and Impacts differ in different sectors but they generally refer to the same thing: things that happen(to people or communities, or the environment, etc) as a result of the project, program, policy, or whatever you are evaluating.If we’re not attempting to understand what caused the outcomes or impacts then all we are doing is documenting coincidences and we can’t really call them outcomes or impacts at all.In all outcome and impact evaluation we must deal with causation.
  14. I think we can, as a profession, often fall into two traps:massively oversimplifying causationand massively overcomplicating it.On oversimplification, there’s often an implicit assumption that if the expected changes occur, they must have been caused by the program. So, the evaluation just documents those and calls them outcomes without actually checking that they are. But we also have a tendency to overcomplicate. A lot of people seem to think that the only way to infer causation is with elaborate experimental designs, very large datasets, and sophisticated statistical techniques. And if you can’t do that, then you just throw your hands in the air and say well, we can’t really be sure that any of these were influenced by the program.Experimental designs can indeed be very powerful, and the BetterEvaluation site has some good advice on when to choose this option and how to use it well. But experimental designs are not the be-all and end-all of causal inference.The BetterEvaluation site has several non-experimental options which can offer sufficient rigour with a straight forward design.
  15. Synthesis is part of the BetterEvaluation framework –This cluster looks at ways of answering synthesis questions – the overall evaluative questions that bring together data and values to make evaluative judgements about whether something is working well or not.Some level of synthesis is essential for all evaluations yet it’s often missed out of evaluation designs.
  16. The synthesis questions would be our main focus. We would want to ask questions such as “Was it good? Did it work? Was it effective?”. In many evaluations, just looking at the average effect would not be enough. If we are concerned about equity, we would want to know “For whom did it work?”. If it worked in terms of some criteria but not others, we’d want to know this too.And in many cases we’d want to relate this performance data to costs. Was it value for money? Was it cost-effective? Did it succeed in terms of the Triple Bottom Line – social costs and benefits, economic costs and benefits and environmental costs and benefits.To answer these questions we need to draw on the answers to other types of questions – value questions about what success looks like, descriptive questions about what has happened, and causal questions about what has caused things to happen – but we need more than this. We need synthesis methods.
  17. As well as sythesizing data from a single evaluation, we can also synthesize data across evaluations which can help answer questions like: does this type of intervention work? Which variations are more effective and when?
  18. How can evaluation make a difference?How do you move from a situation where evaluations are published but just sit on the shelf – or worse, perhaps they are misused as they are in this cartoon – where you can see that evaluation reports are used to prop the desk up and hold the window open.How do we move from this, to a situation where they contribute to improved social conditions? – which I think most of us believe evaluation should do.
  19. One of my favourite infographics is from the 1850s by Florence Nightingale, who as well as being the grandmother of modern nursing, was also a keen statistician and she drew this graphic. It depicts the causes of death in the British Army in 1854. The graph shows that by far the biggest killer was preventable disease, not battle wounds as was previously thought. The data was so compelling that it contributed to a policy to improve the conditions in all military hospitals. Pretty impressive for a diagram.
  20. The tasks in this cluster are relevant throughout the entire process of the evaluation, through each of the previous tasks – which is why we chose to brand it as white, like the light in a prism.
  21. Better Evaluation breaks the management of an evaluation into 9 key tasks. First, the manager of an evaluation needs to understand and engage with stakeholders, establish decision making processes, decide on the contributors to the evaluation, determine and secure resources, define the ethical and quality evaluation standards to be employed in the evaluation, document the process and ensure there is a clear evaluation plan or framework within which the evaluation rests. The last two tasks are the hallmark of well managed and quality evaluations are evaluations that evaluate themselves against these tasks and that serve to also build evaluation capacity.
  22. I hope you find the framework useful in our discussions here today – and in your work afterwards.I have handouts which provide two versions of the framework – an overview of the tasks, and a detailed version which lists different methods and processes you can use for each task.
  23. Your Evaluation Plan will specify how you will collect and analyse data. It is useful to plan your data collection and analysis around a few Key Evaluation Questions. These are high level questions that the evaluation is intended to answer (for example, "How effective was the program?").You will need different types of options for different types of evaluation questions:
  24. For each evaluation task you will find a range of options, each with their own advantages and disadvantages.On each option page, you will find advice about when it might be appropriate to choose that option, taking into account available time, expertise and other issues.
  25. www.betterevaluation.org