This document discusses social impact measurement and performance management. It presents frameworks for measuring social impact at different levels, from individual programs to entire fields. It emphasizes using data and feedback loops to continually evaluate and improve social services. Performance is measured through outcomes, benchmarks, and constituent voice. The goal is to understand what works, for whom, and drive better outcomes and collaboration.
(NEHA) Bhosari Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
CKX: Social Impact Measurement Around the Globe
1.
2. Demonstrating
change is caused by
an intervention
SOCIAL IMPACT MEASUREMENT
Telling a story of
what you do and why
it matters
Count or proportion
of social change
occurring
Using data to make decisions
Performance
management:
continuous
evaluation and
improvement
3. (Tris Lumley, NPC, April 2014)
FIELD
Impact evidence
ORGS
Managerial /
administrative data
PEOPLE
Constituent Voice
What are your priorities /
needs / desires?
Survey Panel
What do we know?
Field-level
evidence /
‘what works’
What should we
do?
Programme
design
How are we
doing compared to
others? Shared
How are we
doing?
What should we
change? Performance
management
How can we serve you
better?
Measurement /
Benchmarks
Data
Labs
How should we
collaborate? Field-level
collaboration
Pooled funding
Recommended
practices
What should our
priorities be?
How are we
doing collectively? What do
we need to change?
What have we learned that
will improve services? What
works for
whom, in what
circumstances?
Impact
Evaluation
How can we serve you
better next time? Constituent
voice /
Feedback
How did we do?
Where did we fail?
How do we
share what we’ve
learned?
Summative
evaluation
BEFORE DURING AFTER
Field-level
Priorities
Constituent
voice /
Feedback
11. Consultancy; 15% Public sector; 14%
Charity; 17%
Nonprofit, NGO or community
organisation but not charity; 19%
Social enterprise; 12%
Private sector; 10%
Academic institution; 10%
International mining company ; 0% Network, association or membership organisation; 3%
12. 5
4
3
2
1
0
Social
enterprise
Private sector Nonprofit, NGO
or community
organisation
but not charity
Public Consultancy Charity Academic
institution
Network,
association or
membership
organisation
Sliding scale where 5 is most expert
I do it for my program or organisation I do it for other programs or organisations
Responses to the question: I can identify measures of social impact relating to a program or organisation
15. 2
2
2
2
2
3
3
3
3
3
3
4
5
6
6
7
7
7
7
8
14
47
SROI
Surveys
Interviews
Outcomes Star
Theory of Change
Bespoke methods/tools
Data analysis
Logic Model
Before and after comparison
Results Based Accountability
Cost-Benefit Analysis
London Benchmarking Group2
Outcomes-based evaluation
Warwick Edinburgh Scale
HACT Social Value Tool
Focus Groups
Validated scales
Administrative data
Outcome Mapping
External evaluation
de Bono's Thinking tools
Results Framework
22. -71%
-84%
-42%
-45%
-48%
-12%
-15%
-24%
-25%
-26%
13%
5%
32%
24%
38%
57%
57%
53%
77%
72%
Hospitals and GPs relating to care
Tax and benefit records to catch fraud
Pharmaceutical companies share with academic researchers
Email and internet search traffic monitoring to identify terrorists
GP health records shared with researchers
Energy companies to predict energy needs
Health records shared with private healthcare companies
Technology companies monitoring searches for flu epidemic
Online retailers to target advertisements
Health records sold to privae healthcare companies
Should not happen Should happen
(Royal Statistical Society UK, 2014)
23. Delivering services to people
that need them without a
performance management
system is like driving a car
with no instruments on the
dashboard. With the
windscreen obscured by ice.
You know where you want to
go, but you have no idea if
you’re getting there.
24. Service delivery
Data
Daily data entry
onto a shared IT
system
Analysis
Analyst support to
collate and analyse
data in order to
produce data
dashboards.
Review meetings
Regular meetings to
discuss dashboards,
resource allocation
and service
improvement.
1. Consent for data
collection and use
2. Constructing a
shared case
management IT
system
3. Building capacity of all
service delivery workers
to collect and use data
5. Feedback
loops and
reporting
6. Adjusting
services in
response
4. Performance
analysts
(adapted from Social Finance)
27. Pay for
performance
Comparing intervention
participants to a control group
ATTRIBUTION
COHORT
Pay for success
Payment by Results
Performance-based
contracting
Outcomes-based
contracting
• UK Department of Work and
Pensions Innovation Fund:
disadvantaged young people
• UK It’s All About Me: adoption
• Manchester: children in care
• Granite School District: early
childhood education
• Saskatchewan – single mothers
• Peterborough: ex-offenders
• Massachusetts: high risk young men
• New York State: employment for ex-offenders
• New South Wales Benevolent Society
Social Benefit Bond: families with
children in care
Count or proportion of social
change among participants
CONTRIBUTION
INDIVIDUAL
language used
SIAA currently has six Country Impact Groups established in Austria, Bulgaria, Canada,Estonia, Hungary and Portugal, and is in conversation with eight other countries about setting up groups including France, Ireland and Turkey.
There is more support for data-sharing within government “for the benefit of services and me”,
with varying safeguards, in comparison to not sharing data at all due to privacy risks. The
addition of safeguards such as anonymisation of data, or punishment for data misuse,
significantly improves the level of support from 33% to around 51%.6
When asked to choose between a positive and a negative statement with the addition of
safeguards, 49-55% were positive (agreeing that we should share all the data we can) and
28-34% were negative (agreeing that we should not share the data due to privacy risks).
When asked without any reassurance on safeguards, only 33% chose the positive statement.
For programs that are implementing an established model, performance management is partly about ensuring adherence to practices that have been proven effective.
For programs that are developing over time, performance management is an essential part of understanding and adapting to the needs of individual clients.
Won academy award in 1978- it marked the first time that the words "fuck" and "shit" were broadcast on many networks. - began movement
1982 and 1983 two studies using the above evidence were published
On January 13, 2011, A&E introduced the new series Beyond Scared Straight,
On August 18, 2011, Disney owned A&E premiered the second season of Beyond Scared Straight, once again in the midst of controversy. Joe Vignati, director of Justice Programs at the Governor's Office for Children and Families in Georgia, writes at the Juvenile Justice Information Exchange: "After becoming the highest rated program in the history of the Disney-owned (sic) A&E network, a new season of this 'reality' show returns to titillate the curious and misinformed."[6] Also, in light of the Speziale case, the Campaign for Youth Justice has petitioned A&E to cancel Beyond Scared Straight, as they claim that the show promotes "the spread of a noxious program" and may be in violation of federal law and standards set forth by the Office of Juvenile Justice and Delinquency Prevention (OJJDP).[5]
A third season of Beyond Scared Straight debuted on Monday, August 20, 2012 on A&E.
A fourth season of Beyond Scared Straight debuted on Thursday, October 3, 2013 on A&E. A&E renewed Beyond Scared Straight for a seventh season which is to debut on Thursday, June 26, 2014
Criticism[edit]
A 2002 meta-analysis (a study that combines the results of many studies in order to see the whole picture) of the results of a number of scared straight and similar programs found that they actively increased crime rates, leading to higher reoffence rates than in control groups that did not receive the intervention.[7]
Two Justice Department officials have written an op-ed piece describing scared straight programs as "not only ineffective but is potentially harmful" to the kids involved. The op-ed appears in the February 1, 2011 edition of the Baltimore Sun, written by OJJDP Acting Administrator Jeff Slowikowski and Laurie O. Robinson. They say that, "when it comes to our children," policymakers and parents should "follow evidence, not anecdote."[8]
In 2004 the Washington State Institute for Public Policy estimated that each dollar spent on Scared Straight programs incurred costs of $203.51.[9]
Please feel free to get in touch via my email in white. The web address at the top is my blog and my twitter account is at the bottom.