2. Mission Australia is
client focused.
We strive for client informed
services that achieve positive
outcomes with individuals
and communities.
3. “How do we know what
we’re doing is helping
people?” and
“how do we measure that impact
and aggregate the results to
show how much difference the
organisation is making?”
4. Outputs are the products, services or facilities that result from an
organisation’s or project’s activities.
Outcomes are the changes, benefits, learning or other effects that result
from what the project or organisation makes, offers or provides.
Impact is the broader or longer-term effects of projects or organisation’s
outputs, outcomes and activities.
Outputs, outcomes and impact
5. Our approach
Inputs, Outputs
& Activities
Outcomes Impact
Organisational
Strategies
• Early intervention
and prevention
• Advocacy and client
voice
• Evidence based best
practice and case
management
• Self-efficacy and optimism
• Confidence and fulfillment
• Participation and inclusion
• Independence
Evaluation
Methods
• Input and output
measurement using
existing
administrative
systems
• Development of indicators.
• Consultation with service
and organisational staff
• Lessons from previous
pilots
• MA Client Wellbeing Survey
pilot
• Data analysis and
reflection of the
Theory of Change
and Practice
• Reporting back to
stakeholders and
practitioners
7. Outcome measures must:
• Be externally credible
• Be used with the external population (to provide comparison data)
• Have communication power, internally and externally
• Demonstrate program effectiveness to funders and donors
Principles
8. 1. Outcome hierarchies developed and implemented
2. Indicators and collection methodologies developed
3. Consultation and approval with staff across the organisation
4. MA Client Wellbeing Survey pilot developed and piloted in two services:
PHaMs and JSA Stream 4:
• The Personal Helpers and Mentors service (PHaMs) is a non‐clinical
recovery‐focussed support service for people experiencing
mental health issues.
• Jobs Services Australia, now JobActive, was the Australian
Government’s employment program. Stream 4 clients were those with
multiple and complex needs.
Stages
9. Stages of the pilot:
1. Processes for data collection developed
2. Ethics, consent and protocols developed
3. Sampling framework – all new clients in a set period
4. Training – face to face and webinars
5. Repeat measures data collection implemented
6. Data analysis and reporting
Implementing impact measurement
11. Mission Australia Client Wellbeing Survey
Captures subjective wellbeing through a
combination of measures
• Personal Wellbeing Index (PWI)
• Single item measures on:
• Housing
• Financial security
• Health
• Self-esteem and control
• Demographics:
• Collected via MA systems
• Age, gender, ATSI status
• Service interventions
12. PWI is the foundation of the survey.
It measures subjective wellbeing by asking
about satisfaction in:
• Standard or living
• Health
• Achieving
• Relationships
• Safety
• Community
• Future security
• Life as a whole
Mission Australia Client Wellbeing Survey
14. PHaMs JSA
Participants at intake
Many of our clients have very low levels of personal wellbeing
High Risk
PWI 0-50
Challenged
PWI 51-69
Normal Range
PWI 70-100
94%
6%
50%
25%
25%
15. PHaMs participants
MA saw significant improvements for the PHaMs clients across a range of
domains and self-reported measures after eight months in the program
50
48
35
0 20 40 60 80 100
Personal Wellbeing Index
Wave 1 Wave 2 Wave 3
17. PHaMs participants
82
63
77
47
59
25
47
18
41
19
35
18
0 20 40 60 80 100
Control over own life: has less
than mixed control
Housing adequacy: is not
adequate
Coping: Not at all or a little
Not enough money to meet needs
%
Proportion self-reporting poor outcomes
Wave 1
Wave 2
Wave 3
19. JSA participants
Significant improvements were not seen in wellbeing of
JSA Stream 4 job seekers
54
51
55
0 20 40 60 80 100
Personal Wellbeing Index
Wave 1 Wave 2 Wave 3
21. Mission Australia’s Client Wellbeing Pilot has been
validated as a useful tool to measure and inform
client journeys to independence in programs where
clients receive wrap-around intensive supports, such
as in the PHaMs program.
The findings of the pilot indicated that where a
program has a narrow focus and a constrained delivery
model such as in the Job Services Australia model, the
tool is not well adapted to collect the outcomes.
22. 1. Consider what is already known to
develop a sound framework
2. Gain management endorsement and
collaborate
3. Use repeat measures
Key lessons about measuring impact
23. Informing policy & practice
• Continuous quality improvement in our services.
• Where site results were reported and reflective practice undertaken,
service improvements have been identified and implemented.
• The pilot will be continued in selected sites to inform the design of a
model for the roll out of the NDIS.
• To inform a review of an outcomes measurement framework across MA
for reducing homelessness and strengthening communities.
• As an iterative process with government in tender and program design.
How Mission Australia uses outcome measurement
24. How Mission Australia uses outcome measurement
Reflecting on practice
PHaMs sites reflected on their practice. They were provided with site specific
outcome measures and questions to help them reflect on what the outcome
measures mean for practice:
• Are our clients better off?
• If not, what could be the reasons and are any of them reasons that we
can influence with our practice?
25. Feedback from the field:
“By having regular discussions the pilot has enabled us to identify:
• How we can continue to increase support for participants,
carers and families,
• How we can improve on group activities, external factors
which may be influencing participants’ daily lives etc.
As a team our purpose is to assist participants in pursuing the
best outcomes possible towards their recovery, as such being
able to participate in the Client Impact Pilot has been a valuable
experience.”
PHaMs Manager
26. • There is no common set of indicators being used across
Commonwealth and State Governments for outcomes-based funding
and program delivery.
• The modified PWI approach was adopted to measure our impact on
client independence. Future impact evaluation needs to measure
outcomes against MA’s current goals of reducing homelessness and
strengthening communities across our diverse programs and
services.
• Developing outcomes measurement systems and requisite IT
platforms requires staff capacity and resources.
Challenges
27. Outcome measurement is
important. It shows us and our
stakeholders how we are
achieving client outcomes.
It assists us to continually
reflect and improve.
Mission Australia is a national community service organisation.
We have been helping people to regain their independence for over 155 years.
Our goal is to reduce homelessness and strengthen communities.
We have a client focus when doing this.
MA believes a good quality of life is closely associated with independence, which includes the ability to achieve individual potential and the opportunity for social and economic participation.
“How do we know if what we’re doing is helping people?”
It’s the question community organisations seeking to help clients change their lives have to ask themselves every day. This is quickly followed by, “how do we measure that impact and aggregate the results to show how much difference the whole organisation is making?”
As part of grappling with these questions Mission Australia commenced an impact measurement pilot to better understand how our services contributed to clients’ journeys towards independence. The pilot was designed to inform funders and donors on program outcomes and MA on the effectiveness of its services.
The aim of the pilot was to enhance our understanding of the role of MA’s interventions (which are our outputs)
in improving self-efficacy and optimism, confidence and fulfilment, and participation and inclusion (which are the outcomes we aim to achieve)
and ultimately to measure clients’ journey towards independence (which is the impact we hope to have)
So we wanted to better understand the role of interventions in improving wellbeing, and ultimately to measure clients’ journeys towards independence.
To develop a stronger understanding of what independence means for clients, we reflected on MA’s strategic goals, program logics and service models.
We found that self-efficacy and optimism, confidence and fulfilment, and participation and inclusion were common themes in MA’s rationales for what we do.
These contribute to our theory of change, which articulates the desired impact we hope our clients achieve through our services. It sets out the projected outcomes, while spelling out the underlying assumptions of how we work towards helping people get back on their feet.
Having identified these assumptions, we then work out how change will be manifested and articulate short, medium and long term outcomes.
Outcomes are achieved through our inputs and program outputs, which are, broadly speaking: early intervention and prevention, advocacy and client voice, evidence based practice and a case management approach.
From the theory of change and outcomes hierarchies, measurable indicators were developed that can be used to quantify client progress.
MA undertook an extensive literature review of both national and international indicators and measures which mapped against program specific outcomes.
During 2011/12 MA piloted over 100 outcome measures from sources such as the Australian Bureau of Statistics (ABS), Household Income and Labour Dynamics in Australia Survey (HILDA) and World Health Organisation (WHO), and the Personal Wellbeing Index (PWI) to get an understanding of which measures were effective in capturing change and which could be easily implemented and analysed.
From this experience we learnt that we needed to keep the process of capturing the data simple for both staff and clients: the administration had to be easy for the staff to understand; the tool easy for clients to complete; and the measurements meaningful for the service providers.
It was important that normative and comparative data be available for these measures to help service delivery staff and stakeholders ‘make sense’ of results.
This is particularly important when demonstrating program effectiveness to funders and donors.
Mission Australia developed a logical framework approach to our understanding of outcomes, as well a methodology to measure these. Our services use program logics which map expected behavioural change at the program level.
Measurable indicators were developed to quantify client progress. This included international and national review; matching to outcomes frameworks; piloting with staff and clients; review and revision.
Consultation with teams across MA including: program managers, case workers, service design and delivery staff, business intelligence, and clients.
Approval by the Board’s Service Impact committee and executive leaders.
MA Client Wellbeing Survey developed and piloted in two services – PHaMs and JSA.
Processes included using online and paper surveys (which took no more than 10 minutes to be self completed by the client) as well as existing administrative systems.
The pilot underwent an organisational ethical review
Information sheet and consent forms for clients to provide informed consent
Client participation was voluntary
Data was de-identified at collection and no identifiable data was made available.
The sample included:
PHaMs: All new clients entering services between May and July 2014
JSA : all new Stream 4 clients, or newly placed (employment/education/training) clients during the same time period.
Training was provided to all staff responsible for the data collection.
PHaMs: consultation occurred at the site level via program staff that validated measures and sought input from clients.
JSA: comprehensive data already collected for service reporting was deemed sufficient.
Collection:
PHaMs: Program managers were responsible for implementation. Data was collected by Program Managers and Case Workers
JSA: Employment Advisors offered the survey to eligible jobseekers.
Data captured from the client surveys, caseworker surveys and administrative sources were combined on the basis of service client ID. Wave-on-wave comparisons were made on data gathered from the same matched set of clients.
MA used the Personal Wellbeing Index (PWI) as the foundation of measuring client outcomes (more about this on next slide).
The survey tool also included some single item questions to ensure that all domains in MA’s outcomes hierarchy were properly covered. To achieve this we added questions from the HILDA survey and Longitudinal Survey of Australian Children (LSAC) and the WHO Quality of Live (WHOQOL) to monitor the progress of our clients.
PHaMs program managers took part in defining appropriate performance measures as part of a training workshop. The measures developed were shared with other PHaMs staff to validate and shortlist. As well as being validated by case workers, the measures were validated by peer workers and clients. A questionnaire for case workers was developed to capture data on any remaining measures.
Administrative systems were used to match the responses with the demographic profile of the clients, and JSA performance measures.
The PWI was produced by Deakin University’s Australian Centre on Quality of Life and Australian Unity.
It has been used in Australia for the past 12 years and has been used internationally in over 50 countries. Over this time it has been shown to been a valid and reliable measure of subjective wellbeing.
Subjective wellbeing doesn’t fluctuate or change much over time. This is because people tend to ‘self-regulate’ their wellbeing: a solid level of wellbeing is something we all tend to seek. In other words, we try to manage it so that it is stable.
If circumstances are particularly difficult (i.e. if someone doesn’t have housing, is being abused, etc.) then that self-regulation is more difficult and wellbeing may decrease.
The PWI is particularly sensitive, so if we address such circumstances we could reasonably expect it to register changes in wellbeing.
Survey participants rate their satisfaction with these domains on a scale of 0-10. The scores on these seven domains are averaged into a 0-100 scale, where 0 is completely dissatisfied and 100 is completely satisfied.
The following guidelines are given by the developers of the PWI:
70+ points: ‘Normal’: A person is likely to be experiencing a normal level of wellbeing.
51-69 points: ‘Challenged’: Personal wellbeing is likely to be challenged / compromised.
<50 points: ‘High-risk’: Very low personal wellbeing / strong likelihood of depression.
In Australia the average PWI is around 75
Almost all PHaMs participants had a PWI score which placed them in the ‘high risk’ category.
Half of JSA participants had a PWI which placed them in the ‘high risk’ category, and a further quarter had a PWI score which placed them in the ‘challenged’ category.
Clients also reported very low levels across the other measures of self-esteem, control and coping.
Health, housing and finances were also all poorly reported against
Domains where PHaMs clients showed the most improvement were:
Standard of Living
Future Security
Health
Achieving in Life
Generally, the greatest improvements in domains of wellbeing occurred in the first four months. In the following four months wellbeing generally remained at improved levels or continued to increase but less sharply.
Note: Any ‘decreases’ between Wave 2 and Wave 3 were not statistically significant. This means that we interpret these results as having remained stable.
The pilot results suggest that the PHaMs service model delivers outcomes consistent with MA’s theory of change and that the service model and its delivery contribute to the client journey to independence.
The PHaMs program, which offers a recovery service rooted in a strengths based approach, is effective in supporting client wellbeing and paths to independence. It is expected that similar programs will also support client wellbeing and paths to independence.
Other areas which showed significant improvements.
Note: These are drops in the proportion of clients with each negative outcome. Although these outcomes look larger than the PWI outcomes on the previous slide, the scales are different and cannot be compared. We cannot compare a change in PWI to a change in per cent, just as we cannot compare a change in weight to a change in temperature.
Caseworkers assessed that over the eight months more participants were accessing appropriate services.
Generally JSA participants remained at the same levels of wellbeing.
It could be argued that providing opportunities for employment, education and training is a mechanism for improving wellbeing.
MA did not see significant improvements in the wellbeing of job seekers who were placed in employment, education or training.
However, MA was not able to follow-up with job seekers after they had exited the program to determine if employment, education and training had an impact on their wellbeing in the longer-term.
We also need to be careful when directly comparing the PHaMs and JSA results. The participants of the two programs were different people with different initial levels of subjective wellbeing and different barriers to independence.
Similarly – we did not see significant changes in the domains of the PWI, as we did for PHaMs.
Note: The increase in future security was not statistically significant. Statistical tests take into consideration how variable a measure is within a group. In this case there was a lot of variance in how much JSA participants rating of Future security changes. When there is a lot of variance we need larger sample sizes to tell if a result is statistically significant.
The pilot results suggest that the PHaMs Service model delivers outcomes consistent with MA’s theory of change and that the service model and its delivery contribute to client journey to independence.
The PHaMs program, which offers a recovery service that takes a strengths based approach, is effective in supporting client wellbeing and paths to independence. It is expected that similar programs will also support client wellbeing and paths to independence.
Significant improvements were not seen in wellbeing of JSA Stream 4 jobseekers. While measuring wellbeing in this context offers an opportunity to understand the jobseeker and their needs better, the outcomes that the service model demands and the large case load means that employment advisers are not able to provide the sort of intensive, flexible and holistic support available in a program such as PHaMs.
Note:
Caution needs to be exercised in attributing the positive outcomes exclusively to MA interventions.
Proving a causal link is difficult when there are multifaceted external factors in play in each client’s circumstances which can influence the outcomes observed.
Before developing outcomes indicators, MA developed sound outcomes hierarchies and a theory of change. It is on these frameworks that the outcome indicators and measures rest. This highlights the importance of allocating time and resources upfront to comprehensively understand high level organisational goals, and those of the programs.
Management endorsement is critical to success in outcome measurement. The pilot also required internal resources from a number of teams and across different business units at all stages of development and implementation. Many staff were trained and supported to undertake this pilot and there has generally been excellent feedback from different levels of services staff.
MA utilised a repeat measures design for measuring outcomes, in which the same cohort of participants were measured over three waves. This type of research design allowed us to look at average improvement over time, as well as explore individual differences between participants in their journey towards independence.
MA staff reported significant value in understanding and reflecting on client outcomes. (More about this on following slides).
Mission Australia’s Strategic Plan commits the organisation to delivering evidence-based integrated services to achieve the goals of reducing homelessness and strengthen communities. The Plan also commits MA to delivering more relevant and effective services and integrating client outcomes across core services.
The Client Impact Pilot will be used to inform a review of an outcomes measurement framework across MA for reducing homelessness and strengthening communities. The framework to be used will include assessment of a number of measurement tools appropriate to funder requirements and service type.
As Commonwealth and State Governments move to outcomes based payments and measurement for program delivery, there is no common set of indicators being used. One purpose of the pilot has been to use this as an iterative process with government in tender and program design.
For the reflection exercise staff were asked:
What does the data tell us- how are we doing with on these measures?
Pick one OUTCOME measure/result that they feel is important to focus on.
How can we optimise this outcome for our client?
Who are the partners who can help us?
What works or could work to do better (low cost and no cost ideas)?
What are we going to do?
• As Commonwealth and State Governments move to outcomes based payments and measurement for program delivery, there is no common set of indicators being used. One purpose of the pilot has been to use this as an iterative process with government in tender and program design.
• Mission Australia has a diverse range of programs and services and the modified PWI approach was adopted to measure how effective our case management interventions have been in improving client independence. MA’s current strategic plan has set organisational goals of reducing homelessness and strengthening communities and our further work on impact measurement will need to include additional indicators to measure outcomes against these goals.
• Governments and donors expect service providers to allocate program funding to front line service delivery and Mission Australia and other providers have limited resources to develop outcomes measurement systems with the requisite IT platforms. MA has therefore funded the pilot from existing resources.
Quote from the field:
“the experience has been very motivating very motivating for staff here who know that they are doing a good job but have never really seen hard evidence.”
For more information…..
Please see Mission Australia’s Client Wellbeing Pilot Final Report
Or contact the Research and Evaluation team by emailing research@missionaustralia.com.au