Transforming End of Life Care in Acute Hospitals PM Workshop 3: Vital Signs ‘Making Measurement Better’ How well things are going and how to make it better’
Transforming End of Life Care in Acute Hospitals PM Workshop 3: Vital Signs ‘Making Measurement Better’ How well things are going and how to make it better’ presented by Sean Manning, NHS England
Similaire à Transforming End of Life Care in Acute Hospitals PM Workshop 3: Vital Signs ‘Making Measurement Better’ How well things are going and how to make it better’
Similaire à Transforming End of Life Care in Acute Hospitals PM Workshop 3: Vital Signs ‘Making Measurement Better’ How well things are going and how to make it better’ (20)
Transforming End of Life Care in Acute Hospitals PM Workshop 3: Vital Signs ‘Making Measurement Better’ How well things are going and how to make it better’
3. Insights into
• Importance of measurement
• Traditions of measurement
• Knowing How We are Doing
• Pareto principle
• Structure ‐ process ‐ outcome measures
• Driver diagrams
• Model for improvement
• Tracking variation
• 7 steps to measurement
10. “I can make the last stage of
my life as good as possible
because everyone works together
confidently, honestly and
consistently to help me and the
people who are important to me,
including my carer(s).”
11. The traditions of measurement
• eg A‐B comparison, average, huge dataset
Research
• eg one‐to‐many benchmarking
comparision, average, large dataset
Judgement
• eg continual analysis of single changing
process over time
Improvement
12. Research Judgement Improvement
Goal
New knowledge (not its
applicability)
Comparison
Reward / punishment
Spur for change
Process understanding
Evaluating a change
Hypothesis Fixed None Multiple and flexible
Measures Many Very few Few
Time period Long, past Long, past Short, current
Sample Large Large Small
Confounders Measure or control Describe and try to
measure
Consider but rarely
measured
Risks in
improvement
settings
Ignores time based
variation
Over‐engineers data
collection
Ignores time based
variation
Over‐reaction to natural
variation
Incorrectly perceived as
‘inferior statistics’
Measurement mindsets
Based on L Solberg, G Mosser and S McDonald (1997) The Three Faces of Performance Measurement: Improvement, Accountability
and Research, Journal on Quality Improvement, 23 (3): 135 ‐ 147.
13. What mind sets are
at play here?
Research
Improvement
Judgement
20. What does it look like?
Interruptions in surgeries
Tally by GPs of the causes of interruptions while with a patient.
Category Count
Sign script ‐ contraception 72
Sign script ‐ minor illness nurse 18
Clinical query ‐ learner 18
Clinical query ‐ NP 24
Sign script ‐ urgent 78
Chaperone 198
Equipment search 60
Admin info 312
Clinical query ‐ GP 66
Panic button 6
Cancellation msg 588
Other 72
TOTALS 1512
0
100
200
300
400
500
600
700
Count
21. What does it look like?
Interruptions in surgeries
Tally by GPs of the causes of interruptions
while seeing patients.
Category Count % of Total
Cancellation msg 588 38.9
Admin info 312 20.6
Chaperone 198 13.1
Sign script ‐ urgent 78 5.2
Other 72 4.8
Sign script ‐ contraception 72 4.8
Clinical query ‐ GP 66 4.4
Equipment search 60 4
Clinical query ‐ NP 24 1.6
Clinical query ‐ learner 18 1.2
Sign script ‐ minor illness nurse 18 1.2
Panic button 6 0.4
TOTALS 1512 100
0
5
10
15
20
25
30
35
40
45
% of Total
22. What does it look like?
0
10
20
30
40
50
60
70
80
90
100
0
5
10
15
20
25
30
35
40
45
% of Total
Cumulative %
23. What does it look like?
0
10
20
30
40
50
60
70
80
90
100
0
5
10
15
20
25
30
35
40
45
% of Total
Cumulative %
Three categories of interruption (17%) account for 73% of the problem
24. What to measure?
Structure Process Outcome
Avedis Donabedian
‘Outcomes remain the ultimate validators of the
effectiveness and quality of medical care’ but they ‘must
be used with discrimination’
The environment in
which care occurs
What care is
delivered, and how
The impact on
patients and the
population
25. What to measure?
Structure Process Outcome
o Outcomes are a worthy goal
o All have pros & cons
o Should measure a selection of all
three
Veena Ralegh
26. What to measure?
Structure Process Outcome
“Intermediate outcomes”
• a common solution
• properties of both process & outcome
• but be careful to acknowledge it’s
not ‘the ultimate outcome’
28. Metrics for different audiences
Board
Service managers
Project managers
Frontline staff
Focus on
outcome
Focus on
process
Relevant process +
outcome measures
Relevant process +
outcome measures
Higher level outcome
measures
Highest level outcome
measures
32. Driver Diagrams
Weight loss example
Pedometer
Gym work out 3
days
Squash weekends
No pub weekdays
Take packed
lunch
Low fat meals
Buy only 1
sandwich
Water bottle for
work bag
Fruit for dessert
Put away the
large glasses
Put cycling days
in diary
Cycling kit out
night before
Get rid of
Oyster cardBe more
active
during the
day
Do sport
Drink less
alcohol
Substitute
lower
calorie
foods
Eat lessReduce
calories in
Increase
calories out
Take stairs
2 stone
weight loss
in 6/12
40. What to measure?
Add metrics to your driver diagram
Structure Process Outcome Balance
How much?
By when?
What is our baseline?
How do we get it?
1. Identify measures for your aim, a primary driver and a secondary driver
using each of these four prompts
2. For each measure record your answer to these two questions
3. For each measure record your answer to these two questions
41. How much & how often?
There is no precise science to guide decisions about how
many metrics to use, or how often…
How many different things
are you monitoring consciously?
How frequently?
Straight motorwayReversing round a corner
45. Change through small steps
Change ...
• with a clear purpose
• you can learn from (without fear of failure)
• which is less exhausting
• with fewer unintended consequences
• which builds engagement and optimism
54. “If I stick my right foot in a bucket
of boiling water and my left foot in
a bucket of ice water, on the
average, I’m pretty comfortable.”
The Problem with Averages
55. The Problem
Aggregated data presented in
tabular formats or with summary
statistics, will not help you measure
the impact of improvement efforts.
Aggregated data can only lead to
judgment, not to improvement.
57. Did things improve?
What will happen next?
Should we do something?
Smoking Cessation :Percentage of smokers who
have quit smoking after 4 week programme
INTERVENTION
61. Waiting time results 0
10
20
30
40
50
60
70
80
90
100
date
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Change
Made
CycleTime(min.)
0
10
20
30
40
50
60
70
80
90
100
date
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Change
Made
CycleTime(min.)
Unit 1
Unit 3
0
10
20
30
40
50
60
70
80
90
100
date
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Change
Made
CycleTime(min.)
Unit 270
35
0
10
20
30
40
50
60
70
80
Avg
Before
Change
Avg After
Change
WaitTime(min.)
62. If We Have 3 Numbers in Sequence:
How Do You Report These?
Downward Trend
Upward Trend
Some Recovery
Setback
Collapse
Dramatic Recovery
Each pattern has an equal 1:6 chance
Lets Get Scientific!
Data has no meaning without a
Context
67. 2 Ways To Improve A Process
If controlled variation (Common Cause)
• process is stable
• variation is inherent to process
• therefore, process must be changed i.e. Redesign
If uncontrolled variation (Special Cause)
• process is unstable
• variation is extrinsic to process
• cause should be identified and “treated”
68. 7 steps to measurement
https://www.youtube.com/watch?v=Za1o77jAnbw&list=PL_V1d0Y94nv4u2yCCDnApxa
9ykKmSG1oE
69. Insights into
• Importance of measurement
• Traditions of measurement
• Knowing How We are Doing
• Pareto principle
• Structure ‐ process ‐ outcome measures
• Driver diagrams
• Model for improvement
• Tracking variation
• 7 steps to measurement