SlideShare une entreprise Scribd logo
1  sur  90
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 1
Finding out what could go wrong before it does
– Modelling Risk and Uncertainty
Bruce Edmonds
Centre for Policy Modelling
Manchester Metropolitan University
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 2
Classic Policy Modelling
Essential steps:
1. Decide on KPIs of policy success
2. List candidate policies
3. Predict impact of policies: cost and KPIs
4. Choose best policy
Sometimes this is embedded within a repeated
cycle of:
a) Decide on a policy (using steps 2-4 above)
b) Implement it
c) Evaluate the policy
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 3
Statistical Models
Approach:
1. Regress KPIs on known outputs
2. Choose inputs that maximise KPIs
3. Hence choose the policy that might most closely implement
those inputs
• Assumes generic fixed relationship – average
success
• Straightforward to do
• Requires enough data between KPIs and inputs
• Candidate policies and regressed inputs may not be
obviously relatable
• Not customisable to particular situations
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 4
Micro-Simulation Models
Approach:
1. Divide up population in to areas/groups
2. Choose simple statistical or other model for reaction
3. For each area/group regress/adjust model for their
own data
4. Maybe add some flows between areas/groups
5. Aggregate over areas/groups for overall assessment
• Requires details data for each area/group
• Good for heterogeneity of groups/areas
• Does not work so well when lots of interaction
between groups
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 5
Computable General Equilibrium
Models
Approach:
1. Construct a simplified economic model of situation
with and without chosen policy
2. Calculate equilibrium without policy
3. Calculate equilibrium with policy
4. Compare the two equilibria and see if this represents
an improvement and how much of one
• Only simple models are calculable
• Uses strong economic assumptions
• The equilibrium is only one restricted and long-
term aspect of the outcomes
• Does not have a good predictive record
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 6
System Dynamic Models
Approach:
1. Build relationship between key variables using flow
and storage approach (maybe in a participatory way)
2. Add in equations and delays
3. Run simulated system with probably inputs
4. Evaluate the results somehow
• Good for dynamics with delayed feedback
• Does not deal with heterogeneity of actors
• ‘Touchy-feely’ judgment of outcomes
• Can look more real that evidence proves
• Not good at predicting outcome values
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 7
Simulation Models
Approach:
1. Build a simulation reflecting how parts of system relate
2. Adjust parameters reflecting particular situation/data
3. Check simulation by running for known situation where
outcomes and data is known (validation)
4. Produce different variations of simulation to reflect each
policy to be tested
5. Run each variation many times and measure the outcomes
• Simulation only as strong as knowledge of system
• Might have many unknown parameters
• Never enough data to sufficiently validate
• Policies can be directly implemented
• Outcomes assessed in many different ways
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 8
Some modelling tensions
precision (model not vague)
generality
of scope (works for
different cases)
Lack of error (accuracy of results)
realism
(reflects
knowledge of
processes)
Economic Models
Scenarios
Agent-based
models
Stats/regression
models
Reality
Wanted for
policy decisions
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 9
Problem 1: System Complexity
• There is no guarantee that a simple model will be
adequate to representing complex
social/ecological/economic/technical systems
• How the parts and actors interact and react might
be crucial to the outcomes (e.g. financial markets)
• We may not know which parts of the system are
crucial to the outcomes
• We may not fully understand how the parts
interact and react
• System and model are both too complex to fully
explore and understand in time available
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 10
Problems 2&3: Error and Uncertainty
• The values of many key parameters might be
unknown or only approximately known
• Data might be patchy and of poor quality
• Tiny changes in key factors or parameters might
have huge consequences for outcomes (the
‘butterfly effect’)
• Levels of error may be amplified by the system
(as in automated trading in financial markets)
• There may be processes that we do not even
know are important to the outcomes
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 11
Problem 4: Structural Change
• System evolves due to internal dynamics
• For example, innovations might occur
• System might have several very different
behavioural ‘phases’ (e.g. bull and bear markets)
which it shifts between
• The rules of the system might change rapidly…
• ...and well before any equilibrium is reached
• Rule-change might be linked to system state
• Different parts of the system might change in
different ways
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 12
Prediction
• Given all these difficulties for many situations,
prediction is not only infeasible…
• ...but suggesting you can predict is dishonest
• and may give false comfort (e.g. Newfoundland
Cod Fisheries Collapse or 2007/8 financial crash)
• Most techniques only work in two cases, where:
1. There is lots of experience/data over many previous
episodes/cases
2. Nothing much changes (tomorrow similar to today)
• Often even approximate or probabilistic
prediction is infeasible and unhelpful
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 13
The key question….
How does one manage a system or
situation that is too complex to predict?
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 14
Lessons from robotics: Part I
Robotics in the 70s and 80s tried to (iteratively):
1. build a map of its situation (i.e. a predictive
model)
2. use this model to plan its best action
3. then try to do this action
4. check it was doing OK go back to (1)
But this did not work in any realistic situation:
• It was far too slow to react to its world
• to make useable predictions it had to make too
many dodgy assumptions about its world
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 15
Lessons from robotics: Part II
Rodney Brooks (1991) Intelligence without
representation. Artificial Intelligence, 47:139–160
A different approach:
1. Sense the world in rich fast ways
2. React to it quickly
3. Use a variety of levels of reaction
a. low simple reactive strategies
b. switched by progressively higher ones
Do not try to predict the world, but react to it quickly
This worked much better.
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 16
Lessons from Weather Forecasting
• Taking measurements at a few places and trying
to predict what will happen based on simple
models based on averages does not work well
• Understanding the weather improved with very
detailed simulations fed by rich and
comprehensive sensing of the system
• Even then they recognize that there are more
than one possibilities concerning the outcomes
(using ensembles of specific outcomes)
• If these indicate a risk of severe weather they
issue a warning so mitigating measures can be
taken
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 17
Lessons from Radiation Levels
• The human body is a very complex system
• It has long been known that too much radiation
can cause severe illness or death in humans
• In the 30s & 40s it was assumed there was a
“safe” level of radiation
• However it was later discovered that any level of
radiation carried a risk of illness
• Including naturally occurring levels
• Although an increase in radiation might not seem
to affect many people, it did result in more
illnesses in some
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 18
Socio-Ecological Systems
• Are the combination of human society embedded
within an ecological system (SES)
• Many social and ecological systems are far too
complex to predict
• Their combination is doubly complex
• E.g. fisheries, deforestation, species extinctions
• Yet we still basically use the 1970s robotics
“predict and plan” approach to these…
• …as if we can plan optimum policies by
estimating/projecting future impact
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 19
Why simple models won’t work
• Simpler models do not necessarily get things
“roughly” right
• Simpler models are not more general
• They can also be very deceptive – especially with
regards to complex ways things can go wrong
• In complex systems the detailed interactions can
take outcomes ‘far from equilibrium’ and far from
average behaviour
• Sometimes, with complex systems, a simple
model that relies on strong assumptions can be
far worse than having no models at all
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 20
A Cautionary Tale
• On the 2nd July 1992 Canada’s fisheries minister,
placed a moratorium on all cod fishing off
Newfoundland. That day 30,000 people lost their jobs.
• Scientists and the fisheries department throughout
much of the 1980s estimated a 15% annual rate of
growth in the stock – (figures that were consistently
disputed by inshore fishermen).
• The subsequent Harris Report (1992) said (among
many other things) that: “..scientists, lulled by false
data signals and… overconfident of the validity of
their predictions, failed to recognize the statistical
inadequacies in … [their] model[s] and failed to …
recognize the high risk involved with state-of-stock
advice based on … unreliable data series.”
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 21
What had gone wrong?
• “… the idea of a strongly rebuilding Northern cod
stock that was so powerful that it …[was]... read
back… through analytical models built upon
necessary but hypothetical assumptions about
population and ecosystem dynamics. Further, those
models required considerable subjective judgement
as to the choice of weighting of the input variables”
(Finlayson 1994, p.13)
• Finlayson concluded that the social dynamics
between scientists and managers were at play
• Scientists adapting to the wishes and worldview of
managers, managers gaining confidence in their
approach from the apparent support of science
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 22
Example 1: Fishing!
• …is a dynamic, spatial, individual-based ecological model
that has some of the complexity, adaptability and fragility
of observed ecological systems with emergent outcomes
• It evolves complex, local food webs, endogenous shocks
from invasive species, is adaptive but unpredictable as to
the eventual outcomes
• Into this the impact of humans can be imposed or even
agents representing humans ‘injected’ into the simulation
• The outcomes can be then analysed at a variety of levels
over long time scales, and under different scenarios
• Paper: Edmonds, B. (in press) A Socio-Ecological Test
Bed. Ecology & Complexity.
• Full details and code at: http://openabm.org/model/4204
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 23
In this version
• Plants and higher order entities (fish)
distinguished (no photosynthesizing herbivores!)
• First a rich competing plant ecology is evolved
• Then single fish injected until fish take hold and
evolve until there is an ecology of many fish
species, run for a bit to allow ‘transients’ to go
• This state then frozen and saved
• From this point different ‘fishing’ polices
implemented and the simulations then run
• with the outcomes then analysed
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 24
The Model
• A wrapped 2D grid of
well-mixed patches with:
– energy (transient)
– bit string of characteristics
• Organisms represented
individually with its own
characteristics,
including:
– bit string of characteristics
– energy
– position
A well-mixed
patch
Each
individual
represented
separately
Slow
random rate
of migration
between
patches
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 25
Model sequence each simulation tick
1. Input energy equally divided between patches.
2. Death. A life tax is subtracted, some die, age incremented
3. Initial seeding. until a viable is established, random new individual
4. Energy extraction from patch. energy divided among the
individuals there with positive score when its bit-string is evaluated
against patch
5. Predation. each individual is randomly paired with a number of
others on the patch, if dominate them, get a % of their energy, other
removed
6. Maximum Store. energy above a maximum level is discarded.
7. Birth. Those with energy > “reproduce-level” gives birth to a new
entity with the same bit-string as itself, with a probability of mutation,
Child has an energy of 1, taken from the parent.
8. Migration. randomly individuals move to one of 4 neighbours
9. Statistics. Various statistics are calculated.
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 26
First, evolve a rich mixed ecology
Evolve and save a suitable complex ecology with
a balance of tropic layers (final state to the left
with log population scale)
Herbivores
Appear
First Successful
Plant
Simulation
“Frozen”
Carnivores
Appear
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 27
This version designed to test possible
outcomes of fishing policies
• Complex aquatic plant ecology evolved
• Herbivore fish injected into ecology and whole
system further evolved
• Once a complex ecology with higher-order
predators then system is fixed as starting point
• Different extraction (i.e. fishing) policies can be
enacted on top of this system:
– How much fish is extracted each time (either absolute
numbers or as a proportion of existing numbers)
– Where uniformly at random or patch-by-patch
– How many ‘reserves’ are kept
– Is there a minimum stock level below which no fishing
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 28
Demonstration of the basic model
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 29
Typical Harvest Shape (last 100 ticks) for
different catch levels over 20 different runs
Catch level (per tick)
ProportionofMaximum
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 30
Decide in your groups
1. Amount of fish extraction (quota) per tick, either:
• Absolute number (0-200)
• Percentage of existing stock (0-100%)
2. The way fish is extracted, either:
• Randomly over whole grid
• Random patch chosen and fished, then next until
quota for tick is reached
3. How many patches will be kept as reserves (not
fished)
4. When to start fishing (0-999 ticks)
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 31
Total Extinction Prob. & Av. Total Harvest
(last 100 ticks) for different catch levels
Catch level (per tick)
ProportionofMaximum
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 32
Num Fish (all species, 20 runs) – catch
level 25
0
1000
2000
3000
4000
5000
6000
0
31
62
93
124
155
186
217
248
279
310
341
372
403
434
465
496
527
558
589
620
651
682
713
744
775
806
837
868
899
930
961
992
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 33
Num Fish (all species, 20 runs) – catch
level 35
0
1000
2000
3000
4000
5000
6000
0
31
62
93
124
155
186
217
248
279
310
341
372
403
434
465
496
527
558
589
620
651
682
713
744
775
806
837
868
899
930
961
992
Catch target=30
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 34
Num Fish (all species, 20 runs) – catch
level 50
0
1000
2000
3000
4000
5000
6000
0
31
62
93
124
155
186
217
248
279
310
341
372
403
434
465
496
527
558
589
620
651
682
713
744
775
806
837
868
899
930
961
992
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 35
Average (over 20 runs) of fish at end of 5000
simulation ticks
0
1000
2000
3000
4000
5000
0 20 40 60 80 100
Number Fish for Different Catch Levels
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 36
Average (over 20 runs) of numbers of fish
species at end of 5000 simulation ticks
0
20
40
60
80
100
120
140
0 20 40 60 80 100
Num Fish Species with Catch Level
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 37
Average Number of Species vs. Catch
Level (from a different starting ecology)
0
2
4
6
8
10
12
14
0 5 10 15 20 25 30 35 40
Num Species Fish
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 38
Average Number of Species, Catch=20
0
5
10
15
20
25
30
35
0 200 400 600 800 1000
AverageNumberofSpecies
Time
"by patches"
"uniform"
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 39
Average Number of Species, Catch=30
0
5
10
15
20
25
30
35
0 200 400 600 800 1000
AverageNumberofSpecies
Time
"by patches"
"uniform"
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 40
Average Number of Species, Catch=40
0
5
10
15
20
25
30
35
0 200 400 600 800 1000
AverageNumberofSpecies
Time
"by patches"
"uniform"
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 41
A risk-analysis approach
1. Give up on estimating future impact or “safe”
levels of exploitation
2. Make simulation models that include more of the
observed complication and complex interactions
3. Run these lots of times with various scenarios to
discover some of the ways in which things can
go surprisingly wrong (or surprisingly right)
4. Put in place sensors/measures that would give
us the earliest possible warning that these might
be occurring in real life
5. React quickly if these warning emerge
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 42
Example 2: Social Influence and
Domestic Water Demand
• Produced for the Environment Agency/DEFRA
• Part of a bigger project to predict future domestic
water demand in the UK given some different
future politico-economic scenarios and climate
change
• The rest of the project were detailed statistical
models to do the prediction
• This model was to examine the assumptions and
look at the envelope of possibilities
• Joint work with Olivier Barthelemy and Scott Moss
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 43
Monthly Water Consumption
REL_CHNG
.88
.75
.63
.50
.38
.25
.13
0.00
-.13
-.25
-.38
-.50
20
10
0
Std.Dev = .17
Mean = .01
N = 81.00
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 44
Relative Change in Monthly
Consumption
Date
FEB
2001
SEP
2000
APR
2000
N
O
V
1999
JU
N
1999
JAN
1999
AU
G
1998
M
AR
1998
O
C
T
1997
M
AY
1997
D
EC
1996
JU
L
1996
FEB
1996
SEP
1995
APR
1995
N
O
V
1994
JU
N
1994
REL_CHNG
1.0
.8
.6
.4
.2
-.0
-.2
-.4
-.6
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 45
Purpose of the SI&DWD Model
• Not long-term prediction
• But to begin to understand the relationship of
socially-influenced consumer behaviour to
patterns of water demand
• By producing a representational agent model
amenable to fine-grained criticism
• And hence to suggest possible interactions
• So that these can be investigated/confirmed
• And this loop iterated
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 46
Model Structure - Overall Structure
•Activity
•Frequency
•Volume
Households
Policy
Agent
•Temperature
•Rainfall
•Sunshine
Ground
Aggregate Demand
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 47
Model Structure - Microcomponents
• Each household has a variable number of micro-
components (power showers etc.): bath
other_garden_watering shower hand_dishwashing
washing_machine sprinkler clothes_hand_washing
hand_dishwashing toilets sprinkler power_shower
• Actions are expressed by the frequency and
volume of use of each microcomponent
• AVF distribution in model calibrated by data from
the Three Valleys
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 48
Model Structure - Household
Distribution
• Households distributed randomly on a grid
• Each household can copy from a set of
neighbours (currently those up to 4 units up, down
left and right from them)
• They decide which is the neighbour most similar
to themselves – this is the one they are most
likely to copy
• Depending on their evaluation of actions they
might adopt that neighbour’s actions
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 49
An Example Social Structure
- Global Biased
- Locally Biased
- Self Biased
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 50
Household Behaviour - Endorsements
• Action Endorsements: recentAction neighbourhoodSourced
selfSourced globallySourced newAppliance
bestEndorsedNeighbourSourced
• 3 Weights moderate effective strengths of
neighbourhoodSourced selfSourced globallySourced
endorsements and hence the bias of households
• Can be characterised as 3 types of households
influenced in different ways: global-;
neighbourhood-; and self-sourced depending on
the dominant weight
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 51
History of a particular action
from one agent’s point of view
Month 1: used, endorsed as self sourced
Month 2: endorsed as recent (from personal use) and neighbour
sourced (used by agent 27) and self sourced (remembered)
Month 3: endorsed as recent (from personal use) and neighbour
sourced (agent 27 in month 2).
Month 4: endorsed as neighbour sourced twice, used by agents 26 and
27 in month 3, also recent
Month 5: endorsed as neighbour sourced (agent 26 in month 4), also
recent
Month 6: endorsed as neighbour sourced (agent 26 in month 5)
Month 7: replaced by action 8472 (appeared in month 5 as neighbour
sourced, now endorsed 4 times, including by the most alike
neighbour – agent 50)
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 52
Policy Agent - Behaviour
• After the first month of dry conditions, suggests
AFV actions to all households
• These actions are then included in the list of those
considered by the households
• If the household’s weights predispose it, it may
decide to adopt these actions
• Some other neighbours might imitate these
actions etc.
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 53
Number of consecutive dry months in
historical scenario
0
1
2
3
4
5
6
7
8
9
J-73
J-74
J-75
J-76
J-77
J-78
J-79
J-80
J-81
J-82
J-83
J-84
J-85
J-86
J-87
J-88
J-89
J-90
J-91
J-92
J-93
J-94
J-95
J-96
J-97
Simulation Date
Numberofconsequativedrymonths
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 54
Simulated Monthly Water
Consumption
REL_CHNG
.075
.063
.050
.037
.025
.012
-.000
-.013
-.025
-.038
-.050
120
100
80
60
40
20
0
Std. Dev = .01
Mean= -.000
N = 325.00
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 55
Monthly Water Consumption (again)
REL_CHNG
.88
.75
.63
.50
.38
.25
.13
0.00
-.13
-.25
-.38
-.50
20
10
0
Std.Dev = .17
Mean = .01
N = 81.00
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 56
Simulated Change in Monthly
Consumption
Date
SEP
1997
APR
1996
N
O
V
1994
JU
N
1993
JAN
1992
AU
G
1990
M
AR
1989
O
C
T
1987
M
AY
1986
D
EC
1984
JU
L
1983
FE
B
1982
SEP
1980
APR
1979
N
O
V
1977
JU
N
1976
JAN
1975
AU
G
1973
M
AR
1972
O
C
T
1970
REL_CHNG
.10
.08
.06
.04
.02
0.00
-.02
-.04
-.06
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 57
Relative Change in Monthly
Consumption (again)
Date
FEB
2001
SEP
2000
APR
2000
N
O
V
1999
JU
N
1999
JAN
1999
AU
G
1998
M
AR
1998
O
C
T
1997
M
AY
1997
D
EC
1996
JU
L
1996
FEB
1996
SEP
1995
APR
1995
N
O
V
1994
JU
N
1994
REL_CHNG
1.0
.8
.6
.4
.2
-.0
-.2
-.4
-.6
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 58
30% Neigh. biased, historical
scenario, historical innov. datesAggregate demand series scaled so 1973=100
0
20
40
60
80
100
120
140
160
180
200
J-
73
J-
74
J-
75
J-
76
J-
77
J-
78
J-
79
J-
80
J-
81
J-
82
J-
83
J-
84
J-
85
J-
86
J-
87
J-
88
J-
89
J-
90
J-
91
J-
92
J-
93
J-
94
J-
95
J-
96
J-
97
Simulation Date
RelativeDemand
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 59
80% Neigh. biased, historical
scenario, historical innov. datesAggregate demand series scaled so 1973=100
0
20
40
60
80
100
120
140
160
180
200
J-
73
J-
74
J-
75
J-
76
J-
77
J-
78
J-
79
J-
80
J-
81
J-
82
J-
83
J-
84
J-
85
J-
86
J-
87
J-
88
J-
89
J-
90
J-
91
J-
92
J-
93
J-
94
J-
95
J-
96
J-
97
Simulation Date
RelativeDemand
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 60
80% Neigh. biased, medium-high
scenario, historical innov. DatesAggregate demand series scaled so 1973=100
0
20
40
60
80
100
120
140
160
180
200
Jan-
73
Jan-
74
Jan-
75
Jan-
76
Jan-
77
Jan-
78
Jan-
79
Jan-
80
Jan-
81
Jan-
82
Jan-
83
Jan-
84
Jan-
85
Jan-
86
Jan-
87
Jan-
88
Jan-
89
Jan-
90
Jan-
91
Jan-
92
Jan-
93
Jan-
94
Jan-
95
Jan-
96
Jan-
97
Simulation Date
RelativeDemand
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 61
What did the model tell us?
• That it is possible that social processes:
– can cause a high and unpredictable variety in patterns
of demand
– can ‘lock-in’ behavioural patterns and partially ‘insulate’
them from outside influence (droughts only occasionally
had a permenant affect on patterns of consumption)
• and that the availability of new products could
dominate effects from changing consumptions
habits
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 62
Conclusions of Example 2
• ABM can be used to construct fairly-rich
computational descriptions of socially-related
phenomena which can be used
– to replicate systems analytic techniques can’t deal with
– to explore some of the possibilities
• especially those unpredictable but non-random possibilities
caused to human behaviour
– as part of an iterative cycle of detailed criticism
• validatable by both data and expert opinion
– to inform be informed by good observation
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 63
A central dilemma – what to trust?
Intuitions
A complex simulation
A policy maker
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 64
But Modeller to Policy Actor Interface
is not easy
• Analysts/modellers and policy actors have
different: goals, language, methods, habits…
• Policy Actors will often want predictions –
certainty – even if the analysts know this is
infeasible
• Analysts will know how difficult the situation is to
understand and how much is unknown, and will
want to communicate their caveats (which often
get lost in the policy process)
• So discussion between them does not necessarily
go easily
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 65
Many views of a model (I)
- due to syntactic complexity
• Computational ‘distance’ between specification
and outcomes means that
• There are (at least) two very different views of a
simulation
(consequences of complexity)
Simulation
Representation of OutcomesSpecification
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 66
Representation of Outcomes (II)
Many views of a model (II)
- understanding the simulation
(consequences of complexity)
Simulation
Representation of Outcomes (I)Specification
Analogy 1
Analogy 2
Theory 1
Theory 2
Summary 1 Summary 2
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 67
Four Meanings (of the PM)
Research World
1. The researcher’s idea/intention for the PM
2. The fit of the PM with the evidence/data
The ideavalidation relation extensively
discussed within research world
Policy World
3. The usefulness of the PM for decisions
4. The communicable story of the PM
The goalinterpretation relation extensively
discussed within policy world
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 68
Two Worlds
Research
• Ultimate Goal is
Agreement with
Observed (Truth)
• Modeller also has an
idea of what the model
is and how it works
Policy
• Ultimate Goal is in
Final Outcomes
(Usefulness)
• Decisions justified by
a communicable
causal story
Policy Model
• Labels/Documentation may be
different from all of the above!
Modeller
Policy
Advisor
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 69
Joining the Two Worlds
Empirical
• Ultimate Goal is
Agreement with
Observed (Truth)
• Modeller also has an
idea of what the model
is and how it works
Instrumental
• Ultimate Goal is in
Final Outcomes
(Usefulness)
• Decisions justified by
a communicable
causal story
Model
• Labels/Documentation may be
different from all of the above!
Tighter loop
(e.g. via
participatory
modelling)
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 70
Conclusions
• Complex systems can not be relied upon to behave in
regular ways
• Often averages, equilibria etc. are not very
informative
• Future levels can not meaningfully be predicted
• Simpler models may well make unreliable
assumptions and not be representative
• Rather complex models can be part of a risk-analysis
• Identifying some of the ways in which things can go
wrong, implement measure to watch these, then be
able to react quickly to these (‘driving policy’)
• A tight measure-react loop can be essential for driving
policy – modelling might help in this – but this is hard!
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 71
The End!
Bruce Edmonds:
http://bruce.edmonds.name
These Slides: http://slideshare.net/bruceedmonds
Centre for Policy Modelling: http://cfpm.org
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 72
Some Pitfalls in Model Construction
Pitfalls Part 1
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 73
Modelling Assumptions
• All models are built on assumptions, but…
• They have different origins and reliability, e.g.:
– Empirical evidence
– Other well-defined theory
– Expert Opinion
– Common-sense
– Tradition
– Stuff we had to assume to make the model possible
• Choosing assumptions is part of the art of
simulation but which assumptions are used
should be transparent and one should be honest
about their reliability – plausibility is not enough!
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 74
Theoretical Spectacles
• Our conceptions and models constrain how we
1. look for evidence (e.g. where and what kinds)
2. what kind of models we develop
3. how we evaluate any results
• This is Kuhn’s “Theoretical Spectacles” (1962)
– e.g. continental drift
• This is MUCH stronger for a complex simulation
we have immersed ourselves in
• Try to remember that just because it is useful to
think of the world through our model, this does
not make them valid or reliable
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 75
Over-Simplified Models
• Although simple models have many pragmatic
advantages (easier to check, understand etc.)…
• If we have missed out key elements of what is being
modelled it might be completely wrong!
• Playing with simple models to inform formal and
intuitive understanding is an OK scientific practice
• …but it can be dangerous when informing policy
• Simple does not mean it is roughly correct, or more
general or gives us useful intuitions
• Need to accept that many modelling tasks requested
of us by policy makers are not wise to do with
restricted amounts of time/data/resources
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 76
Underestimating model limitations
• All models have limitations
• They are only good for certain things: a model
that explains well might not predict well
• The may well fail when applied in a different
context than the one they were developed in
• Policy actors often do not want to know about
limitations and caveats
• Not only do we have to be 100% honest about
these limitations, but we also have to ensure that
these limitations are communicated with the
model
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 77
Not checking & testing a model
thoroughly
• Doh!
• Sometimes there is not a clear demarcation
between an exploratory phase of model
development and its application to serious
questions (whose answers will impact on others)
• Sometimes an answer is demanded before
thorough testing and checking can be done – “Its
OK, I just want an approximate answer” :-/
• Sometimes researchers are not honest
• Depends on the potential harm if the model is
relied on (at all) and turns out to be wrong
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 78
Some Pitfalls in Model Application
Pitfalls Part 2
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 79
Insufficiently Validated Models
• One can not rely on a model until it has been
rigorously checked and tested against reality
• Plausibility is nowhere NEAR enough
• This needs to be on more than one case
• Its better if this is done independently
• You can not validate a model using one set of
settings/cases then rely on it in another
• Validation usually takes a long time
• Iterated development and validation over many
cycles is better than one-off models (for policy)
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 80
Promising too much
• Modellers are in a position to see the potential of
their work, and so can tantalise others by
suggesting possible/future uses (e.g. in the
conclusions of papers or grant applications)
• They are tempted to suggest they can ‘predict’,
‘evaluate the impact of alternative polices’ etc.
• Especially with complex situations (that ABM is
useful for) this is simply deceptive
• ‘Giving a prediction to a policy maker is like giving
a sharp knife to a child’
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 81
The inherent plausibility of ABMs
• Due to the way ABMs map onto reality in a
common-sense manner (e.g. peopleagents)…
• …visualisations of what is happening can be
readily interpretted by non-modellers
• and hence given much greater credence than
they warrant (i.e. the extent of their validation)
• It is thus relatively easy to persuade using a good
ABM and visualisation
• Only we know how fragile they are, and need to
be especially careful about suggesting otherwise
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 82
Model Spread
• On of the big advantages of formal models is that
they can be passed around to be checked, played
with, extended, used etc.
• However once a model is out there, it might get
used for different purposes than intended
• e.g. the Black-Scholes model of derivative pricing
• Try to ensure a released model is packaged with
documentation that warns of its uses and
limitations
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 83
Narrowing the evidential base
• The case of the Newfoundland cod, indicates how
models can work to constrain the evidence base,
therefore limiting decision making
• If a model is considered authoritative, then the
data it uses and produces can sideline other
sources of evidence
• Using a model rather than measuring lots of stuff
is cheap, but with obvious dangers
• Try to ensure models are used to widen the
possibilities considered, rather than limit them
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 84
Other/General Pitfalls
Pitfalls Part 3
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 85
Confusion over model purpose
• A model is not a picture of reality, but a tool
• A tool has a particular purpose
• A tool good for one purpose is probably not good
for another
• These include: prediction, explanation, as an
analogy, an illustration, a description, for theory
exploration, or for mediating between people
• Modellers should be 100% clear under which
purpose their model is to be judged
• Models need to be justified for each purpose
separately
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 86
When models are used out of the
context they were designed for
• Context matters!
• In each context there will be many
conditions/assumptions we are not even aware of
• A model designed in one context may fail for
subtle reasons in another (e.g. different ontology)
• Models generally need re-testing, re-validating
and often re-developing in new contexts
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 87
What models cannot reasonably do
• Many questions are beyond the realm of models
and modellers but are essentially
– ethical
– political
– social
– semantic
– symbolic
• Applying models to these (outside the walls of
our academic asylum) can confuse and distract
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 88
The uncertainty is too great
• Required reliability of outcome values is too low
for purpose
• Can be due to data or model reasons
• Radical uncertainty is when its not a question of
degree but the situation might fundamentally
change or be different from the model
• Error estimation is only valid in absence of radical
uncertainly (which is not the case in almost all
ecological, technical or social simulations)
• Just got to be honest about this and not only
present ‘best case’ results
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 89
A false sense of security
• If the outcomes of a model give a false sense of
certainly about outcomes then a model can be
worse than useless; positively damaging to policy
• Better to err on the side of caution and say there
is not good model in this case
• Even if you are optimistic for a particular model
• Distinction here between probabilistic and
possibilistic views
Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 90
Not more facts, but values!
• Sometimes it is not facts and projections that are
the issue but values
• However good models are, the ‘engineering’
approach to policy (enumerate policies, predict
impact of each, choose best policy) might be
inappropriate
• Modellers caught on the wrong side of history
may be blamed even though they were just doing
the technical parts

Contenu connexe

Similaire à Finding out what could go wrong before it does – Modelling Risk and Uncertainty

Mixing ABM and policy...what could possibly go wrong?
Mixing ABM and policy...what could possibly go wrong?Mixing ABM and policy...what could possibly go wrong?
Mixing ABM and policy...what could possibly go wrong?Bruce Edmonds
 
Policy Making using Modelling in a Complex world
Policy Making using Modelling in a Complex worldPolicy Making using Modelling in a Complex world
Policy Making using Modelling in a Complex worldBruce Edmonds
 
Modelling Pitfalls - introduction and some cases
Modelling Pitfalls - introduction and some casesModelling Pitfalls - introduction and some cases
Modelling Pitfalls - introduction and some casesBruce Edmonds
 
Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...
Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...
Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...Bruce Edmonds
 
Integrating Microsimulation, Mathematics, and Network Models Using ABM – pros...
Integrating Microsimulation, Mathematics, and Network Models Using ABM– pros...Integrating Microsimulation, Mathematics, and Network Models Using ABM– pros...
Integrating Microsimulation, Mathematics, and Network Models Using ABM – pros...Bruce Edmonds
 
Decision-making Support System for climate change adaptation_yin v2
Decision-making Support System for climate change adaptation_yin v2Decision-making Support System for climate change adaptation_yin v2
Decision-making Support System for climate change adaptation_yin v2Chonghua Yin
 
Noah smith macroeconomics talk.pptx
Noah smith   macroeconomics talk.pptxNoah smith   macroeconomics talk.pptx
Noah smith macroeconomics talk.pptxraymondpeil
 
BehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptxBehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptxssuserff2861
 
BehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptxBehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptxYuriValverde1
 
BA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexity
BA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexityBA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexity
BA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexityBA and Beyond
 
3rd alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a
3rd  alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a3rd  alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a
3rd alex marketing club (pharmaceutical forecasting) dr. ahmed sham'aMahmoud Bahgat
 
Modelling Pitfalls - extra resources
Modelling Pitfalls - extra resourcesModelling Pitfalls - extra resources
Modelling Pitfalls - extra resourcesBruce Edmonds
 
Using Data Integration Models for Understanding Complex Social Systems
Using Data Integration Modelsfor Understanding Complex Social SystemsUsing Data Integration Modelsfor Understanding Complex Social Systems
Using Data Integration Models for Understanding Complex Social SystemsBruce Edmonds
 
Business decision, resource mgt and cost benefit analysis
Business decision, resource mgt and cost benefit analysisBusiness decision, resource mgt and cost benefit analysis
Business decision, resource mgt and cost benefit analysisMohammed Jasir PV
 
3. introduction of ABM_INTI.pdf
3. introduction of ABM_INTI.pdf3. introduction of ABM_INTI.pdf
3. introduction of ABM_INTI.pdfYudi Yasik
 
WHY INFORMATION SYSTEM FAILS IN ORGANIZATION
WHY INFORMATION SYSTEM FAILS IN ORGANIZATIONWHY INFORMATION SYSTEM FAILS IN ORGANIZATION
WHY INFORMATION SYSTEM FAILS IN ORGANIZATIONAssignment Studio
 
35.pdf
35.pdf35.pdf
35.pdfa a
 
35.2.pdf
35.2.pdf35.2.pdf
35.2.pdfa a
 
Not sure how to do this case analysis please help me do it!1.Are t.pdf
Not sure how to do this case analysis please help me do it!1.Are t.pdfNot sure how to do this case analysis please help me do it!1.Are t.pdf
Not sure how to do this case analysis please help me do it!1.Are t.pdfamitbagga0808
 
httphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docx
httphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docxhttphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docx
httphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docxadampcarr67227
 

Similaire à Finding out what could go wrong before it does – Modelling Risk and Uncertainty (20)

Mixing ABM and policy...what could possibly go wrong?
Mixing ABM and policy...what could possibly go wrong?Mixing ABM and policy...what could possibly go wrong?
Mixing ABM and policy...what could possibly go wrong?
 
Policy Making using Modelling in a Complex world
Policy Making using Modelling in a Complex worldPolicy Making using Modelling in a Complex world
Policy Making using Modelling in a Complex world
 
Modelling Pitfalls - introduction and some cases
Modelling Pitfalls - introduction and some casesModelling Pitfalls - introduction and some cases
Modelling Pitfalls - introduction and some cases
 
Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...
Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...
Towards Integrating Everything (well at least: ABM, data-mining, qual&quant d...
 
Integrating Microsimulation, Mathematics, and Network Models Using ABM – pros...
Integrating Microsimulation, Mathematics, and Network Models Using ABM– pros...Integrating Microsimulation, Mathematics, and Network Models Using ABM– pros...
Integrating Microsimulation, Mathematics, and Network Models Using ABM – pros...
 
Decision-making Support System for climate change adaptation_yin v2
Decision-making Support System for climate change adaptation_yin v2Decision-making Support System for climate change adaptation_yin v2
Decision-making Support System for climate change adaptation_yin v2
 
Noah smith macroeconomics talk.pptx
Noah smith   macroeconomics talk.pptxNoah smith   macroeconomics talk.pptx
Noah smith macroeconomics talk.pptx
 
BehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptxBehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptx
 
BehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptxBehavioralEconomicsfinal.pptx
BehavioralEconomicsfinal.pptx
 
BA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexity
BA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexityBA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexity
BA and Beyond 19 Sponsor spotlight - Namahn - Beating complexity with complexity
 
3rd alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a
3rd  alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a3rd  alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a
3rd alex marketing club (pharmaceutical forecasting) dr. ahmed sham'a
 
Modelling Pitfalls - extra resources
Modelling Pitfalls - extra resourcesModelling Pitfalls - extra resources
Modelling Pitfalls - extra resources
 
Using Data Integration Models for Understanding Complex Social Systems
Using Data Integration Modelsfor Understanding Complex Social SystemsUsing Data Integration Modelsfor Understanding Complex Social Systems
Using Data Integration Models for Understanding Complex Social Systems
 
Business decision, resource mgt and cost benefit analysis
Business decision, resource mgt and cost benefit analysisBusiness decision, resource mgt and cost benefit analysis
Business decision, resource mgt and cost benefit analysis
 
3. introduction of ABM_INTI.pdf
3. introduction of ABM_INTI.pdf3. introduction of ABM_INTI.pdf
3. introduction of ABM_INTI.pdf
 
WHY INFORMATION SYSTEM FAILS IN ORGANIZATION
WHY INFORMATION SYSTEM FAILS IN ORGANIZATIONWHY INFORMATION SYSTEM FAILS IN ORGANIZATION
WHY INFORMATION SYSTEM FAILS IN ORGANIZATION
 
35.pdf
35.pdf35.pdf
35.pdf
 
35.2.pdf
35.2.pdf35.2.pdf
35.2.pdf
 
Not sure how to do this case analysis please help me do it!1.Are t.pdf
Not sure how to do this case analysis please help me do it!1.Are t.pdfNot sure how to do this case analysis please help me do it!1.Are t.pdf
Not sure how to do this case analysis please help me do it!1.Are t.pdf
 
httphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docx
httphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docxhttphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docx
httphome.ubalt.eduntsbarshbusiness-statoprepartIX.htmTool.docx
 

Plus de Bruce Edmonds

Staging Model Abstraction – an example about political participation
Staging Model Abstraction – an example about political participationStaging Model Abstraction – an example about political participation
Staging Model Abstraction – an example about political participationBruce Edmonds
 
The evolution of empirical ABMs
The evolution of empirical ABMsThe evolution of empirical ABMs
The evolution of empirical ABMsBruce Edmonds
 
Using agent-based simulation for socio-ecological uncertainty analysis
Using agent-based simulation for socio-ecological uncertainty analysisUsing agent-based simulation for socio-ecological uncertainty analysis
Using agent-based simulation for socio-ecological uncertainty analysisBruce Edmonds
 
How social simulation could help social science deal with context
How social simulation could help social science deal with contextHow social simulation could help social science deal with context
How social simulation could help social science deal with contextBruce Edmonds
 
Agent-based modelling, laboratory experiments, and observation in the wild
Agent-based modelling,laboratory experiments,and observation in the wildAgent-based modelling,laboratory experiments,and observation in the wild
Agent-based modelling, laboratory experiments, and observation in the wildBruce Edmonds
 
Culture trumps ethnicity! – Intra-generational cultural evolution and ethnoce...
Culture trumps ethnicity!– Intra-generational cultural evolution and ethnoce...Culture trumps ethnicity!– Intra-generational cultural evolution and ethnoce...
Culture trumps ethnicity! – Intra-generational cultural evolution and ethnoce...Bruce Edmonds
 
An Introduction to Agent-Based Modelling
An Introduction to Agent-Based ModellingAn Introduction to Agent-Based Modelling
An Introduction to Agent-Based ModellingBruce Edmonds
 
Different Modelling Purposes - an 'anit-theoretical' approach
Different Modelling Purposes - an 'anit-theoretical' approachDifferent Modelling Purposes - an 'anit-theoretical' approach
Different Modelling Purposes - an 'anit-theoretical' approachBruce Edmonds
 
Socio-Ecological Simulation - a risk-assessment approach
Socio-Ecological Simulation - a risk-assessment approachSocio-Ecological Simulation - a risk-assessment approach
Socio-Ecological Simulation - a risk-assessment approachBruce Edmonds
 
A Simple Model of Group Commoning
A Simple Model of Group CommoningA Simple Model of Group Commoning
A Simple Model of Group CommoningBruce Edmonds
 
6 Modelling Purposes
6 Modelling Purposes6 Modelling Purposes
6 Modelling PurposesBruce Edmonds
 
Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...
Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...
Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...Bruce Edmonds
 
The Post-Truth Drift in Social Simulation
The Post-Truth Drift in Social SimulationThe Post-Truth Drift in Social Simulation
The Post-Truth Drift in Social SimulationBruce Edmonds
 
Drilling down below opinions: how co-evolving beliefs and social structure mi...
Drilling down below opinions: how co-evolving beliefs and social structure mi...Drilling down below opinions: how co-evolving beliefs and social structure mi...
Drilling down below opinions: how co-evolving beliefs and social structure mi...Bruce Edmonds
 
Model Purpose and Complexity
Model Purpose and ComplexityModel Purpose and Complexity
Model Purpose and ComplexityBruce Edmonds
 
Modelling Innovation – some options from probabilistic to radical
Modelling Innovation – some options from probabilistic to radicalModelling Innovation – some options from probabilistic to radical
Modelling Innovation – some options from probabilistic to radicalBruce Edmonds
 
Co-developing beliefs and social influence networks
Co-developing beliefs and social influence networksCo-developing beliefs and social influence networks
Co-developing beliefs and social influence networksBruce Edmonds
 
Simulating Superdiversity
Simulating Superdiversity Simulating Superdiversity
Simulating Superdiversity Bruce Edmonds
 
A Model of Social and Cognitive Coherence
A Model of Social and Cognitive CoherenceA Model of Social and Cognitive Coherence
A Model of Social and Cognitive CoherenceBruce Edmonds
 

Plus de Bruce Edmonds (20)

Staging Model Abstraction – an example about political participation
Staging Model Abstraction – an example about political participationStaging Model Abstraction – an example about political participation
Staging Model Abstraction – an example about political participation
 
The evolution of empirical ABMs
The evolution of empirical ABMsThe evolution of empirical ABMs
The evolution of empirical ABMs
 
Social Context
Social ContextSocial Context
Social Context
 
Using agent-based simulation for socio-ecological uncertainty analysis
Using agent-based simulation for socio-ecological uncertainty analysisUsing agent-based simulation for socio-ecological uncertainty analysis
Using agent-based simulation for socio-ecological uncertainty analysis
 
How social simulation could help social science deal with context
How social simulation could help social science deal with contextHow social simulation could help social science deal with context
How social simulation could help social science deal with context
 
Agent-based modelling, laboratory experiments, and observation in the wild
Agent-based modelling,laboratory experiments,and observation in the wildAgent-based modelling,laboratory experiments,and observation in the wild
Agent-based modelling, laboratory experiments, and observation in the wild
 
Culture trumps ethnicity! – Intra-generational cultural evolution and ethnoce...
Culture trumps ethnicity!– Intra-generational cultural evolution and ethnoce...Culture trumps ethnicity!– Intra-generational cultural evolution and ethnoce...
Culture trumps ethnicity! – Intra-generational cultural evolution and ethnoce...
 
An Introduction to Agent-Based Modelling
An Introduction to Agent-Based ModellingAn Introduction to Agent-Based Modelling
An Introduction to Agent-Based Modelling
 
Different Modelling Purposes - an 'anit-theoretical' approach
Different Modelling Purposes - an 'anit-theoretical' approachDifferent Modelling Purposes - an 'anit-theoretical' approach
Different Modelling Purposes - an 'anit-theoretical' approach
 
Socio-Ecological Simulation - a risk-assessment approach
Socio-Ecological Simulation - a risk-assessment approachSocio-Ecological Simulation - a risk-assessment approach
Socio-Ecological Simulation - a risk-assessment approach
 
A Simple Model of Group Commoning
A Simple Model of Group CommoningA Simple Model of Group Commoning
A Simple Model of Group Commoning
 
6 Modelling Purposes
6 Modelling Purposes6 Modelling Purposes
6 Modelling Purposes
 
Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...
Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...
Are Mixed-Methods Just a Fudge? The Dangers and Prospects for Integrating Qu...
 
The Post-Truth Drift in Social Simulation
The Post-Truth Drift in Social SimulationThe Post-Truth Drift in Social Simulation
The Post-Truth Drift in Social Simulation
 
Drilling down below opinions: how co-evolving beliefs and social structure mi...
Drilling down below opinions: how co-evolving beliefs and social structure mi...Drilling down below opinions: how co-evolving beliefs and social structure mi...
Drilling down below opinions: how co-evolving beliefs and social structure mi...
 
Model Purpose and Complexity
Model Purpose and ComplexityModel Purpose and Complexity
Model Purpose and Complexity
 
Modelling Innovation – some options from probabilistic to radical
Modelling Innovation – some options from probabilistic to radicalModelling Innovation – some options from probabilistic to radical
Modelling Innovation – some options from probabilistic to radical
 
Co-developing beliefs and social influence networks
Co-developing beliefs and social influence networksCo-developing beliefs and social influence networks
Co-developing beliefs and social influence networks
 
Simulating Superdiversity
Simulating Superdiversity Simulating Superdiversity
Simulating Superdiversity
 
A Model of Social and Cognitive Coherence
A Model of Social and Cognitive CoherenceA Model of Social and Cognitive Coherence
A Model of Social and Cognitive Coherence
 

Dernier

Boyles law module in the grade 10 science
Boyles law module in the grade 10 scienceBoyles law module in the grade 10 science
Boyles law module in the grade 10 sciencefloriejanemacaya1
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsAArockiyaNisha
 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoSérgio Sacani
 
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCESTERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCEPRINCE C P
 
Zoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfZoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfSumit Kumar yadav
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real timeSatoshi NAKAHIRA
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​kaibalyasahoo82800
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...jana861314
 
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxBroad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxjana861314
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfSumit Kumar yadav
 
CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service 🪡
CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service  🪡CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service  🪡
CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service 🪡anilsa9823
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfnehabiju2046
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksSérgio Sacani
 
Cultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxCultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxpradhanghanshyam7136
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfmuntazimhurra
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxUmerFayaz5
 
Artificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C PArtificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C PPRINCE C P
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)PraveenaKalaiselvan1
 
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsHubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsSérgio Sacani
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...RohitNehra6
 

Dernier (20)

Boyles law module in the grade 10 science
Boyles law module in the grade 10 scienceBoyles law module in the grade 10 science
Boyles law module in the grade 10 science
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based Nanomaterials
 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on Io
 
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCESTERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
STERILITY TESTING OF PHARMACEUTICALS ppt by DR.C.P.PRINCE
 
Zoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfZoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdf
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real time
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
 
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxBroad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdf
 
CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service 🪡
CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service  🪡CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service  🪡
CALL ON ➥8923113531 🔝Call Girls Kesar Bagh Lucknow best Night Fun service 🪡
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdf
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
 
Cultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptxCultivation of KODO MILLET . made by Ghanshyam pptx
Cultivation of KODO MILLET . made by Ghanshyam pptx
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdf
 
Animal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptxAnimal Communication- Auditory and Visual.pptx
Animal Communication- Auditory and Visual.pptx
 
Artificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C PArtificial Intelligence In Microbiology by Dr. Prince C P
Artificial Intelligence In Microbiology by Dr. Prince C P
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)
 
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsHubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...
 

Finding out what could go wrong before it does – Modelling Risk and Uncertainty

  • 1. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 1 Finding out what could go wrong before it does – Modelling Risk and Uncertainty Bruce Edmonds Centre for Policy Modelling Manchester Metropolitan University
  • 2. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 2 Classic Policy Modelling Essential steps: 1. Decide on KPIs of policy success 2. List candidate policies 3. Predict impact of policies: cost and KPIs 4. Choose best policy Sometimes this is embedded within a repeated cycle of: a) Decide on a policy (using steps 2-4 above) b) Implement it c) Evaluate the policy
  • 3. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 3 Statistical Models Approach: 1. Regress KPIs on known outputs 2. Choose inputs that maximise KPIs 3. Hence choose the policy that might most closely implement those inputs • Assumes generic fixed relationship – average success • Straightforward to do • Requires enough data between KPIs and inputs • Candidate policies and regressed inputs may not be obviously relatable • Not customisable to particular situations
  • 4. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 4 Micro-Simulation Models Approach: 1. Divide up population in to areas/groups 2. Choose simple statistical or other model for reaction 3. For each area/group regress/adjust model for their own data 4. Maybe add some flows between areas/groups 5. Aggregate over areas/groups for overall assessment • Requires details data for each area/group • Good for heterogeneity of groups/areas • Does not work so well when lots of interaction between groups
  • 5. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 5 Computable General Equilibrium Models Approach: 1. Construct a simplified economic model of situation with and without chosen policy 2. Calculate equilibrium without policy 3. Calculate equilibrium with policy 4. Compare the two equilibria and see if this represents an improvement and how much of one • Only simple models are calculable • Uses strong economic assumptions • The equilibrium is only one restricted and long- term aspect of the outcomes • Does not have a good predictive record
  • 6. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 6 System Dynamic Models Approach: 1. Build relationship between key variables using flow and storage approach (maybe in a participatory way) 2. Add in equations and delays 3. Run simulated system with probably inputs 4. Evaluate the results somehow • Good for dynamics with delayed feedback • Does not deal with heterogeneity of actors • ‘Touchy-feely’ judgment of outcomes • Can look more real that evidence proves • Not good at predicting outcome values
  • 7. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 7 Simulation Models Approach: 1. Build a simulation reflecting how parts of system relate 2. Adjust parameters reflecting particular situation/data 3. Check simulation by running for known situation where outcomes and data is known (validation) 4. Produce different variations of simulation to reflect each policy to be tested 5. Run each variation many times and measure the outcomes • Simulation only as strong as knowledge of system • Might have many unknown parameters • Never enough data to sufficiently validate • Policies can be directly implemented • Outcomes assessed in many different ways
  • 8. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 8 Some modelling tensions precision (model not vague) generality of scope (works for different cases) Lack of error (accuracy of results) realism (reflects knowledge of processes) Economic Models Scenarios Agent-based models Stats/regression models Reality Wanted for policy decisions
  • 9. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 9 Problem 1: System Complexity • There is no guarantee that a simple model will be adequate to representing complex social/ecological/economic/technical systems • How the parts and actors interact and react might be crucial to the outcomes (e.g. financial markets) • We may not know which parts of the system are crucial to the outcomes • We may not fully understand how the parts interact and react • System and model are both too complex to fully explore and understand in time available
  • 10. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 10 Problems 2&3: Error and Uncertainty • The values of many key parameters might be unknown or only approximately known • Data might be patchy and of poor quality • Tiny changes in key factors or parameters might have huge consequences for outcomes (the ‘butterfly effect’) • Levels of error may be amplified by the system (as in automated trading in financial markets) • There may be processes that we do not even know are important to the outcomes
  • 11. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 11 Problem 4: Structural Change • System evolves due to internal dynamics • For example, innovations might occur • System might have several very different behavioural ‘phases’ (e.g. bull and bear markets) which it shifts between • The rules of the system might change rapidly… • ...and well before any equilibrium is reached • Rule-change might be linked to system state • Different parts of the system might change in different ways
  • 12. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 12 Prediction • Given all these difficulties for many situations, prediction is not only infeasible… • ...but suggesting you can predict is dishonest • and may give false comfort (e.g. Newfoundland Cod Fisheries Collapse or 2007/8 financial crash) • Most techniques only work in two cases, where: 1. There is lots of experience/data over many previous episodes/cases 2. Nothing much changes (tomorrow similar to today) • Often even approximate or probabilistic prediction is infeasible and unhelpful
  • 13. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 13 The key question…. How does one manage a system or situation that is too complex to predict?
  • 14. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 14 Lessons from robotics: Part I Robotics in the 70s and 80s tried to (iteratively): 1. build a map of its situation (i.e. a predictive model) 2. use this model to plan its best action 3. then try to do this action 4. check it was doing OK go back to (1) But this did not work in any realistic situation: • It was far too slow to react to its world • to make useable predictions it had to make too many dodgy assumptions about its world
  • 15. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 15 Lessons from robotics: Part II Rodney Brooks (1991) Intelligence without representation. Artificial Intelligence, 47:139–160 A different approach: 1. Sense the world in rich fast ways 2. React to it quickly 3. Use a variety of levels of reaction a. low simple reactive strategies b. switched by progressively higher ones Do not try to predict the world, but react to it quickly This worked much better.
  • 16. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 16 Lessons from Weather Forecasting • Taking measurements at a few places and trying to predict what will happen based on simple models based on averages does not work well • Understanding the weather improved with very detailed simulations fed by rich and comprehensive sensing of the system • Even then they recognize that there are more than one possibilities concerning the outcomes (using ensembles of specific outcomes) • If these indicate a risk of severe weather they issue a warning so mitigating measures can be taken
  • 17. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 17 Lessons from Radiation Levels • The human body is a very complex system • It has long been known that too much radiation can cause severe illness or death in humans • In the 30s & 40s it was assumed there was a “safe” level of radiation • However it was later discovered that any level of radiation carried a risk of illness • Including naturally occurring levels • Although an increase in radiation might not seem to affect many people, it did result in more illnesses in some
  • 18. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 18 Socio-Ecological Systems • Are the combination of human society embedded within an ecological system (SES) • Many social and ecological systems are far too complex to predict • Their combination is doubly complex • E.g. fisheries, deforestation, species extinctions • Yet we still basically use the 1970s robotics “predict and plan” approach to these… • …as if we can plan optimum policies by estimating/projecting future impact
  • 19. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 19 Why simple models won’t work • Simpler models do not necessarily get things “roughly” right • Simpler models are not more general • They can also be very deceptive – especially with regards to complex ways things can go wrong • In complex systems the detailed interactions can take outcomes ‘far from equilibrium’ and far from average behaviour • Sometimes, with complex systems, a simple model that relies on strong assumptions can be far worse than having no models at all
  • 20. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 20 A Cautionary Tale • On the 2nd July 1992 Canada’s fisheries minister, placed a moratorium on all cod fishing off Newfoundland. That day 30,000 people lost their jobs. • Scientists and the fisheries department throughout much of the 1980s estimated a 15% annual rate of growth in the stock – (figures that were consistently disputed by inshore fishermen). • The subsequent Harris Report (1992) said (among many other things) that: “..scientists, lulled by false data signals and… overconfident of the validity of their predictions, failed to recognize the statistical inadequacies in … [their] model[s] and failed to … recognize the high risk involved with state-of-stock advice based on … unreliable data series.”
  • 21. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 21 What had gone wrong? • “… the idea of a strongly rebuilding Northern cod stock that was so powerful that it …[was]... read back… through analytical models built upon necessary but hypothetical assumptions about population and ecosystem dynamics. Further, those models required considerable subjective judgement as to the choice of weighting of the input variables” (Finlayson 1994, p.13) • Finlayson concluded that the social dynamics between scientists and managers were at play • Scientists adapting to the wishes and worldview of managers, managers gaining confidence in their approach from the apparent support of science
  • 22. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 22 Example 1: Fishing! • …is a dynamic, spatial, individual-based ecological model that has some of the complexity, adaptability and fragility of observed ecological systems with emergent outcomes • It evolves complex, local food webs, endogenous shocks from invasive species, is adaptive but unpredictable as to the eventual outcomes • Into this the impact of humans can be imposed or even agents representing humans ‘injected’ into the simulation • The outcomes can be then analysed at a variety of levels over long time scales, and under different scenarios • Paper: Edmonds, B. (in press) A Socio-Ecological Test Bed. Ecology & Complexity. • Full details and code at: http://openabm.org/model/4204
  • 23. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 23 In this version • Plants and higher order entities (fish) distinguished (no photosynthesizing herbivores!) • First a rich competing plant ecology is evolved • Then single fish injected until fish take hold and evolve until there is an ecology of many fish species, run for a bit to allow ‘transients’ to go • This state then frozen and saved • From this point different ‘fishing’ polices implemented and the simulations then run • with the outcomes then analysed
  • 24. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 24 The Model • A wrapped 2D grid of well-mixed patches with: – energy (transient) – bit string of characteristics • Organisms represented individually with its own characteristics, including: – bit string of characteristics – energy – position A well-mixed patch Each individual represented separately Slow random rate of migration between patches
  • 25. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 25 Model sequence each simulation tick 1. Input energy equally divided between patches. 2. Death. A life tax is subtracted, some die, age incremented 3. Initial seeding. until a viable is established, random new individual 4. Energy extraction from patch. energy divided among the individuals there with positive score when its bit-string is evaluated against patch 5. Predation. each individual is randomly paired with a number of others on the patch, if dominate them, get a % of their energy, other removed 6. Maximum Store. energy above a maximum level is discarded. 7. Birth. Those with energy > “reproduce-level” gives birth to a new entity with the same bit-string as itself, with a probability of mutation, Child has an energy of 1, taken from the parent. 8. Migration. randomly individuals move to one of 4 neighbours 9. Statistics. Various statistics are calculated.
  • 26. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 26 First, evolve a rich mixed ecology Evolve and save a suitable complex ecology with a balance of tropic layers (final state to the left with log population scale) Herbivores Appear First Successful Plant Simulation “Frozen” Carnivores Appear
  • 27. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 27 This version designed to test possible outcomes of fishing policies • Complex aquatic plant ecology evolved • Herbivore fish injected into ecology and whole system further evolved • Once a complex ecology with higher-order predators then system is fixed as starting point • Different extraction (i.e. fishing) policies can be enacted on top of this system: – How much fish is extracted each time (either absolute numbers or as a proportion of existing numbers) – Where uniformly at random or patch-by-patch – How many ‘reserves’ are kept – Is there a minimum stock level below which no fishing
  • 28. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 28 Demonstration of the basic model
  • 29. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 29 Typical Harvest Shape (last 100 ticks) for different catch levels over 20 different runs Catch level (per tick) ProportionofMaximum
  • 30. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 30 Decide in your groups 1. Amount of fish extraction (quota) per tick, either: • Absolute number (0-200) • Percentage of existing stock (0-100%) 2. The way fish is extracted, either: • Randomly over whole grid • Random patch chosen and fished, then next until quota for tick is reached 3. How many patches will be kept as reserves (not fished) 4. When to start fishing (0-999 ticks)
  • 31. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 31 Total Extinction Prob. & Av. Total Harvest (last 100 ticks) for different catch levels Catch level (per tick) ProportionofMaximum
  • 32. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 32 Num Fish (all species, 20 runs) – catch level 25 0 1000 2000 3000 4000 5000 6000 0 31 62 93 124 155 186 217 248 279 310 341 372 403 434 465 496 527 558 589 620 651 682 713 744 775 806 837 868 899 930 961 992
  • 33. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 33 Num Fish (all species, 20 runs) – catch level 35 0 1000 2000 3000 4000 5000 6000 0 31 62 93 124 155 186 217 248 279 310 341 372 403 434 465 496 527 558 589 620 651 682 713 744 775 806 837 868 899 930 961 992 Catch target=30
  • 34. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 34 Num Fish (all species, 20 runs) – catch level 50 0 1000 2000 3000 4000 5000 6000 0 31 62 93 124 155 186 217 248 279 310 341 372 403 434 465 496 527 558 589 620 651 682 713 744 775 806 837 868 899 930 961 992
  • 35. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 35 Average (over 20 runs) of fish at end of 5000 simulation ticks 0 1000 2000 3000 4000 5000 0 20 40 60 80 100 Number Fish for Different Catch Levels
  • 36. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 36 Average (over 20 runs) of numbers of fish species at end of 5000 simulation ticks 0 20 40 60 80 100 120 140 0 20 40 60 80 100 Num Fish Species with Catch Level
  • 37. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 37 Average Number of Species vs. Catch Level (from a different starting ecology) 0 2 4 6 8 10 12 14 0 5 10 15 20 25 30 35 40 Num Species Fish
  • 38. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 38 Average Number of Species, Catch=20 0 5 10 15 20 25 30 35 0 200 400 600 800 1000 AverageNumberofSpecies Time "by patches" "uniform"
  • 39. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 39 Average Number of Species, Catch=30 0 5 10 15 20 25 30 35 0 200 400 600 800 1000 AverageNumberofSpecies Time "by patches" "uniform"
  • 40. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 40 Average Number of Species, Catch=40 0 5 10 15 20 25 30 35 0 200 400 600 800 1000 AverageNumberofSpecies Time "by patches" "uniform"
  • 41. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 41 A risk-analysis approach 1. Give up on estimating future impact or “safe” levels of exploitation 2. Make simulation models that include more of the observed complication and complex interactions 3. Run these lots of times with various scenarios to discover some of the ways in which things can go surprisingly wrong (or surprisingly right) 4. Put in place sensors/measures that would give us the earliest possible warning that these might be occurring in real life 5. React quickly if these warning emerge
  • 42. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 42 Example 2: Social Influence and Domestic Water Demand • Produced for the Environment Agency/DEFRA • Part of a bigger project to predict future domestic water demand in the UK given some different future politico-economic scenarios and climate change • The rest of the project were detailed statistical models to do the prediction • This model was to examine the assumptions and look at the envelope of possibilities • Joint work with Olivier Barthelemy and Scott Moss
  • 43. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 43 Monthly Water Consumption REL_CHNG .88 .75 .63 .50 .38 .25 .13 0.00 -.13 -.25 -.38 -.50 20 10 0 Std.Dev = .17 Mean = .01 N = 81.00
  • 44. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 44 Relative Change in Monthly Consumption Date FEB 2001 SEP 2000 APR 2000 N O V 1999 JU N 1999 JAN 1999 AU G 1998 M AR 1998 O C T 1997 M AY 1997 D EC 1996 JU L 1996 FEB 1996 SEP 1995 APR 1995 N O V 1994 JU N 1994 REL_CHNG 1.0 .8 .6 .4 .2 -.0 -.2 -.4 -.6
  • 45. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 45 Purpose of the SI&DWD Model • Not long-term prediction • But to begin to understand the relationship of socially-influenced consumer behaviour to patterns of water demand • By producing a representational agent model amenable to fine-grained criticism • And hence to suggest possible interactions • So that these can be investigated/confirmed • And this loop iterated
  • 46. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 46 Model Structure - Overall Structure •Activity •Frequency •Volume Households Policy Agent •Temperature •Rainfall •Sunshine Ground Aggregate Demand
  • 47. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 47 Model Structure - Microcomponents • Each household has a variable number of micro- components (power showers etc.): bath other_garden_watering shower hand_dishwashing washing_machine sprinkler clothes_hand_washing hand_dishwashing toilets sprinkler power_shower • Actions are expressed by the frequency and volume of use of each microcomponent • AVF distribution in model calibrated by data from the Three Valleys
  • 48. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 48 Model Structure - Household Distribution • Households distributed randomly on a grid • Each household can copy from a set of neighbours (currently those up to 4 units up, down left and right from them) • They decide which is the neighbour most similar to themselves – this is the one they are most likely to copy • Depending on their evaluation of actions they might adopt that neighbour’s actions
  • 49. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 49 An Example Social Structure - Global Biased - Locally Biased - Self Biased
  • 50. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 50 Household Behaviour - Endorsements • Action Endorsements: recentAction neighbourhoodSourced selfSourced globallySourced newAppliance bestEndorsedNeighbourSourced • 3 Weights moderate effective strengths of neighbourhoodSourced selfSourced globallySourced endorsements and hence the bias of households • Can be characterised as 3 types of households influenced in different ways: global-; neighbourhood-; and self-sourced depending on the dominant weight
  • 51. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 51 History of a particular action from one agent’s point of view Month 1: used, endorsed as self sourced Month 2: endorsed as recent (from personal use) and neighbour sourced (used by agent 27) and self sourced (remembered) Month 3: endorsed as recent (from personal use) and neighbour sourced (agent 27 in month 2). Month 4: endorsed as neighbour sourced twice, used by agents 26 and 27 in month 3, also recent Month 5: endorsed as neighbour sourced (agent 26 in month 4), also recent Month 6: endorsed as neighbour sourced (agent 26 in month 5) Month 7: replaced by action 8472 (appeared in month 5 as neighbour sourced, now endorsed 4 times, including by the most alike neighbour – agent 50)
  • 52. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 52 Policy Agent - Behaviour • After the first month of dry conditions, suggests AFV actions to all households • These actions are then included in the list of those considered by the households • If the household’s weights predispose it, it may decide to adopt these actions • Some other neighbours might imitate these actions etc.
  • 53. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 53 Number of consecutive dry months in historical scenario 0 1 2 3 4 5 6 7 8 9 J-73 J-74 J-75 J-76 J-77 J-78 J-79 J-80 J-81 J-82 J-83 J-84 J-85 J-86 J-87 J-88 J-89 J-90 J-91 J-92 J-93 J-94 J-95 J-96 J-97 Simulation Date Numberofconsequativedrymonths
  • 54. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 54 Simulated Monthly Water Consumption REL_CHNG .075 .063 .050 .037 .025 .012 -.000 -.013 -.025 -.038 -.050 120 100 80 60 40 20 0 Std. Dev = .01 Mean= -.000 N = 325.00
  • 55. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 55 Monthly Water Consumption (again) REL_CHNG .88 .75 .63 .50 .38 .25 .13 0.00 -.13 -.25 -.38 -.50 20 10 0 Std.Dev = .17 Mean = .01 N = 81.00
  • 56. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 56 Simulated Change in Monthly Consumption Date SEP 1997 APR 1996 N O V 1994 JU N 1993 JAN 1992 AU G 1990 M AR 1989 O C T 1987 M AY 1986 D EC 1984 JU L 1983 FE B 1982 SEP 1980 APR 1979 N O V 1977 JU N 1976 JAN 1975 AU G 1973 M AR 1972 O C T 1970 REL_CHNG .10 .08 .06 .04 .02 0.00 -.02 -.04 -.06
  • 57. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 57 Relative Change in Monthly Consumption (again) Date FEB 2001 SEP 2000 APR 2000 N O V 1999 JU N 1999 JAN 1999 AU G 1998 M AR 1998 O C T 1997 M AY 1997 D EC 1996 JU L 1996 FEB 1996 SEP 1995 APR 1995 N O V 1994 JU N 1994 REL_CHNG 1.0 .8 .6 .4 .2 -.0 -.2 -.4 -.6
  • 58. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 58 30% Neigh. biased, historical scenario, historical innov. datesAggregate demand series scaled so 1973=100 0 20 40 60 80 100 120 140 160 180 200 J- 73 J- 74 J- 75 J- 76 J- 77 J- 78 J- 79 J- 80 J- 81 J- 82 J- 83 J- 84 J- 85 J- 86 J- 87 J- 88 J- 89 J- 90 J- 91 J- 92 J- 93 J- 94 J- 95 J- 96 J- 97 Simulation Date RelativeDemand
  • 59. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 59 80% Neigh. biased, historical scenario, historical innov. datesAggregate demand series scaled so 1973=100 0 20 40 60 80 100 120 140 160 180 200 J- 73 J- 74 J- 75 J- 76 J- 77 J- 78 J- 79 J- 80 J- 81 J- 82 J- 83 J- 84 J- 85 J- 86 J- 87 J- 88 J- 89 J- 90 J- 91 J- 92 J- 93 J- 94 J- 95 J- 96 J- 97 Simulation Date RelativeDemand
  • 60. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 60 80% Neigh. biased, medium-high scenario, historical innov. DatesAggregate demand series scaled so 1973=100 0 20 40 60 80 100 120 140 160 180 200 Jan- 73 Jan- 74 Jan- 75 Jan- 76 Jan- 77 Jan- 78 Jan- 79 Jan- 80 Jan- 81 Jan- 82 Jan- 83 Jan- 84 Jan- 85 Jan- 86 Jan- 87 Jan- 88 Jan- 89 Jan- 90 Jan- 91 Jan- 92 Jan- 93 Jan- 94 Jan- 95 Jan- 96 Jan- 97 Simulation Date RelativeDemand
  • 61. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 61 What did the model tell us? • That it is possible that social processes: – can cause a high and unpredictable variety in patterns of demand – can ‘lock-in’ behavioural patterns and partially ‘insulate’ them from outside influence (droughts only occasionally had a permenant affect on patterns of consumption) • and that the availability of new products could dominate effects from changing consumptions habits
  • 62. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 62 Conclusions of Example 2 • ABM can be used to construct fairly-rich computational descriptions of socially-related phenomena which can be used – to replicate systems analytic techniques can’t deal with – to explore some of the possibilities • especially those unpredictable but non-random possibilities caused to human behaviour – as part of an iterative cycle of detailed criticism • validatable by both data and expert opinion – to inform be informed by good observation
  • 63. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 63 A central dilemma – what to trust? Intuitions A complex simulation A policy maker
  • 64. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 64 But Modeller to Policy Actor Interface is not easy • Analysts/modellers and policy actors have different: goals, language, methods, habits… • Policy Actors will often want predictions – certainty – even if the analysts know this is infeasible • Analysts will know how difficult the situation is to understand and how much is unknown, and will want to communicate their caveats (which often get lost in the policy process) • So discussion between them does not necessarily go easily
  • 65. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 65 Many views of a model (I) - due to syntactic complexity • Computational ‘distance’ between specification and outcomes means that • There are (at least) two very different views of a simulation (consequences of complexity) Simulation Representation of OutcomesSpecification
  • 66. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 66 Representation of Outcomes (II) Many views of a model (II) - understanding the simulation (consequences of complexity) Simulation Representation of Outcomes (I)Specification Analogy 1 Analogy 2 Theory 1 Theory 2 Summary 1 Summary 2
  • 67. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 67 Four Meanings (of the PM) Research World 1. The researcher’s idea/intention for the PM 2. The fit of the PM with the evidence/data The ideavalidation relation extensively discussed within research world Policy World 3. The usefulness of the PM for decisions 4. The communicable story of the PM The goalinterpretation relation extensively discussed within policy world
  • 68. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 68 Two Worlds Research • Ultimate Goal is Agreement with Observed (Truth) • Modeller also has an idea of what the model is and how it works Policy • Ultimate Goal is in Final Outcomes (Usefulness) • Decisions justified by a communicable causal story Policy Model • Labels/Documentation may be different from all of the above! Modeller Policy Advisor
  • 69. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 69 Joining the Two Worlds Empirical • Ultimate Goal is Agreement with Observed (Truth) • Modeller also has an idea of what the model is and how it works Instrumental • Ultimate Goal is in Final Outcomes (Usefulness) • Decisions justified by a communicable causal story Model • Labels/Documentation may be different from all of the above! Tighter loop (e.g. via participatory modelling)
  • 70. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 70 Conclusions • Complex systems can not be relied upon to behave in regular ways • Often averages, equilibria etc. are not very informative • Future levels can not meaningfully be predicted • Simpler models may well make unreliable assumptions and not be representative • Rather complex models can be part of a risk-analysis • Identifying some of the ways in which things can go wrong, implement measure to watch these, then be able to react quickly to these (‘driving policy’) • A tight measure-react loop can be essential for driving policy – modelling might help in this – but this is hard!
  • 71. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 71 The End! Bruce Edmonds: http://bruce.edmonds.name These Slides: http://slideshare.net/bruceedmonds Centre for Policy Modelling: http://cfpm.org
  • 72. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 72 Some Pitfalls in Model Construction Pitfalls Part 1
  • 73. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 73 Modelling Assumptions • All models are built on assumptions, but… • They have different origins and reliability, e.g.: – Empirical evidence – Other well-defined theory – Expert Opinion – Common-sense – Tradition – Stuff we had to assume to make the model possible • Choosing assumptions is part of the art of simulation but which assumptions are used should be transparent and one should be honest about their reliability – plausibility is not enough!
  • 74. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 74 Theoretical Spectacles • Our conceptions and models constrain how we 1. look for evidence (e.g. where and what kinds) 2. what kind of models we develop 3. how we evaluate any results • This is Kuhn’s “Theoretical Spectacles” (1962) – e.g. continental drift • This is MUCH stronger for a complex simulation we have immersed ourselves in • Try to remember that just because it is useful to think of the world through our model, this does not make them valid or reliable
  • 75. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 75 Over-Simplified Models • Although simple models have many pragmatic advantages (easier to check, understand etc.)… • If we have missed out key elements of what is being modelled it might be completely wrong! • Playing with simple models to inform formal and intuitive understanding is an OK scientific practice • …but it can be dangerous when informing policy • Simple does not mean it is roughly correct, or more general or gives us useful intuitions • Need to accept that many modelling tasks requested of us by policy makers are not wise to do with restricted amounts of time/data/resources
  • 76. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 76 Underestimating model limitations • All models have limitations • They are only good for certain things: a model that explains well might not predict well • The may well fail when applied in a different context than the one they were developed in • Policy actors often do not want to know about limitations and caveats • Not only do we have to be 100% honest about these limitations, but we also have to ensure that these limitations are communicated with the model
  • 77. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 77 Not checking & testing a model thoroughly • Doh! • Sometimes there is not a clear demarcation between an exploratory phase of model development and its application to serious questions (whose answers will impact on others) • Sometimes an answer is demanded before thorough testing and checking can be done – “Its OK, I just want an approximate answer” :-/ • Sometimes researchers are not honest • Depends on the potential harm if the model is relied on (at all) and turns out to be wrong
  • 78. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 78 Some Pitfalls in Model Application Pitfalls Part 2
  • 79. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 79 Insufficiently Validated Models • One can not rely on a model until it has been rigorously checked and tested against reality • Plausibility is nowhere NEAR enough • This needs to be on more than one case • Its better if this is done independently • You can not validate a model using one set of settings/cases then rely on it in another • Validation usually takes a long time • Iterated development and validation over many cycles is better than one-off models (for policy)
  • 80. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 80 Promising too much • Modellers are in a position to see the potential of their work, and so can tantalise others by suggesting possible/future uses (e.g. in the conclusions of papers or grant applications) • They are tempted to suggest they can ‘predict’, ‘evaluate the impact of alternative polices’ etc. • Especially with complex situations (that ABM is useful for) this is simply deceptive • ‘Giving a prediction to a policy maker is like giving a sharp knife to a child’
  • 81. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 81 The inherent plausibility of ABMs • Due to the way ABMs map onto reality in a common-sense manner (e.g. peopleagents)… • …visualisations of what is happening can be readily interpretted by non-modellers • and hence given much greater credence than they warrant (i.e. the extent of their validation) • It is thus relatively easy to persuade using a good ABM and visualisation • Only we know how fragile they are, and need to be especially careful about suggesting otherwise
  • 82. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 82 Model Spread • On of the big advantages of formal models is that they can be passed around to be checked, played with, extended, used etc. • However once a model is out there, it might get used for different purposes than intended • e.g. the Black-Scholes model of derivative pricing • Try to ensure a released model is packaged with documentation that warns of its uses and limitations
  • 83. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 83 Narrowing the evidential base • The case of the Newfoundland cod, indicates how models can work to constrain the evidence base, therefore limiting decision making • If a model is considered authoritative, then the data it uses and produces can sideline other sources of evidence • Using a model rather than measuring lots of stuff is cheap, but with obvious dangers • Try to ensure models are used to widen the possibilities considered, rather than limit them
  • 84. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 84 Other/General Pitfalls Pitfalls Part 3
  • 85. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 85 Confusion over model purpose • A model is not a picture of reality, but a tool • A tool has a particular purpose • A tool good for one purpose is probably not good for another • These include: prediction, explanation, as an analogy, an illustration, a description, for theory exploration, or for mediating between people • Modellers should be 100% clear under which purpose their model is to be judged • Models need to be justified for each purpose separately
  • 86. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 86 When models are used out of the context they were designed for • Context matters! • In each context there will be many conditions/assumptions we are not even aware of • A model designed in one context may fail for subtle reasons in another (e.g. different ontology) • Models generally need re-testing, re-validating and often re-developing in new contexts
  • 87. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 87 What models cannot reasonably do • Many questions are beyond the realm of models and modellers but are essentially – ethical – political – social – semantic – symbolic • Applying models to these (outside the walls of our academic asylum) can confuse and distract
  • 88. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 88 The uncertainty is too great • Required reliability of outcome values is too low for purpose • Can be due to data or model reasons • Radical uncertainty is when its not a question of degree but the situation might fundamentally change or be different from the model • Error estimation is only valid in absence of radical uncertainly (which is not the case in almost all ecological, technical or social simulations) • Just got to be honest about this and not only present ‘best case’ results
  • 89. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 89 A false sense of security • If the outcomes of a model give a false sense of certainly about outcomes then a model can be worse than useless; positively damaging to policy • Better to err on the side of caution and say there is not good model in this case • Even if you are optimistic for a particular model • Distinction here between probabilistic and possibilistic views
  • 90. Finding out what could go wrong before it does, Bruce Edmonds, Cambridge, Sept. 2018. slide 90 Not more facts, but values! • Sometimes it is not facts and projections that are the issue but values • However good models are, the ‘engineering’ approach to policy (enumerate policies, predict impact of each, choose best policy) might be inappropriate • Modellers caught on the wrong side of history may be blamed even though they were just doing the technical parts