A talk given at ICES 2018.
Scientific modelling can make things worse, as in the case of the North Atlantic Cod Fisheries Collapse. Some of these failures have been attributed to the simplicity of the models used compared to what they are trying to model. MultiAgent-Based Simulation (MABS) pushes the boundaries of what can be simulated, prompting many to assume that it can usefully inform policy, even in the face of complexity. That said, MABS also brings with it new difficulties and potential confusions. This paper surveys some of the pitfalls that can arise when MABS analysts try to do this. Researchers who claim (or imply) that MABS can reliably predict are criticised in particular. However, an alternative is suggested – that of using MABS for a kind of uncertainty analysis – identifying some of the possible ways a policy can go wrong (or indeed go right). A fisheries example is given. This alternative may widen, rather than narrow, the range of evidence and possibilities that are considered, which could enrich the policy-making process. We call this Reflexive Possibilistic Modelling.
Relates to paper at: http://cfpm.org/discussionpapers/215
Using agent-based modelling to inform policy – what could possibly go wrong?
1. The spectre of complexity and the value of pluralism
– A case for Reflexive Possibilistic Modelling
Bruce Edmonds and Lia ní Aodha
Centre for Policy Modelling
Manchester Metropolitan University
http://cfpm.org
ICES ASC 2018, Hamburg. 25-29 September.
SAF21 is a project financed under the EU Horizon 2020 Marie Skłodowska-Curie
(MSC) – ITN - ETN programme (project 642080).
2. “it makes quite a difference whether the world is
viewed as a machine or as a turbulent stream” (Kwa
1994: 387).
“A recurrent theme of Western philosophy and
science, including social science, has been the
attempt to reformulate systems of knowledge in order
to bracket uncertainty...” (Scott, 1998: 321).
3. The ‘Engineering’ approach to policy
The basic strategy is:
1. Decide how to measure attainment of goals
2. List possible approaches/strategies/policies
3. Evaluate these by predicting their measured benefit - costs
4. Choose the best one
(Sometimes this is embedded within a cycle of: (a) choose a policy using the
above or otherwise, (b) try it out (c) wait (d) measure/assess the outcomes.
Though not many successive iterative cycles of this are completed, because
something else (politics, new science, ecological change) disrupts this)
4. But what if we can’t predict outcomes of policies?
• i.e. not even approximately in any kind of reliable way
• That is not in theory (anyone can predict in theory) but a track record of doing
so in many independent cases in practice
• Many approaches work if nothing much changes, but miss cases where
structural change occurs – where how things work change
• Dynamic multispecies interactions occur in complex and chaotic ways that we
do not fully understand
• Social, political etc. factors complicate things
• Factors we are not even aware of turn out to be crucial
5. New Paradigms for Old – the shift to ‘complexity science’
‘Linear’ science
• Solvable equations
• Equilibrium assumptions
• Reductionism
• Strong assumptions.
• Prediction
• Many Failures (for fish, people,
communities).
‘Complexity’ science
• Complex, uncertain, non-linear
• Socio-ecological complex
adaptive systems.
• Adding in Fishermen’s behavior
• Integrative research
• New kinds of model, including
(Agent-based) simulation
An epistemological advance – more data, more complex models
6. Problems with this narrative
• Still assumes prediction is possible (for example, via increasingly complex
models such as ABM).
• Assumes a single model is desirable – a bias towards general models,
frameworks and solutions
• Fails to recognise incommensurabilities between different communities (e.g.
between scientists and some local groups)
• Not enough good data, not varied enough in type and level
• The variety of values are missed out in the engineering approach
• Emphasis on narrow range of evidence and approach
7. Are we taking the complexity seriously?
“A simple system can be adequately described using a single
perspective and a standard analytical model, such as Newtonian
mechanics, but a complex adaptive system cannot. Rather,
complex systems can be characterized by scale effects,
nonlinearities and tipping points, inherent uncertainty or
unpredictability, self-organization, connectivity, path-
dependence, and emergent properties such as resilience that
cannot be predicted from examining the parts of the system”
(Berkes 2017: 1–2).
8. Responsive and open policy formation
• Don’t pretend to predict, even probabilistically
• Rather try to understand lots of possible ways a system
could go and how from as many sources as possible
• Put in place monitors for early indication that these might
be occurring, gather evidence widely
• Which allows policies to be considered and changed –
open to completely new possibilities
• A move from closed+predictive to open+responsive
9. An example – how modelling might add to the
possibilities being considered
• Make models that capture some of the complexity,
chaos and sheer “mess” of what is happening
• See what possibilities these indicate and then go and
understand these and their consequences
• But remember many other sources of possibilities
exist – crucially stakeholder/societal/political input
10. One way to make a complex model – evolve it!
Herbivores
Appear
First Successful
Plant
Simulation
“Frozen”
Carnivores
Appear
Evolve a complex ecology and save this state
11. Then explore the possibilities from there…
Herbivores
Appear
First Successful
Plant
Simulation
“Frozen”
Carnivores
Appear
Do multiple runs of the simulation
starting from there for each condition
to test
After, collect statistics or visualisations about what
happened in the runs to understand the possible paths
12. What it looks like….
• A wrapped 2D grid of well-
mixed patches with:
• energy (transient)
• bit string of characteristics
• Organisms represented
individually with its own
characteristics, including:
• bit string of characteristics
• energy
• position
• stats recorders
A well-mixed
patch
Each
individual
represented
separately
Slow random
rate of
migration
between
patches
13. Total Extinction Probability & Average Total Harvest (last
100 ticks) for different catch levels
Catch level (per tick)
ProportionofMaximum
14. Zooming in on what can go wrong – 20 runs of this model from
the same starting ecology (catch level 35)
0
1000
2000
3000
4000
5000
6000
0
31
62
93
124
155
186
217
248
279
310
341
372
403
434
465
496
527
558
589
620
651
682
713
744
775
806
837
868
899
930
961
992
41
42
43
44
45
46
47
48
49
50
Catch target=30
Simulation Time
Totalnumberoffish
Each line is
from a
different
run of the
model
15. Conclusion
• There are multiple sources of uncertainty (unkown unkowns)
• Any one model (formal or mental) will only capture a small aspect of it
• An ability to predict not been shown, nor seems feasible in the near future
• Rather than close down and focus on a narrow range of (theoretically)
measurable quantities, we need to take in a broad range of evidence,
viewpoints and policies (including novel ones)
• Developing new approaches (including modelling approaches) that broaden
the possibilities considered, but skeptically – they may well be wrong!
• But this needs to be open to new ideas, approaches and evidence
• Accept in the discourse incommensurable viewpoints and decide these via
politics rather than pretending it is science
16. Thanks!
Centre for Policy Modelling: http://cfpm.org
These slides available at: http://slideshare.net/BruceEdmonds
Accompanying paper is at: http://cfpm.org/discussionpapers/215
Basic model is freely available at: http://comses.net/codebases/4204