Booking open Available Pune Call Girls Parvati Darshan 6297143586 Call Hot I...
Tim Palmer, University of Oxford - OECD Workshop on “Climate change, Assumptions, Uncertainties and Surprises”, 3-4 September 2020
1. The Scientific Challenge of
Understanding and Estimating
Climate Change
Tim Palmer
Department of Physics
University of Oxford
tim.palmer@physics.ox.ac.uk
2019 PNAS
2. Why do we need climate models?
An answer from the 1970s.
• The laws of physics applied to climate are too complex to be solved by pencil
and paper
• There are no laboratory experiments we can do to replicate Earth’s climate.
• Hence need climate models to understand Earth’s climate.
• In particular we need climate models to understand global warming due to
human emissions of carbon dioxide, the magnitude of which crucially depends
on how water vapour and cloud (i.e. the water cycle) respond to the warming
effects of increasing carbon dioxide.
3. Why do we need climate models?
A 21st Century answer.
• Mitigation (How bad is climate change going to get and how quickly will it get
there, with and without emissions cuts? Can we rely on sucking CO2 out of the
air at a later date, or will we pass a significant tipping point making negative
emissions technology completely ineffective?)
• Adaptation (What will be the regional effects of climate change – including
changes to extremes of weather? How can we make society more resilient to
these changes?)
• Geoengineering (Is this really a Plan B? Could we inadvertently divert the
monsoons, or the moisture supply to the rainforests, by spraying sulphuric acid
into the stratosphere?)
• Early-warning systems (How far ahead can we reliably predict extreme weather
events. How proactive can we be in providing aid ahead of possible extreme
weather events – e.g. Red Cross/Red Crescent Forecast Based Finance?)
• Attribution (How more likely can we expect some observed extreme event to
become in the future?)
4. Are the current generation of climate models
adequate for these 21st Century tasks?
• No, in my (and Bjorn Stevens’) opinion.
• Climate models were developed by individual academic departments and by
government or research council labs from the 1960s onwards.
• About 30 such climate models contribute to the CMIP datasets, providing a
multi-model “ensemble of opportunity”
• However, each of these models has significant systematic biases compared with
reality, some models more than others. On the regional scale, the biases are
typically as large or larger than the signals we are trying to simulate.
• The unreliability of the CMIP5 ensemble of opportunity has become manifest by
the fact that a number of CMIP6 models have substantially larger climate
sensitivities than any of the CMIP5 models.
5. What’s the alternative to the CMIP “free for all”?
• A very small number (c.1 per continent) of climate modelling centres with
dedicated exascale computing, and where countries pool human and computing
resources.
• With such resources, we can represent the laws of physics more accurately. E.g.
with a c. 1km horizontal grid, convective cloud systems, the effects of orography
and ocean eddy transports can be substantially better modelled.
• Use stochastic parametrisations to represent model uncertainty.
• There is a precedent for such a concept in Numerical Weather Prediction. The
European Centre for Medium Range Weather Forecasts – which pools the
human and computational resources from c. 30 European Member States – has
consistently produced the best numerical weather forecasts in the world.
2011 Physics World
6. According to a calculation by Danish climate scientist
Martin Stendel, the 2019 losses would be enough to
cover the entire UK with around 2.5 metres of melt
water.
Both last year and 2012 were marked by "blocking"
events, the researchers say, where disturbances in the
jet stream saw high pressure systems become stuck
over Greenland, resulting in enhanced melting.
"It's expected that something like the 2019 or 2012
years will be repeated. And we don't exactly know
how the ice behaves in terms of feedback
mechanisms in this vigorous range of melting."
"There could be... hidden feedbacks that we are not
aware about or that are maybe not perfectly
described in the models right now. That could lead to
some surprises."
7. Climate scientists can confidently tie
global warming to impacts such as
sea-level rise and extreme heat. But
ask how rising temperatures will
affect rainfall and storms and the
answers get a lot shakier. For a long
time, researchers chalked the
problem up to natural variability.
Now, however, a new analysis has
found that the problem is not with
the climate, it’s with the massive
computer models.
The study which includes authors
from several leading modeling
centers, casts doubt on many
forecasts of regional climate change
which are crucial for policymaking. It
also means efforts to attribute
specific weather events to global
warming, now much in vogue, are
rife with errors.
8. The systematic error identified in
Smith et al is due to the fact that
the current generation of climate
models do not simulate
persistent circulation regimes
(e.g. Greenland Blocking) well.
This systematic error can be
substantially reduced by
increasing model resolution.
9. Candidate for an EU Flagship Project for a “CERN for Climate Change”
Now hopefully to become part of the EU Green Deal and Digital Strategy in
collaboration with ESA, Eumetsat and ECMWF.
10. How much for a (largely virtual) “CERN for
climate change”?
• $100m per year, including dedicated exascale
supercomputing.
• About the cost of one space satellite launch.
• Cost of International Space Station c. $150 billion.
• Mitigation and adaptation costs for meeting 1.5C
target – between $10 and $100 trillion.
Surely a no-brainer!!