Evaluating global climate models using simple, explainable neural networks
1. Evaluating global climate models
using simple, explainable
neural networks
@ZLabe
Zachary M. Labe
with Elizabeth A. Barnes
Colorado State University
Department of Atmospheric Science
17 December 2021
NG51A-06 – AGU Fall Meeting
Climate Variability Across Scales and Climate States and
Neural Earth System Modeling [Oral Session I]
4. THE REAL WORLD
(Observations)
CLIMATE MODEL
ENSEMBLES
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
5. Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
But let’s remove
climate change…
6. Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
After removing the
forced response…
anomalies/noise!
7. 2-m Temperature (°C)
THERE ARE MANY CLIMATE MODEL LARGE ENSEMBLES…
Annual mean 2-m temperature
7 global climate models
16 ensembles each
ERA5-BE (observations)
8. STANDARD EVALUATION OF
CLIMATE MODELS
Pattern correlation
RMSE
EOFs
Trends, anomalies, mean state
Climate modes of variability
9. STANDARD EVALUATION OF
CLIMATE MODELS
Pattern correlation
RMSE
EOFs
Trends, anomalies, mean state
Climate modes of variability
CORRELATION
[R]
10. STANDARD EVALUATION OF
CLIMATE MODELS
Pattern correlation
RMSE
EOFs
Trends, anomalies, mean state
Climate modes of variability
CORRELATION
[R]
11. STANDARD EVALUATION OF
CLIMATE MODELS
Pattern correlation
RMSE
EOFs
Trends, anomalies, mean state
Climate modes of variability
Negative Correlation Positive Correlation
PATTERN CORRELATION – T2M
13. ----ANN----
2 Hidden Layers
10 Nodes each
Ridge Regularization
Early Stopping
TEMPERATURE
We know some metadata…
+ What year is it?
+ Where did it come from?
14. TEMPERATURE
We know some metadata…
+ What year is it?
+ Where did it come from?
Train on data from the
Multi-Model Large
Ensemble Archive
15. TEMPERATURE
We know some metadata…
+ What year is it?
+ Where did it come from?
NEURAL NETWORK
CLASSIFICATION TASK
HIDDEN LAYERS
INPUT LAYER
19. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
20. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
21. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
22. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Image Classification LRP
https://heatmapping.org/
NOT PERFECT
Crock
Pot
Neural Network
WHY
Backpropagation – LRP
23. [Adapted from Adebayo et al., 2020]
EXPLAINABLE AI IS
NOT PERFECT
THERE ARE MANY
METHODS
24. [Adapted from Adebayo et al., 2020]
THERE ARE MANY
METHODS
EXPLAINABLE AI IS
NOT PERFECT
49. KEY POINTS
Zachary Labe
zmlabe@rams.colostate.edu
@ZLabe
1. Explainable neural networks can be used to identify unique differences in temperature
simulated between global climate model large ensembles
2. As a method of climate model evaluation, we input maps from observations into the neural
network in order to classify each year with a climate model
3. The neural network architecture can be used in regions with known large biases, such as over
the Arctic, or for different methods of preprocessing climate data