Reposted from Dr. Judith Curry’s Climate Etc.
by Judith Curry
Heat waves are the new polar bears, stoking alarm about climate change. Climate scientists addressing this in the media are using misleading and/or inadequate approaches. How should we approach assessing whether and how much manmade global warming has contributed to recent record breaking temperatures? Read on for some outside-the-box thinking on this.
Much has been written in recent weeks on the record-breaking heat wave in the US Northwest and Canada
There have been four categories of scientific contributions to answering this question, that have appeared in the media, blog posts and publications:
I. Hot air: scientists spouting off in the media
Climate scientists are writing op-eds and spouting off on twitter, about AGW causing, or at least exacerbating, the heat wave. Scientists in this category are those who spout off on the topic, use heat waves to advocate for their preferred climate policies, without having done any actual work on the topic.
For one high profile example, see this article in the NYTimes by Michael Mann: Climate change is behind the heat dome.
One argument in the hot air line of reasoning is based on this diagram: as the average temperatures increase, then the frequency of heat extremes increases also.
However, analysis of historical data belies this simple interpretation:
The changing shape of Northern Hemisphere summer temperature distributions
Need for caution in interpreting extreme weather statistics
For an easier to read summary, see this report by Prescient Weather, which shows that the higher moments of the temperature distributions are critical also, and that the variance may be decreasing.
The other piece of the hot air argument relates to a hypothesis that the jet stream is made ‘wavier’ by global warming, an argument made by Michael Mann among others. There is a ton of recent papers debunking this idea, and some recent papers even suggest that high-pressure domes such as occurred during the heat wave will weaken under global warming.
It is intellectually lazy for scientists to spout off on this (or any other topic) without actually having done some work on the topic or at least having read and analyzed recent research on the topic. A convenient, but unjustified, storyline that supports your activism and preferred policies is not helpful.
II. Scientists analyzing historical data
John Christy has provided the following analysis of historical data, included in Cliff Mass’ blog post:
(did i tell you i HATE the new wordpress editor. See Cliff’s post for the figures prepared by Christy)
As shown below, there IS NO INCREASING TREND for more record high temperatures over our region (Oregon, Washington) during the past century. In fact, the past decade (2011-2020) had no all-time records.
Average number of days with temperatures above 99F in OR, WA? Also no trend.
These results are consistent with what others have found. For example, the U.S. National Climate Assessment found the warmest day of the year over the Northwest actually COOLED between a historic (1901-1960) and a contemporary period (1986-2016).
Dr. Nick Bond, Washington State Climatologist, said that he and Associate State Climatologist Karin Bumbaco found similar results, published in a peer-reviewed paper.
A single heat wave event can be evaluated against the historical record of previous historical heat waves (e.g. past ~100 years). Apart from some technical disputes surrounding which data set, the perils of homogenization, etc., what exactly is the logic for using historical temperatures records in heat wave attribution arguments?
A. If a record is set, does that lead to a necessary conclusion that AGW was a major contributing cause?
B. If a record is not set, does that lead to a necessary conclusion that AGW was not a major contributing cause?
C. If there is an underlying trend in heat wave frequency at that location, does that lead to a necessary conclusoin that AGW was a major contributing cause for a single heat wave event?
D. If there is no underlying trend in heat wave frequency at that location, does that lead to a necessary conclusion that AGW was not a major contributing cause for a singe heat wave event?
E. If there is a global trend in frequency/severity heat wave events, does that say anything conclusive about a role (or not) of AGW in influencing a single local heat wave event?
F. Does the magnitude by which a temperature record is broken say anything at all about a role (or not) of AGW in influencing a single local heat wave event?
While providing a historical context for a local heat wave event is critical for understanding the situation, the answer to each of these questions is ‘no.’ A, C, E and F, in combination, would stack the deck in favor of a ‘yes’, but data does not provide a quantitative answer to how much warming from the heat wave was caused by AGW. Getting to an unequivocal ‘no’ answer simply from analyzing the temperature record is more challenging. But if a local heat wave record is set, it is worth digging deeper to try to understand the proximate (weather) causes and any underlying climate influence (multi-decadal natural variability and/or AGW).
III. Scientists conducting climate model-based attribution analysis.
As described by Gavin Schmidt at RealClimate : https://www.realclimate.org/index.php/archives/2021/07/rapid-attribution-of-pnw-heatwave/#.YOYxqur28iM.twitter
“The way that climate-model based attribution for extreme events works (as discussed previously on RealClimate here and here etc.) is that you look at the situation with and without the anthropogenic global warming signal and calculate the ratio of probabilities. If an event is say, twice as common with the GW, then one can give a fractional attribution of 50% to anthropogenic forcing and the return time is half what it used to be. If it is five times more likely, the attribution is 80% = 100*(5-1)/5 and the return time is a fifth of what it used to be. In this case, we are seeing probability ratios of 150 to 1000s, suggesting that these, improbable, temperatures can be almost entirely attributed to global warming. Without the anthropogenic signal, temperatures this extreme wouldn’t have happened in thousands to tens of thousands of years.”
The rapid report from the European team is found [here]
This effort involves a massive amount of number crunching.
This report has gotten a lot of media attention, as an example see this article from Time. https://time.com/6079744/climate-weather-attribution/
So, what’s wrong with this picture?
1. A time series of order a hundred years (from observations or a model simulation) is insufficient to develop meaningful statistics about being a 1 in 10,000 year event.
2. The atmospheric dynamics in global climate models are fairly ‘blah’; the coarse resolution of climate models is fundamentally unable to capture the kind of blocking events that causes heat waves, or resolve hurricanes, or resolve extreme convective events that cause flooding, etc.
3. This approach implicitly assumes that all climate change is caused by emissions, and ignores or mischaracterizes multi-decadal natural internal variability (since climate models do not have the correct phasing and amplitudes).
Cliff Mass has done a masterful job of critiquing the report from the European group. I don’t disagree with anything he says.
This entire climate model-based approach to extreme event attribution is fundamentally flawed. Until climate models are able actually resolve circulation features (requiring a horizontal of resolution of ~20 km), they simply are not useful for attribution of extreme weather events.
IV. Scientists conducting process-based analyses
NOAA scientist Marty Hoerling has likened extreme weather event attribution to conducting an autopsy. You have some clues, but the conclusion requires linking them together in a mechanistic sequence of events.
Cliff Mass has provided the best autopsy report so far on the heat wave.
He provides the follow summary of the proximate sequence of events leading to the heat wave:
“Record amplitude of a ridge/high pressure over our region, forced by a tropical disturbance in the western Pacific, that produced a downstream wave train. An environment that allowed the resulting wave to amplify. The ridge had to be in exactly the right position relative to our terrain. An upper-level trough had to develop in just the right location offshore and move in the optimal direction to cause strong southeasterly flow, fostering the supercharger noted above. We needed a period when the sun was very strong. And a summer stretch without smoke, which has a profound cooling effect.
The meteorological dice had to come up all sixes. And they did.”
Process-based analyses are different from hand waving ‘story lines’. Here is what Mass considered:
1. The state of Washington has warmed by 1.5C over the past 120 years
2. Whether the drought and dry soils contributed to the heat wave (no, as per regional model simulations and the fact that there is no trend in drought in the Pacific northwest)
3. Whether global warming produces stronger ridges of high pressure (no, as per data analysis and climate model simulations)
4. No observed trend in heat waves (Christy’s analysis)
5. Use of regional climate model (no, CO2 doesn’t produce more heat waves)
6. Analysis of regional weather dynamics supported with regional climate modeling results shows a paradoxical pathway for cooling in the region
So do we have an unambiguous ’cause of death’ here, i.e. an unambiguous ‘no’ answer to the question as to whether AGW was the cause, or at least had an influence, on the heat wave?
A simple consilience of this evidence does not lead to an unambiguous ‘no’ conclusion. However Cliff’s analysis is arguably sufficient to infer that CO2 was not the sole, or even dominant, cause of the record temperatures.
V. A fifth way
We need a better logic for attributing extreme weather events to global warming, and some outside-the-box thinking on how to attribute the causes of extreme weather events.
Considering the strategies described above, I and III are unsatisfactory, and frankly not at all useful. Especially for III, a massive amount of resources and brain power are wasted on this approach, for which global climate models, at their current resolution, are simply not fit-for-purpose.
II is very useful, but the logic in evaluating this information for attribution is ambiguous. IV provides useful insights, but doesn’t provide a quantitative answer regarding attribution or a clear role of CO2‘s contribution.
We need a fifth way, that builds on II and IV, provides a better logic for conducting the autopsy, and considers some new approaches.
Extreme weather events can be extreme in terms of the magnitude of individual events, the frequency of events crossing some threshold, or clustering of extreme events. It needs to be acknowledged that extreme events are by definition rare, and short historical records (even century long records) are insufficient for formulating meaningful statistics about return times.
A thermodynamic and dynamical storyline of the extreme event needs to be assembled, similar to how Cliff Mass framed the problem. Here is an alternative approach for understanding and quantifying the effect of an increase in CO2 on severe weather systems. The example provided here is targeted at NW US heat wave.
Single column models of the atmosphere coupled to the land surface can provide a quantitative assessment of the direct contribution of CO2 forcing to the surface temperatures. This is a better approach than looking at the historical record of annual average surface temperatures, and assuming that any increase is caused by CO2 and would increase the magnitude of any heat wave by that same amount.
Experiment #1. For this particular event, on the day of the maximum record breaking temperature, a local vertical profile of temperature and humidity can be obtained from a radiosonde or the operational analysis from numerical weather prediction centers. This can be run through an atmospheric single-column model with radiative transfer model and land surface model to calculate the surface temperature in response to pre-industrial CO2, current levels of CO2, future levels of CO2. This is a simple calculation that answers the question: all other things being equal, how much difference have emissions for the past 100 years made to the surface temperature for the heat dome event that emerged, just through the radiative effects of the CO2? Cold, dry situations with no clouds amplify the impact of CO2 on the surface temperature. It is fairly easy to calculate exactly what effect the increase in CO2 would have on surface temperature under the local conditions for Portland, OR. Without having done the calculation, an outcome of 1-3 F wouldn’t surprise me.
Experiment #2. This experiment builds on #1 to address the impact of the fast thermodynamic feedbacks on the surface temperature change. This can be accomplished by using the shape of the temperature profile and relative humidity from the original radiosonde or operational analysis to adjust the temperature and humidity profiles to the resulting surface temperature for the calculations in experiment #1 for altered CO2. This provides a better assessment of the direct radiative effects of altered CO2 in this particular weather system.
The next set of experiments address the dynamical effects of increasing CO2 on the particular weather system that influenced the record high surface temperatures.
This heat wave was exceptionally well forecasted as much as 10 days in advance by global ensemble weather forecast system. Global ensemble weather forecast system with high resolution (at least 20 km) can be used to simulate the daily forecasts from 14 to 1 days in advance of the event, with a CO2 concentration of 300 ppm. Not clear at this point whether a single forecast simulation at each lead time is adequate, or whether the full ensemble is needed.
Experiment #3. Make no change to the weather forecast model except to the CO2 concentration. Compare ‘forecasts’ with altered CO2 concentration with the original forecasts: 500 and 850 mb geopotential heights and temperatures in the vicinity of the heat dome, also the surface temperatures in the NW US and SW Canada. It may turn out that Experiment #3 is sufficient to infer the role of more/less CO2 on the evolution of the omega block, heat dome and record high temperatures. But experiments #4 and #5 should be considered, since there are caveats to interpreting experiment #3.
Experiment #4. Alter the global sea surface temperatures (SST) in a way that preserves the global pattern of SST for this period, but have magnitudes more consistent with a 300 ppm climate. I would use NOAA’s 20th century reanalyses for thishttps://judithcurry.com/2011/08/17/reanalyses-org/embed/#?secret=1f25Jcd8J5
Subtract the annual average (or summer average) SST for each ocean grid point for a 300 ppm climate (around 1910) from the current gridded values; subtract the gridded difference from the SST field for this case used in the weather forecast models. Run the same set of experiments as in #3; compare with the original forecasts and the reduced CO2 forecasts from #3. Note: not clear how quickly the initialized atmospheric temperature profiles will adjust to the altered SST, and how much this would influence the evolution of the atmospheric dynamics.
Experiment #5. For Experiment #4, the initialized atmospheric temperatures are too warm and specific humidity is too high relative to the lower SST values. Humidity initialization doesn’t really matter, since the model rapidly creates its own humidity field. However, the initial temperature field may matter. Its the temperature gradients that influence the circulations. During summer, the pole-to-equator temperature gradient wouldn’t change much between high and low CO2; melting Arctic sea ice would be just underway at the end of June in 1910, whereas it is well underway in June in the current climate. Convective lapse rates would also be different for high vs low CO2. I’m not sure how quickly the atmospheric temperatures would adjust to the altered SSTs in #4. Initialized atmospheric temperatures would be out of balance with the colder surface temperatures, making the marine atmosphere too stable. Somehow initializing with atmospheric temperatures more suited to 1910 while preserving all of the temperature gradients would be ideal. The team doing the 20th century reanalyses could maybe figure out how to do this.
What this set of numerical experiments would do is allow for inferences to be made that compare the thermodynamic and dynamical effects of reduced/increased CO2 on the surface temperatures and the dynamics of the heat dome. The exact logic of how such inferences should be made, with what caveats and uncertainties, would require more attention than I can give it here.
Such an analysis would only take us so far: the question remains as to whether increased CO2 is changing the overall hemispheric dynamics, making such heat dome events and omega blocks more or less frequent. Experiments with high-resolution (20 km horizontal resolution) coupled global climate models with increased/decreased CO2 can provide some insights (the essential ingredient is for the model to have sufficiently high resolution to resolve blocking patterns).
Analysis of global reanalysis data (ERA5 back to 1950, 20th century reanalysis actually back to the 19th century) can provide some important insights:
- Is increasing CO2 changing the multi-decadal ocean oscillations? I’ve done a literature survey and there is no evidence of this yet.
- Is additional warming changing ENSO? I’ve done a literature survey and yes ENSO has changed since 1950; whether these changes are CO2 caused is debated.
- Are the atmospheric teleconnection regimes (e.g. AO, PNA etc) changing? This is something I’ve looked at (since 1950), and no changes apart from minor variations associated with multi-decadal climate variability.
With regards to the wavy jet stream hypothesis and its influence on blocking, I follow the literature on this topic, but haven’t done a formal literature review on this. Basic dynamical reasoning does not support the wavy jet stream hypothesis. There is more theoretical research to be done, and the ERA5 and 20th Century Reanalysis should prove a good data set for this, but the value lies in how these data are interpreted.
And finally, machine learning and network based methods are increasingly being used for attribution analyses in a range of different fields.
So I’m tossing these ideas out for discussion, I look forward to your further outside-the-box ideas on how to approach this problem.
Heat versus cold
And finally, I address the alarm over heat waves. I was in Utah in late June, where the local temperature reached 112F. It is not pleasant. Fortunately I could mostly stay inside where it was cooler. There is no question that excessive and unusual heat causes health problems. People have adapted to very hot temperatures (see this article about Pakistan) . This issue is unexpectedly hot temperatures, for which broad segments of the population are unprepared for and have no experience in dealing with. By this standard, the record breaking temperatures in Portland were more difficult to deal with than the relative routine and substantially higher temperatures in Pakistan.
While heat kills, cold temperatures kill more than an order of magnitude more people than heat. Pat Michaels has been on this issue for decades, and its not particularly controversial. This recent article in the Guardian is interesting:
Title – “Extreme temperature kills 5 million people a year with heat related deaths rising study finds”
Subtitle – “More people died of cold than heat in past 20 years but climate change is shifting the balance.”
The only conclusion I can draw here is that global warming is associated with fewer temperature related deaths. Which is completely at odds with the impression the Guardian article is trying to make with its alarming headline.
Credit: Source link