With climate models, simpler isn’t necessarily better

January 23, 2015
by Alexandra Cheung

Grantham Institute Co-Director Professor Joanna Haigh discusses a recent paper which argues that  existing climate models ‘run hot’ and overstate the extent of manmade climate change.

Retro filtered picture of smoky chimneys silhouettes against sunIt is perplexing that some climate change sceptics, who expend much energy in decrying global circulation (computer) models of the climate, on the basis that they cannot properly represent the entire complexities of the climate system and/or that they contain too many approximations, are now resorting to an extremely simplified model to support their arguments.

The model used in the Sci. Bull. article is a very useful tool for conceptualising the factors which contribute to the relationship between increasing concentrations of carbon dioxide in the atmosphere and global average temperature – indeed, we use such models as teaching aids for students studying atmospheric physics – but it is in no way fit for purpose as an accurate predictor of climate change.  It requires as input the values of a number of parameters and, fundamentally, the choice of these values determines the predicted temperatures

Key here is the “feedback parameter” which represents the knock-on effects of changes in the atmosphere on the initial response to greenhouse gas warming.  A positive feedback will make the temperature change larger and a negative one reduce it.  For example, as the atmosphere warms it can hold more water vapour which itself is a greenhouse gas, acting to enhance the initial carbon dioxide-induced warming and thus giving a positive feedback.  The physics of this process is very well-understood.  There are a number of other, both positive and negative, feedback processes but overall, analyses of meteorological observations, modelling and understanding of the physical processes point to a significantly positive value.  In the present paper the authors choose a very small value, based on temperatures measured in ice cores over the 810,000 year period of ice ages and inter-glacials.  Their analysis is incomplete but anyway not relevant to changes in global climate over decadal-to-century timescales.

Thus by choosing an inappropriate value of the feedback parameter, and also judicious choices of other parameters, the authors end up with their “models run hot” conclusion.  Must try harder.

 

 

 

11 Responses to “With climate models, simpler isn’t necessarily better”

  1. Monckton's Fundamentally Flawed Simple Climate Model - Real Sceptic says:

    […] for Space Studies, the paper is “complete trash.” Grantham Institute Co-Director Joanna Haigh noted the “inappropriate” and “judicious choices” of parameters and others had equally stinging […]

  2. Joanna Haigh says:

    In response to Lord Monckton’s comment:

    As explained by Ray Bates, the use of the term “feedback” in climate science is unhelpful as it does not refer to the same framework as that for which it is generally used in electronics and control theory.

    If you place a microphone near a speaker then a small sound picked up by the microphone will be amplified through the speaker, picked up again, reamplified and so on so that the system spirals out of control to a screeching denouement.

    In climate the warming introduced by a change in carbon dioxide concentration induces an increase in water vapour which enhances the initial warming. It does not (on the timescale of recent climate change, but see below wrt ice ages) introduce any significant further increase in CO2 so there is no feedback on the initial forcing, no spiralling out of control and the system moves towards a new equilibrium with higher concentrations of CO2 and H2O.

    The 100,000 year variation seen in the ice core records is introduced through changes in the Earth’s orbit around the Sun. The change in total solar irradiance arriving at Earth associated with this variation is very small, around 1 W/m2, implying a radiative forcing of less than 0.2 W/m2. From this one MIGHT deduce, given the 5K amplitude in temperature variation, a huge climate sensitivity of around 25K/(W/m2) but that would be to misunderstand the processes involved which are related to changes in the distribution of the solar radiation over the Earth rather than its absolute value. During entry into an Ice Age there is not time for summer warming to melt the build up of ice at high latitudes during winter and the persistence of the ice enhances global albedo introducing further cooling. Furthermore, the cooler oceans and Earth can take up more CO2 & CH4 so there is a reduction in greenhouse warming by these gases. Calculations using the CO2 and CH4 records can reproduce quite well the observed temperature series. Again there is a tendency towards a new equilibrium, no “feedback” on the initial forcing and the feedback of control theory is not relevant.

    It is inappropriate to use the temperature and CO2 data from the ice cores on 100,000 year timescales to derive a value of climate sensitivity relevant to recent climate change when the relevant processes are so different.

  3. Climate Pete says:

    Bishop Hill says:

    “Surely the fact that models overpredict warming is not in dispute? It’s not only the pause but the longer-term record too which shows that the models run too hot. You can hypothesise reasons for the overprediction (heat in the deep ocean etc etc) but that doesn’t change the fact that they run too hot.”

    See Well-estimated global surface warming in climate projections selected for ENSO phase, Risbey et al, Nature Climate Change online publication DOI: 10.1038/NCLIMATE2310.

    The diagram from that publication

    http://api.ning.com/files/YsTyeLVttoiD-1iyfblKlDufj-rlKqa1u4XN-q0rdsVr*XN5n8qcHhuLaIL3p5pUcPJGd4FQLpknjuqXb9GdBLRd*BiuZq2h/ClimateModelsVsActualsByENSOPhase.jpg

    clearly shows climate model runs with ENSO 3.4 states similar to the actual ENSO 3.4 state over time closely mirror temperature actual trends. Further, climate models with ENSO 3.4 states most out of phase with actual ENSO 3.4 states produce the biggest temperature discrepancy between models and actuals. The ENSO 3.4 area represents only 2% of the earth’s surface area and thus per se has negligible effect on global average temperature. Further, there appears to be universal agreement that ENSO state acts as a leading driver for short term global temperature fluctuations, rather than a lagging indicator (i.e. cause rather than effect).

    Thus a lack of control for and discussion of the ENSO 3.4 state in the simple model resurrected by Monckton et al has to be a major weakness.

  4. Climate Pete says:

    Monckton said :

    “The RSS satellite record shows no global warming for 18 years 3 months, and all of the datasets are within statistical earshot of that. ”

    I have in front of me a spreadsheet giving the trend of a trends of an average of the four temperature data sets RSS, UAH 5.6, Cowtan & Way (hybrid V1), and GISTEMP. The trend calculated from a 12-month centrally smoothed rolling average from September 1996 to July 2013 is 0.8 K per century with a 2 sigma bound of 0.14 K per century, which, combined, produces a statistically significant range of 0.53 to 1.07 K per century.

    This is the time period of what Monckton would describe as “the pause”, a period of time starting with a huge El Nino and finishing in an ENSO neutral state, so if controlling the temperature trends for ENSO state the trend would be noticeably higher than this.

    Addition of 2014 data (taking the last date for smoothed average to July 2014) would increase the trend still further as this has been a record year in some of the surface temperature at data sets.

    Further, these record temperatures have been achieved while in a neutral ENSO state. If the climate had been in an El Nino state then late 2014 average temperatures would have been higher by between 0.1 and 0.2 K.

    Thus Monckton’s statement is at best misinformed and at worst deliberately misleading.

    Climate Pete

  5. jsam says:

    What pause?
    http://www.woodfortrees.org/plot/hadcrut4gl/from:1996/plot/hadcrut4gl/from:1996/trend

    The models are doing pretty well.
    http://www.skepticalscience.com/contary-to-contrarians-ipcc-temp-projections-accurate.html

    The surface warms. The oceans warm, rise and acidify. Global ice levels recede.

    The surface warms.
    http://www.woodfortrees.org/plot/hadcrut4gl/from:1996/plot/hadcrut4gl/from:1996/trend

    The oceans warm.
    http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/
    …and rise 3.2 mm per year, up from 1.9 mm per year a century ago.
    http://sealevel.colorado.edu/content/global-mean-sea-level-time-series-seasonal-signals-removed
    …and acidify by 30% since the industrial revolution.
    http://www.pmel.noaa.gov/co2/story/What+is+Ocean+Acidification%3F

    The earth is losing a trillion tons of ice per year:
    – 159 Gt Antarctic land ice, McMillan el al, GRL (2014)
    + 26 Gt Antarctic sea ice, Holland et al, J Climate (2014)
    – 261 Gt Arctic sea ice, PIOMAS
    – 378 Gt Greenland, Enderlin et al, GRL (2014)
    – 259 Gt other land based glaciers, Gardner et al. Science (2013)
    = – 1,031 Gt, total

  6. Monckton of Brenchley says:

    I am most grateful to Alexandra Cheung for her thoughtful critique of our paper in Science Bulletin, and to Joanna Haigh for her interesting comment.

    Perhaps the most startling feature of the inferred temperature record of the past 810,000 years is the near-perfectly thermostatic behaviour of the climate. Based on the ice-core records (Jouzel et al., 2007), and making the usual factor-of-two allowance for polar amplification, absolute global mean surface temperature over most of the last million years has varied by little more than 3 C, or 1%, either side of the long-run mean. Now, this is of course what one would expect from an object bounded by two near-infinite heat-sinks. But it is not an object in which strongly net-positive temperature feedbacks are at all likely to have acted: otherwise, temperature would have been far less stable than it is.

    Indeed, that thermostatic tendency seems to be very much at work today. The RSS satellite record shows no global warming for 18 years 3 months, and all of the datasets are within statistical earshot of that. The rate of global warming in the 25 years since 1990 is half of what the IPCC had predicted as its “business-as-usual” case, notwithstanding record rises in CO2 concentration to levels perhaps not seen for millions of years. It is these and suchlike considerations (see e.g. Lindzen & Choi, 2009, 2011; Spencer & Braswell, 2010, 2011) that lead us to regard the feedback-sum as more likely to be appreciably net-negative than strongly net-positive.

    The other principal problem we identify in the present climate-sensitivity methodology is the use of the Bode system-gain relation in association with strongly net-positive feedbacks. There are various problems with the Bode relation, which should be used very cautiously, if at all, in modeling the climate object (Bates, 2007). For one thing, the equation came from process engineering in the design of electronic circuitry, having its origin in the notion of negative feedback first posited by H.S. Black of Bell Labs on the back of a newspaper when traveling on the Lackawanna ferry. Bell Labs retain the newspaper in a place of honor in the labs today. Now, when the closed-loop gain in a circuit reaches unity, the point of striking singularity in the equation, the current transitions from the positive to the negative rail. There is no mechanism in the climate by which feedbacks that had been causing additional forcing and hence more warming can suddenly cause yet more forcing and yet result in cooling as strong as the warming a moment previously. Plainly, the values given by the Bode relation at loop gains >1 are unphysical in the climate: yet those are the values the relation prescribes, casting some doubt on whether the relation is applicable at all.

    Furthermore, in a circuit the output [the voltage] is a bare output: whatever its value, it does not act on the circuit object in any way to equilibrate it. In the climate, however, the temperature is the instrument of the object’s self-equilibration, which is why the singularity in the Bode equation – and hence the very high sensitivities it appears to engender – seems implausible, to say the least.

    Joanna Haigh is of course correct that getting the right answer for what seems to her to be the wrong reason would not be helpful. However, it is a model. One can put one’s own parameter values in it and get some notion of climate sensitivity. Of course it makes no claim to replace the far more complex general-circulation models: however, it does for the first time encompass in a short paper many of the principal elements in the determination of climate sensitivity.

    Joanna Haigh is also right that over so short a period as 25 years internal variability, combined with errors in estimating forcings (and, for that matter, feedbacks and the modalities of their mutual amplification), make it rather difficult for any model to make reliable predictions. And that is another purpose in our paper: to allow those previously baffled by the complexities of determining climate sensitivity to gain some appreciation of the formidable uncertainties in determining it. In the end, the only approach that is going to work for everyone is to wait and see.

    But if our model has let a little daylight in on the magic it will have served some purpose. In particular, if it leads to a reassessment of the applicability of the Bode system-gain relation to an object in which vigorously net-positive feedbacks are imagined, it will not have been entirely without merit. Like any simple model, it will not come as close to representing the real climate as the more complex general-circulation models. As we say in the paper, it has its limitations. But it has its uses too.

  7. Rob Honeycutt says:

    Perhaps Bishop Hill or Paul Matthews can tell us how well this new simplified model plays out over, say, the past few glacial-interglacial cycles. Or, perhaps even just the past 100 years.

  8. Joanna Haigh says:

    It depends what timescales you use. The Sci. Bull. paper only looks at the past 25 years, over which the models have overestimated the trend. Fyfe et al show that over this period the difference might be explained by a combination of errors in external forcing and internal climate variability as much as by model response. Kim et al look at decadal trends and show that the models predict too little warming or even cooling compared to the reanalysis over the 1960s and 1970s and too much warming in recent decades: again indicating the significance of natural variability. My main point, however, concerned not the fact of any error in the models but the use of an inappropriate method to assess them. Getting the right answer for the wrong reason is not helpful.

  9. Paul Matthews says:

    It is interesting that Prof Haigh appears to be in denial of the fact that climate models run too hot.

    As reported by Kim et al in their 2012 paper, Evaluation of short-term climate change prediction in multi-model CMIP5 decadal hindcasts, “Most of the models overestimate trends”.

    There is also the 2013 paper in Nature by Fyfe et al, with the self-explanatory title “Overestimated global warming over the past 20 years.”

  10. Joanna Haigh says:

    Even if they do run too hot – which is not actually the case: they have underestimated the trend during some previous decades – then use of a clearly implausible model to “explain” it does nothing to advance our understanding.

  11. Bishop Hill says:

    I haven’t read the paper you discuss, but it seems to me that you are on shaky ground. Surely the fact that models overpredict warming is not in dispute? It’s not only the pause but the longer-term record too which shows that the models run too hot. You can hypothesise reasons for the overprediction (heat in the deep ocean etc etc) but that doesn’t change the fact that they run too hot.

Leave a Reply