Ruling out high deflation scenarios

Further to my series of posts on Deflategate, reader chrimony observed that my statistical analysis had shown that it was possible that there had been no tampering, but had not excluded the possibility of tampering.  This is a sensible observation, but raises the question of whether and how one could use the available statistical information to exclude tampering. This is analysis that ought to have been done in the Wells Report.  I’ve done the analysis in this post and the results are sharper than I’d anticipated.

For Logo initialization, any manual deflation exceeding de minimis of say 0.1 psi can be excluded by observations.  For Non-Logo initialization, statistical information rules out “high” deflation scenarios i.e.  deflation by more than the inter-gauge bias of 0.38 psi plus uncertainty, including deflation levels of ~0.76 psi reported in Exponent’s deflation simulations.  Remarkably, for Non-Logo initialization, the only manual deflation that is not precluded are amounts equal (within uncertainty) to the inter-gauge bias of ~0.38 psi.  Precisely why Patriots would have deflated balls by an amount almost exactly equal to the bias between referee Anderson’s gauges is a bizarre coincidence, to say the least.  I think that one can safely say that it is “more probable than not” that referee Anderson used the Logo gauge than that such an implausible coincidence.
Continue reading

Exponent’s Transients: Bodge or Botch?

In my first writeup, I observed that Exponent’s Logo transients appeared to be bodged too high, even with their unwarranted and adverse use of 67 deg F initialization (Exponent’s “temperature trick”). In today’s post, I’ve taken a closer look at the seemingly questionable calculation of the transients at 67 deg F, showing that the Patriot transients make sense only if initialization for the transients purporting to show Logo Gauge initialization were not actually initialized at 12.5 psi using the Logo Gauge (as stated and as is the purpose of the diagram).  My reverse engineering shows that the Patriot dry transient in Figure 27 only makes sense if the Logo Gauge read 12.81 psi at initialization or if the Master Gauge (not the stated Logo Gauge) was erroneously used for initialization.  If I’m correct, this is a very significant error –  a botch, rather than a bodge – for which one would expect a prompt corrigendum, if not retraction, of the corresponding calculations.  In a postscript to today’s post, I’ve attached a note on conversion from Logo and Non-Logo Gauge scale to correctly calibrated Master Gauge scale.

Continue reading

NFL Officials Over-Inflated Patriot Balls

One of the ironies of the NFL’s conduct in this affair is that it can be established that NFL officials (under the supervision of NFL Executive Vice President Troy Vincent) over-inflated Patriot balls at half-time, the only proven tampering with Patriot balls. Brady and the Patriots were unaffected by the overinflation by NFL officials, as they destroyed the Colts in the second half.

Exponent must have noticed the over-inflation by officials, as it is implied by the post-game measurements, but failed to report or comment on it. Their avoidance becomes all the more conspicuous because many of the texts at issue in the Wells Report pertain to an earlier incident in which NFL officials had over-inflated Patriot balls, much to Brady’s frustration and annoyance at the time.
Continue reading

More on Deflategate

By converting football pressures to ball temperatures under the Ideal Gas Law, it is possible to conveniently show Colt and Patriot information – transients, simulations and observations – on a common scale. I’ve done this in the diagram shown below, and, in my opinion, it neatly summarizes the actual information. Commentary follows the figure.
implied_ball_temperatures

Figure 1.  Transients as digitized from Figures 25 and 27 converted to temperature transients using Ideal Gas Law. Red- Patriot, blue- Colt; thick – dry; thin – wet; solid -Logo, dashed – Non-Logo.  Simulations shown in open circles: large –  Logo 67 deg F initialization; small – NonLogo 71 deg F initialization.  Observed average: solid circle- Non-Logo; + – Logo. Continue reading

Deflategate and Errors in the Wells Report

Readers in the U.S. are doubtless aware of the “Deflategate scandal”, in which the NFL alleged that Tom Brady, the greatest quarterback of his generation, had conspired with an equipment manager and locker room attendant, to deflate a microscopic amount of pressure from footballs in the AFC championship game. The NFL seemed to be completely taken by surprise by the Ideal Gas Law and the fact that outside temperatures below calibration temperatures would result in much larger deflation without tampering.

The findings depend on the interpretation of statistical data by decision-makers – a topic that interests me.   I found the technical report by Exponent, Wells’ technical consultants, to be very unsatisfactory on numerous counts:

  • although they were reported by Wells to have considered “all permutations”, they hadn’t.  On important occasions, they omitted highly plausible possibilities that indicated no tampering and, on other occasions, they only considered assumptions that were most adverse to the Patriots;
  • on key occasions, it seemed to me that Exponent failed to properly characterize exculpatory results.

At the end of my analysis, I concluded that their key technical findings were simply incorrect and wrote up my analysis, now online here.

I watched both the AFC championship and the final. I have no fan commitment to the Patriots. As someone who’s played sports all his life and whose play has always been rushed, I am amazed at how time seems to stand still for great athletes such as Brady.

The summary is as follows.
Continue reading

Implications of recent multimodel attribution studies for climate sensitivity

Last year, a paper of mine (Lewis 2014) showing that the approach used in Frame et al (2005), which argued for using a uniform prior for estimating equilibrium (strictly, effective) climate sensitivity (ECS), in fact led to a unique, objective Bayesian estimate for ECS upon undertaking a simple transformation (change) of variables. The estimate was lower, and far better constrained at the upper end, than the one resulting from use of a uniform prior in ECS, as recommended in Frame et al (2005) when estimating ECS. The only uniform priors involved were those for estimating posterior probability density functions (PDFs) for observational variables with Gaussian (normally distributed) data uncertainties, where they are totally noninformative and their use is uncontroversial. I wrote an article about Lewis (2014) at the time, and a version of the paper is available here.

I’ve now had a new paper that uses an essentially identical method to Lewis (2014), but with updated, higher quality data, published by Climate Dynamics, here. A copy of the accepted version is available on my web page, here.

Continue reading

Scientific American article: “How to Misinterpret Climate Change Research”

A Scientific American article concerning Bjorn Stevens’ recent paper “Rethinking the lower bound on aerosol radiative forcing” has led to some confusion. The article states, referring to a blog post of mine at Climate Audit, “The misinterpretation of Stevens’ paper began with Nic Lewis, an independent climate scientist.”. My blog post showed how climate sensitivity estimates given in Lewis and Curry (2014) (LC14) would change if the estimate for aerosol forcing from Stevens’ recent paper were used instead of the estimate thereof given in the IPCC 5th Assessment Working Group 1 report (AR5 WG1). To clarify, Bjorn Stevens has never suggested that my blog post misinterpreted or misrepresented his paper.

The article also states, paraphrasing rather than quoting, “Lewis had used an extremely rudimentary, some would even say flawed, climate model to derive his estimates, Stevens said.” LC14 used a simple energy budget climate model, described in AR5 WG1, to estimate equilibrium climate sensitivity (ECS) from estimates of climate system changes over the last 150 years or so. An essentially identical method was used to estimate ECS in Otto et al (2013), a paper of which Bjorn Stevens was an author, along with thirteen other AR5 WG1 lead authors (and myself). Energy budget models actually estimate an approximation to ECS, effective climate sensitivity, not ECS itself, which some people may regard as a flaw. AR5 WG1 states that “In some climate models ECS tends to be higher than the effective climate sensitivity”; this is certainly true. Since the climate system takes many centuries to equilibrate, it is not known whether or not this is the case in the real climate system. LC14 discussed the issues involved in some detail, and my Climate Audit blog post referred to estimating “equilibrium/effective climate sensitivity”.

I sent Bjorn Stevens a copy of the above wording and he has responded, saying the following:

“Dear Nic,

because I have reservations about estimates of ocean heat uptake used in the ‘energy-balance approaches’, and because of a number of issues (which you allude to) regarding differences between effective climate sensitivity estimates from the historical record and ECS, I am not ready to draw the inference from my study that ECS is low. That said, I do think what you write in the two paragraphs above is a fair characterization of the situation and of your important contributions to the scientific debate. The Ringberg meeting also made me confident that the open issues are ones we can resolve in the next few years.

Feel free to quote me on this.

Best wishes, Bjorn”

Update 26 April 2015

Gayathri Vaidyanathan tells me that the article has  been changed at ClimateWire .  Certainly, the title has been changed, and I presume the text has been amended per the version she sent me, which no longer suggests misinterpretation. But Scientific American is still showing the original version, so the situation is not very satisfactory.

Update 28 April 2015

The text of the article has now been changed at Scientific American, although the title is unaltered. The sentence referring to misinterpretation now reads “Stevens’ paper was analyzed by Nic Lewis, an independent climate scientist.*” At the foot of the article is the note:

Correction: A previous version of this story did not accurately reflect Lewis’ work. Lewis used Stevens’ study in an analysis that was used by some media outlets to throw doubt on global warming.

Pitfalls in climate sensitivity estimation: Part 3

A guest post by Nicholas Lewis

In Part 1 I introduced the talk I gave at Ringberg 2015, explained why it focussed on estimation based on warming over the instrumental period, and covered problems relating to aerosol forcing and bias caused by the influence of the AMO. In Part 2 I dealt with poor Bayesian probabilistic estimation and summarized the state of observational, instrumental period warming based climate sensitivity estimation. In this third and final part I discuss arguments that estimates from that approach are biased low, and that GCM simulations imply ECS is higher, partly because in GCMs effective climate sensitivity increases over time. I’ve incorporated one new slide here to help explain this issue.

Slide 19

ringSlide19

Continue reading

Pitfalls in climate sensitivity estimation: Part 2

A guest post by Nicholas Lewis

In Part 1 I introduced the talk I gave at Ringberg 2015, explained why it focussed on estimation based on warming over the instrumental period, and covered problems relating to aerosol forcing and bias caused by the influence of the AMO. I now move on to problems arising when Bayesian probabilistic approaches are used, and then summarize the state of instrumental period warming, observationally-based climate sensitivity estimation as I see it. I explained in Part 1 why other approaches to estimating ECS appear to be less reliable.

Slide 8 ringSlide8

Continue reading

Pitfalls in climate sensitivity estimation: Part 1

A guest post by Nicholas Lewis

As many readers will be aware, I attended the WCRP Grand Challenge Workshop: Earth’s Climate Sensitivities at Schloss Ringberg in late March. Ringberg 2015 was a very interesting event, attended by many of the best known scientists involved in this field and in areas of research closely related to it – such as the behaviour of clouds, aerosols and heat in the ocean. Many talks were given at Ringberg 2015; presentation slides are available here. It is often difficult to follow presentations just from the slides, so I thought it was worth posting an annotated version of the slides relating to my own talk, “Pitfalls in climate sensitivity estimation”. To make it more digestible and focus discussion, I am splitting my presentation into three parts. I’ve omitted the title slide and reinstated some slides that I cut out of my talk due to the 15 minute time constraint.

Slide 2

ringSlide2

In this part I will cover the first bullet point and one of the major problems that cause bias in climate sensitivity estimates. In the second part I will deal with one or two other major problems and summarize the current position regarding observationally-based climate sensitivity estimation. In the final part I will deal with the third bullet point.

In a nutshell, I will argue that:

  • Climate sensitivity is most reliably estimated from observed warming over the last ~150 years
  • Most of the sensitivity estimates cited in the latest IPCC report had identifiable, severe problems
  • Estimates from observational studies that are little affected by such problems indicate that climate sensitivity is substantially lower than in most global climate models
  • Claims that the differences are due to substantial downwards bias in estimates from these observational studies have little support in observations.

Continue reading

Follow

Get every new post delivered to your Inbox.

Join 3,555 other followers