BBC Radio 4 on Climategate

http://www.bbc.co.uk/programmes/b01nl8gm

Karoly and Gergis vs Journal of Climate

On June 10, a few days after the Gergis-Karoly-Neukom error had been identified, I speculated that they would try to re-submit the same results, glossing over the fact that they had changed the methodology from that described in the accepted article. My cynical prediction was that a community unoffended by Gleick or upside-down Mann would not cavil at such conduct.

The emails http://www.climateaudit.info/correspondence/foi/gergis/Part%202a%20Journal%20Correspondence.pdf show that Karoly and Gergis did precisely as predicted, but Journal of Climate editors Chiang and Broccoli didn’t bite. Most surprising perhaps was that Karoly’s initial reaction was agreement with the Climate Audit criticism of ex post correlation screening. However, when Karoly realized that the reconstruction fell apart using the methodology of the accepted article, he was very quick to propose that they abandon the stated methodology and gloss over the changes. In today’s post, I’ll walk through the chronology. Continue reading

Gergis et al Correspondence

Michael Kottek writes in the comment section:

The results of my FOI request to the University of Melbourne can be seen here:

http://tinyurl.com/96ey5dt

I requested all correspondence between the authors and the journal regarding the paper. The referees reports were exempted as were documents relating to the resubmitted paper.

I also requested correspondence between the authors after the paper was accepted. Once again emails relating to the resubmitted paper were exempted, and personal material redacted.

I note that emails regarding the paper that were received by one author and not forwarded to the others would not have been covered by my request.

Despite the embarrassment of the withdrawn paper, the University is to be commended for their no nonsense approach to this request. As an alumunus, I am pleased that the response is far more sensible than the approach taken by the UEA and UVa.

Chronology (Steve: Oct 28, 9 pm Eastern)
Here is a more detailed commentary which raises questions about Karoly’s claim that they had “independently” discovered the screening error on June 5. [Note: times in the emails are in multiple time zones. In the analysis below, it is my understanding that in June 2012, relative to UTC, Melbourne time was +10, Switzerland +2, Eastern -4, CA blog time -5.]

May 31 

As CA readers are aware, the issue of screening in Gergis et al 2012 was first raised in a CA post in a May 31 blog post, a discussion that directly quoted the following paragraph of Gergis et al:

Our temperature proxy network was drawn from a broader Australasian domain (90E–140W, 10N–80S) containing 62 monthly–annually resolved climate proxies from approximately 50 sites (see details provided in Neukom and Gergis, 2011)… Only records that were significantly (p<0.05) correlated with the detrended instrumental target over the 1921–1990 period were selected for analysis. This process identified 27 temperature-sensitive predictors for the SONDJF warm season (Figure 1 and Table 1) henceforth referred to as R27.

The CA discussion, and, in particular, the May 31 Name and Shame blog post, was referred to on numerous occasions in the internal emails among Gergis, Neukom and others over the next few days, commencing almost immediately with an email from Gergis to Neukom and other coauthors we follows:

We should all be aware that this is unfolding: https://climateaudit.org/2012/05/31/myles-allen-calls-for-name-and-shame

In my original May 31 post, I had presumed that Gergis et al had used correlation screening against trending series (which I termed the “Screening Fallacy”), a topic discussed on a number of occasions at critical climate blogs (see references in original post.) Against this, Jim Bouldin and others argued that Gergis et al had employed detrending screening, thereby avoiding the CA criticism.

June 6 (Australia Time)

The CA discussion quickly led to Jean S checking the correlations of the available series.

At 07:42 AM June 6 Melbourne time (blogtime- June 5 16:42), Jean S reported (CA comments here) that Gergis’ claim that they used detrended correlations for screening was false (asking me and others to check). Within an hour, Jean S’ comment attracted online notice from Hu McCulloch (blog 17:39; and Kenneth Fritsch (blog 18:19).;

About two hours after Jean S’ post – by now nearly 2 am in Switzerland (June 6 09:46 Melbourne; 01:46 Switzerland; blog 18:46 June 5), Neukom urgently notified his Australian associates of the same problem that Jean S had reported at CA a couple of hours earlier. Neukom had a skype discussion with Gergis, followed up by an email (2Gergis, page 77) to Gergis, Karoly and others. Neukom noted that the mistake was related to the proxy screening (then under discussion at Climate Audit) and thus a “delicate issue”:

As just discussed with joelle on skype, I found a mistake in our paper in journal of climate today. It is related to the proxy screening, so it is a delicate issue. In the paper we write that we do the correlation analysis for the screening based on detrended (instrumental and proxy) data, but in reality we did not use detrended data.

Meanwhile at CA (blog time June 5 20:11; Melbourne 11:11), CA reader HaroldW reported that he had confirmed Jean S’ results. I checked in at CA with a question to Jean S (blog time 20:49 June 5; Melbourne 11:49).

June 7 (Australia time)
The following morning (10 am June 6 blogtime; June 7 01:00 Melbourne), I reported that I had confirmed Jean S’ results, posting the discussion as a fresh post a little later (12:01 June 6 blog time; 03:01 June 7 Melbourne).

Five hours later (05:56AM June 7 Melbourne; 14:56 blog time), then late evening June 6 in Switzerland,  Neukom wrote an assessment of situation to Karoly and expressed his desire to discuss matters with Karoly the following day.  One hour later (06:48AM June 7 Melbourne; Switzerland June 6 22:48), Karoly wrote back to Neukom  urging use of detrended data for their calculation (as stated in article):

I think that it is much better to use the detrended data for the selection of proxies, as you can then say that you have identified the proxies that are responding to the temperature variations on interannual time scales, ie temp-sensitive proxies , without any influence from the trend over the 20th century . This is very important to be able to rebut the criticism is that you only selected proxies that show a large increase over the 20th century ie a hockey stick.

The same argument applies for the Australasian proxy selection. If the selection is done on the proxies without detrending ie the full proxy records over the 20th century, then records with strong trends will be selected and that wi ll effectively force a hockey stick result. Then Stephen Mcintyre criticism is valid. I think that it is really important to use detrended proxy data for the selection, and then choose proxies that exceed a threshold for correlations over the calibration period for either interannual variability or decadal variability for detrended data.

Another hour later (08:03 June 7 Melbourne; 00:03 June 7 Switzerland), Gergis asked Neukom whether he was “250% certain” of the problem. The emails are then surprisingly quiet through the rest of June 7 (Australia).

June 8 (Australia)
Early in the Australian morning of June 8 (06:47AM Melbourne; 15:56 June 7 blog time)), Karoly emailed Neukom and others, referring them to the CA post of about 27 hours earlier (12:01 June 6 blog time; 03:01 June 7 Australia;  19:01 June 6 Switzerland):

Someone has now tried to reproduce the screening of the 27 selected proxies against the target Australasian temp series and is unable to reproduce the claimed results in the paper. https://climateaudit.org/2012/06/06/gergis-significance/. I suggest that you look at this Stephen Mcintyre post. Given that the error is now identified in the blogosphere, we need to notify the journal of the error and put the manuscript on hold

Although the CA post had cited Jean S’ results of June 5, Karoly disregarded these links back to the original provenance.

Within an hour (07:26 AM June 8 Melbourne), Gergis (2G:37, p 112; 2K:31) acknowledged Karoly’s email.  At 08:24 AM Melbourne (00:24 Swiss), Gergis wrote to Neukom, but did not copy Karoly or other coauthors; she  argued to Neukom that they had emails showing that they “became aware of the issue” prior to the “latest blogpost” because they had “contacted authors for permission to release their records”:

Hi Raphi, we have emails that predate this latest blogpost that indicate we became aware of the issue as we contacted authors for permission to release their records

CA readers will recognize that Gergis here is sliding over a couple of issues: the request for data had come before the detrended correlation issue had arisen. She had only asked authors to release records because of the May 31 CA blogpost in which screening had already been made an issue; Jean S’ results were reported in comments on June 5; my June 6 CA blogpost reported and linked to Jean S’ results reported the previous day.

Neukom immediately wrote back (08:26 June 8 Melbourne; 00:26 Swiss) warning his coauthors that caution needed to be taken with detrended correlations.

A little later (08:42AM June 8 Melbourne; 00:42 Swiss), Neukom sent Gergis a reconstruction with the (only) eight proxies that passed detrended correlation. Karoly quickly noted (08:54AM Melbourne; 00:54 Swiss) that some of the correlations were now flipped!

Throughout the rest of June 8, Karoly and Gergis started notifying others of the problem.

At 10:38 AM Melbourne (02:38 Swiss), Gergis (2K:34; page 73) notified coauthors Gallant and Phipps (cc Neukom) of the problem. In this first notice, Gergis said that Neukom had identified the problem on the morning of June 6 (presumably referring to Neukom’s email that had been received at 09:42 AM June 6 Melbourne (01:42 Switzerland):

Following on from my attempt to gain permission to release non publically available records released and submitted online with NOAA over the weekend, on Wednesday [June 6] morning Raphi discovered an error in the Aus2K temperature analysis….

Meanwhile, Stephen Mclntyre and co have located the error overnight (I was alerted through an intimidating email this morning): https://climateaudit.org/2012/06/06/gergis-significance . So instead of this being a unwanted but unfortunately normal part of science, we are likely to have an extremely negative online commentary about our work. Although it was an unfortunate data processing error, it does have implications for the results of the paper. We wish to alert you to this issue before the paper goes into final production.

Although Gergis refers here to a supposedly “intimidating email” (and uses the same phrase to Journal Climate later that day), no such email is included in the FOI emails. Nor did I send her any such email. Gergis also inaccurately notified her co-authors that “McIntyre and co have located the error overnight [June 8]”. In fact, the error had been identified at Climate Audit nearly two days earlier.

At 11:16 AM (Melbourne), Gergis sent Karoly a draft notice letter to the Journal of Climate. Karoly reverted at 11:47AM with his edits presumably those shown in the redlined version (2K:35, page 75). Karoly’s version stated that they had discovered the error as occurring on June 6:

While attempting to release non-publicly available records used in our study with NOAA over the weekend [June 2-3], our team discovered an error in our paper.. .
When we went to recheck this on Wednesday [June 6], we discovered that the records used in the final analysis were not detrended for proxy selection, making this statement incorrect.

Back-dating to June 5

Soon afterwards (12:35 Melbourne), Gergis sent a revised notice letter to Journal of Climate. In the revised letter, the discovery date was now said to be Tuesday, June 5 rather than Wednesday June 6, as stated in the draft letters. This back-dating ostensibly coopered up their claim to discovery priority indpendent of Climate Audit (but is not supported in their own correspondence). Gergis et al also inaccurately told Journal of Climate that Climate Audit had identified the error “overnight [June 8]”, more than 2.5 days after the actual time of the original report. Their letter stated:

While attempting to release non-publicly available records used in our study with NOAA this week, our team discovered an error in our paper….

When we went to recheck this on Tuesday [June 5], we discovered that the records used in the final analysis were not detrended for proxy selection, making this statement incorrect…

Meanwhile, independently of our team’s detection of this error, prominent climate change blogger Stephen Mclntyre has identified the issue overnight (I was alerted through an intimidating email this morning): http:l/climateaudit.org/2012/06/06/gergis-significance. So instead o(this being a unwanted but unfortunately normal part of science, we are likely to have an extremely negative online commentary about our work and possibly the journal.)

At at 14:19 Melbourne, Gergis sent a near identically worded notice to PAGES 2K again saying that they had discovered the error on June 5 (Tuesday), adding a warning to the PAGES 2K consortium that they might have to archive all their data”

In terms of the consortium paper, please run with the current version of the Aus2K temperature reconstruction but please note that it may change in coming weeks…

They are now demanded that the full network of records be made available. Over the past week I have been busy contacting authors of non publically available records that were not used in the final temperature reconstruction to attempt to release their data. Everyone managed to agree on just the C20th portions used for calibration be released, but some still no not want to make their full records available.

This issue has implications for other 2K groups: ANY mention of proxy ‘screening’ or selection criteria is likely to be heavily criticised . Although we attempted to be transparent about our methodology, this has backfired and caused a lot of trouble. I just thought you should be aware that it may not be enough that only the records used in the final analysis are already available. It is possible that every record from every region {those rejected from the analysis and those used in final reconstructions) will need to be made available once the consortium paper is published.elp benefit the broader group.

During the Melbourne afternoon of June 8, Karoly worked with University of Melbourne public relations staff on a statement, sending a draft to Gergis and others at 15:57 (2K:38); Gergis reverted at 16:17. This statement adopted the date of “Tuesday 5 June” as the date on which the error was discovered:

While the paper states that “both proxy climate and instrumental data were linearly detrended over the 1921-1990 period”, it was discovered on Tuesday 5 June that the records used in the final analysis were not detrended for proxy selection, making this statement incorrect. Although this is an unfortunate data processing issue, it is likely to have implications for the results reported in the study. The journal has been contacted and the publication of the study has been put on hold.

At 17:56 Melbourne (09:56 Swiss), Karoly sent Neukom a short and long version of their statement. In the long version, they neutrally said that the error was discovered on “Tuesday June 5” (without attribution); no date was mentioned in the short statement. Karoly said that they planned to send a statement to me containing the above paragraph. Neukom reverted immediately (18:18 Melbourne), suggesting that they include the date in the short statement: (2K, 168).

Maybe we can include the date when we discovered the error also in the short statement so that it is clear that we did not just do it as a reaction to the Mclntyre blog?…

And I will try to write down everything that happened in the correct chronological order to be sure l can recall this all correctly. Because I think it may be interesting for some people to see how the error and its discovery developed and when/how we (re-)acted.

Neukom also requested that his work be checked:

I think all the analysis needs to be replicated by someone else (maybe Ailie or Steven) to make sure all other errors I made can be identified and eliminated.

Karoly (18:36) reverted to Neukom that he would put the date in the email to me, but doubted that I would “accept that we didn’t find the issue without his help, but that doesn’t matter”. Karoly additionally asked Neukom to keep “good records” of what happened.

I am about to go home and have some dinner, then I’ll send this to McIntyre, so that he gets it Friday morning. Melbourne Uni wanted as little detail in the short statement as possible. l’ll put the date in my email to McIntyre, which he will likely post, as well as the short statement. I doubt that he will accept that we didn’t find the issue without his help, but that doesn’t matter…

Please keep good records of what happened when, and what you did. Also, keep any records of emails you receive from McIntyre or other bloggers. Joelle is being sent hate emails.

If the FOI release is complete, while there are some critical emails, none appear to me to be fairly classified as “hate mail” – the term “hate mail”, as used by climate scientists, appears to include anything that is merely critical. Karoly sat on the notice to me overnight and sent me an email the following morning Melbourne time with the same paragraph. Karoly additionally noted that participants at CA had “also” identified this “data processing issue”:

We would be grateful if you would post the notice below on your ClimateAudit web site. We would like to thank you and the participants at the ClimateAudit blog for your scrutiny of our study, which also identified this data processing issue.

I reported this at the time, with eyebrows more than somewhat quizzically raised at the Gavinesque coincidence that, after months of peer review and after acceptance of their paper, they had supposedly “independently” discovered the error in screening on June 5 – the very day that the precise error was spelled out at CA (though the issue of Gergis screening had already been discussed for a few days.)

The removal of the Gergis paper had been noted in a comment at RC (June 8 15:50 blog time; June 9 03:50 Melbourne). Another RC commenter pointed out to Schmidt that the problem had been discovered at CA:

Gavin – you ought also to mention that the problem was discovered at the Climate Audit blog

Mann appears to have contacted Karoly soon afterwards, as, within 10 minutes of sending this email to me, Karoly forwarded the email to Mann, with a covering note that the comment at RC about removal was correct. Even though Karoly had told Mann about the error, Mann reverted to Karoly that Mann and the other RC authors would falsely tell RC readers that they had “no further information” on the retraction of the paper from the journal website and that he would involve Schmidt and Steig in the plan:

We have simply noted at RC in the comments that the paper does appear to have been retracted from the AMS website, and we have no further information as to why. I will share this w/ Eric and Gavin so they know the status,

Mann also made defamatory remarks about me to Karoly:

Well I’m afraid Mclntyre has probably already leaked this anyway. I probably don’t have to tell you this, but don’t trust him to behave ethically or honestly here, and assume that anything you tell him will be cherry-picked in a way that maximally discredits the study and will be leaked as suits his purposes.

Karoly pointed out to Mann (2K:55 11:19 Melbourne) that there was discussion at CA of the announcement here. Karoly told Mann that they had a “fully-documented” record demonstrating their priority over CLimate Audit:

PS We do have a fully-documented record or who, when and how the data processing issue was identified by a member of the author team independent of, and before, any posts on this issue at CA or other web sites.

Needless to say, no such “fully-documented record” was disclosed to Michael Kotteck.

IPCC Check Kites Gergis

A few days ago, WUWT pointed out that the American Meteorological Society webpage showed that the Gergis et al paper had been officially “withdrawn”. However, readers should know better than to presume that this would have any effect on IPCC use of the reconstruction.

The withdrawal of the Gergis article hasn’t had the slightest impact on IPCC usage of the Gergis reconstruction, which continues to be used in the recently released AR5 Second Order Draft, thanks to academic check kiting reminiscent of Ammann and Wahl. Tim Osborn of CRU is a Lead Author of the AR5 chapter (as he was in AR4) and would be familiar with the technique from AR4.

Continue reading

Two Blogs on Climate Sensitivity

Two interesting blog posts on climate sensitivity. Troy CA here and Paul_K at Lucia’s here. I haven’t parsed either post, but both are by thoughtful commenters and deserve a read.

AGU Webinar on Michael Mann

An Inside Look at the Michael Mann Case
Featuring Peter Fontaine, counsel to Michael Mann and a leader of Cozen O’Connor’s Brownfield Development and Climate Change practices

To join the meeting:
http://agu.adobeconnect.com/legalwebinar2/

• Please login as a guest with your first and last name. The meeting does not require a password. The meeting hosts will authorize you to enter the meeting.
• We recommend you use the audio on your computer. You will be able to hear the presentation and ask questions via a chat box.

A Belated SI for D’Arrigo et al 2006

The other day, I noticed that the long dormant WDCP supplementary information (and here) for D’Arrigo et al 2006, of which Rob Wilson is a coauthor, had been updated on April 30, 2012. In 2005, D’Arrigo et al (then under review at JGR) had been cited by IPCC AR4. At the time, as an IPCC reviewer, I attempted to obtain both very rudimentary information about the sites used and unarchived measurement data from the authors, from the IPCC and from the journal (JGR, which was theoretically subject to AGU policies requiring data archiving.) My efforts were totally rebuffed. I was even threatened with expulsion as an IPCC reviewer for asking for data. I tried again in October 2009 and was once again rebuffed. So what accounted for this belated update nearly seven years later? The backstory proved interesting. The new SI is an improvement but still unsatisfactory and, unfortunately, contained errors on the long contentious Polar Urals data set. Continue reading

“Forensic Bioinformatics”

Pielke Jr has sent me the following two links on the longstanding dispute between Baggerly and Coombes, two biostatisticians, against a team of cancer researchers at Duke University, led by young star Dr Potti. See CBS News here and a Baggerly 2010 lecture here.

Baggerly and Coombes had attempted to replicate a leading paper; their efforts have ultimately led to retraction of the papers. But the decisive step in the retraction did not arise from proper operation of the peer review system or university investigations, but through something entirely fortuitous. Continue reading

Lewandowsky and “Hide the Decline”

Ethics bait-and-switcher Stephan Lewandowsky and his sidekick, Klaus Oberauer, have added hide the decline to their repertoire at the University of Western Australia blog.

As CA readers are well aware, the Briffa et al 2001 reconstruction, based on 387 tree ring density chronologies, goes down in the latter part of the 20th century – clearly contradicting the Mann reconstruction. The inconsistency between the two reconstructions ought to have troubled anyone with an actual scientific interest in the validity of these reconstructions. However, in order not to “dilute the message” in IPCC TAR, climate scientists chose to “hide the decline”, by simply deleting adverse data that went down. “Hiding the Decline” is the title of Andrew Montford’s forthcoming book.

Lewandowsky appears to be yet another person who has been “tricked” (TM – climate science) by IPCC and others hiding the decline in the Briffa reconstruction. In his post on replication, Lewandowsky claimed that the Briffa et al 2001 decline not only did not contradict the Mann hockey stick, but replicated it:

Replicable effects such as the conjunction fallacy are obviously not confined to cognitive science. In climate science, for example, the iconic “hockey stick” which shows that the current increase in global temperatures is unprecedented during the past several centuries if not millennia, has been replicated numerous times since Mann et al. published their seminal paper in 1998. (Briffa et al., 2001; Briffa et al., 2004; Cook et al. 2004; D’Arrigo et al., 2006; Esper et al., 2002; Hegerl et al., 2006; Huang et al., 2000; Juckes et al., 2007; Kaufman et al., 2009 ; Ljungqvist, 2010; Moberg et al., 2005; Oerlemans, 2005 ; Pollack & Smerdon, 2004; Rutherford et al., 2005; Smith et al., 2006).

One of the fundamental properties of proxy series is whether they go up or down in the 20th century – a seemingly elementary phenomenon that we reflected on in connection with Mann and upside-down Tiljander here, where a reader linked to an amusing video in which the protagonists didn’t care whether data went up or down, a video that seems apt for Lewandowsky as well:

Aside from satire, surely the reconstruction between the Briffa reconstruction and the Mann reconstruction ought to be more worrying to anyone actually interested in proxy reconstructions. Both the Mann reconstruction and the Briffa reconstruction used very large networks of tree ring data: explaining why one series went up while the other went down ought to have been a priority for specialists. (The “consensus” explanation by the Hockey Team is simply incorrect and all-too-typical armwaving. They claim that the Briffa reconstruction, unlike the others, is from a small geographically unrepresentative subset. In fact, the Briffa reconstruction is from a very large network of 387 sites, while the other reconstructions cited above are from small (5-18 site) networks, in which bristlecones and/or Yamal are important components. The Mann reconstruction, like Briffa, is from a large network, but its methodology results in very high weighting to the bristlecones.)

The idea that present temperatures are “unprecedented during the past several centuries” was definitely not original to the Mann hockey stick, as this view dated back to at least Hubert Lamb and could be said to be a consensus view.

Nor does the “replication” claimed by Lewandowsky necessarily impress all specialists. Esper et al 2012 (with coauthors Zorita, Wilson and Timonen) recently criticized Lewandowsky’s prefered reconstructions in terms reminiscent of Climate Audit, as follows:

The missing millennial scale trends in existing TRW records as well as the increased cooling trend after removal of this proxy type from the Arctic-wide estimates both suggest that the widely cited hemispheric reconstructions underestimate pre-instrumental temperatures to some extent. This hypothesis seems to be important as most of the annually resolved, large-scale records are solely composed of or dominated (on longer timescales) by TRW data, and their spatial domain encompasses the Northern Hemisphere extratropics including northern boreal and Arctic environments. Inclusion of tree-ring data that lack millennial scale cooling trends, as revealed here (Fig. 3 and Supplementary Fig. S1), thus probably causes an underestimation of historic temperatures.

D’Arrigo et al 2006, one of the supposed confirmations of the Stick, spoke against the ability to draw conclusions of the medieval relative to modern period as follows:

we stress that presently available paleoclimatic reconstructions are inadequate for making specific inferences, at hemispheric scales, about MWP warmth relative to the present anthropogenic period and that such comparisons can only still be made at the local/regional scale

Social Priming
Lewandowsky, who has written in the past on “social priming”, noted in his post that Kahneman had recently slagged social priming theories (a development covered at CA here.) Lewandowky’s post cited the following classic example of social priming:

For example, it has been reported that people walk out of the lab more slowly after being primed with words that relate to the concept “old age” (Bargh et al., 1996)

As partial support for the concept of “social priming”, it seems to me that there is a statistically significant increase in the incidence of drivel in writings by activists after being primed with words that relate to “climate skeptics”. This hypothesis will be more difficult to test among authors where the incidence of drivel is already high, even without social priming.

The Lewandowsky Ethics Switch

Lewandowsky had to obtain approval for his survey from the UWA Ethics Committee. Simon Turnill has just received remarkable information on this process through FOI, described in an excellent post here. Documents here.

The information showed that Lewandowsky used bait-and-switch. Lewandowsky had obtained approval for a project entitled “Understanding Statistical Trends”. The original proposal had nothing to do with his bizarre online conspiracy theory. Lewandowsky switched the proposal in August 2010.

In addition to Simon’s points, note that Lewandowsky stated the following in his ethics proposal:

Because I am interested in soliciting opinions also from those folks, I would like to withhold my name from the survey as I fear it might contaminate responding”

Nonetheless, Lewandowsky’s name was prominently displayed at some of the anti-skeptic blogs. Lewandowsky’s fears that the survey would be contaminated seem to have been justified.