Scotland Yard Interviewed

The Norfolk Constabulary report a “sophisticated and carefully orchestrated attack on the CRU’s data files, carried out remotely via the internet”.

CA sources have learned that UK police believe that the plan was carried out by a “mindermast”. The insidious methods of a mindermast are revealed in the interview below. here

Update: Police Q&A here. H/t Hilary. Q&A
Cabin

The Norfolk Police reported:

The nature and sophistication of the attack does not suggest that it was anyone at the UEA.

Apparently, one of the steps in the “attack” involved calculating a trend in Excel. The police, quite reasonably in the opinion of many observers, accordingly excluded anyone at UEA.

The vocabulary of the Norfolk briefing is pleasingly similar to Peter Cook’s
plod. Cook’s police inspector solemnly pronounced the great train robbery to be the work of thieves:

Q: who perpetrated this awful crime?
A (deadpan): we believe this to be the work of thieves. ANd I’ll tell you why. The whole pattern is reminiscent of past robberies where we have found thieves to be involved: the telltale loss of property, the snatching away of money. It all points to thieves

Life surely imitates art in the pronouncement of the Norfolk Constabulary that they believed the release of emails to be the work of criminals, criminals so competent that no one at the University of East Anglia could be suspected. In the words of Peter Cook, by a mindermast.

CRU Server Returned

The backup server has been returned to CRU. UEA informed the Information Tribunal today (in connection with my appeal of their refusal of the attachments to the notorious delete-all-emails).

UEA stated to the Tribunal that:

it confirms its intention to preserve the server in a secure manner for the time being.

I doubt that it intends to preserve the server for a minute longer than it is forced to.

Norfolk Police Inquiry at East Anglia Ends

Andrew Montford reports that the East Anglia police inquiry has closed. The police say that it was a hack, rather than a leak or inadvertent exposure, but did not provide details of why they arrived at that conclusion.

Norfolk Constabulary has made the decision to formally close its investigation into the hacking of online data from the Climate Research Centre (CRU) at the University of East Anglia (UEA) in Norwich.

The decision follows a comprehensive investigation by the force’s Major Investigation Team, supported by a number of national specialist services, and is informed by a statutory deadline on criminal proceedings.

While no criminal proceedings will be instigated, the investigation has concluded that the data breach was the result of a ‘sophisticated and carefully orchestrated attack on the CRU’s data files, carried out remotely via the internet’.

Senior Investigating Officer, Detective Chief Superintendant Julian Gregory, said: “Despite detailed and comprehensive enquiries, supported by experts in this field, the complex nature of this investigation means that we do not have a realistic prospect of identifying the offender or offenders and launching criminal proceedings within the time constraints imposed by law.

“The international dimension of investigating the World Wide Web especially has proved extremely challenging.

“However, as a result of our enquiries, we can say that the data breach was the result of a sophisticated and carefully orchestrated attack on the CRU’s data files, carried out remotely via the internet. The offenders used methods common in unlawful internet activity to obstruct enquiries.

“There is no evidence to suggest that anyone working at or associated with the University of East Anglia was involved in the crime.”

The security breach was reported to Norfolk Constabulary on 20 November 2009, following publication of CRU data on the internet from 17 November onwards.

An investigation was launched by the joint Norfolk and Suffolk Major Investigation Team, led by Det Chief Supt Gregory, with some support from the The Met’s Counter Terrorism Command, the National Domestic Extremism Team and the Police Central e-crime Unit, along with consultants in online security and investigation.

The investigation, code-named Operation Cabin, focused on unauthorised access to computer material, an offence under the Computer Misuse Act 1990, which has a three year limit on proceedings from the commission of the original offence. It has been concluded by Norfolk Constabulary, in consultation with The Met, that due to outstanding enquiries this is now an unrealistic prospect.

Norfolk Assistant Chief Constable Charlie Hall, Protective Services lead, said: “Online crime is a global issue. While law enforcement agencies continue to develop our response to emerging threats, it falls upon individuals and organisations to be alert to this and and take steps to mitigate risk as far as is practicable.”

Station Homogenization as a Statistical Procedure

Temperature stations are known to be affected by numerous forms of inhomogeneity. Allowing for such inhomogeneities is an interesting and not very easy statistical problem. Climate scientists have developed some homemade methods to adjust for such homogeneities, with Menne’s changepoint-based algorithm introduced a few years ago in connection with USHCN among the most prominent. Although the methodology was entirely statistical, they introduced it only in climate literature, where peer reviewers tend to be weak in statistics and focused more on results than on methods.

Variants of this methodology have since been used in several important applied results. Phil Jones used it to purportedly show that the misrepresentations in the canonical Jones et al 1990 article about having inspected station histories of Chinese stations “didn’t matter” (TM- climate science). More recently, the Berkeley study used a variant.

Anthony has today drawn attention to a statistical discussion of homogenization procedures, in which Demetris Koutsoyannis is a a coauthor. Continue reading

Another Untrue Allegation by Karoly

As CA readers are aware, David Karoly was a senior author of Gergis et al, which was withdrawn in June following criticism at Climate Audit. Despite (or perhaps because of) this experience, Karoly slagged me in a recent article reviewing Mann’s book, including an accusation that I was responsible for “promulgating misinformation”. I wrote Karoly stating that I tried to write accurately and asked that he provide specific examples of “promulating misinformation” or withdraw the allegation with an apology. I had hoped that such a request would trigger some sense of professional obligation on Karoly’s part to do the right thing. I made no mention of legal action in the letter, let alone threat. As CA readers know, I’ve consistently discouraged those readers who regard litigation as a means of resolving problems.

Indeed, the request was partly successful as Karoly proceeded to retract the article containing the untrue allegations against me. So I’m a bit surprised that Karoly falsely claimed that I had made a “threat of legal action”.

Here’s the correspondence.
Continue reading

“AGU Journals Should Ask Authors to Publish Results”

This is the title of a current op ed in EOS drawn to my attention by Leif Svalggard. The policies advocated in the op ed are obviously ones that I endorse.

AGU actually does have data policies that, on paper, would deal with many of the disputes that I’ve had with paleoclimate authors. From time to time, I’ve tried to get AGU editors to enforce even their present policy, but to date AGU editors have simply ignored such correspondence – not even acknowledging.

For example, I tried to get Colin O’Dowd, editor of JGR, to enforce AGU policy on data. My exchange with O’Dowd is mentioned by Climategate correspondents who felt confident that my initiative would be rebuffed. O’Dowd never even acknowledged any of multiple emails (though I’m a member of AGU.) I later wrote a member of the AGU board who acknowledged my email but did nothing either.

More recently, in connection with Gergis et al, I asked Eric Calais, editor of GRL, to require one of the Gergis coauthors to archive data published in GRL (and considered in Gergis et al) in accordance with existing AGU data policies.

I am writing in respect to data for Neukom et al 2010, Multi-centennial summer and winter precipitation variability in southern South America, published in GRL.

There has obviously been considerable adverse publicity about authors of paleoclimate temperature reconstructions failing to archived data and several committees have recommended that such practices end. This has occurred once again with Neukom et al 2010. Could you please ask the authors to archive the proxy data used in their reconstruction?

No answer. No acknowledgement.

The existing AGU policy is as follows:

1. Data sets cited in AGU publications must meet the same type of standards for public access and long-term availability as are applied to citations to the scientific literature. Thus data cited in AGU publications must be permanently archived in a data center or centers that meet the following conditions:

a) are open to scientists throughout the world.
b) are committed to archiving data sets indefinitely.
c) provide services at reasonable costs.

The World and National data centers meet these criteria. Other data centers, though chartered for specific lengths of time, may also be acceptable as an archive for this material if there is a commitment to migrating data to a permanent archive when the center ceases operation. Citing data sets available through these alternative centers is subject to approval by AGU.

2. Data sets that are available only from the author, through miscellaneous public network services, or academic, government or commercial institutions not chartered specifically for archiving data, may not be cited in AGU publications. This type of data set availability is judged to be equivalent to material in the gray literature. If such data sets are essential to the paper and authors should treat their mention just as they would a personal communication. These mentions will appear in the body of the paper but not in the reference list.

What tends to happen is that authors disregard rule (2) in a typical case where a dataset is described in a print publication but not archived, instead being passed hand to hand among pals. As I interpret AGU rule 2, the citation of the print article should not permitted if the data itself has been obtained through gray channels. Unfortunately, neither AGU editors or reviewers pay the slightest attention to the policy or take the slightest interest in breaches.

In addition to recommending that the existing rules be enforced, I would be inclined to toughen up rule 2 so that the obstacles to the use of gray versions were much more severe than at present.

Lonnie Thompson’s Legacy

In my last post, I observed that Ellen Mosley-Thompson’s archiving record was even worse than that that of her husband, Lonnie Thompson, whose failure to adequately archive his ice core measurements has long been a subject of criticism at Climate Audit. In particular, I observed that Ellen had archived nothing from the 15 expeditions to Greenland and Antarctica that, according to her CV, she had led.

This post has occasioned fresh commentary on Lonnie Thompson’s archiving, which I will review in today’s post. In particular, I will assess Thompson’s statement in an email to a CA reader stating:

…our ice core data are archived at the World Data Center NOAA Paleoclimate data base in Boulder Colorado…

Despite Thompson’s claim, no data whatever is archived for many ice cores. For other cores, my issue is that the Thompson archive is completely inadequate, as I’ll discuss below. I remain mystified by Thompson’s intransigence in establishing a comprehensive and meticulous archive of his measurement data, as, in my opinion, he should regard the establishment of such an archive as an essential part of his scientific legacy and his #1 priority given his age and health.

Cores With No Archive Whatever
As noted in my recent post, according to her CV, Ellen has led “nine expeditions to Antarctica and six to Greenland to retrieve ice cores”. Antarctic sites include Plateau Remote, Dyer Plateau and Siple Station; Greenland sites include those in the PARCA program e.g. GITS, D2, D3, Raven, Tunu. Despite Lonnie Thompson’s claim that “our ice core data are archived at the World Data Center NOAA Paleoclimate”, no data from any of these Ellen-led expeditions has been archived at the NOAA Paleo website.

In my earlier post, I noted that one Greenland data set associated with Ellen had been archived, but pointed out that this came from a much earlier expedition that she had not led and that the data had been transcribed by third parties. For further clarification of this, in 1966, the first long ice core was drilled in Greenland at Camp Century (1387.4 m to bedrock). Lonnie Thompson studied dust concentrations in this core as part of his 1977 thesis entitled Microparticles, Ice Sheets and Climate (Ohio State University Institute of Polar Studies Report 64). These results were published in an academic journal in 1981 by Lonnie and his wife, Ellen Mosley-Thomspon as Temporal variability of microparticle properties in polar ice sheets (Journal of Volcanology and Geothermal Research 11). In 1990, this data was transcribed at WDC/NSIDC and eventually archived at NOAA (with the most recent deposit dated in 2007). This is the only archived Greenland/Antarctica associated with either Thompson and, as I stated in the earlier post, does not come from one of the 15 expeditions to Antarctica/Greenland led by Ellen Mosley-Thompson.

Subsequent to this early consideration of dust in the Camp Century, Greenland ice core, Lonnie’s career, as is well known, has been devoted to high-altitude tropical ice cores. In the late 1980s and 1990s, Thompson drilled a series of tropical ice cores: Quelccaya 1984, Dunde 1987, Guliya 1992, Huascaran 1993, Sajama 1997, Dasuopu 1997, Kilimanjaro 2000 and Puruogangri 2000. Each expedition was punctuated with a short article in Science, none of which, in my opinion, can really be considered as a comprehensive technical report on the ice cores.

Since 2000, Thompson has conducted expeditions to Bona-Churchill (two cores 2002 – NSF award here), Quelccaya and Coropuna (four cores 2003 – NSF award here), Puncak Jaya, New Guinea and Nevado Hualcán, Peru (four cores 2007 – see NSF award here and most recently Alto dell’Ortles, Italy (see NSF award here). See Thompson’s NSF award history
here.

Nothing has been archived at NOAA from any of these expeditions despite Thompson’s assertion that “our ice core data are archived at the World Data Center NOAA”.

NSF funding for Puncak Jaya, Hualcan and Ortles has not expired (it continues in each case to March 2013), but NSF funding has expired for the other three programs (Bona-Churchill – May 2006; Quelccaya/Coropuna – June 2007; Naimona’nyi – March 2010). Nor has there been any publication of the Bona-Churchill ice core, a lacuna noted at CA for a number of years – see here.

Cores with Very Defective Archives
Let me now turn to the archival situation for the early cores, where Thompson has uniformaly failed to provide anything like a comprehensive or definitive archive of measurement data, but has (in some cases, grudgingly) provided digital versions of a single figure in an article, which Thompson has then attempted to pass off as an adequate archive.

Thompson’s NSF award for Quelccaya/Coropuna contains a relevant description of the major component of an adequate archive (though, needless to say, Thompson has not established such an archive for these cores):

This award will help obtain and analyze four ice cores to bedrock from the Quelccaya and Coropuna ice caps in the Peruvian Andes. These new cores, covering the last 10,000 years will provide a more robust and detailed history of Pacific sea surface temperature variability than has been possible with previous cores. Approximately 6,000 samples will be obtained from each core for stable isotopes of oxygen and deuterium, insoluble dust concentration and size distribution, and soluble aerosol chemistry analyses. These and other proxy measurements will be used to reconstruct a regional temperature and precipitation history and produce a high-resolution record of El Nino Southern Oscillation (ENSO) variations

This is precisely in accordance with the sort of definitive archive that I’ve advocated for a number of years. Each Thompson ice core has several thousand samples, on which various measurements are typically taken: O18, dust, chemistry, etc. If Thompson’s ice cores as important as represented, then they deserve nothing less than a definitive archive i.e. the measurements for all 6000 or so samples for each core. These are not “large” archives. I presume that much of the data is already in digital form somewhere at Ohio State.

Other information that is required for a definitive archive is detailed information on layer thickness (including photographs of ice core) used to date the core. The dating of rapidly thinning ice cores is subject to a variety of uncertainties. Subsequent investigators should be able to re-examine Thompson’s dating and, if they disagree, they should have the information that enables them to reassess the isotope history. (An example of such a re-consideration is Vinther’s recent reassessment of the 1970s and 1980s vintage Agassiz cores in light of information from more recent Greenland drilling.)

Instead of this sort of definitive archive, Thompson’s archive at NOAA for Dunde, Guliya and Dasuopu were, until a few months ago, nothing more than decadal O18 chronologies for the period 1000-1990 with no detailed sample information, no ancillary dust, chemistry, nothing on deeper portions of the core.

Aside from mere craftsmanship, there is an important additional reason why Thompson should be expected to archive all measurement data for these cores. In multiproxy studies, the raw measurement data (by sample) is combined with the dating assigned to the samples to produce (for example) isotope “chronologies” (to borrow the term used by dendros). As discussed at CA on numerous occasions – see, for example, here, Thompson has published or distributed inconsistent chronologies (see the figure below.) Some of the inconsistencies are hard to understand. The Dunde version of Thompson et al (PNAS 2006) is inconsistent with the version in Yao et al (Ann Glac 2006), of which Thompson was coauthor. The Dunde version of Thompson et al (PNAS 2006) is consistent with the version used in Mann et al (1998), but not with the intermediate Thompson et al (Clim Chg 2004).


Dunde Versions – see here.

Correspondence with Thompson and Journals
Until 2003, Thompson had archived nothing from any of his Himalaya cores (Dunde, Guliya, Dasuopu), all of which have been important in the paleoclimate discussion. In October 2003, shortly prior to publication of our 2003 article on MBH, I requested data on these cores from the Thompsons, but got nowhere.

In 2004, I was asked to review a response by Mann et al to our 2003 article. In that capacity, I asked to see supporting data for their submission (data that they had refused to supply to me late in 2003.) Stephen Schneider said that, in his 28 years as editor, no reviewer had ever requested supporting data. I replied that times change and that I wanted to see the supporting data. Schneider said that this would require approval by their editorial board. I asked that this be obtained. This incident is documented in 2004 Climategate emails. As a result, Climatic Change adopted a policy requiring authors to supply supporting data, but not code. I then insisted that Mann supply the requested supporting data; Mann appears to have withdrawn the submission rather than comply with the new data policy.

In March 2004, I requested supporting data for Thompson’s 2003 article under the new policy, a request that was supported by Climatic Change. However, instead of providing a comprehensive and definitive archive, Thompson simply archived decadal O18 chronologies for Dunde core 3, Guliya and Dasuopu for the top part of the cores from AD1000 to 1990.

Thompson has subsequently pointed to this absurdly inadequate archive as evidence that he has “archived his data”. However, for the various reasons set out above, archiving the published decadal O18 histories for these sites is only one small component of a definitive archive.

Over the years, I’ve attempted to persuade both Thompson himself and the various journals that a definitive archive is both required and long overdue. I’ve collated this correspondence here. In 2004 and 2005, I tried to persuade Climatic Change. In 2005 and 2006, I tried to persuade Science to require Thompson to provide a definitive archive of cores that Thompson had published in Science. This resulted in them supplying sample information for two Kilimanjaro cores that were interesting (see contemporary CA discussion here) but not relevant to my request for sample data from Dunde, Guliya and Dasuopu. I then had a lengthy correspondence with Science re-iterating my request for details on these three cores; the editors said that they corresponded with Thompson on the matter, but the requests were stymied and ultimately the Science editors stopped answering without ever answering or resolving the matter.

In 2007, I tried to obtain measurement data for the three cores once again with both the editor of PNAS and then Ralph Cicerone, the president of NAS. Once again, the correspondence went nowhere. I asked for an archive of sample data, carefully explaining the difference between sample data and a decadal chronology. PNAS responded that Thompson had told them that he had already provided what I asked for and that I should simply go to the NOAA site – a site that I was obviously well aware of and which I had distinguished in my initial response. And which obviously didn;t contain the sample data that I had asked for. Both Thompson’s unresponsive answer and PNAS’ seemingly wilful obtuseness to the problem (think Gavin Schmidt or William Connolley on upside-down Tiljander) are all too characteristic of the climate community and have contributed to the present adverse atmosphere.

Rather than enforce journal policies, Cicerone told me to write Thompson, who had refused for years to provide this data. So I wrote to Thompson one more time, this time copying Cicerone, Gerald North, Brooks Hanson of Science. Once again, Thompson ignored my email.

In comments on the prior thread, CA reader Roger has criticized me for being insufficiently diligent in trying to obtain measurement data from Lonnie Thompson. I believe that the attached correspondence completely refutes his criticism.

As I observed above, it remains a source of great puzzlement to me why Lonnie Thompson does not regard the establishment of definitive archives of his data as an integral part of his scientific legacy. At this point, it is evident that it will be difficult, if not impossible, for him to carry out the definitive statistical analyses of his data. Over the past 10 years, he has fallen further and further behind. The logical solution to his quandary is simply to provide a comprehensive and definitive archive and thereby let any interested scientist carry out the analyses that Thompson himself will either never carry out or not carry out for many years. Nor in my opinion should Thompson, at this age and stage of his career, spend his time haring off to yet another drill site. No vice president of exploration in a mineral exploration company would personally feel obliged to sit on a drill rig. Surely there are other people capable of supervising the next drill program. Thompson should let them do their job.

Lonnie and Ellen, A Serial Non-Archiving Couple

Recently, Geoffrey Boulton’s report and Nature editorial provided more pious language urging data archiving by hoarding scientists. As I mentioned in my initial comments on Boulton’s editorial, there have been many such pious pronouncements over the years without the slightest impact on, for example, the serial non-archiving couple of Lonnie Thompson and Ellen Mosley-Thompson, who, as it turns out, is an even worse offender than husband Lonnie, if such can be imagined. Their long career of non-archiving has flourished despite clear U.S. federal government policies dating back to 1991 which, on paper, require thorough data archiving by the climate community as a condition of receiving grants. Unfortunately, the U.S. climate funding bureaucracy has been thoroughly co-opted by the climate industry
and has failed to enforce regulations that, on paper, would require the Thompsons and others to archive data. Unfortunately, Boulton failed to do any assessment of why even apparently mandatory government policies have been insufficient to deter to serial non-archivers. Continue reading

False Positives

Some readers may have noticed a Dutch scandal in the academic psychology industry. See here (h/t Pielke Jr).

The previously undisclosed whistleblower is said to be Uri SImonsohn, co-author of the article: “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” The authors set out the following sensible solution to the problem of false positive publications:

Table 2. Simple Solution to the Problem of False-Positive Publications
Requirements for authors
1. Authors must decide the rule for terminating data collection before data collection begins and report this rule in the article.
2. Authors must collect at least 20 observations per cell or else provide a compelling cost-of-data-collection justification.
3. Authors must list all variables collected in a study.
4. Authors must report all experimental conditions, including failed manipulations.
5. If observations are eliminated, authors must also report what the statistical results are if those observations are included.
6. If an analysis includes a covariate, authors must report the statistical results of the analysis without the covariate.

Guidelines for reviewers
1. Reviewers should ensure that authors follow the requirements.
2. Reviewers should be more tolerant of imperfections in results.
3. Reviewers should require authors to demonstrate that their results do not hinge on arbitrary analytic decisions.
4. If justifications of data collection or analysis are not compelling, reviewers should require the authors to conduct an exact replication.

If these rules were applied by real_climate_scientists, most of the criticisms at Climate Audit would be eliminated.

However, there are no signs that real_climate_scientists have any intention of adopting these rules, as evidenced by Gavin Schmidt’s bilious outrage at the idea that Briffa should have reported the Yamal-Urals regional chronology considered and discarded in favor of the known HS of the small Yamal chronology.

The language of false positives was also used by the Texas sharpshooters, Wahl and Ammann, in connection with the failed verification statistics from MBH98.

Boulton’s Nature Editorial

Geoffrey Boulton, who did an execrable job on the Muir Russell “inquiry”, has written a good editorial in Nature here reporting on the recent Royal Society report that he chaired.

There have been a number of reports over the years, urging improved data archiving, and yet the problems persist. Boulton’s report is merely one more. Whether it will have an impact when past reports have failed remains to be changed. In the U.S., there are quite sensible high-level senior policies on data archiving, but these are flouted in paleoclimate by the relevant NSF division. The AGU has sensible policies, but these are ignored by editors and journalists. In the past, as evidenced in Climategate emails, members of the climate “community” have sneered at my efforts to ask AGU editors to enforce these policies, confident in the solidarity of the editor, and such efforts have proved fruitless.

Boulton’s report and editorial merely add one more editorial, but one more editorial isn’t going to affect someone like Lonnie Thompson.

The missing link in Boulton’s report is putting teeth into such recommendations. Enough teeth that recalcitrant journals like The Holocene (which proudly has no data archiving policy) have to adopt and enforce policies and recalcitrant scientists like Lonnie Thompson have to pay attention.

I’ve observed for many years that the funding agencies, e.g. some divisions of the US. National Science Foundation, have much of the responsibility. One of the problems is that some NSF divisions have failed to ensure compliance with existing federa policy, instead leaving matters up to academic journals (which have no obligation to enforce US federal policy.)

As sometimes happens with regulatory agencies, the NSF climate agencies have become a cheerleader for the industry that they are supposed to regulate (in the sense of at least requiring compliance with data archiving). When challenged in the past, NSF administrators have said that authors have met journal standards and that was the end of the story. Similarly with IPCC, which should also have standards for articles cited in IPCC assessment reports. I challenged Susan Solomon on this in 2005 and she said that the establishment of data policies by IPCC would be interfering with academic journals.

However, both funding agencies and IPCC have different obligations than academic journals. If they wish to rely on academic journal policies, then they need to ensure that such journals have data policies that are sufficient to ensure compliance with the funding agency and/or IPCC obligations. To put some teeth on this, if the journal (e.g. The Holocene) does not have such standards, then its eligilibity for citation as a publication of record for an NSF funded project or by IPCC should be withdrawn. If The Holocene, for example, was reduced to grey literature for NSF and/or IPCC purposes, maybe editor John Mathews would pay attention.

Similarly, the funding agencies need to pay some attention to their obligations to ensure data archiving. While they enjoy being cheerleaders for the scientists that they are funding – and are entitled to take some satisfaction in their accomplishments – they cannot become so close to the science industry that they abnegate their regulatory responsibilities.

In passing, Boulton endorses a longstanding issue at Climate Audit – that scientists should archive all the data, not just the data “used” in the final calculations.

Too often, we scientists seek patterns in data that reflect our preconceived ideas. And when we do publish the data, we too frequently publish only those that support these ideas. This cherry-picking is bad practice and should stop.

For example, there is strong evidence that the partial reporting of the results of clinical trials, skewed towards those with positive outcomes, obscures relationships between cause and effect. We should publish all the data, and we should explore them not just for preconceived relationships, but also for unexpected ones. Without rigorous use and manipulation of data, science merely creates myths.

I raised this issue originally with Jacoby at Climatic Change and my requests were rebuffed. Little has changed over the years and my recent similar request with Gergis to four different journals (Journal of Climate, Climate Dynamics, Holocene and GRL) likewise had no success.

This is, of course, the issue that was at the heart of the ongoing Yamal controversy. Gavin Schmidt and Real Climate sneered at the idea that CRU should have any obligation to do anything other than what Boulton describes as “partial reporting”. Perhaps Boulton’s new report will help change perceptions on this point. It’s too bad that Boulton’s epiphany came after his participation in the Muir Russell whitewash.