In a twitter exchange among Jean S, Ronan Connolley and Tim Osborn, Ronan drew attention to an early spaghetti graph in a comment on MBH98 published by Phil Jones in Science on the day after (Apr 24, 1998) publication of Mann et al 1998. The Briffa reconstruction is in purple below. Like IPCC 2001, it hides the decline in the Briffa reconstruction (here a 1998 version) by deleting late 20th century values – here after 1950.
Jones stated that all three reconstructions “clearly show” that the 20th century is the warmest in the 600-year period, with the most “dramatic feature” being the 20th century rise:
Despite the different methods of reconstruction and the different series used, or alternatively, because a few good ones are common to all three series, there is some similarity between the series. All clearly show the 20th century warmer than all others since 1400. The dramatic feature of all three records is the rise during the 20th century.
Mann et al published a Reply to Jones’ comment in June 1998 (with Jones as coauthor of the Reply). They agreed that Jones’ spaghetti diagram “demonstrate[d] the the robustness of the conclusion that the 20th-century warming is unusual in the context of the past several centuries”L
The comparison shown by Jones between Mann et al.’s Northern Hemisphere temperature reconstruction (1) and two other recent estimates is useful in several ways. For example, it demonstrates the robustness of the conclusion that the 20th-century warming is unusual in the context of the past several centuries, on the basis of largely independent estimates.
However, these claims are based on hide-the-decline: Jones deleted post-1940 values of the Briffa reconstruction, slightly enhancing the effect of the deletion by smoothing with post-deletion values only. Jones noted this truncation in the caption to the figure, where he stated that “tree-ring density data show a decline since the 1940s unrelated to temperature [see (9 – Briffa et al, in press; 10 – Briffa et al 1998 (Phil Trans London)] for more details], and the curve from (9) ends at 1940″, a precaution not taken by Mann in IPCC AR3. In the next graphic, I’ve done a blow-up of the 1900-2000 portion of the graphic to demonstrate this. I’ve also shown the deleted values of the Briffa reconstruction (using the nhlmt version from Briffa et al 1998). This hide-the-decline incident is a year earlier than hide-the-decline in Briffa and Osborn 1999 and Jones et al 1999.
Figure 2. Blow-up of Jones 1998 comment on MBH98. Briffa version is nhlmt from Briffa et al 1998.
In the next graphic, I’ve plotted the complete series – no hide-the-decline. When the decline is shown, one feels that even a reviewer for Science or Nature would cavil at the assertion that the “dramatic feature of all three records is the rise during the 20th century” or that the conclusion that “20th-century warming is unusual in the context of the past several centuries” is “robust”:
Figure 3. Jones 1998 diagram, showing decline in Briffa et al reconstruction (nhd2 version rescaled on same ratio as nhlmt to nhd1 and re-centering to match visually.
Jones did make some sensible comments in his Comment that have not previously drawn attention. Jones observed that one should be able to easily extract relative importance of the various proxies, speculating that “much of their success, in a statistical sense” must come from tree rings:
The mathematical technique used by Mann et al. (3) to produce the reconstructions could easily be adapted to show which proxy series are the most important. Although Mann et al. (3) do not explicitly rank the various proxies, much of their success, in a statistical sense, must come from the large number of tree-ring width series used.
Jones’s observation was correct, but no one seems to have attempted extracting the contribution of different proxies types until I did so in 2004-2005. My analysis showed that the contribution of all proxies except bristlecones was little different than white-to-low-order-red noise and that the HS came from a very small subset of all proxies. The climate community has chosen to ignore this point.
Jones also made the sensible observation that each new class of proxy had to prove itself – a precaution immediately abandoned in favor of Mannian armwaving.
Each paleoclimatic discipline has to come to terms with its own limitations (6, 7) and must unreservedly admit to problems, warts and all. A particular issue for all ice core and coral series and some new tree-ring work is what exactly an isotope series (be it O, H, or C) tells us about past temperature. Sensitivity to temperature cannot be assumed; it must be proved with instrumental data on both interannual, and, where possible, on longer (more than 20 years) time scales (7).
I must say that I was a little intrigued to find an example of hide-the-decline a full year before Briffa and Osborn 1999 or Jones et al 1999, previously the earliest hide-the-decline example. In previous analyses of hide-the-decline, I’ve repeatedly emphasized that the technique of deleting data to hide inconvenient results originated with CRU (I’ve previously termed it “Keith’s Science Trick”). Mann knew of CRU’s deletion of the decline, but, as Lead Author of IPCC AR3, Mann willingly and enthusiastically participated in the hide-the-decline scheme, because he didn’t want to “give fodder to the skeptics” or “dilute the message”. But the technique originated with CRU and the “exoneration” of Jones, Briffa and Osborn on hide-the-decline by Muir Russell and Oxburgh was totally undeserved. (Nor does assignment of blame to CRU excuse Mann’s participation in hide-the-decline as Lead Author of AR3, where Mann and CRU both were culpable.)
Postscript: the twitter exchange also discusses the provenance of the Briffa reconstruction version in IPCC AR3 Figure 2-21. IPCC cited Briffa 2000 (QSR), while the actual version used in AR3 comes from Briffa et al 2001. Tim Osborn stated that the IPCC version matched the green LFD curve in Briffa (2000) Figure 5. This seems to be only partly true: the LFD curve has the same shape as the AR3 curve, but appears to be scaled differently.
Some comments have been accumulating on an unrelated topic. Please comment on this thread. I’ll try to write something over the next couple of days. But in a quick first look, Mann has re-iterated his untrue claims that various listed inquiries investigated and exonerated Mann personally. The untruthfulness of these claims has been discussed in a number of CA posts. I read a comment that Mann had coopered up some false quotations, but haven’t checked these claims.
Also, Mann’s brief does not address the arguments by ACLU and others against Mann’s assertion that the defendants were obligated to acquiesce in the findings of government inquiries, a proposition that ACLU and numerous major news media say is (though not using this exact word) unprecedented.
PCA was also performed on certain proxy sub-networks (spatially dense regional networks of tree-ring data available separately in different continents) as means of dimensional reduction of the predictor network. In this case, the procedure was performed separately for each independent step of the stepwise calibration/reconstruction procedure described in “3” below. A decreasing number of PCs of these sub-networks are retained increasingly further back in time, as dictated by application of objective selection criteria (consideration of results of both Preisendorffers Rule N and Scree test).
We employed a standard, objective criterion for determining how many PCs should be kept for each region.
Michael E. Mann in his book.
Let’s continue with another housekeeping post.
When Steve and Ross uncovered Mann’s flawed PCA, Mann’s defence was that MM had failed to use Preisendorfer’s Rule N in the selection of the PCs in the NOAMER AD1400 step (for the background of this story, see here or read it from Montford’s excellent book). Steve observed immediately that the use of “Rule N” was not mentioned in MBH98 in connection to the tree ring PCA. He then emulated the claimed rule convincingly showing that it was pretty impossible that the rule was actually used. Further, up to now Mann has failed to reproduce any code or documentary evidence for the supposed use of the rule. It is hard to imagine, even in the context of the Hockey Stick, any other argument with so little support but which is still alive and well with the usual suspects. In fact, this fairy tale now seems to be the official story line in Wikipedia (citing Mann’s book, of course).
When I got my hands in the MBH9X file archive contained in the Climategate files, among the first things I checked was that if it contained the code for the selection rule, or even a file indicating the use of such code. Nope. Later I observed a curious thing in the files. MBH9X is a stepwise procedure, and in every step (if there were enough proxies), one supposedly (as the Corrigendum statement indicates) calculated the tree ring PCs. However, there were quite a few calculation steps missing from the archive. For instance, Stahle & Cleveland Oklahoma/Texas (STAHLE OK) precipitation chronologies had a single calculation step (AD1700) although they were also used in the later steps (AD1730, AD1750, AD1760, AD1800, and AD1820). I then checked the actual MBH98 data, and vola, I noticed that instead of calculating the PCs for the STAHLE OK network at, say, AD1820 step Mann had simply recycled the 1820-1980 part of the AD1700 step PCs (1700-1980)! The correspondence between the archive and the PCs actually used was almost one-to-one: every “missing” step in the archive matched the reuse of the PCs from the previous step in the MBH98 data. In other words, contrary what is claimed in the Corrigendum PCs were not calculated for every step.
So what has the above to do with the PC selection rule? Well, Steve had observed that there are three (*) cases (SOAMER AD1750, STAHLE SW AD1750, and NOAMER AD 1500), where MBH98 retains more PCs than in the previous step although the network does not change (i.e., the same proxies are inputs to the PCA). But it was not only that the network was (supposed to be) the same … the PCs were exactly the same as in the previous step! Of course then, it is impossible for any “rule” to retain different number of PCs. The most striking example comes from the NOAMER AD1500 step, which is using PCs from the AD1450 step. Mann is keeping 2 PCs in the AD1450 step, but in AD1500 step 6 PCs are retained from the same PC set!
As Steve has said several times, MBH9X is the gift that keeps on giving! When now preparing for this post, I discovered yet-another Mannian mystery. There is a single exception to the above mentioned correspondence between the archive and PCs actually used. Namely, PCs are calculated for the SOAMER AD1450 step, but no SOAMER PCs are used prior the AD1600 step. The corrigendum text cited in the beginning continues
PCs were no longer calculated back in time once a given network contained fewer than 7 available series (with the exception that PCs were calculated for the ‘Stahle Southwest U.S./Mexico network’ with 6 series available). Thus, although some series may be available further back in time, they may not have been used to calculate PCs. For example,there are 110 series available back to 1400, but only 95 are used because PCs were not calculated on 6 Australian and 6 South American ITRDB series and 3 ‘Vaganov’ series.
SOAMER contains 7 series in the AD1450 step. Why were none of those PCs used?
(*) Actually there is a fourth case (AUSTRAL AD1750), which is immaterial here as it has the corresponding calculation step in the archive.
The Mann Statement of Claim prominently displayed, as one of only two quotations from the “inquiries”, an extended quotation from the Myths vs Facts webpage, included as one of three Resources accompanying the EPA decision denying reconsideration of various petitions for reconsideration of the Endangerment Finding (though Mann’s Statement of Claim falsely cited the gazetted (Federal Register) denial decision itself as its source.) The Myths vs Facts document (as well as the Factsheet and Press Release, also linked as “Resources”) contained statements and assertions that were both untrue and which were undocumented in the actual language of the “formal” documents that they were supposedly supporting.
The identity of the authors (and reviewers, if any) is not disclosed in the supporting documents (RTP documents) for the denial decision – Jean S speculated a few days ago that Gavin Schmidt was involved in EPA’s supposed investigation and “exoneration” of Michael Mann; Schmidt has thus far refused to comment.
In today’s post, I’ll discuss who wrote (and possibly reviewed) the Myths vs Facts document. Continue reading →
This post is rather technical, and it is intended mainly for the historical completeness. So unless you are very, very interested in the tiny technical details of the HS saga, you can safely skip this.
As most readers are aware, and stated in my post few hours after CG broke out, Mike’s Nature trick was first uncovered by UC here. He was able to replicate (visually perfectly) the smooths of MBH9x thereby showing that the smooths involved padding with the instrumental data. The filter used by UC was the zero-phase Butterworth filter (an IIR filter), which has been Mann’s favourite since at least 2003. However, there was something else that I felt was odd: UC’s emulation required a very long (100 samples or so) additional zero padding. So about two years ago, I decided to take an additional look at the topic with UC.
Indeed, after digitalizing Mann’s smooths we discovered that UC’s emulation was very, very good but not perfect. After a long research, and countless hours of experimenting (I won’t bore you with the details), we managed to figure out the “filter” used by Mann before Mann (2004)-era. Mann had made his own version of the Hamming filter (windowing method, an FIR filter)! Instead of using any kind of usual estimate for the filter order, which is usually estimated from the transition bandwidth (see, e.g., Mitra: Digital Signal Processing) and has typically the length of a few dozen coefficients at maximum, he used the filter length equal to the length of the signal to be filtered! As Mann’s PCA was apparently just a “modern” convention, this must be a “modern” filter design. Anyhow, no digital signal processing expert I consulted about the matter had ever seen anything like that.
In order to see how absurd the “filter design” is, consider filtering a signal of length 999 samples. According to Mann, you should design a Hamming filter of the same length. One should always disregard half of the filter length amount (i.e., 499 in our example) of filtered values from both ends, so in Mann’s case one would end up with a single smoothed value! In Mann’s implementation, however, one ends up with a filtered series of the same length as the original signal.
Another way to think of Mann’s “filter” is to consider it as a normal filter with a huge (half the signal length) zero padding to the both ends of the signal. This interpretation also gives hints why UC’s emulation was so successful. One can also speculate, if the similarity of the results between zero-phase Butterworth and Mann’s original filters is the reason Mann chose the Butterworth filter in the first place.
If someone wants to explore this topic further, I’ve place my Octave/Matlab implementation of Mann’s smoother here. The code includes references to the original Mann’s code I uncovered. Finally, the exact parameters of the trick in MBH9x were as follows. MBH98 has 50-year smoothing with padding of 1981-1995 instrumental. Additionally, the smoothing is cut back 25 samples (half of the “filter length”) from both ends. MBH99 used 40-year filtering with 1981-1997 (not 1998!) instrumental padding. The smooth is cut back 20 samples from the end but not from the beginning.
Jean S writes (transferred from a comment with the addition of a few headings):
A question for the experts: is it known who wrote and who were used as experts in the EPA documents? If not, is that information considered public (i.e., obtainable under FOIA or similar)?
The reason I ask is that I get very, very eerily feeling when reading certain parts of the EPA decision, especially Continue reading →. For instance, I think there are very, very few people in this world from whom the following paragraphs could originate (considering style, content and astonishing familiarity with Mann’s work):
In a recent post, I observed that Mann’s Statement of Claim contained a bizarre misrepresentation about the nature of Mann’s research, as it falsely credited Mann with being “one of the first” to document the increase in 20th century temperatures. Reader PhilH, a retired judge, observed that, on its own, the misrepresentation was merely odd and that it would have significance for the pleadings only if it could be connected to the narrative of the case. In today’s post, I’ll try to do exactly that.
The “money quote” chosen by Mann’s lawyers as supposed evidence of his “exoneration” by the “inquiries” is an EPA statement that manipulation of “temperature data and trends” is a “myth”. This quotation connects to the paragraph 2 misrepresentation, but not to Mann’s actual research. It’s a massive whiff by Mann’s lawyers. Continue reading →
One of the essential elements in Mann’s reliance on EPA findings is his assertion that his supposed exoneration by EPA had been “widely available and commented” on in the media and had been “read by the Defendants”:
All of the above reports and publications were widely available and commented upon in the national and international media. All were read by the Defendants.
The claim that all nine “inquiry” reports had been “read” by Steyn and the other defendants is surely a fantasy on the part of Mann and his lawyers. While Steyn seems to be a man of eclectic interests, somehow I can’t picture him poring through the dreck of the turgid “reports” from the various inquiries.
I am particularly dubious of Mann’s claim that Steyn (and Simberg) had read the EPA documents. Some climate blogs took notice of the EPA decision denying various petitions for reconsideration of the EPA Endangerment Finding when it was issued, but none understood it to supposedly be an “investigation” and “exoneration” of Michael Mann, something which would have occasioned great interest in July 2010, then only a few weeks after the Muir Russell and second Penn State reports. The EPA denial decision was first included in a list of inquiries in an unmarked Feb 2011 revision of a November 2009 post by SKS, but really was brought to public attention for the first time in Mann’s Statement of Claim itself. It seems very improbable to me that Steyn (or Simberg) were aware of EPA’s supposed findings in connection in Mann (not that Mann’s characterization is accurate, but that’s a story for another day), or why they would be obligated to be familiar with them. Given Mann’s allegation that Steyn, Simberg and others were supposed to be aware of EPA’s investigation and “exoneration” of Mann, the contemporary unawareness of this supposed EPA investigation – especially at SKS and Real Climate – is really quite remarkable.
Michael Mann, now feigning sensitivity towards Mark Steyn’s use of the word “fraudulent”, used identical language in the Climategate emails against critics without the slightest compunction. Mann’s hypocrisy has been widely noted.
Unpublicized thus far is a discussion by EPA, in which EPA concluded that Mann’s accusations of “scientific fraud” were within the scope of “acceptable and appropriate” scientific exchange and that it is “entirely acceptable and appropriate for scientists to express their opinions and challenge papers that they believe are scientifically flawed” in such terms.
The EPA’s finding appears to be inconsistent with Gavin Schmidt’s recent tweet arguing that such language is “per se defamatory”:
Saying that ppl [people] are frauds is per se defamatory. Goes beyond disagreement/error/dislike.