A reader of this website recently sent a letter to Canada’s Minister of the Environment, Stéphane Dion, urging him to consider our 2005 articles. I have included here the Minister’s response, in which he states that “many of our arguments have been refuted by by Jones and Mann in their recently published review paper (Review of Geophysics)."
I show here that this paper did not consider any of the major points in our 2005 papers. It could hardly do so as it was not "recently published" in this context, but was accepted almost a year before our 2005 articles (which accordingly were not “pre-butted” or “pre-futed” in Jones and Mann ).
Secondly, I point out that the cited article does not itself "refute" any points; it relies on "findings" from a submission by Mann et al. to Climatic Change; however, this submission was rejected.
Thirdly, I show that the points in the Reviews of Geophysics paper are incorrect — which may be the reason that the Mann et al. submission to Climatic Change was rejected.
Mann has not only failed to report the rejection of the Climatic Change paper, but has stated that they have not had papers rejected (while our submission to Nature was rejected).
Here it the letter from Minister Dion sent on April 20, 2005:
Thank you for your e-mail of February 21, regarding climate change.
Despite contrary views from a small group, there is now a large body of scientific knowledge that makes it very clear that global warming is real and that mitigative action is necessary. Based on the comprehensive assessments of peerreviewed scientific papers — a process that involved more than 2,500 international experts — the Intergovernmental Panel on Climate Change has concluded that the Earth’s surface has warmed by about 0.6 oC over the past century and that most of the observed warming over the last 50 years is attributable to human activities (mainly through the emission of CO2 from fossil fuel burning).
With respect to the paper by McIntyre and McKitrick to which you referred, many of its arguments have been refuted by Jones and Mann in their recently published review paper (Review of Geophysics). More importantly, an independent study published in a recent peer-reviewed paper in Nature suggests that, while Mann’s analysis may have underestimated the degree of inter-century variability in the climate record, the 20th century remains the warmest of the past millennium. Thus, the rather conservative Intergovernmental Panel assessment that the warming of the 20th century is unusual and “unlikely to be entirely natural” in origin still holds true.
The United Nation Convention on Climate Change was established in 1992. It was this Convention that set 1990 as the "base year" for tabulating greenhouse gas emissions. Hence this also became the reference data for the Kyoto Protocol, which is part of Convention. Canada’s One-Tonne Challenge is a small but vital step for every Canadian to contribute in slowing the increase of their atmospheric concentrations. Similar challenges are being presented to Canadian industry, including the cement industry. That is because all sectors of are society contribute to greenhouse gas emissions, and all therefore have the opportunity to make a difference. Furthermore, I would like to note that fulfilling Canada’s obligation under the Kyoto Protocol is not primarily about helping ourselves, but about protecting the well-being of the future generations who will live in this land.
I appreciate your interest in this important issue.
Original signed by:
Minister Dion’s comments parallel comments at the Enivonrment Canada website which state:
McIntyre,S. and McKitrick. 2005. Hockey sticks, principal components, and spurious significance. GRL. 32, L03710, doi:10.1029/2004GL021750.
In this long-awaited article in Geophysical Research Letters, McIntyre and McKitrick finally put their challenges to paleoclimate reconstructions of Northern Hemispheric climates published by Michael Mann and colleagues into a formal, peer reviewed paper. Their primary argument is that Mann et al. undertook some unusual data transformation which they contend strongly affected the statistical analysis undertaken to complete the climate reconstruction. They suggest that this method overstates the significance of one particular record based on bristlecone pine data from North America, and that the reconstruction does not pass significance tests for the critical 15th century period. Although there have as yet not been published rebuttals to this challenge, McIntyre and McKitrick’s arguments have already been dismissed by Jones and Jones [sic] in their review paper on climate trends (published late last year in Reviews of Geophysics). However, two other recent papers published by Von Storch (Science, Sept 30., 2004) and by Moberg et al. (Nature, Feb 10., 2005) suggest that Mann’s analysis may have underestimated the degree of inter-century variability in the climate record. Moberg et al. also show the 20th century as the warmest of the past millennium, but his trends look more like an undulating serpent than a hockey stick. These papers are, collectively, a reminder that science proceeds with hesitant steps, and that context is important. In this regard, the rather conservative IPCC TAR statement that the warming of the 20th century is unusual and ‘unlikely’ to be entirely natural in origin still holds true.
The tagline to the Reviews of Geophysics article states:
Received 20 October 2003; revised 4 February 2004; accepted 17 February 2004; published 6 May 2004.
Thus, the timeline of the Environment Canada website is quite misleading. The paper by Jones and Mann [not Jones and Jones] was not “late last year”, but was accepted (in presumably its final form) on February 17, 2004. While it is certainly possible that a paper published “late last year” could have considered the topics of our 2005 papers (accepted in January 2005) on a pre-emptive basis, the chances of a paper accepted in February 2004 doing so seem more remote.
There’s an interesting curiosity to the chronology here: our 2003 paper had not been published as at October 20, 2003 and so we presume that the paper as originally submitted did not contain a discussion even of our first article. Accordingly, we presume that the discussion of our 2003 article by Jones and Mann  was inserted in the February 4, 2004 revision. The acceptance of Jones and Mann  came quickly after the revision. It’s quite possible that the inserted discussion about our 2003 article was not re-submitted to referees.
Now let’s see what the Reviews of Geophysics article actually says:
A recent study [McIntyre and McKittrick, 2003] claims that revisions of the data and methods used in the Mann et al. [1998a, 1999] reconstruction shown here in Figure 5 yield a “Å”Åcorrected version” exhibiting 15th century NH mean temperatures warmer than those of the latter (1970s–1980s) 20th century. This claim is so clearly at odds with every other reconstruction discussed in this section (particularly Esper et al.  and Huang et al. ) that it should be dismissed on this basis alone. However, a careful analysis (M. E. Mann et al., Critical flaws in a recent criticism of the Mann et al.  study, submitted to Climate Change, 2003) of the McIntyre and McKittrick  result reveals that their anomalous 15th century warmth results from their elimination of over 70% of the 15th century proxy data used by Mann et al. [1998a]. Also, their reconstruction, unlike that of Mann et al. [1998a] or Mann and Jones , fails independent pass cross-validation tests. Their result can thus be dismissed as spurious on this basis also.
Review articles necessarily rely on other published articles. Note that the Reviews of Geophysics article does not itself “refute” any findings; it merely reports the results of another article, in this case: M. E. Mann et al., Critical flaws in a recent criticism of the Mann et al.  study, submitted to Climate Change, 2003. Obviously, this study has never been published and one may conclude that it was rejected by Climatic Change, which is edited by Stephen Schneider, who can hardly be accused of not being sympathetic to Mann on the one hand or uncritical of our work on the other. If the submission was rejected by Climatic Change, it must have had some serious problems.
When we had our submission rejected by Nature, we reported this (although we were not obliged to do so). Contrast this behaviour with Mann’s who stated at realclimate (here at Comment #2) :
Only one of the parties involved has (1) had their claims fail scientific peer-review,…
Given the rejection of M. E. Mann et al., Critical flaws in a recent criticism of the Mann et al.  study, submitted to Climate Change, 2003, Mann’s statement is obviously untrue.
Topics in 2005 Articles
Let’s now look at whether the text of the Reviews of Geophysics article somehow pre-butted our 2005 articles.
The abstract to our GRL article stated:.
The “hockey stick” shaped temperature reconstruction of Mann et al. [1998, 1999] has been widely applied. However it has not been previously noted in print that, prior to their principal components (PCs) analysis on tree ring networks, they carried out an unusual data transformation which strongly affects the resulting PCs. Their method, when tested on persistent red noise, nearly always produces a hockey stick shaped first principal component (PC1) and overstates the first eigenvalue. In the controversial 15th century period, the MBH98 method effectively selects only one species (bristlecone pine) into the critical North American PC1, making it implausible to describe it as the “dominant pattern of variance”. Through Monte Carlo analysis, we show that MBH98 benchmarks for significance of the Reduction of Error (RE) statistic are substantially under-stated and, using a range of cross-validation statistics, we show that the MBH98 15th century reconstruction lacks statistical significance.
The arguments here pertain to the existence of a bias in the MBH98 methodology; the dominance of the bristlecone pines in their calculations, an error in their RE benchmarking and the lack of statistical significance for their result. Now let’s look at our E&E abstract:
The differences between the results of McIntyre and McKitrick  and Mann et. al.  can be reconciled by only two series: the Gaspé cedar ring width series and the first principal component (PC1) from the North American tree ring network. We show that in each case MBH98 methodology differed from what was stated in print and the differences resulted in lower early 15th century index values. In the case of the North American PC1, MBH98 modified the PC algorithm so that the calculation was no longer centered, but claimed that the calculation was “conventional”. The modification caused the PC1 to be dominated by a subset of bristlecone pine ring width series which are widely doubted to be reliable temperature proxies. In the case of the Gaspé cedars, MBH98 did not use archived data, but made an extrapolation, unique within the corpus of over 350 series, and misrepresented the start date of the series. The recent Corrigendum by Mann et al. denied that these differences between the stated methods and actual methods have any effect, a claim we show is false. We also refute the various arguments by Mann et al. purporting to salvage their reconstruction, including their claims of robustness and statistical skill. Finally, we comment on several policy issues arising from this controversy: the lack of consistent requirements for disclosure of data and methods in paleoclimate journals, and the need to recognize the limitations of journal peer review as a quality control standard when scientific studies are used for public policy.
Again the Abstract is focused entirely on critical issues pertaining to the bias in the PC calculations, the crucial impact of bristlecones (and here Gaspé cedars), the lack of robustness and statistical skill.
The 2003 article did not discuss most of the key issues of the 2005 articles. In the earlier article, we pointed out non-replicability of MBH98 principal components using conventional algorithms, but were then unable to precisely diagnose the defects in their method (as we did in the 2005 article). Our 2003 article was not as narrowly focused on 15th century results, since it spent a considerable amount of time pointing out quality problems in the data set, not all of which contributed to 15th century results. Our 2003 article did not discuss bristlecones, as we were then unaware of their pernicious role in MBH98. Our 2003 article did not discuss the problems with RE benchmarking. Although the 2003 results can be readily construed as a demonstration of non-robustness, We were then unaware of the pernicious of role of bristlecone pines in .
Thus, Jones and Mann  not only failed to pre-butt (or pre-fute) the findings of our 2005 papers, it did not even discuss them.
The key problem in MBH98 principal components methodology is short-segment standardization, which causes the algorithm to mine for hockey sticks. We were not aware that this was the reason for the peculiar MBH98 principal component series at the time of our 2003 article, as Mann et al. had misrepresented their methodology in MBH98. We presented fresh PC calculations using the maximum period in which all series were available. We illustrated the difference with respect to the Australian dataset (not the North American dataset which has become the subject of so much discussion.)
Original Caption: Figure 5. (a) Australia PC1 in MBH98 (series #96) graphed over time (b) PC1 for the MBH98 Australia dataset calculated using standard algorithm.
Here we pointed out that it was impossible to determine on the existing record how Mann et al. had dealt with missing data in principal component calculations. We had attempted to obtain further particulars from Mann et al. on their methodology, which had been refused. We therefore carried out calculations using the maximum period in which all series were represented in the network — a procedure not dissimilar to how Mann et al. handled temperature principal component calculations.
In fact, it turned out that, in addition to the unreported short-segment standardization, Mann et al. had an unreported stepwise PC method with highly idiosyncratic (and as far as I can tell) non-replicable methodology for determining the number of retained PCs and determining which steps to use. The extensive new Supplementary Information archived at Nature in July 2004 provided a description of the actual retained PCs for the first time, although it did not provide a replicable method.
So when Jones and Mann  stated:
However, a careful analysis (M. E. Mann et al., Critical flaws in a recent criticism of the Mann et al.  study, submitted to Climate Change, 2003) of the McIntyre and McKittrick  result reveals that their anomalous 15th century warmth results from their elimination of over 70% of the 15th century proxy data used by Mann et al. [1998a].
this is misleading on a variety of counts.
First, while the emulation in MM03 did not implement the then unreported stepwise PC method, the issue of non-robustness in the 15th century did not specifically pertain to the stepwise method, but to the short-segment standardization and the impact of bristlecones resulting from that. Mann was aware of this in February 2004.
Second, while it is true that a calculation without 15th century U.S. tree ring data results in high-15th century values, thereby demonstrating non-robustness, this is only because the calculation excludes the bristlecones. If the exact sensitivity were further analysed (and this was well-known to Mann at the time), the issue was not the exclusion of U.S. tree ring data, but the issue of bristlecones.
Given that Mann et al. had elsewhere claimed that their methodology was robust to the presence/absence of bristlecone pines in total, the presence/absence of 15th century U.S. tree ring data shouldn’t matter anyway.
Turning now to the issue of statistical skill, which is a major theme of our 2005 articles. Jones and Mann  stated:
their reconstruction, unlike that of Mann et al. [1998a] or Mann and Jones , fails independent pass cross-validation tests. Their result can thus be dismissed as spurious on this basis also.
We have always gone to some pains to emphasize that our articles are entirely critical in nature and that we are not “presenting” a new interpretation of climate history. Any “reconstructions” illustrated in our articles have been entirely critical in nature to show non-robustness of MBH98 results.
In MM03, we explicitly stated that we did not endorse MBH98 methodology or data as a basis for reconstructing past climates. For example, we prominently stated in the conclusions of the article:
Without endorsing the MBH98 methodology or choice of source data, [our bold here] we were able to apply the MBH98 methodology to a database with improved quality control and found that their own method, carefully applied to their own intended source data, yielded a Northern Hemisphere temperature index in which the late 20th century is unexceptional compared to the preceding centuries, displaying neither unusually high mean values nor variability. More generally, the extent of errors and defects in the MBH98 data means that the indexes computed from it are unreliable and cannot be used for comparisons between the current climate and that of past centuries.
This caveat against MBH98 methodology and data was intended to forestall any idea that we were presented “our own” reconstruction. Since controversialists (on both sides) sometimes mis-understood the counterfactual approach of our analysis, we added the following comment in the FAQ section of the Supplementary Information to MM03:
Your graph seems to show that the 15th Century was warmer than today’s climate: is this what you are claiming?
No. We’re saying that Mann et al., based on their methodology and corrected data, cannot claim that the 20th century is warmer than the 15th century — the nuance is a little different. To make a positive claim that the 15th century was warmer than the late 20th century would require an endorsement of both the methodology and the common interpretation of the results which we are neither qualified nor inclined to offer.
Our GRL article has been our most publicized article by far and it does not contain any discussion of 15th century results, other than a demonstration of the statistical insignificance of MBH98 results.
Again, in the FAQ to our 2005 articles, we stated once again:
Are you saying the 15th century was warmer than the present?
No, we are saying that the hockey stick graph used by IPCC provides no statistically significant information about how the current climate compares to that of the 15th century (and earlier). And notwithstanding that, to the extent readers consider the results informative, if a correct PC method and the unedited version of the Gaspé series are used, the graph used by the IPCC to measure the average temperature of the Northern Hemisphere shows values in the 15th century exceed those at the end of the 20th century. We do not think that we could be more explicit than this.
Our 2005 E&E article consistently discussed reconstructions only in terms of the non-robustness of MBH98 results to slight permutations of methods and proxies and never as a positive reconstruction of past climates. It refuted the MBH claim that their method was robust to the exclusion of all dendrochronological indicators, whereas it is obviously not robust even to the presence/absence of bristlecones. Richard Muller of Berkeley understood this nuance clearly in a widely disseminated discussion:
But science also advances when we learn that something we believed to be true isn’t. When solving a jigsaw puzzle, the solution can sometimes be stymied by the fact that a wrong piece has been wedged in a key place… In fact, McIntyre and McKitrick are careful to point out that it is hard to draw conclusions from these data, even with their corrections. Did medieval global warming take place? Last month the consensus was that it did not; now the correct answer is that nobody really knows. Uncovering errors in the Mann analysis doesn’t settle the debate; it just reopens it. We now know less about the history of climate, and its natural fluctuations over century-scale time frames, than we thought we knew… It is our responsibility as scientists to look at the data in an unbiased way, and draw whatever conclusions follow. When we discover a mistake, we admit it, learn from it, and perhaps discover once again the value of caution.
I re-iterated the purely critical nature of our articles one more time in my recent National Post article.
The issue is not whether our illustrations, presented as a reductio ad absurdum of MBH98 having any “skill”, but whether MBH98 has any skill. We addressed that issue squarely in our GRL article where we stated explicitly that
… neither the R2 and other cross-validation statistics nor the underlying construction step have ever been reported for the controversial [MBH98] 15th century period. Our calculations have indicated that they are statistically insignificant. Timely reporting of these statistics (in the original article) might have led to an earlier consideration of the discrepancy between the apparently high RE value and the low values of other statistics, and thus enabled earlier identification of the underlying data transformation resulting in this problem.
Mann has continually tried to deflect consideration away from his own reconstruction, but trying to criticize our sensitivity analyses of his own results. Wahl and Ammann continue this tradition of sleight-of-hand by criticizing the RE of various reconstructions used to illustrate non-robustness, while withholding the R2 and other statistics of their emulation of MBH98.
The Reviews of Geophysics article obviously does not refute our 2005 articles. It relies on a rejected submission to Climatic Change and its main criticisms of our 2003 article are themselves readily refuted.
Jones, P.D. and M. E. Mann, 2004. Climate over past millennia. Reviews of Geophysics, 42, Paper number 2003RG000143.