In recent discussion of the Weblog 2007 Awards, several commenters at other blogs have argued that our criticisms of the Mannian parlor tricks have been “thoroughly refuted and discarded by climatologists, published in a credible journal”; that “other professionals in the field who also have “looked in great detail at the problem at hand” and have come to the conclusion that rather than McIntyre’s findings being “valid and relevant”, they instead have found them to be “without statistical and climatological merit”; that CA “fluffed on the whole hockey stick thing”. See for example here
Omitted in these references are the fact that the people described as “climatologists published in a credible journal” or “professionals in the field” are none other than Wahl and Ammann, serial coauthors with Michael Mann, students of Mann, who are not independent of the controversy. Indeed, they largely use (without citation or attribution or even acknowledgment to Michael Mann) arguments originally published at realclimate (and already responded to in MM 2005b(EE). Aside from their lack of independence, neither Ammann nor Wahl qualify as statistical authorities. Ammann did his undergraduate work in geology; Wahl in divinity. While this does not exclude them from having potential insight in the matter, it is evidence that one should not necessarily expect a sure grasp of mathematical and statistical issues and that their conclusions cannot be relied upon uncritically, even if Stephen Schneider accepted their article.
Readers interested in a third party view of the matter are far better off consulting the North Report, the Wegman report, (particularly) Wegman’s Reply to Questions and Richard Smith’s account of the 2006 American Statistical Association session. All of these individuals are vastly more eminent than Ammann and Wahl. Wegman, in particular, has been Chair of the National Academy of Sciences Committee on Theoretical and Applied Statistics and is a legitimate statistical expert. His comments on the Wahl and Ammann preprint are very acute and have not received appropriate consideration.
I’ve collated some of these remarks for the benefit of new readers who haven’t been following this particular story. Please read the comments below using the analogy from the previous post: see if any of our criticisms of Mannian parlor tricks have been refuted – as opposed to whether someone arguing that you can re-tool the trick to still saw the woman in half a different way. (And for this latter, pay particular attention to Wegman’s comments on Wahl and Ammann later in the post.)
The Wegman Report
The original Wegman Report is online here. Here are some excerpts from this report:
The debate over Dr. Mann’s principal components methodology has been going on for nearly three years. When we got involved, there was no evidence that a single issue was resolved or even nearing resolution. Dr. Mann’s RealClimate.org website said that all of the Mr. McIntyre and Dr. McKitrick claims had been ‘discredited’. UCAR had issued a news release saying that all their claims were ‘unfounded’. Mr. McIntyre replied on the ClimateAudit.org website. The climate science community seemed unable to either refute McIntyre’s claims or accept them. The situation was ripe for a third-party review of the types that we and Dr. North’s NRC panel have done.
While the work of Michael Mann and colleagues presents what appears to be compelling evidence of global temperature change, the criticisms of McIntyre and McKitrick, as well as those of other authors mentioned are indeed valid.
“Where we have commonality, I believe our report and the [NAS] panel essentially agree. We believe that our discussion together with the discussion from the NRC report should take the ‘centering’ issue off the table. [Mann’s] decentred methodology is simply incorrect mathematics …. I am baffled by the claim that the incorrect method doesn’t matter because the answer is correct anyway.
Method Wrong + Answer Correct = Bad Science.
The papers of Mann et al. in themselves are written in a confusing manner, making it difficult for the reader to discern the actual methodology and what uncertainty is actually associated with these reconstructions.
It is not clear that Dr. Mann and his associates even realized that their methodology was faulty at the time of writing the [Mann] paper.
We found MBH98 and MBH99 to be somewhat obscure and incomplete and the criticisms of MM03/05a/05b to be valid and compelling.
Overall, our committee believes that Mann’s assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis.
[The] fact that their paper fit some policy agendas has greatly enhanced their paper’s visibility… The ‘hockey stick’ reconstruction of temperature graphic dramatically illustrated the global warming issue and was adopted by the IPCC and many governments as the poster graphic. The graphics’ prominence together with the fact that it is based on incorrect use of [principal components analysis] puts Dr. Mann and his co-authors in a difficult face-saving position.
We have been to Michael Mann’s University of Virginia website and downloaded the materials there. Unfortunately, we did not find adequate material to reproduce the MBH98 materials. We have been able to reproduce the results of McIntyre and McKitrick
Generally speaking, the paleoclimatology community has not recognized the validity of the [McIntyre and McKitrick] papers and has tended dismiss their results as being developed by biased amateurs. The paleoclimatology community seems to be tightly coupled as indicated by our social network analysis, has rallied around the [Mann] position, and has issued an extensive series of alternative assessments most of which appear to support the conclusions of MBH98/99… Our findings from this analysis suggest that authors in the area of paleoclimate studies are closely connected and thus ‘independent studies’ may not be as independent as they might appear on the surface.
It is important to note the isolation of the paleoclimate community; even though they rely heavily on statistical methods they do not seem to be interacting with the statistical community. Additionally, we judge that the sharing of research materials, data and results was haphazardly and grudgingly done. In this case we judge that there was too much reliance on peer review, which was not necessarily independent.
Based on the literature we have reviewed, there is no overarching consensus on [Mann’s work]. As analyzed in our social network, there is a tightly knit group of individuals who passionately believe in their thesis. However, our perception is that this group has a self-reinforcing feedback mechanism and, moreover, the work has been sufficiently politicized that they can hardly reassess their public positions without losing credibility.
It is clear that many of the proxies are re-used in most of the papers. It is not surprising that the papers would obtain similar results and so cannot really claim to be independent verifications.”
Especially when massive amounts of public monies and human lives are at stake, academic work should have a more intense level of scrutiny and review. It is especially the case that authors of policy-related documents like the IPCC report, Climate Change 2001: The Scientific Basis, should not be the same people as those that constructed the academic papers.”
Wegman on Wahl and Ammann
Wegman’s Reply to Questions is a really excellent consideration of the efforts of Wahl and Ammann to re-tool Mann’s parlor trick and prove that the late 20th century was paranormal. I’ve given the entire question 10 and Wegman response as it is clear and concise. (Again the issue is the narrow one of whether Mann et al proved that the late 20th century was paranormal.) Stupak asked:
10. In the footnote of your report, you reference papers by Wahl and Ammann (2006) and Wahl et al. (2006) and note that they “are not to the point.” I understand that Wahl and Ammann actually examined, among other things, the problem of data decentering, the main focus of your report, and corrected the emulation of MBH98 by recentering the data.
a. Did you analyze this work by Wahl and Ammann prior to sending your final report to the Committee on Energy and Commerce? If so, why does your report not alert the reader that these researchers had conducted a reanalysis of the MBH98 that corrected the only statistical methodology error discussed in the “Finding” section of your report and that these researchers found that recentering the data did not significantly affect the results reported in the MBH98 paper?
To which, Wegman answered:
Ans: The Wahl and Ammann paper came to our attention relatively late in our deliberations, but was considered by us. Some immediate thoughts we had on Wahl and Ammann was that Dr. Mann lists himself as a Ph.D. coadvisor to Dr. Ammann on his resume. As I testified in the second hearing, the work of Dr. Ammann can hardly be thought to be an unbiased independent report. It would have been more convincing had this paper been written by a totally independent authority, but alas this is not the case. The Wahl and Ammann paper is largely an attempt to refute the criticisms of McIntyre and McKitrick (MM). The comment we made in our footnote about being “not to the point” refers to the fact that MM03 and MM05 were not attempting to portray themselves as doing a paleoclimate reconstruction, they not being paleoclimatologists themselves, but were merely pointing out the flaws in the MBH98 and MBH99 papers. There are several comments of interest in the Wahl and Ammann paper. They suggest three areas in which the MBH papers have been subject to scrutiny.
First, the MBH reconstruction has been examined in light of its agreement/lack of agreement with other long-term annual and combined high/low frequency reconstructions. Wahl and Ammann (2006, p.3 in the 24 February 2006 draft)
Their conclusion is:
“The comparison of the MBH reconstruction, derived from multi-proxy (particularly tree ring) data sources, with widespread bore-hole-based reconstructions is still at issue in the literature.” Wahl and Ammann (2006, p.4 in the 24 February 2006 draft)
In other words, the MBH reconstruction does not agree with other widely accepted methodologies for climate reconstruction. Bore hole methods measure a temperature gradient and calculate the diffusion of heat within the bore hole. This method does not have nearly the confounding variables as do tree ring proxies. The second area of scrutiny involves comparison with results from modeling efforts.
“Second a related area of scrutiny of the MBH reconstruction technique arises from an atmosphere-ocean general circulation model (AOGCM) study , which also examines the potential loss of amplitude [in the MWP] in the MBH method (and other proxy/instrumental reconstructions that calibrate by using least squares projections of the proxy vectors onto a single- or multi-dimensional surface determined by either the instrumental data or its [their] eigenvectors.” Wahl and Ammann (2006, p.4 in the 24 February 2006 draft)
Again the MBH reconstructions do not correlate well with the model based methods. Wahl and Amman (2006) offer the following explanation.
“However, a number of issues specific to the modeling situation could arise in this context, including: how realistically the AOGCM is able to reproduce the real world patterns of variability and how they respond to various forcings7; the magnitude of forcings and the sensitivity of the model that determine the magnitude of temperature fluctuations ; and the extent to which the model was sampled with the same richness of information that is contained in the proxy records (not only temperature records, but series that correlate well with the primary patterns of variability including, for example, precipitation in particular seasons.” Wahl and Ammann, (2006, p.5 in the 24 February 2006 draft)
This quotation has two interesting facets. First, it seems to call into question the very models that are predicting temperature increases based on CO2 forcings. If these models do not coincide with the MBH reconstructions, then which are we to believe? Second, the quotation implicitly admits what we have observed previously, namely that there are other covariates such as precipitation, which are not teased out in the temperature reconstructions. Thus, what are purported to be temperature reconstructions are contaminated with covariates that reflect temperature indirectly at best and not at all at worst. The third area of scrutiny involves the challenges made by MM.
“A third area of scrutiny has focused on the nature of the proxy data set utilized by MBH, along with the pre-processing algorithms used to enhance the climate signal-to-noise characteristics of the proxy data.” Wahl and Ammann, (2006, p.5 in the 24 February 2006 draft)
We submit that both the mathematical analysis in Appendix A of our report to Congress together with our simulation demonstrate that the decentering method yields incorrect results. The critical issue then becomes the proxies themselves, which MM have challenged. A telling comment from Wahl and Ammann is the following.
“A further aspect of this critique is that the single-bladed hockey stick shape in proxy PC summaries for North America is carried disproportionately by a relative small subset (15) of proxy records derived from bristlecone/foxtail pines in the western United States, which the authors [MM] mention as being subject to question in the literature as local/regional temperature proxies after approximately 1850 . It is important to note in this context that because they employ an eigenvector-based CFR technique, MBH do not claim that all proxies used in their reconstruction are closely related to local-site variations in surface temperature.” Wahl and Ammann, (2006, p.9 in the 24 February 2006 draft).
This together with the AOGCM quotation reinforces the notion that MBH are attempting to reconstruct temperature histories based on proxy data that are extremely problematic in terms of actually capturing temperature information directly. As we testified, it would seem that there is some substantial likelihood that the bristlecone/foxtail pines are CO2 fertilized and hence are reflecting not temperature at all but CO2 concentration. It is a circular argument to say increased CO2 concentrations are causing temperature increases when temperature increases are estimated by using proxies that are directly affected by increased CO2 concentrations.
It is our understanding that when using the same proxies as and the same methodology as MM, Wahl and Ammann essentially reproduce the MM curves. Thus, far from disproving the MM work, they reinforce the MM work. The debate then is over the proxies and the exact algorithms as it always has been.
The fact that Wahl and Ammann (2006) admit that the results of the MBH methodology does not coincide with the results of other methods such as borehole methods and atmospheric-ocean general circulation models and that Wahl and Ammann adjust the MBH methodology to include the PC4 bristlecone/foxtail pine effects are significant reasons we believe that the Wahl and Amman paper does not convincingly demonstrate the validity of the MBH methodology.
The next part of the Stupak question was:
b. Do you agree or disagree with Wahl and Ammann’s finding that the time period used to center the data does not significantly affect the results reported in the MBH98 paper? If you disagree, please state the basis for your disagreement.
Ans: We do disagree. The fundamental issue focuses on the North American Tree Ring proxy series, which Wahl and Ammann admit are problematic in carrying temperature data. In the original MBH decentered series, the hockey-stick shape emerged in the PC1 series because of reasons we have articulated in both our report and our testimony. In the original MBH papers, it was argued that this PC1 proxy was sufficient. We note the following from Wahl and Ammann.
“Thus, the number of PCs required to summarize the underlying proxy data changes depending on the approach chosen. Here we verify the impact of the choice of different numbers of PCs that are included in the climate reconstruction procedure. Systematic examination of the Gaspé-restricted reconstructions using 2-5 proxy PCs derived from MM-centered, but unstandardized data demonstrates changes in reconstruction as more PCs are added, indicating a significant change in information provided by the PC series. When two or three PCs are used, the resulting reconstructions (represented by scenario 5d, the pink (1400-1449) and green (1450-1499) curve in Fig. 3) are highly similar (supplemental information). As reported below, these reconstructions are functionally equivalent to reconstructions in which the bristlecone/foxtail pine records are directly excluded [emphasis added] (cf. pink/blue curve for scenarios 6a/b in Fig. 4).
When four or five PCs are used, the resulting reconstructions (represented by scenario 5c, within the thick blue range in Fig. 3) are virtually indistinguishable (supplemental information) and are very similar to scenario 5b.” Wahl and Ammann, (2006, p.31, 24 February 2006 draft)
Without attempting to describe the technical detail, the bottom line is that, in the MBH original, the hockey stick emerged in PC1 from the bristlecone/foxtail pines. If one centers the data properly the hockey stick does not emerge until PC4. Thus, a substantial change in strategy is required in the MBH reconstruction in order to achieve the hockey stick, a strategy which was specifically eschewed in MBH. In Wahl and Ammann’s own words, the centering does significantly affect the results.
In passing, the results cited here by Wahl and Ammann had already been discussed in MM 2005b (but Wahl and Ammann fail to acknowledge the earlier discussion and imply that their treatment is novel. Actually the approach originated in Mann’s 2004 response to our Nature submission.) The third part of the question was:
c. Dr. Gulledge included in his testimony a slide showing the graph of WA emulation of the MBH and MBH-corrected for decentering and the Gaspé tree-ring series. Were you aware of their reanalysis of MBH99 prior to the time you finalized your report? Do you agree or disagree with their reanalysis of MBH99? If you disagree, please state the basis for your disagreement.
To which Wegman answered (and note the bolded portion as well):
Ans: Yes, we were aware of the Wahl and Ammann simulation. We continue to disagree with the reanalysis for several reasons. Even granting the unbiasedness of the Wahl and Ammann study in favor of his advisor’s methodology and the fact that it is not a published refereed paper, the reconstructions mentioned by Dr. Gulledge, and illustrated in his testimony, fail to account for the effects of the bristlecone/foxtail pines.
Wahl and Ammann reject this criticism of MM based on the fact that if one adds enough principal components back into the proxy, one obtains the hockey stick shape again. This is precisely the point of contention. It is a point we made in our testimony and that Wahl and Ammann make as well. A cardinal rule of statistical inference is that the method of analysis must be decided before looking at the data. The rules and strategy of analysis cannot be changed in order to obtain the desired result. Such a strategy carries no statistical integrity and cannot be used as a basis for drawing sound inferential conclusions.
The NAS (North) Report
If North et al agreed with the Wegman findings, as they testified to the House Subcommittee under oath, how did this get expressed in the NAS panel report? My view, at the time, and it’s unchanged, was that their report was “schizophrenic”: they agreed with our specific criticisms of Mannian parlor tricks within the body of the report, while at the same time, reporting that there were other proofs that late 20th century climate was paranormal. Eduardo Zorita at the time characterized the NAS report as being as severe as could be contemplated under the circumstances:
in my opinion the Panel adopted the most critical position to MBH nowadays possible. I agree with you that it is in many parts ambivalent and some parts are inconsistent with others. It would have been unrealistic to expect a report with a summary stating that MBH98 and MBH99 were wrong (and therefore the IPC TAR had serious problems) when the Fourth Report is in the making. I was indeed surprised by the extensive and deep criticism of the MBH methodology in Chapters 9 and 11.
So is there any actual language in the NAS panel report that supports any suggestion that they had repudiated any of our published claims in respect to Mannian statistical methodology? In the quotes below, I’ve searched every reference in the report to McIntyre (or MM).
First, like Wegman, they specifically and categorically agree that Mann’s principal components methodology is biased towards mining for hockey-stick shaped series. This is not the only way of doing this parlor trick – Mannian principal components is a fancy way of performing the parlor trick of selecting HS-series from a universe of noise, but you can do this the old-fashioned way: just pick them ( a methodology adopted in subsequent and previous studies.) The NAS panel (STR Preprint, 86) has an extended discussion of Mann’s principal components error as follows:
Spurious Principal Components: McIntyre and McKitrick (2003) [actually McIntyre and McKitrick 2005a] demonstrated that under some conditions, the leading principal component can exhibit a spurious trendlike appearance, which could then lead to a spurious trend in the proxy-based reconstruction. To see how this can happen, suppose that instead of proxy climate data, one simply used a random sample of autocorrelated time series that did not contain a coherent signal. If these simulated proxies are standardized as anomalies with respect to a calibration period and used to form principal components, the first component tends to exhibit a trend, even though the proxies themselves have no common trend. Essentially, the first component tends to capture those proxies that, by chance, show different values between the calibration period and the remainder of the data. If this component is used by itself or in conjunction with a small number of unaffected components to perform reconstruction, the resulting temperature reconstruction may exhibit a trend, even though the individual proxies do not. Figure 9-2 shows the result of a simple simulation along the lines of McIntyre and McKitrick (2003) (the computer code appears in Appendix B)….
Principal components of sample data reflect the shape of the corresponding eigenvectors of the population covariance matrix. The first eigenvector of the covariance matrix for this simulation is the red curve in Figure 9-2, showing the precise form of the spurious trend that the principal component would introduce into the fitted model in this case. This exercise demonstrates that the baseline with respect to which anomalies are calculated can influence principal components in unanticipated ways. (STR Preprint, 86)
They comment approvingly on our criticisms on the inappropriate reliance on the RE statistic (and failed verification r2 statistic) and on non-robustness to bristlecones as follows:
A second area of criticism focuses on statistical validation and robustness. McIntyre and McKitrick (2003, 2005a,b) question the choice and application of statistical methods, notably principal component analysis; the metric used in the validation step of the reconstruction exercise; and the selection of proxies, especially the bristlecone pine data used in some of the original temperature reconstruction studies. These and other criticisms, explored briefly in the remainder of this chapter, raised concerns that led to new research and ongoing efforts to improve how surface temperature reconstructions are performed ….The more important aspect of this criticism is the issue of robustness with respect to the choice of proxies used in the reconstruction. For periods prior to the 16th century, the Mann et al. (1999) reconstruction that uses this particular principal component analysis technique is strongly dependent on data from the Great Basin region in the western United States. Such issues of robustness need to be taken into account in estimates of statistical uncertainties. STR Preprint,106-7)
Regarding metrics used in the validation step in the reconstruction exercise, two issues have been raised (McIntyre and McKitrick 2003, 2005a,b). One is that the choice of “significance level” for the reduction of error (RE) validation statistic is not appropriate. The other is that different statistics, specifically the coefficient of efficiency (CE) and the squared correlation (r2), should have been used (the various validation statistics are discussed in Chapter 9). Some of these criticisms are more relevant than others, but taken together, they are an important aspect of a more general finding of this committee, which is that uncertainties of the published reconstructions have been underestimated. Methods for evaluation of uncertainties are discussed in Chapter 9 [multiple stats recommended) (STR Preprint, 107)
Obviously none of these direct references to our work amount to anything like a repudiation. Quite the contrary. In every case where we were specifically mentioned, they agreed with our criticisms. In addition, they also made several specific findings on matters associated with our critique which, while not mentioning us (as perhaps they should have), supported the points with which we were associated. For example, they said that strip bark dendro chronologies should be “avoided” in temperature reconstructions:
While “strip-bark” samples should be avoided for temperature reconstructions, attention should also be paid to the confounding effects of anthropogenic nitrogen deposition (Vitousek et al. 1997), since the nutrient conditions of the soil determine wood growth response to increased atmospheric CO2 (Kostiainen et al. 2004). (STR Preprint, 50)
We had obviously criticized the failed verification r2, CE and other statistics in the MBH reconstruction, a result confirmed by Wahl and Ammann, despite their opposite characterization. The NAS panel saw throught this characterization and observed the failed CE statistic in MBH, initially observed in MM2005 (GRL) (although they didn’t rub salt in the wound by also observing the failed verification r2 statistic, which had been the more prominent issue.) They said:
Reconstructions that have poor validation statistics (i.e., low CE) will have correspondingly wide uncertainty bounds, and so can be seen to be unreliable in an objective way. Moreover, a CE statistic close to zero or negative suggests that the reconstruction is no better than the mean, and so its skill for time averages shorter than the validation period will be low. Some recent results reported in Table 1S of Wahl and Ammann (in press) indicate that their reconstruction, which uses the same procedure and full set of proxies used by Mann et al. (1999), gives CE values ranging from 0.103 to -0.215, depending on how far back in time the reconstruction is carried. STR Preprint, 91
Large-scale surface temperature reconstructions demonstrate very limited statistical skill (e.g., using the CE statistic) for proxy sets before the 19th century (Rutherford et al. 2005, Wahl and Ammann in press). STR 111
Also without noting that it was us that had made the point, they also observed the non-robustness of these temperature reconstructions to small subsets:
Temperature reconstructions for periods before about A.D. 1600 are based on proxies from a limited number of geographic regions, and some reconstructions are not robust with respect to the removal of proxy records from individual regions (see, e.g., Wahl and Ammann in press). Because the data are so limited, different large-scale reconstructions are sometimes based on the same datasets, and thus cannot be considered as completely independent. …
Published information, although limited, also suggests that these statistics are sensitive to the inclusion of small subsets of the data. Some of the more regionally focused reconstructions (D’Arrigo et al. 2006) have better demonstrated skill back to the 16th century or so, and possibly earlier. To improve the skill of reconstructions, more data need to be collected and possibly new assimilation methods developed. STR 111
So I would submit that there are no comments in the NAS Panel report that, in any way, refute, rebut or repudiate any claims from the McIntyre and McKitrick articles. This is not to say that they do not present their own spaghetti graph as supposed evidence for the paranormal. I’ve discussed defects with each of these other parlor tricks on various occasions as well and have observed the singular lack of due diligence by the NAS panel in investigating these supposed evidences of the paranormal. But for now, all I’m re-capping here is that the NAS panel did not rebut our claims with respect to the supposed Mannian evidence of the paranormal.
Did Wegman and North Disagree?
There’s obviously been a lot of spinning here, as Wegman’s language was much more forthright. The realclimate crowd have tried to marginalize the clear statements in Wegman.
At the July 19, 2006 House Energy and Commerce Subcommittee hearing, Barton asked North very precisely whether he disagreed with any Wegman’s findings and North (under oath) said no as follows:
CHAIRMAN BARTON. I understand that. It looks like my time is expired, so I want to ask one more question. Dr. North, do you dispute the conclusions or the methodology of Dr. Wegman’s report?
DR. NORTH. No, we don’t. We don’t disagree with their criticism. In fact, pretty much the same thing is said in our report. But again, just because the claims are made, doesn’t mean they are false.
CHAIRMAN BARTON. I understand that you can have the right conclusion and that it not be–
DR. NORTH. It happens all the time in science.
CHAIRMAN BARTON. Yes, and not be substantiated by what you purport to be the facts but have we established–we know that Dr. Wegman has said that Dr. Mann’s methodology is incorrect. Do you agree with that? I mean, it doesn’t mean Dr. Mann’s conclusions are wrong, but we can stipulate now that we have–and if you want to ask your statistician expert from North Carolina that Dr. Mann’s methodology cannot be documented and cannot be verified by independent review.
DR. NORTH. Do you mind if he speaks?
CHAIRMAN BARTON. Yes, if he would like to come to the microphone.
MR. BLOOMFIELD. Thank you. Yes, Peter Bloomfield. Our committee reviewed the methodology used by Dr. Mann and his coworkers and we felt that some of the choices they made were inappropriate. We had much the same misgivings about his work that was documented at much greater length by Dr. Wegman.
Given these explicit statements by NAS panel officials, let’s take a look at what Wegman said about Mann et al and exactly what North, Bloomfield, Wallace and the others were agreeing with.
At the 2006 ASA meeting NAS Panel member Mike Wallace was reported as saying:
In Mike’s view, the two reports were complementary, and to the extent that they overlapped, the conclusions were quite consistent.
American Statistical Association Newletter
Wegman’s presentation was also discussed in the American Statistical Association newsletter here. discussing a packed session “What is the Role of Statistics in Public Policy Debates about Climate Change?” at the 2006 ASA meeting discussing statistics and climate change, to which Wegman, Mike Wallace of the NAS panel and Smith himself spoke.
At the core of the controversy is an incorrect use by Mann et al. of principal components (PCs).
Note that there is no nuance here – Smith agrees with Wegman that the Mann et al method was incorrect. He then considered the argument that the error doesn’t not “matter” together with Wegman’s rebuttal:
A number of other commentators have acknowledged the flaws in the Mann reconstruction but have argued that this does not matter because the answers have been verified by other analyses. Ed’s own response to that was given in the equation:
Method Wrong + Answer Correct = Bad Science.
In other words, the fact that the answer may have been correct does not justify the use of an incorrect method in the first place.
Both Wegman’s talk and Smith’s account of it correctly noted that the issues with Mann et al were not just principal components, observing almost but not quite accurately:
Ed also touched on some of the other controversies in Mann’s work. Some of the proxies had been criticized as inappropriate. For example, bristlecone pines are known to be CO2 fertilized, creating a possible confounding problem if they are used in temperature reconstructionA figure from Mann’s own website suggested that the medieval warm period reappeared if bristlecone pines were excluded from the reconstruction. Other studies had shown a “discomforting array of different results” in the reconstructions obtained with minor methodological variations.
As noted above, NAS panelist Wallace agreed that the NAS panel did not disagree with Wegman on common issues:
In Mike’s view, the two reports were complementary, and to the extent that they overlapped, the conclusions were quite consistent.
while there is undoubtedly scope for statisticians to play a larger role in paleoclimate research, the large investment of time needed to become familiar with the scientific background is likely to deter most statisticians from entering this field… In the end, it’s important not to lose sight of the forest for the trees, where the “forest” refers to the totality of scientific evidence for global warming.
As to the last sentence, I agree that it’s important not to lose sight of the forest for the trees. As a reviewer for AR4, it was my position that, if the paleoclimate issues were not relevant to the policy issues, then the Paleoclimate (and the hockey stick discussion) should be deleted from AR4 so that people could focus on what were the “real” arguments. The IPCC “consensus” was presumably that the paleoclimate arguments remained important and that’s why the chapter remained, despite my suggestions that it be deleted.
Wegman Report http://climateaudit.files.wordpress.com/2007/11/07142006_wegman_report.pdf
Wegman Reply to Questions http://www.uoguelph.ca/~rmckitri/research/StupakResponse.pdf
North (NAS Panel) Report http://books.nap.edu/openbook.php?isbn=0309102251
North and Wegman at the House Energy and Commerce Subcommittee hearing
Richard Smith url