Realclimate discovers checking is a "good thing"

It’s "interesting times" at realclimate.org where the authors feel the need suddenly to check their sources and the provenance of data in a way that they clearly didn’t before, and advise caution when the latest scare stories of global warming come in.

For students of psychology, this points to an internal conflict of the psyche between the conscience on the one hand and deeply held beliefs on the other. What, I wonder, would a psychologist have made of the post entitled "What if…the Hockey Stick were wrong?" published on the 27th January 2005, the day that the MM05 results were published?

So it is that when realclimate admits that it posted incorrect information regarding a borehole study by Dr Huang (Huang, S., H. N.Pollack and P.-Y. Shen, Temperature Trends Over the Past Five Centuries Reconstructed from Borehole Temperature, Nature 403, 756-758, 2000), it says something psychologically revealing:

The Internet is nothing if not flexible, and unlike in journals where mistakes can persist an awfully long time, we are able to correct such problems very quickly. In this respect, Dr. Huang’s letter seems to indicate that things are actually working quite well here.

We would like to take this opportunity to re-iterate our commitment to getting the science right, and as importantly, getting it right in real-time. We welcome all corrections or clarifications and we will endeavour to fix any errors, great or small, as quickly as we can.

Now, don’t get me wrong. Everybody makes mistakes, climatologists included. I think it’s reasonable and fair that people should honourably make changes and apologize when errors are spotted. But realclimate’s former banging of the twin drums of "scientific consensus" and "peer-reviewed" as if they were above these sorts of errors (only a month or so ago!) makes the disconnect that much more uncomfortable.

For example…

When Richard Muller wrote in his widely read article "Global Warming Bombshell", he was accused of scurrilously parroting the statements of McIntyre and McKitrick in regard to the claims that (1) use of non-centered PCA (as by MBH98) is somehow not statistically valid, and (2) that "Hockey Stick" patterns arise naturally from application of non-centered PCA to purely random "red noise".

Michael Mann (for it was he) wrote:

Both claims, which are of course false, were made in a comment on MBH98 by MM that was rejected by Nature , and subsequently parroted by astronomer Richard Muller in a non peer-reviewed setting

Ah, the negation of an informed scientific comment because it was not "peer-reviewed"! Those were the days…

Bear in mind that Muller is a believer in the anthropogenic greenhouse gas hypothesis, but still advised that the scientific method is to reject faulty studies entirely and to be suspicious of results which confirm one’s own prejudices.

If you are concerned about global warming (as I am) and think that human-created carbon dioxide may contribute (as I do), then you still should agree that we are much better off having broken the hockey stick. Misinformation can do real harm, because it distorts predictions.

But regardless of his personal beliefs, Muller puts the scientific method first:

A phony hockey stick is more dangerous than a broken one–if we know it is broken. It is our responsibility as scientists to look at the data in an unbiased way, and draw whatever conclusions follow. When we discover a mistake, we admit it, learn from it, and perhaps discover once again the value of caution

Perhaps realclimate is making that voyage of discovery as well.


12 Comments

  1. Willis Eschenbach
    Posted Oct 4, 2006 at 12:35 AM | Permalink

    Well, I don’t know where to put this, it should be a new thread I suppose. This was about the only borehole thread, and then only peripherally.

    UNDERGROUND PROBLEMS WITH MANNHOLES

    While researching ocean drill cores at the WCDC, I stumbled across Mann’s borehole data. One of the proxies used for historical temperature reconstruction is “borehole temperature”, the temperature down in the ground. In 2002, Michael Mann et al. published a study called Optimal surface temperature reconstructions using terrestrial borehole data. It is available here, for $9.00.

    In it they use all of the available weapons to construct the temperature proxy “¢’‚¬? EOFs, PCA, and of course that perennial favorite, “optimal fingerprinting”, viz:

    We employ a spatial signal detection approach that bears a loose relationship with “”optimal detection” approaches used in anthropogenic climate signal fingerprinting [Mitchell et al., 2001]. In such “”optimal detection” approaches, one seeks to identify, through generalized linear regression, the estimate of a target signal (as predicted by a model) in empirical data. Detection is accomplished through rotation of the empirical data, in EOF state-space, away from the direction of maximal noise (as estimated from, e.g., a control model simulation).

    In our approach, an independent estimate of noise is not available. Rather, we employ an EOF rotation of the information in the borehole dataset toward an independent estimate of the target spatial SAT signal from the instrumental record, based on ordinary (potentially weighted) least squares spatial regression. Once an optimal rotation is found that provides maximal (and statistically significant) agreement between the spatial information in the borehole and instrumental record during the 20th century, the associated eigenvector rotation is used to project the estimated borehole SAT signal back in time.

    So I decided to see how well this “optimal detection” works compared to plain old linear regression. I regressed several of their results against the HadCRUT3 temperature dataset. I compared my own area-weighted (cos(latitude) ) average of their raw gridded data, their calculated area-weighted average of their “optimal” gridded data, and their final “optimal reconstruction”. Here are the results:

    A couple things of note. First, the difference between just regressing the plain old raw average and their “optimal” result amounts to 0.06 degrees five hundred years ago … seems like they wasted a lot of time checking for fingerprints.

    Second, this is as boring a paleo record as I could imagine, very little detail.

    Finally, the lack of detail pertains to a very interesting sleight-of-hand manoeuvre … notice the kinks in the borehole reconstructions every hundred years in the graph above? Those kinks are present in Mann’s paper as well, only he’s hidden them by squishing the graphs down flat. But once you know where to look, you can still see them. Take a look … the black arrows show the big ones. The 1900 kink is the most prominent one, but they’ve made it so flat that even it’s hard to see.

    Why are there kinks in the reconstruction? Because they’ve infilled the data. In each grid cell, there are actually only six data points, one each for the years 1500, 1600, 1700, 1800, 1900, and 1980. The other 474 data points are just linearly filled in between those six points.

    Now, consider the true statistical significance of their data. Any average temperature will only have six data points … do they have a significant trend? No way, including autocorrelation you’re down to only about three data points, that’s one degree of freedom …

    None of their results have any statistical significance whatsoever. For the period 1900-1980, for example, they are regressing two data points (or EOFing and PCAing two data points) in each grid against the actual temperature.

    Grrrr … the Mannomatic truly can chop, slice, and dice anything. Most likely, they’re now using the data from this “reconstruction” in their latest hockeystick emulation …

    Questions? The data is here, read’em and weep …

    w.

  2. Peter Hearnden
    Posted Oct 4, 2006 at 2:47 AM | Permalink

    So, Willis, in your opinion, is there a temperature signal in the bore hole data? Or, might your graphs just be ‘slight of hand’?

  3. Willis Eschenbach
    Posted Oct 4, 2006 at 3:21 AM | Permalink

    #2, Peter, thanks for the question. Since there’s only six data points per gridcell, we really don’t have enough data to say if there is a temperature signal or not. Not one of them has a statistically significant trend, either up or down, there’s simply not enough data.

    Here’s a sample of a few of the boreholes …

    These just happen to be the first ten gridcells in the dataset. Notice that each one only has six data points. Now, do you see a temperature signal in there?

    I don’t think there’s much of one, in part because they are all so radically different. Look at gridcell 3, it has cooled 2 degrees over the period of the record … is that a temperature signal?

    I also don’t think there’s a temperature signal there because of the monotonic nature of the signal, increasing every century. No other proxy dataset (including Mann’s ill-fated “hockeystick”) shows a rise every century. All of them are coldest about 1700. So no, I don’t think there’s a discernable temperature signal there.

    My graphs are just graphs, not sleight of hand. Mann never mentions in his paper that there are only six temperatures per gridcell, one per century, and he squeezed down his graphs until nobody notices. That’s sleight of hand …

    w.

  4. Peter Hearnden
    Posted Oct 4, 2006 at 3:51 AM | Permalink

    Willis, I think slight of hand a pretty unpleasant thing to say. It’s almost as if you think he had an ulterior motive (rules anyone?). I’m amazed a man as intelligent as you will convict without hearing from the defendant…

    Whatever, next time some sceptical quotes the borehole data to me I’ll mention your view 😉

  5. Willis Eschenbach
    Posted Oct 4, 2006 at 5:14 AM | Permalink

    Peter, how would you describe doing an entire study, as Mann did, based on just six data points that are inflated to make 480 data points, and not mentioning that little tiny unimportant detail even once in the study?

    Forthright? Transparent? Honest? Clear? Straightforward? Direct? Up-front? Candid? You tell me …

    One final point. I’m not talking about “the borehole data”, whatever that might entail. I’m just talking about Mann’s study.

    w.

  6. Peter Hearnden
    Posted Oct 4, 2006 at 5:47 AM | Permalink

    Willis, I’d like to hear the accused take on matters before I convict. Sorry about that.

  7. Willis Eschenbach
    Posted Oct 4, 2006 at 5:57 AM | Permalink

    Peter, sure, why not. I mean, after all, Michael Mann has never done anything underhanded before. Write and ask him to show you his data and methods …

    w.

  8. Peter Hearnden
    Posted Oct 4, 2006 at 6:47 AM | Permalink

    There you go again. Willis, you’re like a scratched record.

  9. Posted Oct 4, 2006 at 7:16 AM | Permalink

    No, “Ad Hom” that would be you.

    A screeching cry that people are being mean for criticizing people who you agree with, following with more pathetic appeals that the criticized should be allowed to respond, followed by a “poor me” act that everyone is “attacking” you for defending the people who can’t defend themselves.

    Then a few “ad homs” later you get cut off before you derail the thread (which is your sole motive for the intervention), then you’ll protest about the evils of censorship on evil blogs like this one.

    A broken record, over and over.

    Willis, just ignore “ad hom” and get back to analysis.

  10. Posted Dec 29, 2007 at 11:45 AM | Permalink

    Peter, John A and Willis

    At least for his fellow climate scientists in 2003/4 Mann tried to “defend” his use of proxy reconstructions using RegEM. He says he uses this approach in a stepwise fashion backwards in time to make increasingly better use of low-frequency info in the calibrating process. I agree that if there’s none to be had, it’s extra work. My thesis advisor told me that’s what grad students are for.

    If you’re interested in Mann’s own words from Journal of Climate 2004: ABSTRACT

    Results are presented from a set of experiments designed to investigate factors that may
    influence proxy-based reconstructions of large-scale temperature patterns in past
    centuries. The factors investigated include (1) the method used to assimilate proxy data
    into a climate reconstruction, (2) the proxy data network used, (3) the target season, and (4)
    the spatial domain of the reconstruction.The comparisons support the generally robust nature of several, previously published,estimates of NH mean temperature changes in past centuries, and suggest that further improvements in reconstructive skill are most likely to arise from an emphasis on the quality, rather than quantity, of available proxy data.

    You can find Mann’s entire 68 pages at (Realclimate)

    I’m late with post and a newcomer to climate science/scientists but I was told in physics class that accuracy is more important for measurement than precision 😉

  11. Pat Keating
    Posted Dec 29, 2007 at 2:23 PM | Permalink

    4 Hearnden
    You said in your post #4

    Willis, I think slight of hand a pretty unpleasant thing to say. It’s almost as if you think he had an ulterior motive

    You have quite a nerve saying that, when it was YOU in #2 who first introduced the phrase:

    Or, might your graphs just be ’slight of hand’?

  12. steven mosher
    Posted Dec 29, 2007 at 3:26 PM | Permalink

    RE 11.

    Late hit pat. 5 minutes in the penalty box

%d bloggers like this: