I will make here a very simple suggestion: if IPCC or others want to use “multiproxy” reconstructions of world temperature for policy purposes, stop using data ending in 1980 and bring the proxies up-to-date. I would appreciate comments on this note as I think that I will pursue the matter with policymakers.
Let’s see how they perform in the warm 1990s -which should be an ideal period to show the merit of the proxies. I do not believe that any responsible policy-maker can base policy, even in part, on the continued use of obsolete data ending in 1980, when the costs of bringing the data up-to-date is inconsequential compared to Kyoto costs.
For example, in Mann’s famous hockey stick graph, as presented to policymakers and to the public, the graph used Mann’s reconstruction from proxies up to 1980 and instrumental temperatures (here, as in other similar studies, using Jones’ more lurid CRU surface history rather than the more moderate increases shown by satellite measurements). Usually (but not always), a different color is used for the instrumental portion, but, from a promotional point of view, the juxtaposition of the two series achieves the desired promotional effect. (In mining promotions, where there is considerable community experience with promotional graphics and statistics, securities commission prohibit the adding together of proven ore reserves and inferred ore reserves – a policy which deserves a little reflection in the context of IPCC studies).
Last week, a brand new multiproxy study by European scientists [Moberg et al., 2005] was published in Nature. On the very day of publication, I received an email from a prominent scientist telling me that Mann’s hockeystick was yesterday’s news, that the “community” had now “moved on” and so should I. That the “community” had had no opportunity to verify Moberg’s results, however meritorious they may finally appear, seemed to matter not at all.
If you look at the proxy portion of the new Moberg graphic, you see nothing that would be problematic for opponents of the hockey stick: it shows a striking Medieval Warm Period (MWP), a cold Little Ice Age and 20th century warming not quite reaching MWP levels by 1979, when the proxy portion of the study ends. (I’m in the process of examining the individual proxies and the Moberg reconstruction is not without its own imperfections.) In the presentation to the public – see the figure in the Nature article itself, once again, there is the infamous splice between reconstruction by proxy (up to 1980) and the instrumental record thereafter (once again Jones’ CRU record, rather than the satellite record).
One of the first question that occurs to any civilian becoming familiar with these studies (and it was one of my first questions) is: what happens to the proxies after 1980? Given the presumed warmth of the 1990s, and especially 1998 (the “warmest year in the millennium”), you’d think that the proxy values would be off the chart. In effect, the last 25 years have provided an ideal opportunity to validate the usefulness of proxies and, especially the opportunity to test the confidence intervals of these studies, put forward with such assurance by the multiproxy proponents. What happens to the proxies used in MBH99 or Moberg et al  or Crowley and Lowery  in the 1990s and, especially, 1998?
This question about proxies after 1980 was posed by a civilian to Mann in December at realclimate. Mann replied:
Most reconstructions only extend through about 1980 because the vast majority of tree-ring, coral, and ice core records currently available in the public domain do not extend into the most recent decades. While paleoclimatologists are attempting to update many important proxy records to the present, this is a costly, and labor-intensive activity, often requiring expensive field campaigns that involve traveling with heavy equipment to difficult-to-reach locations (such as high-elevation or remote polar sites). For historical reasons, many of the important records were obtained in the 1970s and 1980s and have yet to be updated. [my bold]
Pause and think about this response. Think about the costs of Kyoto and then think again about this answer. Think about the billions spent on climate research and then try to explain to me why we need to rely on “important records” obtained in the 1970s. Far more money has been spent on climate research in the last decade than in the 1970s. Why are we still relying on obsolete proxy data?
As someone with actual experience in the mineral exploration business, which also involves “expensive field campaigns that involve traveling with heavy equipment to difficult-to-reach locations”, I can assure readers that Mann’s response cannot be justified and is an embarrassment to the paleoclimate community. The more that I think about it, the more outrageous is both the comment itself and the fact that no one seems to have picked up on it.
It is even more outrageous when you look in detail at what is actually involved in collecting the proxy data used in the medieval period in the key multiproxy studies. The number of proxies used in MBH99 is from fewer than 40 sites (28 tree ring sites being U.S. tree ring sites represented in 3 principal component series).
As to the time needed to update some of these tree ring sites, here is an excerpt from Lamarche et al.  on the collection of key tree ring cores from Sheep Mountain and Campito Mountain, which are the most important indicators in the MBH reconstruction:
“D.A.G. [Graybill] and M.R.R. [Rose] collected tree ring samples at 3325 m on Mount Jefferson, Toquima Range, Nevada and 11 August 1981. D.A.G. and M.R.R. collected samples from 13 trees at Campito Mountain (3400 m) and from 15 trees at Sheep Mountain (3500 m) on 31 October 1983.”
Now to get to Campito Mountain and Sheep Mountain, they had to get to Bishop, California, which is hardly “remote” even by Paris Hilton standards, and then proceed by road to within a few hundred meters of the site, perhaps proceeding for some portion of the journey on unpaved roads.
The picture below illustrates the taking of a tree ring core. While the equipment may seem “heavy” to someone used only to desk work using computers, people in the mineral exploration business would not regard this drill as being especially “heavy” and I believe that people capable of operating such heavy equipment can be found, even in out-of-the way places like Bishop, California. I apologize for the tone here, but it is impossible for me not to be facetious.
There is only one relatively remote site in the entire MBH99 roster – the Quelccaya glacier in Peru. Here, fortunately, the work is already done (although, needless to say, it is not published.) This information was updated in 2003 by Lonnie Thompson and should be adequate to update these series. With sufficient pressure from the U.S. National Science Foundation, the data should be available expeditiously. (Given that Thompson has not archived data from Dunde drilled in 1987, the need for pressure should not be under-estimated.)
I realize that the rings need to be measured and that the field work is only a portion of the effort involved. But updating 28 tree ring sites in the United States is not a monumental enterprise nor would updating any of the other sites.
I’ve looked through lists of the proxies used in Jones et al. , MBH99, Crowley and Lowery , Mann and Jones , Moberg et al  and see no obstacles to bringing all these proxies up to date. The only sites that might take a little extra time would be updating the Himalayan ice cores. Even here, it’s possible that taking very short cores or even pits would prove adequate for an update and this might prove easier than one might be think. Be that as it may, any delays in updating the most complicated location should not deter updating all the other locations.
As far as I’m concerned, this should be the first order of business for multiproxy studies.
Whose responsibility is this? While the costs are trivial in the scheme of Kyoto, they would still be a significant line item in the budget of a university department. I think that the responsibility here lies with the U.S. National Science Foundation and its equivalents in Canada and Europe. The responsibilities for collecting the proxy updates could be divided up in a couple of emails and budgets established.
One other important aspect: right now the funding agencies fund academics to do the work and are completely ineffective in ensuring prompt reporting. At best, academic practice will tie up reporting of results until the publication of articles in an academic journals, creating a delay right at the start. Even then, in cases like Thompson or Jacoby, to whom I’ve referred elsewhere, the data may never be archived or only after decades in the hands of the originator.
So here I would propose something more like what happens in a mineral exploration program. When a company has drill results, it has to publish them through a press release. It can’t wait for academic reports or for its geologists to spin the results. There’s lots of time to spin afterwards. Good or bad – the results have to be made public. The company has a little discretion so that it can release drill holes in bunches and not every single drill hole, but the discretion can’t build up too much during an important program. Here I would insist that the proxy results be archived as soon as they are produced – the academic reports and spin can come later. Since all these sites have already been published, people are used to the proxies and the updates will to a considerable extend speak for themselves.
What would I expect from such studies? Drill programs are usually a surprise and maybe there’s one here. My hunch is that the classic proxies will not show anywhere near as “loud” a signal in the 1990s as is needed to make statements comparing the 1990s to the Medieval Warm Period with any confidence at all. I’ve not surveyed proxies in the 1990s (nor to my knowledge has anyone else), but I’ve started to look and many do not show the expected “loud” signal e.g. some of the proxies posted up on this site such as Alaskan tree rings, TTHH ring widths, and theories are starting to develop. But the discussions so far do not explicit point out the effect of signal failure on the multiproxy reconstruction project.
But this is only a hunch and the evidence could be otherwise. The point is this: there’s no need to speculate any further. It’s time to bring the classic proxies up to date.