Tingley and Huybers: Varve Compaction

Specialist literature on varves e.g. Besonen et al 2008 – coauthor Raymond Bradley -(which is cited by Tingley and Huybers) make the obvious observation that varves are compacted within a core. Besonen et al 2008 allow for compaction by estimating annual mass accumulation as a more appropriate measurement of varve “thickness”, rather than uncompacted varve thickness. In their abstract, Besonen et al stated:

In many studies of lakes from the High Arctic, varve thickness is a good proxy for summer temperature and we interpret the Lower Murray Lake varves in this way. OnOn that basis, the Lower Murray Lake varve thickness record suggests that summer temperatures in recent decades were among the warmest of the last millennium, comparable with conditions that last occurred in the early twelfth and late thirteenth centuries, but estimates based on the sediment accumulation rate do not show such a recent increase

They report later in the article:

On the other hand, because of compaction, the thickness of recent varves is not directly comparable with those varves that are buried deeper in the sediment pile. This problem can be addressed by calculating a packing index (a simple ratio of the area occupied by sediment grains versus the area occupied by matrix in the varve BSE images) and then calculating the sediment accumulation rate on an annual basis (assuming a constant sediment density of 2.65 g/cm3, for quartz). This procedure compensates for compression of the sediment with depth, and results in a suppression of the trend over the last century seen in the varve thickness record (Figure 7).

An excerpt from Besonen’s Figure 7 is shown below. The top panel shows varve thickness unadjusted for compaction, while the bottom panel shows mass accumulation. The top panel (with no allowance for compaction) shows somewhat elevated 20th century levels, while the bottom panel (after allowing for compaction) does not – the phenomenon noted in their abstract.

besonen excerpt
Figure 1. Excerpt from Besonen et al Figure 7. Top – varve thickness unadjusted for compaction (but after turbidites); middle – density; bottom – mass accumulation.

A related article (Cook et al 2009 – also with Bradley as coauthor) the following year on a different Murray lake core made similar observations about compaction. Cook et al provided a temperature reconstruction using mass accumulation rate, showing a rather elevated MWP as shown below. I’m not inclined to put much weight on simplistic reconstructions from varve thickness (see also my discussion of Gifford Miller’s observations on this topic), but show this series as evidence that mass accumulation was used by this group as the relevant index.)

cook excerpt re murray lake
Figure 2. Excerpt from Cook et al 2009.

Tingley and Huybers 2013 have three classes of proxy data: MXD data, which has the familiar divergence problem; ice core O18 which doesn’t have a Hockey Stick shape and varves. The Murray Lake varve series is used. Tingley and Huybers provide an excellent SI, including exact URLs for data sets as used – a simple enough protocol that unfortunately is seldom observed. (They forgot to archive their actual reconstruction, though I presume that this is a mere oversight since their intent is clearly to provide a comprehensive archive).

In their SI, they state:

Details and references for the lake varve records used in the analysis are available in Table S.1. Unless the description of the data indicates otherwise, we use the total varve thickness.

Of the two Murray Lake versions (different cores taken by the same group), they cite the Besonen et al version.

For the Murray lake record, we use the unfiltered version of the shorter (1000 year) record posted at the NOAA Paleolimnology site [60].

However, when one compares their archive of data as used to the NOAA archive, one can immediately determine that they used varve thickness without compensating for varve compaction (the series match), rather than mass accumulation as used by Cook et al 2009 in their temperature reconstruction. This decision gives a pronounced upward bias, as shown by the difference between the two series (after taking a log and then scaling as in Tingley and Huybers.)

Murray uncompacted difference
Figure ^. Murray Lake. Difference between uncompacted varve thickness as used by Tingley and Huybers and mass accumulation. Both series logged and then scaled, before differencing.

I haven’t yet looked at how the other varve series handled compaction, but it seems like an important issue in any attempt to deduce temperatures from this sort of data. In the particular case of the Murray Lake series, it seems to me that the original data clearly “indicates” that mass accumulation be used as an index, rather than varve thickness unadjusted for compaction and that this should have been used according to the stated methodology of Tingley and Huybers.

As previously noted, Tingley and Huybers also used the contaminated portion of the Korttajarvi sediment data, so there are multiple problems with their varve reconstruction. These are not complicated issues, but ones that ought to be within the scope of even Nature peer reviewers.

New Light on Svalbard

In 1997, the 121 m Lomonosovfonna ice core was drilled in Svalbard. As of mid-2009, when Hu McCulloch and I wrote CA posts on this core, nothing had been published on
O18 values prior to AD1400 nor had any Lomonosovfonna data been archived, even for the post-1400 period.

Both Hu McCulloch and I, in separate CA posts here and here, speculated that the withheld O18 values prior to AD1400 would elevated values. A digital version of the pre-1400 data became available this week in connection with Hanhijarvi et al and confirmed our surmise, as shown below. Continue reading

More from the Junior Birdmen

A new paper in Nature by Tingley and Huybers h/t WUWT.

In keeping with the total and complete stubbornness of the paleoclimate community, they use the most famous series of Mann et al 2008: the contaminated Korttajarvi sediments, the problems with which are well known in skeptic blogs and which were reported in a comment at PNAS by Ross and I at the time. The original author, Mia Tiljander, warned against use of the modern portion of this data, as the sediments had been contaminated by modern bridgebuilding and farming. Although the defects of this series as a proxy are well known to readers of “skeptical” blogs, peer reviewers at Nature were obviously untroubled by the inclusion of this proxy in a temperature reconstruction.

tingley table s1

They stated:

For the Korttajarvi Lake record, we use the organic layer thickness, as the original publication indicates that a thicker organic layer “probably indicates a warmer summer and a relatively long growing season” [57- Boreas].

However, they didn’t mention the following:

This recent increase in thickness is due to the clay-rich varves caused by intensive cultivation in the late 20th century.

and again:

In the 20th century the Lake Korttaja¨rvi record was strongly affected by human activities. The average varve thickness is 1.2 mm from AD 1900 to 1929, 1.9 mm from AD 1930 to 1962 and 3.5 mm from AD 1963 to 1985. There are two exceptionally thick clay-silt layers caused by man. The thick layer of AD 1930 resulted from peat ditching and forest clearance (information from a local farmer in 1999) and the thick layer of AD 1967 originated due to the rebuilding of the bridge in the vicinity of the lake’s southern corner (information from the Finnish Road Administration). Varves since AD 1963 towards the present time thicken because of the higher water content in the top of the sediment column. However, the gradually increasing varve thickness during the whole 20th century probably originates from the accelerating agricultural use of the area around the lake.

All of this was discussed ad nauseam following Mann et al 2008, though Mann stubbornly refused to concede anything. Kaufman et al 2009 also used the data and, on the advice of Overpeck, conceded the point and issued a corrigendum. Raymond Bradley was a coauthor of both papers and more or less simultaneously took the position that a corrigendum was required and not required.

I’m sure that we’ll be told that their use of contaminated Korttajarvi data doesn’t “matter” – nothing ever seems to. But why use it?

Steve Update Apr 11:
For R users, I’ve collated the Tingley proxies into a time series R-matrix called proxy.tab at http://www.climateaudit.info/data/multiproxy/tingley_2013 and their metadata as info_tingley.csv. A simple average of all the Tingley proxies is shown below. It has a divergence problem because the majority of proxies are MXD proxies.


Their Figure S34 top panel shows a reconstruction from MXD proxies along. The reconstruction is very similar to an MXD average,as shown below.
Figure ^. Tingley and Huybers S34 top panel, showing one variation of their proxy-only reconstructions (MXD), with average of MXD proxies (green) for comparison.

Tingley has provided an exemplary archive. It requires a little collation. R users who wish to skip their own collation may use my collation as follows:

tsp(proxy) #1400 2005

dim(info)	#[1] 125  11

max(time(count)[count>20]) #1992

plot.ts( window(annual, end=max(time(count)[count>20]) ),ylab="SD Units")
title("Average of Normalized Tingley Proxies")

The Impact of TN05-17

TN05-17 is by far the most influential Southern Hemisphere core in Marcott et al 2013- it’s Marcott’s YAD061, so to speak. Its influence is much enhanced by the interaction of short-segment centering in the mid-Holocene and non-robustness in the modern period. Marcott’s SHX reconstruction becomes worthless well before the 20th century, a point that they have not yet admitted, let alone volunteered.

Marcott’s TN05-17 series is a bit of an odd duck within his dataset. It is the only ocean core in which the temperature is estimated by Modern Analogue Technique on diatoms; only one other ocean core uses Modern Antalogue Technique (MD79-257). The significance of this core was spotted early on by ^.

TN05-17 is plotted below. Rather unusually among Holocene proxies, its mid-Holocene values are very cold. Centering on 4500-5500 BP in Marcott style results in this proxy having very high anomalies in the modern period: closing at a Yamalian apparent anomaly of over 4 deg C.

TN05-17_baseFigure 1. TN05-17.

In the most recent portion of the Marcott SHX, there are 5 or fewer series, as compared to 12 in the mid-Holocene. Had the data been centered on the most recent millennium and extended back (e.g. Hansen’s reference station method is a lowbrow method), then there would have been an extreme negative contribution from TN05-17 in the mid-Holocene, but its contribution to the average would have been less (divided by 12, instead of 4). As shown below, TN05-17 pretty much by itself contributes the positive recent values of the SHX reconstruction. It’s closing anomaly (basis 4500-5500 BP) is 4.01 deg. There are 4 contributing series – so the contribution of TN05-17 to the SHX composite in 1940 is 4.01/4, more than the actual SHX value. The entire increase in the Marcott SHX from at least 1800AD on arises from increased influence of TN05-17 – the phenomenon pointed out in my post on upticks.
TN05-17 contribution
Figure 2. Contribution of TN05-17 to the Marcott SHX reconstruction.

Given the overwhelming importance of this proxy, one would like to know a little more about it. The next graphic compares TN05-17 to two other SHX proxies, also MAT proxies but from small lakes in southern New Zealand. The inconsistency of the proxies is evident. The New Zealand paleolimnological proxies have nothing resembling the mid-Holocene “cold period” that characterizes TN05-17. One thing that this graphic shows for sure: the residuals of these proxies from the “true” temperature history as translated to the respective sites do not remotely resemble a low-order AR1 process. To properly model the error distribution, one has to have an error model that permits excursions for millennia – not at all easy to specify.

SHX MAT proxies
Figure 3. Three SHX Modern Analogue Technique Proxies

It appears highly probable that there is some confounding influence on TN05-17. TN05-17 was cored in the Atlantic sector of the Southern Ocean south of Africa. As shown in the graphic below, it is located in very large scale “sediment drifts”.

agulhas drift annotated
Figure 4. Location map of TN05-17 (shown as red dot.)

Nielsen et al 2004 observed that the alkenone temperatures of the most recent samples are several degrees higher than ocean temperatures in the area. They speculated that some of the coretop might be missing – not particularly reassuring when this proxy is the most important contributor not just to 20th century SHX Marcott warming but 19th century SHX Marcott warming. There is occasional discussion in specialist literature of circumstances in which alkenone temperatures are warmer than local ocean temperatures e.g. Ruhlemann et al taking the alkenones from the warmer location in which they formed to the colder place where they settled:

We suggest that the southern samples are biased by suspended organic detritus originating from the cold subpolar waters of the northward flowing Malvinas Current, whereas the northern samples carry an UK’37 signal of tropical/
subtropical origin, transported southward with the Brazil Current. On the basis of surface ocean transport pathways and velocities simulated with the large-scale geostrophic (LSG) ocean general circulation model, we identify areas of the world ocean where alkenone temperatures are potentially biased to higher or lower values due to long particle residence times and lateral advection by surface currents.

The area studied by Ruhlemann et al was in the western South Atlantic between 30 and S. Could something similar be going in the eastern South Atlantic in the area of TN05-17 (50S, 6E) – seems entirely possible to me. There is convincing evidence that there have been secular changes in the Agulhas currents over the Holocene. The TN07-17 history certainly suggests secular changes to me: it looks like ocean currents have changed in this sector over the Holocene, such that alkenone drift (along the lines of the South American alkenone drift) has contributed to the warm values in the early Holocene and later Holocene, while colder currents were present in the mid-Holocene.

Whatever is right or wrong about Marcott et al, merely from a perspective of craftsmanship, it is not particularly reassuring that the main (Yad061 even) contribution to modern SHX warming in the Marcott reconstruction appears to arise from a “cold” mid-Holocene interval at TN05-17, translated into modern warming through short-segment mid-Holocene centering and modern proxy dropout.

Alkenone Divergence offshore Iceland

The longest very high-resolution alkenone core that I’m aware of is Sicre et al’s MD99-2275 (plus splices) from offshore Iceland (67N 18W). It is 4550 years long, its most recent value is 2001AD and its resolution is 4 years. Marcott used nearby core JR51GC-35 (also at 67N 18W), also an alkenone record, which had a resolution of 110 years and a most recent Marcott date of 1836AD.

Here is how the two series compare over the 4500 years covered by the Sicre et al record (originally published in 2008, but updated in 2012). (The NOAA archive is unfortunately inadequate as it does not include depth or identify splice points.)

iceland 67N 18W modern
Figure 1. 67N 18W Offshore Iceland. Comparison of Marcott et al series to high-resolution series.

Taking a longer view, here are the two series compared over the Holocene.

Figure 2. 67N 18W Offshore Iceland. Comparison of Marcott et al series to high-resolution series.

Finally, here’s a zoom into the modern period.
iceland 67N 18W highres
Figure 3. As above. Zoom in.

In geophysical surveying, one tries to use the best quality surveys where available and benchmark lower quality surveys against the highest quality ones. If this methodology were used here, the errors in JR51GC-35 are obviously very large, much higher than arising from the alkenone calibration equation by itself, though other factors could be at work as well.

MD99-2275 has well-dated core going deep into the Holocene. Hopefully, Sicre and other specialists will continue their commendable program. Seeing if these results can be replicated in another core would also do much to increase confidence.

The alkenone divergence problem is clearly present in this data. ALkenone-estimated temperatures in the 20th century continued to decline. In the Marcott reconstruction, JR51GC-35 makes its last (very negative) contribution in the 1820 step. By the act of no longer participating, it causes the Marcott composite to go up in the next period, even though it appears that the “true” alkenone estimated temperature in the area continues to decline.

I mentioned MD99-2275 as a high-resolution core in my notes on AGU 2006. Like McGregor’s Cape Ghir (used inverted), it was one of the proxies in Trouet et al 2009 discussed at CA here. Here is a figure showing the updated Sicre version against the Trouet et al illustration. The Sicre version is in cyan (versus the “Iceland” series in blue). The divergence in the present series continues further than in the Trouet version.

Figure 4. Excerpt from Trouet et al 2009, showing updated Sicre et al series.

Alkenone Divergence in Peru

Gutierrez et al (GRL 2011) pdf here; data here is another very high resolution alkenone series that is well-dated in the 20th century. It was taken in an upwelling zone offshore Peru at a similar latitude to Quelccaya.

Like the high-resolution series offshore Morocco and Namibia, it shows a sharp decline in alkenone-estimated SST in the 20th century, as illustated below. (The archived data has a little more coverage -back to ~1750.)

B0406 alkenone
Excerpt from Gutierrez at al 2011.

The authors survey temperature data at nearby stations (Callao, Pisco and a few others) and report slight cooling in the late 20th century. They suggest “ERA 40 reanalysis indicates its link [cooling] with intensified alongshore winds driving upwelling in spring”.

The closest Marcott proxy is GeoB7139-2, taken offshore Chile at approximately 30S. (A closer comparison would be nice.) This proxy is shown below. It only has two radiocarbon dates in the entire Holocene and only has resolution of ~520 years. (Stated data selection criteria are that “at least four age-control points span or closely bracket the full measured interval” and “sampling resolution is typically better than ~300 yr”.

Figure 2. GeoB 7139-2 per Marcott.

The next graphic shows GeoB7139-2 together with B0604 (the latter offset by 4 deg C).
B06045 comparison
Figure 3. B0604 (offset 4 deg C) and GeoB7139-2.

Although we’ve been reassured by Marcott apologists of the ability of their data and method to capture any past upspike, one feels that this particular series will not be much help in that enterprise.

In Lonnie Thompson’s recent Quelccaya publication, Thompson estimated Nino3-4 SST. Needless to say, they don’t bear much similarity to the alkenone SSTs shown here.

Stepping back from Marcott, increased upwelling in the late 20th century seems to have occurred all over the globe. In addition to the upwelling sites surveyed here in the last couple of days (Morocco, Namibia, Peru), I’ve also seen 20th century declines in alkenone data offshore Iceland and, of course, in the two Marcott series where the decline was deleted: MD03-2421 offshore Japan and OCE326-GGC offshore eastern Canada, both of which I’ll now reconsider with this in mind.

Alkenone Divergence

While there are disappointingly few high-resolution alkenone ocean cores with 20th century resolution, there are a few. Given the importance of this class of proxy in Marcott et al, one would have thought the performance of high-resolution alkenones in the 20th century would have been of interest to Marcott et al, but they were silent on the topic.

Before Climategate, I’d considered high-resolution ocean cores from time to time at Climate Audit (see alkenone tag). In 2007, I reported on a very high-resolution alkenone series offshore Morocco (about 30N) by Helen McGregor (see here and here). This dataset had a serious divergence problem, i.e. the water was getting colder. McGregor worried that fish in the area might need swimming lessons to cope with the rapid change.

McGregor’s series was cited in Leduc et al (2010), a specialist presentation of another high-resolution alkenone series (GeoB8331, GeoB 8331) taken in the Benguela upwelling zone offshore Nambia at about 30S. Like McGregor, they also found sharply cooler SST in late 20th century as measured by well dated alkenone data, indicating that the alkenone divergence problem was not unique to McGregor’s site:
leduc benguela sst
Figure 1. From Leduc et al 2010.

The closest Marcott series to McGregor’s Morocco series was the Iberian Margin (#51) D13822 alkenone series (Abrantes et al.) The graphic below compares the two series: the shorter (warmer) high-resolution McGregor series in red and the longer series used by Marcott in black. The green “rug” marks at top are D13822 radiocarbon dates.


The next graphic shows the modern portion of these two series, both offset to facilitate comparison. In this graphic, I’ve both the published and Marcott dates for the Iberian Margin series, together with the high-resolution McGregor series. All show very pronounced closing downticks, with the well-dated McGregor series placing the Morocco decline in the 20th century and even the last half of the 20th century, Marcott dating the decline in the Iberian Margin to the 19th century (with a dating error of 100-150 years) and Rodrigues et al dating the decline to the 15th century. An obvious question is whether the downturns in the Iberian Margin and Morocco series are contemporary or phased. (Note that the removal of D13822 after the late 19th century contributed to the Marcott 20th century uptick.)


Both sites are very high accumulation. McGregor’s box core GeoB6008-1 accumulated 32 cm in less than a century (1912-1998), while the Iberian margin D13822 is estimated to have accumulated about 25 cm per century (this is about 20 times higher than many cores.) The top sample for D13822 is at 10 cm and dated 57BP by Marcott and 442 BP in the original publication (which I haven’t seen yet.) The closest radiocarbon date is at 257 cm (calibrated 1511 BP) i.e. not closely dated. For comparison, the 10 cm sample in closely-dated GeoB6008-1 is dated to 1981AD and to 1947AD in GeoB6008-2.

Trouet et al 2009

As an amusing sidebar, we also discussed the McGregor alkenone series as it was used in a multiproxy study by Trouet et al 2009 – see CA discussion here. Trouet “solved” the divergence problem in best Mann et al 2008 style: by turning the series upside down. The red series labelled “Cape Ghir” and going sharply upward in the excerpt from Trouet et al 2009 is the McGregor alkenone series with sharply colder temperatures.

Excerpt from Trouet et al 2009.

The practice of multiproxy authors turning indicators of cold SST upside down had been previously criticized at CA in connection with Moberg et al 2005, which used a proxy series showing increased presence of coldwater (polar) foraminifera (% G Bulloides) in the Arabian Sea as one of their most potent indicators of global warming.

The closest Marcott series to the Leduc et al 2010 Benguela series shown above is Farmer’s ODP1084B (also used in Loehle and McCulloch and discussed in Schmidt’s critique of Loehle and McCulloch.) I’ll discuss this interesting downspike on another occasions.

In general, 20th century downspikes in high-resolution alkenone series seem to be the rule, rather than the exception – a divergence problem that is not discussed in the multiproxy studies. The most plausible reason is that high-resolution in the 20th century requires high-resolution, which requires high biological productivity, which, in turn, is most characteristic of upwelling zones. Increased upwelling in upwelling zones seems to be a rather pronounced in the 20th century. This is not inconsistent with overall warming, but neither is it an issue that multiproxy jockeys can simply brush aside.

Clearly distinguished

[Update 04/08/13: Josh has once again a brilliant view what’s going on here and in some other cases. Enjoy!


No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstruction. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum. Most proxy reconstructions end somewhere around 1980, for the reasons discussed above. Often, as in the comparisons we show on this site, the instrumental record (which extends to present) is shown along with the reconstructions, and clearly distinguished from them.

Michael E. Mann


Marcott’s Dimple: A Centering Artifact

One of the longstanding CA criticisms of paleoclimate articles is that scientists with little-to-negligible statistical expertise too frequently use ad hoc and homemade methods in important applied articles, rather than proving their methodology in applied statistical literature using examples other than the one that they’re trying to prove.

Marcott’s uncertainty calculation is merely the most recent example. Although Marcott et al spend considerable time and energy on their calculation of uncertainties, I was unable to locate a single relevant statistical reference either in the article or the SI (nor indeed a statistical reference of any kind.) They purported to estimate uncertainty through simulations, but simulations in the absence of a theoretical framework can easily fail to simulate essential elements of uncertainty. For example, Marcott et al state of their simulations: “Added noise was not autocorrelated either temporally or spatially.” Well, one thing we know about residuals in proxy data is that they are highly autocorrelated both temporally and spatially.

Roman’s post drew attention to one such neglected aspect.

Very early in discussion of Marcott, several readers questioned how Marcott “uncertainty” in the mid-Holocene could possibly be lower than uncertainty in recent centuries. In my opinion, the readers are 1000% correct in questioning this supposed conclusion. It is the sort of question that peer reviewers ought to have asked. The effect is shown below (here uncertainties are shown for the Grid5x5 GLB reconstruction) – notice the mid-Holocene dimple of low uncertainty. How on earth could an uncertainty “dimple” arise in the mid-Holocene?

Figure 1. “Uncertainty” from Marcott et al 2013 spreadsheet sheet 2. Their Figure 1 plot results with “1 sigma uncertainty”.

It seems certain to me that “uncertainty” dimple is an artifact of their centering methodology, rather than of the data.

Marcott et al began their algorithm by centering all series between BP4500 and BP5500 – these boundaries are shown as dotted lines in the above graphic. The dimple of low “uncertainty” corresponds exactly to the centering period. It has no relationship to the data.

Arbitrary re-centering and re-scaling is embedded so deeply in paleoclimate that none of the practitioners even seem to notice that it is a statistical procedure with inherent estimation issues. In real statistics, much attention is paid to taking means and estimating standard deviation. The difference between modern (in some sense) and mid-Holocene mean values for individual proxies seems to me to be perhaps the most critical information for estimating the difference between modern and mid-Holocene temperatures, but, in effect, Marcott et al threw out this information by centering all data on the mid-Holocene.

Having thrown out this information, they then have to link their weighted average series to modern temperatures. They did this by a second re-centering, this time adjusting the mean of their reconstruction over 500-1450 to the mean of one of the Mann variations over 500-1450. (There are a number of potential choices for this re-centering, not all of which yield the same rhetorical impression, as Jean S has already observed.) That the level of the Marcott reconstruction should match the level of the Mann reconstruction over 500-1450 proves nothing: they match by construction.

The graphic shown above – by itself – shows that something is “wrong” in their estimation of uncertainty – as CA readers had surmised almost immediately. People like Marcott are far too quick to presume that “proxies” are a signal plus simple noise. But that’s not what one actually encounters: the difficulty in the field is that proxies all too often give inconsistent information. Assessing realistic uncertainties in the presence of inconsistent information is a very non-trivial statistical problem – one that Marcott et al, having taken a wrong turn somewhere, did not even begin to deal with.

I’m a bit tired of ad hoc and homemade methodologies being advanced by non-specialists in important journals without being established in applied statistical journals. We’ve seen this with the Mannian corpus. Marcott et al make matters worse by failing to publish the code for their novel methodology so that interested readers can quickly and efficiently see what they did, rather than try to guess at what they did.

While assembling Holocene proxies on a consistent basis seems a useful bit of clerical work, I see no purpose in publishing an uncertainty methodology that contains such an obvious bogus artifact as the mid-Holocene dimple shown above.

The Quelccaya Update

Lonnie Thompson has done a much better job of archiving data for his recent update Quelccaya – see NOAA here – both in terms of information and promptness.

Quelccaya is familiar territory for Thompson as it was the location of his first tropical ice cores (1983) and his first publication of this type. Thompson published a first update of Quelccaya d18O values in 2006 (PNAS) but only 5-year average data and only to the late 1990s. The new dataset gives annual data (previously available from the 1983 cores from 470 to 1983) from 226 to 2009.

Below is a graphic showing twentieth century on, comparing to the PNAS 2006 five-year data. The extension covers the big 1998-99 El Nino with a dotted red line. Since 1998-99 is known to be an exceptionally warm year, it is interesting to observe that it is manifested at Quelccaya as an negative downspike.


There has been a longstanding dispute about whether d18O at Quelccaya and other tropical glaciers is a proxy for temperature or for the amount of precipitation. In monsoon region precipitation, negative d18O values show rain-out. Quelccaya d18O has been (IMO plausibly) interpreted by Hughen as evidence of north-south migration of the ITCZ, with Hughen comparing Quelccaya information particularly to information from Cariaco, Venezuela.

It seems to me that, among specialists, Thompson is probably standing fairly alone in claiming that d18O at tropical glaciers is a proxy for temperature rather than amount effect. (Because of Thompson’s eminence, the contradiction of his results is mostly implied, rather than directly stated.) Despite these reservations among specialists, Thompson’s d18O records have been widely cited by Mann and other multiproxy jockeys (both directly and through the Yang composite) and are important contributors to some of the AR4 Hockey Sticks. “Dr Thompson’s Thermometer” was proclaimed in Inconvenient Truth as supposedly vindicating the Mann Hockey Stick, although the graphic shown in AIT was merely the Mann hockey stick wearing whiskers, so naturally it confirmed itself.

Because the 1998 El Nino was so big, it provides a good test case for temperature vs amount. It seems to me that the negative downspike for the big 1998 El Nino is decisive against Thompson.

The PNAS version of the data left off showing a sort of uptick. The extension to 2009 does not seem to me to be going off the charts.

Update Apr 8. here is a comparison of Quelccaya O18 to HadCRU GLB (both scaled over the 20th century). I’ve used GLB because Quelccaya is used to deduce global temperatures in multiproxy studies, not temperatures at Cuzco. Quelccaya O18 values obviously do not capture the temperature trend. Marcott/Mann defenders say that we don’t need proxies to know that temperature has gone up in the 20th century. Quite so. Quelccaya was not a Marcott proxy, but it was important in Mann et al 2008 and other multiproxy reconstructions. What does this sort of thing really tell us?


Anthony’s coverage of the release of this data prompted some discussion of the Thompsons as serial non-archivers, referring to my post here. It is worth commending Thompson for prompt archiving of the present data, but that does not refute past criticism of both Ellen and Lonnie. (I note that Thompson has mitigated some of that criticism by archiving some data on old cores, even within the past year.)

The post in question actually was directed at Ellen Mosley-Thompson, who, as far as I can tell, has not archived a single data set in which she was lead PI in over 30 years in the business. I stated the following:

She has spent her entire career in the ice core business> According to her CV, she has led “nine expeditions to Antarctica and six to Greenland to retrieve ice cores”. However, a search of the NOAA paleo archive for data archived by Ellen Mosley-Thompson shows only one data set from Antarctica or Greenland associated with her. Lest this example be taken to mar her otherwise unblemished record of non-archiving, the data was published in 1981 while she was still junior and, according to its readme, it was transcribed by a third party and contributed in her name. I believe that it’s fair that she has not archived at NOAA (or, to my knowledge, elsewhere) any data from the “nine expeditions to Antarctica and six to Greenland”.

I did a fairly thorough review of Thompson’s non-archiving as of July 2012 here. Nick Stokes at WUWT claimed that my posts were refuted by his being able to locate Thompson data at NOAA. Unfortunately, this is the sort of misdirection that is all too prevalent in the field.

I am obviously aware of the NOAA archive. While, like anyone else, I make my share of mistakes, the odds of me being wrong in the trivial way that Stokes asserted are negligible. While Ellen is listed as a co-contributor on expeditions led by Lonnie, the above statement is true as written.

Nor does Nick’s location of NOAA archives (which I know intimately) refute my criticisms of Thompson’s archive here. The Lonnie situation is much less bad than when I started criticizing him: when I first got interested, no data for Dunde, Guliya or Dasuopu had been archived and Thompson blew off requests for data. Matters are less bad, but still very unsatisfactory. Inconsistent grey versions of Dunde and other series are in circulation. This can only be sorted out by archiving all samples together with dating criteria. I’ve characterized such an archive as Thompson’s legacy – something that he should be proud of and not resist.

I’ve also strongly criticized Thompson’s failure to archive the Bona-Churchill data, sampled long before the recent Quelccaya data. This data was already overdue in 2006, when I first criticized its non-publication and non-archiving. At the time, I observed (somewhat acidly, I’ll admit) that if the data had a big upspike in the late 20th century, Thompson would have press released and published. Because the dog didn’t bark, I predicted that the data went the “wrong” way. Seven years later, Thompson still hasn’t published Bon-Churchill, though results were shown at a workshop a number of years ago, showing that they did indeed go the ‘wrong” way, as I had surmised.