PAGES2K and Nature’s Policy against Self-Plagiarism

Nature’s policies on plagiarism state:

Duplicate publication, sometimes called self-plagiarism, occurs when an author reuses substantial parts of his or her own published work without providing the appropriate references.

The description of the Australasian network of PAGES2K (coauthors Gergis, Neukom, Phipps and Lorrey) is almost entirely lifted in verbatim or near-verbatim chunks from Gergis et al, 2012 (withdrawn and under re-review), in apparent violation of Nature’s policy against self-plagiarism.

Continue reading


Gergis2K and the Oroko “Disturbance-Corrected” Blade

Only two Gergis proxies (both tree ring) go back to the medieval period: Oroko Swamp, New Zealand and Mt Read, Tasmania, both from Ed Cook.  Although claims of novelty have been made for the Gergis reconstruction, neither of these proxies is “new”, with both illustrated in AR4 and Mt Read being used as early as Mann et al 1998 and Jones et al 1998.

In today’s post, I’ll look in more detail at the Oroko tree ring chronology, which was used in three technical articles by Ed Cook (Cook et al 2002 Glob Plan Chg; Cook et al 2002 GRL; Cook et al 2006) to produce temperature reconstructions.  In Cook’s earliest article (2002 Glob Plan Chg), Cook showed a tree ring chronology which declined quite dramatically after 1957.  Cook reported that there was a very high correlation to instrumental summer temperature (Hokitika, South Island NZ) between 1860 and 1957, followed by a “collapse” in correlation after 1957 – a decline attributed by Cook to logging at the site.  For his reconstruction of summer temperature, Cook  “accordingly” replaced the proxy estimate with instrumental temperature after 1957, an artifice clearly marked in Cook’s original articles, but not necessarily in downstream multiproxy uses.

Gergis et al 2012 (which corresponds to PAGES2K up to a puzzling one year offset) said that they used “disturbance-corrected” data for Oroko:

“for consistency with published results, we use the final temperature reconstructions provided by the original authors that includes disturbance-corrected data for the 213 Silver Pine record…( E. Cook, personal communication)

By “disturbance correction” , do they mean the replacement of proxy data after 1957 by instrumental data? Or have they employed some other method of “disturbance correction”?

Assessment of this question is unduly complicated because Cook never archived Oroko measurement data or, for that matter, any of the chronology versions or reconstructions appearing in the technical articles.  Grey versions of the temperature reconstruction (but not chronology) have circulated in connection with multiproxy literature (including Mann and Jones 2003, Mann et al 2008, Gergis et al 2012 and PAGES2K 2013).  In addition, two different grey versions occur digitally in Climategate letters from 2000 and 2005, with the later version clearly labeled as containing a splice of proxy and instrumental data.  The Gergis version is clearly related to the earlier grey versions, but, at present, I am unable to determine whether the “disturbance correction” included an instrumental splice or not.

There’s another curiosity.  As noted above, Cook originally claimed a high correlation to instrumental temperature up to at least 1957, and, based, on their figures, the correlation to 1999 would still have been positive, even if attenuated, but Mann and Jones 2003 reported a negative correlation (-0.25) to instrumental temperature.  However, Gergis et al 2012 obtained opposite results, once again asserting a statistically significant positive correlation to temperature.  To the extent that there had been splicing of instrumental data into the Gergis version, one feels that claims of statistical significance ought to be qualified.  Nonetheless, the negative correlation claimed in Mann and Jones 2003 is puzzling: how did they obtain an opposite sign to Cook’s original study?

As to the Oroko proxy itself,  it does not have anything like a HS-shape. It has considerable centennial variability. Its late 20th century values are somewhat elevated (smoothed 1 sigma basis 1200-1965), but nothing like the Gergis 4-sigma anomaly.  It has no marked LIA or MWP. It has elevated values in the 13th century, but it has low values in the 11th century, the main rival to the late 20th century, and these low 11th century values  attenuate reconstructions where 11th and 20th century values are close.  The HS-ness of the Gergis2K reconstruction does not derive from this series.

The Oroko Swamp site is on the west (windward) coast of South Island, New Zealand at 43S at low altitude (110 m).   In December 2012, during family travel to New Zealand South Island, we visited a (scenic) fjord on the west coast near Manapouri (about 45S).   These are areas of constant wind and very high precipitation. They are definitely nowhere near altitude or latitude treelines. Cook himself expressed surprise that a low-altitude chronology would be correlated to temperature, but was convinced by the relationship (see below).

In today’s post, I’ll parse the various versions far more closely than will interest most (any reasonable) readers.  I got caught up trying to figure out the data and want to document the versions while it’s still fresh in my mind. Continue reading

Gergis and the PAGES2K Regional Average

The calculation of the PAGES2K regional average contains a very odd procedure that thus far has escaped commentary. The centerpiece of the PAGES2K program was the calculation of regional reconstructions in deg C anomalies. Having done these calculations, most readers would presume that their area weighted average (deg C) would be the weighted average of these regional reconstructions already expressed in deg C.

But this isn’t what they did. Instead, they first smoothed by taking 30-year averages, then converted the smoothed deg C regional reconstructions to SD units (basis 1200-1965) and took an average in SD units, converting the result back to deg C by “visual scaling”.

This procedure had a dramatic impact on the Gergis reconstruction. Expressed in deg C and as illustrated in the SI, it has a very mild blade. But, the peculiar PAGES2K procedure amplified the relatively small amplitude reconstruction into a monster blade with a 4 sigma closing value. Following the Arctic2K non-corrigendum correction, it is the largest blade in the reconstruction (and has the greatest area weight.)

I’ll show this procedure in today’s post.

. Continue reading

The Kaufman Tautology

The revised PAGES2K Arctic reconstruction used 56 proxies (down three from the original 59).  Although McKay and Kaufman 2014 didn’t mention the elephant in the room changes in their reconstruction (as discussed at CA here here), they reported with some satisfaction that “decadal-scale variability in the revised [PAGES2K] reconstruction is quite similar to that determined by Kaufman et al. (2009)”, presumably thinking that this replication in the larger dataset was evidence of robustness of at least this property of the data. However, while the decadal scale similarity is real enough, this is more of a tautology rather than evidence of robustness, as 16 of the most highly weighted PAGES2K proxies come from the Kaufman et al 2009 network  (the 22 Kaufman 2009 proxies being assigned over 80% of the total weight and the other  34 proxies under 20%.) Continue reading

Warmest since, uh, the Medieval Warm Period

The money quote in the PAGES2K abstract was that there wasn’t any worldwide Little Ice Age of Medieval Warm Period and that AD1971-2000 temperatures were the highest in nearly 1400 years, long before the Medieval Period:

There were no globally synchronous multi-decadal warm or cold intervals that define a worldwide Medieval Warm Period or Little Ice Age … during the period ad 1971–2000, the area-weighted average reconstructed temperature was higher than any other time in nearly 1,400 years.

In today’s post, I’ll show that the knock-on impact of changes to the Arctic reconstruction on the area-weighted average also make the latter claim untrue. Incorporating the revised Arctic reconstruction, one can however say that, during the period AD1971–2000, the area-weighted average reconstructed temperature was higher than any other time since, uh, the Medieval Warm Period. Continue reading

The Third Warmest Arctic Century

PAGES2K (2013) unequivocally stated that the Arctic was “warmest during the 20th century”:

The Arctic was also warmest during the twentieth century, although warmer during 1941–1970 than 1971–2000 according to our reconstruction.

McKay and Kaufman 2014 did not withdraw or amend the above statement, instead reporting that the revision amplified the cooling trend prior to the 20th century and had only a “fairly minor impact on the relative variability” of the reconstruction .  However, in the corrected reconstruction, the 20th century is only the third warmest. (I do not conclude from their data and methods that this is necessarily true, only that the assertion in the original article is not supported by the revised reconstruction. Continue reading

Gavin Schmidt and the EPA Denial Decision

About eight weeks ago, Jean S postulated that Gavin Schmidt had been involved in writing the documents supporting EPA’s decision denying various petitions for reconsideration of the Endangerment Finding (the “RTP documents“), documents that Mann had cited to the D.C. Court as a supposedly  “independent” investigation into allegations against him. Obviously, if Schmidt had been involved in the evaluation of evidence for EPA, any claim to “independence” of the EPA’s supposed investigation would be risible.

Jean S directly asked Schmidt, but Schmidt ignored the question.

However, Jean S’ post led to the discovery of new and convincing evidence on Schmidt’s involvement in the RTP documents, which I’ll report today for the first time.   Searching for an answer also revealed that EPA appears to have violated federal peer review policies in respect to the peer review of the RTP documents supporting the denial decision. Continue reading

Millennial Quebec Tree Rings

In today’s post, I’m going to discuss an important new 1000-year chronology from northern treeline spruce in Quebec (Gennaretti et al 2014, PNAS here).  The chronology is interesting on multiple counts.  This is the first Quebec northern treeline chronology to include the medieval warm period.  Second, it provides a long overdue crosscheck against the Jacoby-D’Arrigo chronologies (including Gaspe) that have been embedded in a number of canonical reconstructions.  Its results are very different.  Third, the Quebec (and Labrador) northern treeline is the closest treeline to the Baffin Island ice core and varve thickness series.  I’ve observed on several occasions that interpretation of Baffin Island varve thickness series (Big Round Lake) is presently inconsistent with the interpretation of the similar Hvitarvatn series in Iceland and, in my opinion, there are serious questions about whether PAGES2K has oriented this series correctly.

Continue reading

Decomposing Paico

In today’s post, Jean S and I are going to show that the paico reconstruction, as implemented in the present algorithm, is very closely approximated by a weighted average of the proxies, in which the weights are proportional to the number of measurements.  Paico is a methodology introduced in Hanhijarvi et al 2013 (pdf here) and applied in PAGES2K (2013). It was discussed in several previous CA posts.

We are able to show this because we noticed that the contributions of each proxy to the final reconstruction can be closely estimated by half the difference between the reconstruction and reconstructions in which each series is flipped over, one by one. This sounds trivial, but it isn’t: the decomposition has some surprising properties. The method would not work for algorithms which ignore knowledge of the orientation of the proxy i.e. ones where it supposedly doesn’t “matter” whether the proxy is used upside down or not. In particular, the standard deviations of the contribution from each proxy vary by an order of magnitude, but in a way that has an interesting explanation. We presume that this decomposition technique is very familiar in other fields. The following post is the result of this joint work. Continue reading

New Article on Igaliku

Shortly after the publication of PAGES2K, I pointed out that the Igaliku lake sediment proxy, had been contaminated by modern agricultural runoff. The post attracted many comments.

Nick Stokes vigorously opposed the surmise that the Igaliku series had been contaminated by modern agriculture and/or that such contamination should have been taken into account by Kaufman and associates. Stokes:

I see earlier demands that selection criteria be declared for proxies. Kaufman has done that, and appears to have stuck with them. But when a spike appears, suddenly the CA throng has a thousand a posteriori reasons why Kaufman is a reprobate for not throwing it out.

or

I see no reason to disagree with the original authors, Massa et al in saying that “pollen accumulation appears to document climatic changes of the last millennia nonetheless”. The Betula/Salix counts are not contaminated.

Subsequent to my CA post, the Igaliku specialists have published a new article entitled “Lake Sediments as an Archive of Land use and Environmental Change in the Eastern Settlement, Southwestern Greenland” (abstract here) which unambiguously connected soil erosion to agriculture, not just in the modern period, but in the medieval period, observing that modern mechanization in the 1980s had resulted in a “five times” the rate of erosion.

Palaeoenvironmental studies from continental and marine sedimentary archives have been conducted over the last four decades in the archaeologically rich Norse Eastern Settlement in Greenland. Those investigations, briefly reviewed in this paper, have improved our knowledge of the history of the Norse colonization and its associated environmental changes. Although deep lakes are numerous, their deposits have been little used in the Norse context. Lakes that meet specific lake-catchment criteria, as outlined in this paper, can sequester optimal palaeoenvironmental records, which can be highly sensitive to both climate and/or human forcing. Here we present a first synthesis of results from a well-dated 2000-year lake-sediment record from Lake Igaliku, located in the center of the Eastern Settlement and close to the Norse site Garðar. A continuous, high-resolution sedimentary record from the deepest part of the lake provides an assessment of farming-related anthropogenic change in the landscape, as well as a quantitative comparison of the environmental impact of medieval colonization (AD 985—ca. AD 1450) with that of recent sheep farming (AD 1920—present). Pollen and non-pollen palynomorphs (NPPs) indicate similar magnitudes of land clearance marked mainly by a loss of tree-birch pollen, a rise in weed taxa, as well as an increase in coprophilous fungi linked to the introduction of grazing livestock. During the two phases of agriculture, soil erosion estimated by geochemical proxies and sediment-accumulation rate exceeds the natural or background erosion rate. Between AD 1010 to AD 1180, grazing activities accelerated soil erosion up to ≈8 mm century-1, twice the natural background rate. A decrease in the rate of erosion is recorded from ca. AD 1230, indicating a progressive decline of agro-pastoral activities well before the end of the Norse occupation of the Eastern Settlement. This decline could be related to possible climate instabilities and may also be indirect evidence for the shift towards a more marine-based diet shown by archaeological studies. Mechanization of agriculture in the 1980s caused unprecedented soil erosion up to ≈21 mm century-1, five times the pre-anthropogenic levels. Over the same period, diatom assemblages show that the lake has become steadily more mesotrophic, contrary to the near-stable trophic conditions of the preceding millennia. These results reinforce the potential of lake-sediment studies paired with archaeological investigations to understand the relationship between climate, environment, and human societies.

I recently noticed that my criticism had been more or less conceded in McKay and Kaufman 2014, which purported to accommodate the contamination (or overprinting, as suggested by Mosher) by deleting the last two points. I was critical of their correction, arguing that their correction still leaves a heavily contaminated reading in 1970. (The next reading is dated circa 1910 – its’ very low resolution and actually below the resolution standard of the study).

It’s hard to tell whether this was intentional or not. I can see one way that they might have left in this value by accident. If they had deleted two points from the PAGES2K-2013 version, that would have also deleted the contaminated 1970 point. But the PAGES-2013 had already omitted or removed one point from the underlying NOAA version. The new McKay and Kaufman version deleted two points from the NOAA version, and thus only one point from the PAGES2K-2013 version, still leaving the contaminated 1970 reading.

Or, if pressed, perhaps they would argue that the most recent article only expressly referred to mechanization “in the 1980s”. However, this hardly precludes the likelihood that the elevated erosion observed in the sample dated circa 1970 could not similarly be attributed to mechanization occurring earlier than the 1980s (farm mechanization obviously occurring throughout the world long before the 1980s) or dating error.

The series should never have been used in a temperature reconstruction.

Note: Jean S and I have been doing some interesting analysis of paico and it is my present view that Igaliku does not have a large impact on the paico reconstruction, but does have a large impact on the “basic composite” reconstruction, one of the PAGES2K alternatives.

Follow

Get every new post delivered to your Inbox.

Join 3,380 other followers