Twisted Tree Heartrot Hill Revisited

Recently, while re-examining PAGES2K, the current paleoclimate darling, I noticed that PAGES2K(2019) reverted to a variation of the Twisted Tree Heartrot Hill (Yukon) [TTHH] tree ring chronology that we had already criticized in 2003 as being obsolete when used by Mann et al 1998.  PAGES2K was supposed to be an improvement on Mann et al 1998 data, but, in many ways, it’s even worse. So it’s It was very strange to observe the 2019 re-cycling of a TTHH version, previously criticized in 2003 as being already obsolete in 1998. 

MM2003

In McIntyre and McKitrick (2003), we had observed that MBH98 had used an obsolete version of the Twisted Tree Heartrot Hill (Yukon) [TTHH] tree ring chronology, for which measurement data ended in 1975, as compared the chronology ending in 1992 available at the NOAA archive, as shown in excerpt below. (I checked with NOAA and verified that the updated chronology was available at NOAA prior to submission of MBH98.) 

The TTHH chronology declined precipitously in the late 1970s and 1980s, reaching its lowest value in the entire record in 1991. However, the MBH version ended in 1980 (extrapolating the 1975 value for the final 5 years.) Below is comparison from our 2003 article.

Mann et al 2003 Response

In our 2003 replication, we used the NOAA version of the TTHH chronology rather than the obsolete MBH version. In their contemporary (November 2003) response to our first article, Mann et al objected vehemently claiming that we had wrongly substituted a “shorter version” of the TTHH chronology for the “longer” version used in MBH98. (The so-called “shorter” version used a larger dataset but began when 5 cores were available.)

Because the MBH98 proxy reconstruction ended in 1980, the difference between the two versions wasn’t an important issue in the main narrative of MBH hockey stick controversies, but it does become relevant for reconstructions ending in 2000 (such as PAGES2K).

PAGES2K (2019) Version

PAGES2K, including its 2019 version, reverted to the TTHH data version already obsolete in Mann et al 1998 – the data ending in 1975, not 1992.  The figure below compares the TTHH version in PAGES2K (2019) – on -right – to the TTHH versions discussed above. The PAGES2K version uses the same measurement data (ending in 1975) as the MBH98 version.  The PAGES2K chronology is very similar to the MBH98 version in the period of overlap (1550-1975) but is not exactly the same. Notice that the PAGES2K version (like MBH98) avoids the post-1975 data with the severe “decline”. 

The precise provenance of PAGES2K chronology versions is not reported and figuring them out by reverse engineering is a herculean effort (e.g. Soderqvist’s work on PAGES2K Asian chronologies.)  Amusingly, the PAGES2K version begins a little later (1550) than the version that Mann had criticized for being “shorter”.

 

Measurement Data

Although Jacoby and D’Arrigo’s contemporary NOAA archive included the TTHH chronology up to 1992 (with its decline), they never archived the measurement data corresponding to the 1992 chronology.  Many years later (2014), as Jacoby was on his death bed, they filed a large archive of measurement data with NOAA, including data for a Yukon regional chronology (cana326), a subset of which was the 1975 TTHH measurement data. This archive included the TTHH update, but did not include a concordance identifying which identifiers belonged to the TTHH update and which identifiers belonged to other Yukon locations.  

By coincidence, Tom Melvin, Briffa’s associate at the University of East Anglia, had used a TTHH measurement data version (74 cores) as a benchmark for testing “signal free” methodology in 2010 and this measurement data proved to be available in an archive identified by Hampus Soderqvist in his investigations of signal-free methodology. It contained the 1975 measurement data (34 cores), 25 cores from 1987-1992 and 15 cores from 1999 sampling. 

As an exercised, I calculated a chronology using conventional methodology from the subset of cores collected in 1992 or earlier – shown below.  It closely matches the chronology archived in NOAA in the mid-1990s.  

TTHH 1999 Update

The TTHH measurement data was updated a second time in 1999. Its results were published in a 2004 article by D’Arrigo et al entitled “Thresholds for warming-induced growth decline at elevational tree line in the Yukon Territory, Canada” (link) in which they broached the problem of the “decline” in high latitude tree ring widths in late 20th century, despite observed warming in the Arctic.  Climate Audit readers will recall D’Arrigo’s “explanation” of the “divergence problem” to the NAS panel in 2006, when she explained that you “have to pick cherries if you want to make cherry pie”.   

The type case in D’Arrigo et al 2004 was TTHH as shown below. 

D’Arrigo et al never archived the measurement data or chronology for their 2004 article on the divergence problem.  As another exercise, I calculated a chronology for the Melvin data including cores from the 1999 update using a conventional methodology (dplR ModNegExp): it closely replicated the D’Arrigo diagram from 1575 on, but not in the earliest portion (when there are fewer than 5 cores anyway.) 

Melvin Signal-Free Version

As a final exercise, I looked at Melvin’s “signal-free” methodology on the resulting tree ring chronology. (On previous occasions, we’ve discussed the perverse results of this methodology on multiple PAGES2K Asian tree ring chronologies, as articulated by Soderqvist.)  In this case, the signal-free artifact at the end of the series increases closing values by about 20% – much less dramatic than the corresponding artifact for paki033 but an artifact nonetheless.  

Conclusion

In any event, PAGES2K did not use the Melvin signal-free version – which went to 1999 and incorporated a decline after 1975. As noted above, PAGES2K reverted to the measurement data version already obsolete in Mann et al 1998, but in a novel version. At present time, the provenance and exact methodology of the PAGES2K calculation is unknown.  As readers are aware, it took a heroic effort by Soderqvist to deduce that the methodology and provenance used in multiple PAGES2K Asian tree ring chronologies (a particular LDEO variation of “signal-free” methodology). My guess is that the TTHH chronology used in PAGES2K was also calculated by LDEO (Cook et al) using some variation of Melvin iteration: the closing uptrend in the difference is characteristic.  

 

7 Comments

  1. Fred Harwood
    Posted Apr 18, 2024 at 10:07 AM | Permalink | Reply

    Thanks, Steve.

  2. Posted Apr 19, 2024 at 1:30 PM | Permalink | Reply

    After all this time I suppose I shouldn’t be surprised to see such obvious corruption in mainstream climate science, but somehow each new revelation comes as a shock.

    The most depressing part is that despite revelation after revelation that the whole Hockeystick enterprise is built on sand, fraud, and upside-down data, it is STILL being used by the IPCC and quoted by people who should know better.

    Here’s my contribution to the literature, which bears out Steve’s research showing that the whole Stick depends on just a few proxies, and the rest are just for show.

    Kill It With Fire

    Steve, can’t thank you enough for your endless research into this and other questionable climate practices.

    My best to you and yours,

    w.

  3. Jeff Alberts
    Posted Apr 19, 2024 at 9:32 PM | Permalink | Reply

    It still amazes me that, looking at each proxy series individually, there is almost never a hockey stick. But, when magically mixed together, they become hell on Earth in the last 50 years.

    Can anyone explain to me how that happens, without fudging the numbers?

    • DaveS
      Posted Apr 22, 2024 at 12:38 PM | Permalink | Reply

      This is referred to in Willis’ linked post (or perhaps in the comments below it – I’ve just come back from there but have forgotten already!). Essentially it is an artefact of averaging. If you have a number of proxies that meander about over time with no strong deviations and you average them together, they tend to cancel out leaving something pretty flat. Add a few with strong upticks and that’ll be reflected in the average.

      • Jeff Alberts
        Posted Apr 24, 2024 at 9:39 PM | Permalink | Reply

        Thanks Dave. I’m even more amazed that modern science does such things.

    • Stephen McIntyre
      Posted Apr 25, 2024 at 6:18 AM | Permalink | Reply

      “It still amazes me that, looking at each proxy series individually, there is almost never a hockey stick. But, when magically mixed together, they become hell on Earth in the last 50 years.”

      Yes. This is something that most of us have noticed over the years. It’s an issue that deserves much more attention than it’s received. Nor do I believe that it results from “averaging”. On the contrary, averaging would mitigate the problem.

      There is a well-known phenomenon in which you can precisely fit a sine curve over a calibration by a sufficient number of white noise series. The same would apply to a trend. So if you did an inverse regression of a trend over 79 years onto 79 white noise series, you could reproduce the trend exactly, but cancel out into a shaft outside the calibration period. If you had a couple of actual hockey sticks in the calibration period, the number of shite noise series required would be much reduced.

      Most of the present multivariate methods involve some sort of correlation weighting. If you recall the algebra of linear regression, it has formula (X^T X)>{-1} * (X^T y). X^T y (after standardization of data) is the vector of correlations. If the proxies are near-orthogonal (as we see for crummy proxies). then (X^T X)>{-1} is “near” the identity matrix.

      So even with correlation weighting (partial least squares) rather than ordinary least squares, you can get the same sort of overfitting as in the classic example.

      I’m convinced that this phenomenon explains the results of the various complicated multivariate methods in PAGES2K. The methods are quite opaque and I haven’t figured out how each of them works, but there are only so many things that you can do with linear operations and I’m confident that the issue is overfitting by the multivariate method.

      As I look back at old Climate Audit posts, I noticed that I was working on this phenomenon in period prior to Climategate e.g. Georgia Tech presentation but then got wrapped up in Climategate matters and didn’t finish the topic.

      • Jeff Alberts
        Posted Apr 26, 2024 at 8:25 PM | Permalink | Reply

        Most of that is over my head, but I appreciate the explanation. It’s good to know that eyeballing it has some value.

Post a Comment

Required fields are marked *

*
*