PAGES2K (2017): Antarctic Proxies

A common opinion (e,g, Scott Adams) is that the “other proxies”, not just Mann’s stripbark bristlecone tree rings, establish Hockey Stick. In today’s post, I’ll look at PAGES2K Antarctic data – a very important example since Antarctic isotope data (Vostok) is used in the classic diagram used by Al Gore (and many others) to illustrate the link between CO2 and the isotopes used to estimate past temperature. 

Antarctic d18O is one of the few proxies which can be accurately date in both very recent measurements and in Holocene and deep time. However, rather against message, Antarctic d18O over the past two millennia (as for example the PAGES2K 2013 compilation) has mostly gone the “wrong” way, somewhat diluting the IPCC message – to borrow a phrase.

PAGES2017 relaxed the PAGES2K ex ante quality control criteria to include 15 additional series (most of which are not new), but these, if anything, reinforce the earlier message of gradual decline over the past two millennia.

PAGES2K (2017) also added two borehole inversion series, which were given a sort of special exemption from PAGES2K quality control standards on resolution and dating. I suspect that readers already know why these series were given special exemption: one of them has a very pronounced blade.  Long-time readers may vaguely recall that an (unpublished) Antarctic borehole inversion series also played an important role in conclusions of the NAS 2006 report. I tried at the time to get underlying measurement data, but was unsuccessful. A few years ago, when the PAGES2017 borehole inversion series was published, I managed (through an intermediary) to obtain much of the underlying data and even some source code for the borehole inversion. I’ve revisited the topic and I conclude today’s post with a couple of teasers and what is an interesting analysis in works. 

PAGES2K (2013)

Here is a plot of the PAGES2K Antarctic temperature reconstruction. It showed a long decline from mid-first millennium, with nearly all 19th and 20th century values and even early 21st century below the long-term mean.

This series was used in IPCC AR4 (see below). Though its most recent portion is rather muddy in the IPCC diagram, the lack of any 20th century blade is clear.

 

PAGES2K authors used 11 datasets in their temperature reconstruction. According to their statement of methods, they applied sensible ex ante quality control procedures by aiming at use of “longest, highest resolution and best synchronized” of available records.

Data for the Antarctic reconstruction were selected based on a restrictive approach aimed at using the longest, highest resolution and best synchronized of available records. All records were water isotope (d18O or dD) series from ice cores. The project aimed to maximize coherence by using records that could be synchronized through either high-resolution layer counting or alignment of volcanic sulfate records.

I very much endorse this sort of ex ante quality control. Which is the opposite of the far-too-common practice of ex post selection of a subset of proxies.  The 11 isotope series used by PAGES2K (2013) are shown below in a gif together with the reconstruction. The series, examined individually, also show the non-HS decline illustrated in the reconstruction composite.

Several of the high-resolution PAGES2K series extending back to the MWP were first archived as part of PAGES2K, including Law Dome (DSS) and Plateau Remote, both of which I had long and unsuccessfully sought from Tas van Ommen and Ellen Mosely-Thompson.

Earlier versions of Law Dome had been used in Jones et al 1998 and Mann and Jones 2004, the latter including an illustration showing a high MWP. As an IPCC reviewer of AR4, I had asked that Law Dome d18O be included in their figure showing high-resolution Southern Hemisphere proxies. Climategate emails (see CA discussion) show that IPCC authors snickered at this request, knowing that I had asked that they show a proxy with high medieval values. There was no way that they were going to show the Law Dome series. Despite sneering at my request, they recognized that they had to cooper up their rationale for not showing such an important series and inserted the excuse that there was inconsistency between the isotope data and the reconstruction from inversion of subsurface temperatures.

Contrasting evidence of past temperature variations at Law Dome, Antarctica has been derived from ice core isotope measurements and from the inversion of a subsurface temperature profile (Dahl-Jensen et al., 1999; Goosse et al., 2004; Jones and Mann, 2004). The borehole analysis indicates colder intervals at around 1250 and 1850, followed by a gradual warming of 0.7°C to the present. The isotope record indicates a relatively cold 20th century and warmer conditions throughout the period 1000 to 1750.

I mention this incident and excuse because the inconsistency between isotope data and borehole inversions re-appears in PAGES2017.

 

Stenni 2017

Stenni et al 2017 (pdf; CA discussion) presented a much expanded database of high-resolution Antarctic isotope data in response to PAGES2K. They presented 112 records (94 d18O; 18 dD) , many of which were short (36 limited to last 50 years or less). 15 records went back to AD1000;  9 went back to AD0. However, 4 of the additional series did not come up to the present or even to AD1950.  Four series (TALDICE; DML07; DML17 and Berkner Island) dated from the 1990s; the reason for their exclusion from PAGES2013 is unclear.  It included a much lengthened version of WDC06A, a companion hole to WAIS WDC05A. If a site had both d18O and dD records, they used the d18O record and did not double up. There was only one new long series: Roosevelt Island. It showed the long gradual two-millennium decline evident in other records.

Stenni et al produced a reconstruction, which, as pointed out at CA previously, used ex post screening to select series that had positive correlation with upward trending instrumental temperature data:

Even with this bias, their temperature reconstruction had a pronounced downward trend over the past two millennium – entirely consistent with the Law Dome d18O that IPCC had refused to show in AR4 a decade ago.

 

PAGES2K (2017)

The PAGES2K (2017) dataset consisted of 27 series. They used 10 of 11 PAGES2K series (of which one series was updated), added 15 isotope series and two VERY unresolved borehole temperature reconstructions. 13 (of 15) new isotope series had been previously used in Stenni et al 2017; the other two series were dD series at sites where d18O series had already been used. The earlier compilations had avoided such duplication.

PAGES (2017) said that their standards for Antarctic ice core isotope series had been relaxed to include “shorter and decadal-scale-resolution” records:

for some proxy types, the standards in this version were broadened compared to the criteria used previously by PAGES2k regional groups. In most regions, records have been added that have become available since the publication of PAGES2k-2013, or that were not used in the continental-scale reconstructions because they are not annually resolved and therefore did not conform to the reconstruction method used by a particular regional group. In Antarctica, for example, PAGES2k-2013 included only the longest annually resolved ice cores, whereas the present version includes shorter and decadal-scale-resolution records.

Of the 15 new isotope series, 5 begin after the medieval period; 5 end before 1940; 4 have decadal resolution or less. None of the new isotope series begin prior to AD1000; end after AD1950 and have better than decadal resolution. Three series which begin at exactly AD1000 meet the other two criteria. Of these three series, two (DML07, DML17) are from the same campaign and author as the 2013 series DML05 and add little new information. I mentioned the other series, an isotope series from Berkner Island, five and seven years ago in connection with the SH network of Neukom, Gergis and Karoly(see here, here). The new isotope data show the same two-millennium decline as PAGES2K and Stenni et al 2017.

The two borehole series invert downhole thermometer temperatures to supposedly estimate past temperature. These inversions use extremely ill-conditioned matrices – an issue that doesn’t seem to be clearly understood by proponents – with resolution far lower than PAGES2K standards. (PAGES2017 falsely asserts that one of the two series has annual resolution, and that the other has 100-year resolution.)

PAGES (2017) acknowledged that the resolution of borehole inversions was “less straightforward” than other proxies – an understatement, but nonetheless asserted, waving their arms wildly, that the records were “appropriate for examining decadal to multi-centennial variability”:

PAGES2K scientific questions focus on centennial and finer time scales. Terrestrial and lacustrine records were included with average sample resolution of 50 years or finer. However, such records are rare from marine sediments, and thus a minimum average sample resolution of 200 years was accepted for this database. We also included 4 borehole records, although quantifying median resolution is less straightforward in boreholes than in other archives. The borehole records in the database are appropriate for examining decadal to multi-centennial scale variability, depending on the timeframe of interest [21- Orsi et al, Little Ice Age cold interval in West Antarctica: Evidence from borehole temperature at the West Antarctic Ice Sheet (WAIS) Divide. Geophysical Research Letters 39, L09710 (2012). pdf

There is, of course, a different and real reason for PAGES (2017) insertion of borehole records which didn’t meet PAGES2K ex ante quality standards: the borehole inversions, especially at WAIS Divide (shown in the gif below) have a pronounced 20th century blade, which is absent in the Antarctic isotope data. Cynical readers might reasonably conclude that this had something to do with the PAGES2K decision to abandon its quality control standards for these records.

Discussion of Antarctic Borehole Data

I’m going to write a detailed analysis of the WAIS Divide borehole inversion in a separate post . Antarctic played a surprisingly prominent role in conclusions of the 2006 NAS paleoclimate report, but NAS provided no citations for their assertions about Antarctic. I challenged their assertions and, in a surprise appearance in Climate Audit comments, Eric Steig agreed with my criticisms (while slagging me either for making the criticisms or, more likely, for existing.) I was later able to determine from a NAS panelist that their assertions about Antarctic were based on unpublished borehole inversion data. I tried to get the underlying data (measured in 1994-95) from USGS but the data could not be provided to me because it lacked “official USGS approval” which had thus far not been obtained due to other pressing obligations. (Twelve years later, the data remains unarchived.)  In 2009, I looked at inversion techniques for downhole temperatures in “boreholes” in rock. (These almost entirely come from mineral exploration.) I noted that the techniques required inversion of very ill-conditioned matrices and that some properties looked like Chladni-type artifacts.

When Orsi et al published their borehole inversion in 2012, I asked an associate to request for data and code (figuring that it would be pointless to request the data myself.) Orsi courteously sent both data and code to the associate, who sent it to me. Much of the code had been written in 1990 in an antique Fortran; the rest was in Matlab. I spent some time in 2012 trying to figure it out,  but put it to one side after a while. I’ve re-visited the topic with some interesting results which I’ll write up at length, but, for now, give two teasers.

First, the downhole temperature curve was both convex and smooth. (Convex means that there were no changes in the direction of curvature.) However, the reconstruction had three major changes in curvature direction and, in detail, many small changes. Mathematically, this is very unsettling: without some very peculiar conditions, the inverse of a convex and smooth curve ought to be convex (or concave) and smooth as well.  So how do the changes in curvature in the reconstruction arise? Are they real or an artifact? (In some prior CA posts, I’ve discussed changes in curvature in connection with Chladni patterns arising from principal components on tree ring networks – so there are some interesting connections to a long-standing mathematical interest.) However, it’s a little long and detailed for this post.

While I was trying to figure out the code, I noticed the authors had excluded the top 15 meters of their data “because of the influence of the weather on surface measurements”.   This raises an obvious question: what did the excluded data look like?

Orsi’s unpublished data package didn’t include a file named “WAIStemp2009c.txt”, but did include a file entitled “WDC05A_BoreholeTemp_300m_2009.txt”, which contained downhole temperature measurements taken in January 2009.
% as measured in January 2009″, which contained six excluded measurements between 8 and 15 m.  The excluded data is shown (in red) in figure below: it continued upward a little further, then declined, retracing about half the increase. Given that the overarching conclusion of the article was rapid recent increase in temperature, it seemed unsettling that they had deleted the most recent data (which went down). The text of the article also cited 2008 measurements, which had not been included in the data package. They turned out to be online at USAP here and are plotted in right panel: ice sheet temperature in the topmost 2 meters reversed the decline, increasing by more than 16 deg C – an effect that was clearly “weather” not “climate”.

 

Van Ommen et al (1999) contained an informative graphic (replicated below) which showed the dramatic annual variation in near-surface ice sheet temperature: in the top meter or so, temperatures ranged from ~-30 deg C in winter to ~-13 deg C in January, with the amplitude of the variation attenuating by ~15 meters deep. The shape of the temperature profile in the top 15 meters is distinctly of the form of a damped sinusoid: one can reasonably also see a damped sinusoid in the top few meters of the WAIS data as well.

 

The problem with the top ~15 meters or so is the effect of ordinary (average) annual variations, not “weather”. One can see how the elimination of the top ~15 meters of data sidesteps the thorny problem of disentangling these annual variations, but this surely comes at a heavy cost.  Ice cores can be accurately dated by layer counting (based on visual appearance and annual d18O cycles). Layers at 15-18 meters date back to the 1960s. Orsi et al purport to reconstruct temperature up to 2007, but they do so without using ice core dating from ~1965 to 2007. The calculation is entirely done from ice core layers dated prior to the 1960s.

Conclusion

I plan a separate post on the curvature issues, which are of mathematical interest (to me at least). I’m very dubious of these borehole inversions in general and am extra dubious of this borehole inversion in particular. From the perspective of PAGES2K (2017 version),  it seems transparent that they plan to include even questionable borehole inversions in their composite in an effort to goose the inconveniently declining isotope data into a Hockey Stick.

 

29 Comments

  1. pdtillman
    Posted Feb 1, 2019 at 4:15 PM | Permalink

    Glad to see you’ve found the energy to return to this long-running topic. Thank you.

    Borehole data:
    “Given that the overarching conclusion of the article was rapid recent increase in temperature, it’s a little unsettling that they deleted the most recent data (which went down).”
    More than a little, I’d say!

    And, as to Climate vs. Weather, I think Upton Sinclair has earned the last line:
    “It is difficult to get a man to understand something when his job depends on not understanding it.”

  2. Follow the Money
    Posted Feb 1, 2019 at 6:08 PM | Permalink

    Long-time readers may vaguely recall that an (unpublished) Antarctic borehole inversion series also played an important role in conclusions of the NAS 2006 report.

    I recall IPCC 2013 AR5 literally fabricated a southern hemisphere borehole data set using a very tricky and wrong citation to a published study as cover. Doesn’t say much the quality of the Antarctic set the NAS used if the IPCC (!) later ignored it in favor of a utter fabrication.

    • Climate Audit
      Posted Feb 1, 2019 at 9:36 PM | Permalink

      can you locate the citation?

      Antarctic borehole inversion also played a role in AR4. As a reviewer, I had asked IPCC authors to include the (declining) Law Dome d18O series. IPCC refused – there are some laughable Climategate emails on their snickering refusal. Their ultimate excuse was that the isotope and borehole inversion data were inconsistent.

      • Posted Feb 1, 2019 at 11:09 PM | Permalink

        Presumably this is reference to a prior (2013) comment

        The IPCC Southern Hemisphere Reconstructions

        IPCC AR5 WG1 Fig 5.7 refers to Pollack & Smerdon 2004 [ https://agupubs.onlinelibrary.wiley.com/doi/epdf/10.1029/2003JD004163 ] as “PS04bore” and includes it in all 3 panels (NH,SH,global). Table 5.A.6 indicates that PS04bore provides land-only reconstructions for NH, SH, and global. However, the paper only discusses a NH reconstruction.

        • Climate Audit
          Posted Feb 1, 2019 at 11:17 PM | Permalink

          Pollack and Smerdon deal in inversions of drill-holes in rock.

        • FTM
          Posted Feb 2, 2019 at 12:52 PM | Permalink

          That’s the citation Harold!

          Correct, Steve, PS2004 does not use ice boreholes, and I guess I should have thought “ice” automatically with your references to “Antarctic” boreholes. But I still think relevant as to the issue of the sketchy use of borehole temps readings in general to posit anything about global ground-surface temps.

          Anyway the AR5 graph is another example of Climate “Science” bad practices. PS2004 does not gauge or graphically picture southern hemisphere borehole data. But AR5 Fig. 5.7 plots data for the SH in the name of PS2004 and as the same exact curve in form. But the curve is a wee colder, so for the “global” curve, another fabrication, the PS2004 NH curve and the IPCC fake SH curve are averaged to get a middling curve on the temps scale.

        • Climate Audit
          Posted Feb 2, 2019 at 1:15 PM | Permalink

          that’s interesting. I hadn’t noticed that.

          I recall commenting adversely on IPCC SH curves which relied on Mann et al 2008 GLB proxies – i.e. stripbark bristlecones were used not only as NH thermometers, by SH thermometers. It’s all very absurd.

        • Jaap Titulaer
          Posted Feb 3, 2019 at 5:17 AM | Permalink

          I recall commenting adversely on IPCC SH curves which relied on Mann et al 2008 GLB proxies – i.e. stripbark bristlecones were used not only as NH thermometers, by SH thermometers. It’s all very absurd.

          Wait, what?
          Surely you are joking!
          Does that mean that they used an ex-post selected NH proxy series (a low % of the total NA Bristlecone Pine data), which diverges from it’s official selection criteria (so the last part had to be chopped off and smoothed using temperature measurements) but which was purely selected ex-post because it was in-line with the conclusion they wanted to draw, as a proxy for SH????

        • FTM
          Posted Feb 5, 2019 at 3:52 PM | Permalink

          Steve,

          The Fig. 5.7(b) Southern Hemisphere graph exhibits three Mann 2008 proxy sets, one glacial tongue length study, and the fabricated SH PS04bore data.

          Fig. 5.7(c) represents the whole world! What does it contain? Two Mann proxy sets, the glacial length one, and the fabricated PS04bore.

          These uber-influential Mann may likely contain NH brstlecones as I remember your discussion back then about Mann 2008 products. That can mean where the IPCC says “SH” and Global” we should read “more NH bristlecones” and “yes, NH bristlecones again!”

          Would it not be fascinating to have the IPCC AR5 chapter 5 preparers reveal who created Fig. 5.7?

  3. Norman Yarvin
    Posted Feb 2, 2019 at 10:40 AM | Permalink

    Inverting the heat equation (what they’re doing here for boreholes) is a textbook example of an ill-posed problem. Indeed, at the moment, it’s the Wikipedia example:

    https://en.wikipedia.org/wiki/Well-posed_problem

    So it’s not just that matrices used in their particular method are ill-conditioned. It’s that even prior to discretization, the problem is ill-posed, so the more data you try to extract, the worse the condition number gets. And putting in a smooth curve and getting out a mess is not a mathematical surprise; it’s expected.

    Now, one can get a _little_ bit of good data out, via (as that Wikipedia page mentions) “regularization” — basically stomping on the small eigenvalues of the matrix to be inverted, so one isn’t even trying to get that data out. You’d want to see this done a priori: our data has X amount of noise, and so we can accept eigenvalues down to 1/X (or something very roughly like that: not saying that should be the exact criterion; indeed I’d stop long before bringing the noise to the same level as the signal). X can be determined by looking at the residues after local curve fitting, and should be corroborated by what one knows about the intrinsic noise of the sensors used in the experiment. Even then, long-term biases can creep in, as in “we hurried a bit when drilling the last stretch, and didn’t wait quite as long for temperatures to equilibrate around our sensor”. (Not a direct quote, just an example of something that could easily happen.) But even language like “layer dating to the 1960s” gives the wrong impression of the numerical uphill battle here: the main problem is that there are no layers, since heat is constantly diffusing in all directions. This isn’t like sediment in a lake or d18O, where layers laid down remain layers. You can compute “the layer to which the 1960s make their maximum contribution”, but that’s different from there really being a 1960s layer.

    I was once taught the ill-posed nature of this problem in a class, and afterwards mentioned to the professor that I’d seen it used as evidence for global warming. His response was “You don’t need to be a flaming conservative like so-and-so to have doubts about global warming.”

    • Climate Audit
      Posted Feb 2, 2019 at 12:36 PM | Permalink

      for inversion of downhole drill-holes in rock, regulatization by truncated SVD is a common technique. See my discussion from some years ago

      Truncated SVD and Borehole Reconstructions

      while it’s easy enough to say that it’s ill conditioned, my interest in that post was to be descriptive: disregarding temporarily that it was ill-conditioned, how did matters change as the number of eigenvectors increased. The patterns matched those of Chladni patterns.

      In the present case, they used a quite different method, still ill-conditioned, and without saying so and perhaps without realizing it, used a form of ridge regression. Their final answer looks like it’s constructed from 3-4 eigenvectors and it’s possible to show that, as I plan to do in a more technical post.

    • Climate Audit
      Posted Feb 2, 2019 at 1:13 PM | Permalink

      You say: “You can compute “the layer to which the 1960s make their maximum contribution”, but that’s different from there really being a 1960s layer.”

      Keep in mind that the ice core has other measured properties besides its temperature. It has annual layer defined jointly by annual d18O isotope cycle (which either don’t diffuse or diffuse at much much slower rates than heat) and physical appearance. While I may not have expressed it quite as felicitously as I might have, the idea that layers can be dated to specific decades and even years is universally accepted among specialists and makes sense to me. Because heat diffuses, I don’t imply that those layers are thermometers.

      However, it seems obvious to me that, even if one places more weight on borehole temperatures than I’m inclined to, that if you remove layers dated more recently than the 1960s, you remove essential evidence on post 1960s temperatures.

      • Norman Yarvin
        Posted Feb 2, 2019 at 11:19 PM | Permalink

        Okay, so you meant “layers dating to the 1960s” as being dated via those other methods (appearance, d18O). That hadn’t been clear to me. And yes, temperature from 2007, by the time it has gotten down to the 1965 layer, has at the very least been mixed with temperatures between 2007 and 1965, plus an equal interval on the other side (1923-1960), and to a lesser extent temperatures from more distant years. (Whether it would even have gotten down that far yet is not immediately obvious to me; I’d have to run some numbers.)

        In any case, of course truncating those top layers is a mistake: removing “weather”, if desired, should be done after the numerical inversion process. Indeed, taking an axe to the top levels of data before feeding it in must complicate the inversion process. I wonder if they even realized that their equations have to change — that they can’t just use the usual inversion equations. Those equations in any case should be subtly different when dealing with a surface that is continually deposited than when dealing with hard rock, but truncating the top is even more of a change.

      • TimTheToolMan
        Posted Feb 8, 2019 at 7:09 AM | Permalink

        Steve writes “if you remove layers dated more recently than the 1960s, you remove essential evidence on post 1960s temperatures.”

        Surely to understand the layers you’re keeping, you need to take into account the layers above and how their temperatures impacts the lower layers over the years?

        Is this where we get the recent temperature record spliced onto the borehole data for the purposes of that understanding? He says. With irony spliced onto sarcasm.

  4. EdeF
    Posted Feb 2, 2019 at 11:27 AM | Permalink

    Using Ice borehole temperatures to reconstruct recent surface temperatures, this is the first I’ve heard of that. Will have to do some background reading to see what they are trying to do.

  5. Jaap Titulaer
    Posted Feb 3, 2019 at 5:10 AM | Permalink

    The last part of borehole reconstruction looks very, very, suspicious. And of course the base data is also quite suspect, way to much of a smooth curve.
    And knowing that they left out the last part of the actual data, which went down, it is clear that this recent part of the reconstruction is not merely wrong, it is totally fraudulent! What an utter and total unscientific joke this is.

    And this seems to be, again, one of the few ‘proxies’ which (after this manipulation) can be used to derive a hockey-stick, when given enough weight over all others which don’t…

    Can’t wait to see the follow-up post 🙂

  6. ccscientist
    Posted Feb 4, 2019 at 4:42 PM | Permalink

    If they included the red (deleted) data from your last graph, it might have completely blown up the inversion and THAT was why they deleted it, not because of “weather”.

    I try to follow a few rules to respect the numbers:
    1) If it is ill-conditioned, stop (or run all sorts of tests)
    2) If you can get chaos out of the dynamics, be careful
    3) Don’t use small sample sizes
    4) Don’t do regression of 70 data points with 20 variables
    5) Don’t use any tool you don’t understand

  7. Posted Feb 4, 2019 at 5:03 PM | Permalink

    Thanks for researching and writing another very interesting post, Steve.

    I look forward to your follow up post on borehole temperarature inversion. It is a technique that I have always been dubious about, because of the ill-posedness of the problem. I recall writing to Andrew Revkin saying that the Orsi borehole trend on 0.80 +/- 0.06 C/decade over 1987- start 2007 looked highly dubious, and that the error band was absurdly low. But at the time I was too busy to investigate the data and its processing in detail and I subqequently moved on to other things. I’m very pleased that you have now returned to the issue.

  8. EdeF
    Posted Feb 4, 2019 at 9:24 PM | Permalink

    https://www.researchgate.net/profile/Gary_Clow/publication/13521226_Past_temperatures_directly_from_Greenland_Ice_Sheet/links/00b7d52c05803c5926000000/Past-temperatures-directly-from-Greenland-Ice-Sheet.pdf?origin=publication_detail

    I have started reading this report from 1998 which describes how to reconstruct past surface temperatures from ice core temperature readings. Note that we don’t know the past geothermal conditions at the base of the glacier with certainty. These cores show a Little Ice Age and Medieval Warm Period. How gauche.

  9. Climate Audit
    Posted Feb 7, 2019 at 1:00 AM | Permalink

    I added some text in the section of this post about boreholes. The Orsi data package only had the 2009 measurements which I showed in a figure. I located the original 2008 measurements at USAP and they showed an enormous uptick of 14 deg C in top meter (due to measurement being taken in January.) Warming from summer temperatures penetrates as deep as 15 meters according to a 1999 diagram at Law Dome.

    Author clearly seems to have had annual variation in mind, not weather. An infelicitous choice of word, as annual variation is not “weather”. The issue is a little different than it first appeared. It is necessary to try to disentangle the impact of annual variation – not that easy if you only have summer measurements. It’s not obvious that this is best done by deleting data, but their motive for doing so appears to have not been venal – as in Mann’s notorious IPCC TAR diagram.

  10. paul courtney
    Posted Feb 10, 2019 at 9:45 AM | Permalink

    I’m grateful to our host for posting on this, your presentation continues to be clear to those of us less gifted in math. Have you seen that the Mueller indictment of Roger Stone appears to rely on Russian attribution of Podesta emails released by Wiki? Mueller may have to prove this, but my recollection of your column, and particularly the comment string with jaap, raised doubts re: attribution. Maybe jaap could guest post?

    P.S. Don’t blame you for getting exhausted with climate wars. People talking past each other, just like politics at its worst. It’ll keep going that way so long as NYT et al refuse to report on the absurdities that are plain to see.

    • Jaap Titulaer
      Posted Apr 17, 2019 at 2:27 PM | Permalink

      Hi Paul,

      Maybe jaap could guest post?

      Yeah, maybe I could … 🙂
      Where do I send text for such an article? @Stephen: PM me?

      Have you seen that the Mueller indictment of Roger Stone appears to rely on Russian attribution of Podesta emails released by Wiki?

      I have seen that as well. And it is false. I mean the allegation that Stone lied about that is false. Based upon what he knew then his answer should have been NO to a compound question. And today it would be the same.
      As far as I am aware and I’m sure as far as Mr. Stone is aware, there is no evidence for any Russian government involvement in the phishing scam, the theft of the emails (+attachments) of Podesta or the publication of them by WikiLeaks.

      The Podesta (@gmail.com) email inbox never resided on any DNC system, but on Google GMail servers, obviously. His emails were simply retrieved after an unknown party (A) succeeded in misleading him into providing his password, by making him believe he needed to reset it & by providing a link to a fake google account password reset page. We already discussed that.

      The DNC Finance people emails seem to have been retrieved in the same manner, by an unknown party (B). Simply by knowing one or more passwords.
      Because the DNC emails published by WikiLeaks belong to 10 people (from DNC Finance) and only them, they would seem to be retrieved from either: the 10 individual inboxes OR from a single shared DNC Finance group email box. Because the To and From in the emails indicates 10 different accounts (see recent post by the Forensicator), the former is perhaps more likely (unless they had forwarding activated on their personal inboxes; forwarding to a group inbox).
      At first sight it would be much easier to just get the email account password from the single group email inbox. I mean what are the odds that all 10 people fell for the same password phishing scam. Unless of course they all got the same password phishing email, and one of the leaders decided to reset his account then and there (via the link provided in the phishing email) and told the rest to do the same, ASAP…

      We know (as we discussed) that at least one of the DNC Finance people also received an email very similar to the one that Podesta received, and we also know that he clicked on the link provided. Such emails were send to quite a number of people, but exact details have not been published.

      [Stone Indictment]


      3. From in or around July 2016 through in or around November 2016, an organization (“Organization 1”), which had previously posted documents stolen by others from U.S. persons, entities, and the U.S. government, released tens of thousands of documents stolen from the DNC and the personal email account of the chairman of the U.S. presidential campaign of Hillary Clinton (“Clinton Campaign”).

      This part: “of documents stolen from the DNC and the personal email account of the chairman” is materially misleading. WikiLeaks (“Organization 1”) has only published emails in the original internet email format plus their attachments, if any. They did NOT publish any individual documents coming from any other source like a file server.

      And that is important, as there does not seem to be any certain link between the theft of the emails from Podesta (or DNC Finance) and the (alleged) hacking of DNC computer servers.

      Also the description in the GRU Indictment of the way how they are accused to have broken in to the DNC network and servers seems incorrect.
      For two reasons:
      1) groups like Fancy Bear normally use a very different approach. That approach is getting a victim to start some malware on a laptop or other computer which is already a member of the computer network, using an account also known to the network; i.e. working from the inside.
      and
      2) the way described in the indictment (remote login from a non-DNC computer; i.e. working from the outside) does not work. Which happens to be the reason for #1: using a different approach.

      • Posted Apr 21, 2019 at 1:16 AM | Permalink

        Jaap, if Steve would rather stick only to climate I would try Lucia at the Blackboard here: http://rankexploits.com/musings

        If it happens please post a link so others will follow. Thanks for your investigation!
        -Ron

      • paul courtney
        Posted May 4, 2019 at 11:16 AM | Permalink

        Jaap: Thx for your reply, your post is what I recall from earlier posts that attribution evidence was very thin. I hope Stone has enough money to put them to the proof and we get to see Crowdstrike testify. This is still the only site I’ve seen that took up the subject. I’ll peek at Ron’s link.

  11. DaveS
    Posted Feb 14, 2019 at 8:09 AM | Permalink

    The Douma comment threads now appear to be closed, but the latest revelation concerning the supposed chemical attack can be found on twitter

  12. TomRude
    Posted Mar 28, 2019 at 11:23 AM | Permalink

    https://www.nature.com/articles/s41586-019-1060-3

    This study’s result is going against the grain of all palaeoclimatic reconstructions: warm periods are on the contrary humid and cold periods are quite dry in mid-latitudes and tropics.

  13. Tip for tweets
    Posted Apr 2, 2019 at 4:55 PM | Permalink

    About Butina — In his November 14, 2017 House committee testimony Glen Simpson talks about researching the NRA and Butina. Pages 142-4.

    Looks like Simpson could be the source of the DOJ’s interest in Butina.

    • Follow the Money
      Posted May 17, 2019 at 9:24 PM | Permalink

      Re: Napolitano — Steve, you are putting up an admirable effort to help educate the Twitter people who believe the Chris Steele “intelligence” product was sourced from real people like Millian, Sater, and all the Kremlin folks who confessed, rather than being just imaginative garbage.

      I think I can help you with the Napolitano-Papadopoulos matter. Napolitano’s May 2016 talk about a “debate” in the Kremlin about having and releasing Hillary’s emails is swiping from a May 6 2016 “EUTimes.net” article, which was spread around twitter a lot, then rehashed in a “Gateway Pundit” article dated May 10, which was spread more.

      The EUTimes.net article, which is either creative fantasy or disinformation, is the first to print the idea there was a “debate” inside the Russian government whether to release the emails. This article may be the basis of such “intelligence” in the US govt reports, and sounds not unlike the Steele dossier’s reports of a debate inside the Russian govt over the emails.

      May 6, 2016 EUtimes.net article

      Twitter search at the time

      Gateway Pundit repeated EUTimes story

      Napolitano on “Fox” talking of debate in Kremlin

      https://video.foxbusiness.com/v/4886982949001/#sp=show-clips

  14. Posted Apr 20, 2019 at 4:15 AM | Permalink

    Reblogged this on Climate- Science and commented:
    Don`t worry about climate change. Al Gore and the UN are dead wrong on climate fears. their global warming scarev is not driven by science- no it is driven by politics.

7 Trackbacks

  1. […] cold waves are caused by global warming, why are they decreasing? Also, Climate Audit considers the Pages2K Antarctic temperature data. It’s a highly technical blog, but if you really want to get into the science (whatever that […]

  2. […] Reposted from Climate Audit by Steve McIntyre […]

  3. By PAGES2K (2017): Antarctic Proxies | on Feb 7, 2019 at 5:04 PM

    […] Reposted from Climate Audit by Steve McIntyre […]

  4. […] https://climateaudit.org/2019/02/01/pages2k-2017-antarctic-proxies/ […]

  5. By Weekly Climate and Energy News Roundup #347 | on Feb 11, 2019 at 5:10 AM

    […] https://climateaudit.org/2019/02/01/pages2k-2017-antarctic-proxies/ […]

  6. […] Link: https://climateaudit.org/2019/02/01/pages2k-2017-antarctic-proxies/#more-24072 […]

  7. […] Link: https://climateaudit.org/2019/02/01/pages2k-2017-antarctic-proxies/#more-24072 […]