Whitewashing IPCC Renewables: the Carbon Brief

The Carbon Brief, an advocacy site funded by the European Climate Foundation, as part of the ongoing whitewashing of IPCC’S deceptive press release on renewables, today purported to blame journalists for being tricked by the IPCC press release, stating:

Journalists were also under no obligation to adopt the framing of the IPCC’s press release. The media’s practices – including constraints on journalists’ time – must therefore be held partially responsible for presenting the misleading impressions identified above.

Elsewhere in their article, the Carbon Brief attempts a limited hangout, conceding a few small points. Although the IPCC handling of the Greenpeace scenario was presented at Climate Audit, in keeping with standard Team practice, they do not cite Climate Audit, referring, if necessary, vaguely to “critics”. Nor do they rebut the criticisms as expressed here. Nor did they even fully quote the critical part of the press release.

Continue reading

Lynas’ Questions

As most CA readers know by now, the following widely-disseminated lead statement to the IPCC press release announcing the Renewables Report was untrue.

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

On June 16, Mark Lynas asked four sensible questions about how IPCC came to make the untrue statement as follows:

Dear Dr Edenhofer,

I would also like to have a look at the archive of review comments, as requested by Steve McIntyre earlier. In addition I would ask for a response to the following questions, to which I will happily post your reponses online for clarification:

1. what was the process for writing the press release, and who decided whether it faithfully represented the main conclusions of the SPM/main report?
2. why was the SPM released more than a month before the full report?
3. was Sven Teske in any way involved in the decision to highlight Teske et al, 2010 as one of the four ‘illustrative scenarios’ explored in greater depth as per Section 10.3.1?
4. what is the IPCC conflict of interest policy with regard to lead authors reviewing their own work, and having affiliations to non-academic institutions, whether campaign groups or companies?

I will post a note on my blog informing that these questions are with you and awaiting a response. Many thanks for your attention on this.

Mark

Later on June 16, Andy Revkin endorsed Lynas’ request and asked to be copied on any response. On June 17, Oliver Morton of the Economist added himself to the group asking about the press release.

May I add myself to the group asking for an account of how the press release was drafted, and who was involved?

While Edenhofer copied both Lynas and me on his two emails, he conspicuously did not address either Lynas or me. Yesterday, as reported here by Mark Lynas, IPCC WG3 chair Ottmar Edenhofer replied to Morton as follows:

Dear Oliver,

As I have written to Andrew Revkin, the press release was drafted by the WGIII and the Secretariat. Nick Nutall, spokesperson of the United Nations Environment Programme was acting IPCC spokesperson at the time of the Abu Dhabi meeting, because this position was vacant. He has drafted the first version, which was then reviewed by the Secretariat, the WGIII co-chairs, and the WGIII TSU. Sven Teske was not involved in the process of writing the press release.

It was based on the SPM but supplemented from the underlying chapters, for example with the numbers that describe the upper and the lower one of the four scenarios that have been analyzed in-depth:

“Over 160 [164] existing scientific scenarios on the possible penetration of renewables by 2050, alongside environmental and social implications, have been reviewed with four analyzed in-depth. These four were chosen in order to represent the full range. […]

The most optimistic of the four, in-depth scenarios projects renewable energy accounting for as much as 77 percent of the world‘s energy demand by 2050, amounting to about 314 of 407 Exajoules per year. […]

77 percent is up from just under 13 percent of the total primary energy supply of around 490 Exajoules in 2008. Each of the scenarios is underpinned by a range of variables such as changes in energy efficiency, population growth and per capita consumption. These lead to varying levels of total primary energy supply in 2050, with the lowest of the four scenarios seeing renewable energy accounting for a share of 15 percent in 2050, based on a total primary energy supply of 749 Exajoules.”

Best regards,

Ottmar

As too often in climate science, you have to watch the pea under the thimble. The statement at issue is the lead statement that ‘Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.’

Instead of squarely addressing the claim at issue, Edenhofer quotes a paragraph buried in the press release that wasn’t at issue.

So we know a little bit more about the process. The press release was drafted first by Nick Nutall of UNEP and the press release with the opening false statement was ‘reviewed” – and presumably signed off – by the IPCC Secretariat, WG3 co-chairs (including Edenhofer) and the WG3 TSU, with Teske apparently not being involved. Teske was, however, very fast off the mark, to say the least, as he issued a press release from Abu Dhabi on May 9 that couldn’t have postdated the IPCC press release by very long.

They say that ‘it was based on the SPM but supplemented from the underlying chapters’. However, I, for one, cannot find support for the claim that ‘close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.’ I’ve requested a reference from Edenhofer, thus far without success.

Thus, in respect to Lynas’ questions:

1. what was the process for writing the press release, and who decided whether it faithfully represented the main conclusions of the SPM/main report?

see immediately preceding comments. It seems to have been the WG3 chairs and secretariat

2. why was the SPM released more than a month before the full report?

No answer.

3. was Sven Teske in any way involved in the decision to highlight Teske et al, 2010 as one of the four ‘illustrative scenarios’ explored in greater depth as per Section 10.3.1?

No answer.

4. what is the IPCC conflict of interest policy with regard to lead authors reviewing their own work, and having affiliations to non-academic institutions, whether campaign groups or companies?

No answer – other than Pachauri’s statement to Oliver Morton that AR5 authors are not obliged to comply with recently passed IPCC conflict of interest policies.

Pachauri: No Conflict of Interest Policy for AR5

Yesterday, IPCC chairman Pachauri told Oliver Morton of The Economist at an IPCC event in Brussels that conflict of interest policies would not not apply to AR5 authors. IPCC thereby sabotaged recommendations from the Interacademy Council and announced its plans to evade the conflict of interest policies passed at the 33rd IPCC plenary only a month ago.

Continue reading

IPCC Sabotages an Interacademy Recommendation

In the wake of Climategate, IPCC was more or less forced to establish a review of its procedures, carried out by the Interacademy Panel. One of its key recommendations was on conflict of interest – more on this later. A related recommendation called for the formation of an Executive Committee, with at least 3 members not being IPCC insiders:

The IPCC should establish an Executive Committee to act on its behalf between Plenary sessions. The membership of the Committee should include the IPCC Chair, the Working Group Co-Chairs, the senior member of the Secretariat, and 3 independent members, including some from outside of the climate community. Members would be elected by the Plenary and serve until their successors are in place.

A pretty sensible and long overdue recommendation. Independent board members are commonplace in far less prominent organizations.

In researching the conflict of interest policy, I noticed that even this mild recommendation was sabotaged by the IPCC. Here is the resolution on governance passed at the recent 33rd session in Abu Dhabi:

2.3.3 The Composition of the Executive Committee will be as follows:
a. Members:
IPCC Chair (who will chair the Executive Committee)
IPCC Co-Chairs of Working Groups I, II and III and of the Task Force on Inventories
IPCC Vice Chairs

b. Advisory Members:
Head of Secretariat
The four Heads of the Technical Support Units

No independent members on the Executive Committee. Instead of independent members, including ones not from the climate community, staff members will serve as ‘advisory members’.

Responses from IPCC SRREN

Some follow-up on responses to yesterday’s post by IPCC and others.

My interest in SRREN had been attracted by the following lead to the IPCC press release announcing SRREN:

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

This claim was widely covered as googling ’80 percent of the world‘s energy supply could be met by renewables’ will show.

I commented acidly on the execrable IPCC policy of issuing the press release before the report itself became available. I’d made this criticism previously, but this time Andy Revkin agreed. In my post, I observed that the scenario highlighted by IPCC was an extreme case among the scenarios, that the extreme scenario came from Greenpeace and that a Greenpeace employee was a Lead Author of the chapter supposedly carrying out an independent assessment.

The post was covered by Mark Lynas and Andy Revkin among others and has led to responses from IPCC. None of the responses rebut any of the criticisms.

Again, let’s start with the statement that had originally caught both my eye and the eye of the world’s media:

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

Based on my reading of the document so far (and it’s only been available a short time), this statement is untrue on its face. As far as I can tell, the report does NOT show that ‘close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies’. Yes, it lists a scenario from Greenpeace in which 77% of world energy is supplied by renewables, but the report itself did not conduct any independent assessment of the validity of the Greenpeace scenario and did not ‘show’ that the claim in the press release was true.

Ottmar Edenhofer of IPCC sent an email to Revkin cc me and Lynas, in which he said that later statements in the press release put the 80 percent claim into ‘perspective’.

It is important to note that the press release put the 80% figure into perspective:

“Over 160 [164] existing scientific scenarios on the possible penetration of renewables by 2050, alongside environmental and social implications, have been reviewed with four analyzed in-depth. These four were chosen in order to represent the full range. […]

The most optimistic of the four, in-depth scenarios projects renewable energy accounting for as much as 77 percent of the world‘s energy demand by 2050, amounting to about 314 of 407 Exajoules per year. […]

77 percent is up from just under 13 percent of the total primary energy supply of around 490 Exajoules in 2008. Each of the scenarios is underpinned by a range of variables such as changes in energy efficiency, population growth and per capita consumption. These lead to varying levels of total primary energy supply in 2050, with the lowest of the four scenarios seeing renewable energy accounting for a share of 15 percent in 2050, based on a total primary energy supply of 749 Exajoules.”

A couple of points here.

This later ‘perspective’ still doesn’t support the rash assertion that they led with and that interested so many people. The later ‘perspective’ concedes that there were other scenarios. However, the lead assertion was that the report ‘showed’ that close to 80 per cent of 2050 energy could be provided by renewables with the right public policy. All Edenhofer has done here is point to a section of the press release that wasn’t untrue. No one said that everything in the press release was untrue. The critical issues are (1) whether the lead statement – the one that was actually covered – was untrue and (2) based on a Greenpeace scenario that had not been independently assessed.

The other line of supposed rebuttal has been that some other lead authors in the SRREN were drawn from industry. Sven Teske of Greenpeace stated;

With Exxon, Chevron and the French nuclear operator EDF also contributing to the IPCC, trying to suggest that this expert UN body is a wing of Greenpeace is preposterous.

I doubt that any Exxon, Chevron or EDF were Lead Authors of IPCC chapters assessing their own articles or scenarios. If they were, this is unacceptable as well. But none of this is really relevant to the situation at hand. The issue is whether Teske should have been involved in the direct assessment of the Greenpeace scenario. Did anyone actually do any independent due diligence on the Greenpeace scenario to see if it made any sense past being numbers pulled out of the air? My surmise right now is that IPCC didn’t do any independent due diligence on the Greenpeace scenario – this doesn’t mean that they are ‘wrong’, only that the IPCC claims on its behalf were unsupported.

Edenhofer also stated that Teske had been nominated by the German government.

Sven Teske was nominated as an author by the German government and selected by the WGIII as Lead author in the IPCC’s continuous effort to draw on the full range of expertise, and this includes NGOs and business as well as academia.

In the case at hand, I am unfamiliar with German politics, but it seems odd to me that the German government nominated a Greenpeace activist to SRREN. Even if Teske were nominated by the German government, that is no justifies IPCC’s decision to let Teske be involved in the assessment of his own scenarios. Would a more thorough assessment have been done if Teske had not been involved? We don’t know.

While these backstories are interesting, I urge readers not to lose sight of the original point. The following IPCC claim appears to be untrue:

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

Yesterday I asked Edenhofer to support this assertion or withdraw it as follows;

The opening sentence of your M1y 9, 2011 press release says:

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

Quite aside from the matter of a Greenpeace author assessing his own work, the above assertion – one that was widely covered in the world press – appears to be untrue based on my reading of the report itself to date. I am unable to see anything in the report that ‘shows’ that 80% of the world’s energy could be met by renewables ‘if backed by the right enabling public policies’. The Greenpeace scenario merely asserts this, but does not ‘show’ this result. Nor, to my knowledge, is this assertion ‘shown’ in any section of the report. If I am incorrect in my reading, I would appreciate a reference to the section where the report ‘shows’ that ‘close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies’

Otherwise, you should issue a new press release, withdrawing the above apparently untrue statement from your press release.

No response yet.

IPCC WG3 and the Greenpeace Karaoke

On May 9, 2011, the IPCC announced (archive)

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

In accompanying interviews, IPCC officials said that the obstacles were not scientific or technological, but merely a matter of political will.

Little of the increase was due to ‘traditional’ renewables (hydro and ‘traditional’ biomass, mostly dung), but to solar, wind and non-traditional biomass.

I, for one, was keenly interested in how IPCC got to its potential 80%. Unfortunately, in keeping with execrable IPCC practices, the supporting documents for the Renewables Study were not made available at the time of the original announcement. (Only the Summary for Policy-makers was made available at the time.) This showed one worrying aspect of the announcement. The report was based on 164 ‘scenarios’ and the ‘up to 80%” scenario in the lead sentence of their press release was not representative of their scenarios, but the absolute top end. This sort of press release is not permitted in mining promotions and it remains a mystery to me why it is tolerated in academic press releases or press releases by international institutions.

The underlying report was scheduled for release on June 14 and was released today on schedule. Naturally, I was interested in the provenance of the 80% scenario and in determining precisely what due diligence had been carried out by IPCC to determine the realism of this scenario prior to endorsing it in their press release. I hoped against hope that it would be something more than an IPCC cover version of a Greenpeace study but was disappointed.

The scenarios are in chapter 10 of the Report. authors of the chapter are as follows (mainly German):

CLAs -Manfred Fischedick (Germany) and Roberto Schaeffer (Brazil). Lead Authors: Akintayo Adedoyin (Botswana), Makoto Akai (Japan), Thomas Bruckner (Germany), Leon Clarke (USA), Volker Krey (Austria/Germany), Ilkka Savolainen (Finland), Sven Teske (Germany), Diana Ürge‐Vorsatz (Hungary), Raymond Wright (Jamaica).

The 164 scenarios are referenced to a just-published and paywalled article by two of the Lead Authors (Krey and Clarke, 2011, Climate Policy). Update – Since this article has been relied upon in an IPCC report, it is liberated here.

Chapter 10 isolated four scenarios for more detailed reporting, one of which can be identified with the scenario featured in the IPCC press release. The identification is on the basis of Table 10.3 which shows 77% renewables in 2050 for the ER-2010 scenatio attributed to Teske et al., 2010. (Teske being another Chapter 10 Lead Author. This scenario is described as follows:

Low demand (e.g., due to a significant increase in energy efficiency) is combined with high RE deployment, no employment of CCS and a global nuclear phase-out by 2045 in the third mitigation scenario, Advanced Energy [R]evolution 2010 (Teske et al., 2010) (henceforth ER-2010).

Teske et al 2010 – online here – is cited as follows:

Teske, S., T[homas] Pregger, S[onja] Simon, T[obias] Naegler, W[ina] Graus, and C[hristine] Lins (2010). Energy [R]evolution 2010—a sustainable world energy outlook. Energy Efficiency, doi:10.1007/s12053-010-9098-y.

However, googling the title led me first to a different article with the almost the same
title ‘energy [ r]evolution:A SUSTAINABLE GLOBAL ENERGY OUTLOOK’ online here. This version is a joint publication of Greenpeace and the European Renewable Energy Council, self-described as the ‘umbrella organisation of the European renewable energy industry’. the title page shows:

project manager & lead author – Sven Teske
EREC Oliver Schäfer, Arthouros Zervos,
Greenpeace International – Sven Teske, Jan Béranek, Stephanie Tunmore
research & co-authors
DLR, Institute of Technical Thermodynamics, Department of Systems Analysis and
Technology Assessment, Stuttgart, Germany: Dr. Wolfram Krewitt, Dr. Sonja Simon, Dr. Thomas Pregger.
DLR, Institute of Vehicle Concepts, Stuttgart, Germany: Dr. Stephan Schmid
Ecofys BV, Utrecht, The Netherlands: Wina Graus, Eliane Blomen.

The preface to the Greenpeace report is by one R.K. Pachauri, who stated:

This edition of Energy [R]evolution Scenarios provides a detailed analysis of the energy efficiency potential and choices in the transport sector. The material presented in this publication provides a useful basis for considering specific policies and developments that would be of value not only to the world but for different countries as they attempt to meet the global challenge confronting them. The work carried out in
the following pages is comprehensive and rigorous, and even those who may not agree with the analysis presented would, perhaps, benefit from a deep study of the underlying assumptions that are linked with specific energy scenarios for the future.

Dr. R. K. Pachauri
DIRECTOR-GENERAL, THE ENERGY AND RESOURCES INSTITUTE (TERI) AND CHAIRMAN, INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE (IPCC)

Returning now to the original lead to the IPCC Press Release on renewables:

Close to 80 percent of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

The basis for this claim is a Greenpeace scenario. The Lead Author of the IPCC assessment of the Greenpeace scenario was the same Greenpeace employee who had prepared the Greenpeace scenarios, the introduction to which was written by IPCC chair Pachauri.

The public and policy-makers are starving for independent and authoritative analysis of precisely how much weight can be placed on renewables in the energy future. It expects more from IPCC WG3 than a karaoke version of Greenpeace scenario.

It is totally unacceptable that IPCC should have had a Greenpeace employee as a Lead Author of the critical Chapter 10, that the Greenpeace employee, as an IPCC Lead Author, should (like Michael Mann and Keith Briffa in comparable situations) have been responsible for assessing his own work and that, with such inadequate and non-independent ‘due diligence’, IPCC should have featured the Greenpeace scenario in its press release on renewables.

Everyone in IPCC WG3 should be terminated and, if the institution is to continue, it should be re-structured from scratch.

Lindzen’s PNAS Reviews

Chip Knappenberg has published Lindzen’s review correspondence with PNAS at Rob Bradley’s blog here. Most CA readers will be interested in this and I urge you to read the post, taking care to consult the attachments. (I would have preferred that the post include some excerpts from the attachments.)

The post focuses to a considerable extent on PNAS’ departures from their review policy, but there are some other interesting features in the correspondence, which I’ll discuss here, referring readers to the original post for the PNAS issues.

A PNAS letter to its members observes:

very few Communicated and Contributed papers are rejected by the Board. Last year approximately 800 Communicated and 800 Contributed papers were submitted, of which only 32 Communicated and 15 Contributed papers were rejected. These numbers are not exceptional by historical standards extending at least the past 15 years

The rejection of Lindzen’s paper is an unusual event. NAS members submitting a paper are asked to provide two reviews (they are permitted to select their own reviewers.) NAS policy on referees says:

we have adopted the NSF policy concerning conflict of interest for referees (http://www.pnas.org/site/misc/coi.shtml), which states that individuals who have collaborated and published with the author in the preceding four years should not be selected as referees.

Both Happer and Chou, according to Lindzen, met this criterion. (One of the overlooked implications in Wegman’s analysis is that the extensive collaboration between paleoclimate authors makes it that much harder to find referees that meet NSF standards – it’s too bad that they didn’t add this criterion in the analysis.)

PNAS rejected the referees as follows:

Both scientists are formally eligible for refereeing according to the PNAS rules, but one of them (WH) is certainly not an expert for the topic in question and the other one (MDC) has published extensively on the very subject together with Lindzen. So, in a sense, he is reviewing his own work…

it is good scientific practice to involve either some of those who have raised the counter-arguments (and may be convinced by an improved analysis) in the review or to solicit at least the assessment of leading experts that have no direct or indirect affiliation with the authors.

Instead of their normal cozy practices, PNAS reverted by suggesting that the submission be reviewed by “Susan Solomon, Kevin Trenberth, Gavin Schmidt, James G. Anderson and Veerabhadran Ramanathan”, saying that “the Board will seek the comments of at least one of these reviewers unless you have any specific objections to our contacting these experts”. Lindzen disputed PNAS’ characterization of Happer and Chou as not factual. In the end, PNAS obtained four reviews, two of which were respectful,recommending reworking, and two of which were acrimonious. Lindzen surmised that PNAS, contrary to its standard practices, had retained reviewers to whom he had objected.

Some of the comments in the reviews – see here – are intriguing. For example, Reviewer 2 stated:

The poor state of cloud modeling in GCMs has been amply demonstrated elsewhere and the effect of this on climate sensitivity is well documented and acknowledged.

While cloud uncertainties are mentioned in IPCC AR4, I would not say that the effect of various cloud parameterization on climate sensitivity is ‘well documented” in IPCC. Quite the opposite. IPCC’s description of clouds is, in my opinion, far too cursory given the importance of the problem.

The reviewer continues with the following list of problems with theory in the area:

While the stated result is dramatic, and a remarkable departure from what analysis of data and theory has so far shown, I am very concerned that further analysis will show that the result is an artifact of the data or analysis procedure. The result comes out of a multi-step statistical process. We don’t really know what kind of phenomena are driving the SST and radiation budget changes, and what fraction of the total variance these changes express, since the data are heavily conditioned prior to analysis. We don’t know the direction of causality – whether dynamically or stochastically driven cloud changes are forcing SST, or whether the clouds are responding to SST change. Analysis of the procedure suggests the former is true, which would make the use of the correlations to infer sensitivity demonstrably wrong, and could also explain why such a large sensitivity of OLR to SST is obtained when these methods are applied.

Let’s stipulate that all of this is true. Shouldn’t this then be stated prominently in IPCC? The IPCC SPM says “Cloud feedbacks remain the largest source of uncertainty” but this hardly does justice to the long list of problems that worry reviewer 2.

And doesn’t reviewer 2 prove too much here? If all of these problems need to be solved prior to publishing an article in the field, wouldn’t this apply to all articles? Not just ones the implications of which are low sensitivity.

Reviewer 2 complains that methodological details are inadequate:

Sufficient description is necessary so that another experimenter could reproduce the analysis exactly. I don’t think I could reproduce the analysis based on the description given. For example, exactly how were the intervals chosen? Was there any subjectivity introduced?

Look, I’m highly supportive of this type of criticism. Lindzen disputes the criticsm. But it is hardly standard practice in climate science to provide adequate methodology, let alone data. I’ve unsuccessful sought assistance from journals in getting data. I’m all in favor of replication and hope that this precedent extends to the Team as well. Several years ago, I asked PNAS to require Lonnie Thompson to provide a detailed archive of Dunde and other data so that inconsistent versions could be reconciled. PNAS refused.

The more sympathetic reviewers wanted to understand why Lindzen’s results differed from Trenberth’s and asked for a reconciliation:

I feel that the major problem with the present paper is that it does not provide a sufficiently clear and systematic response to the criticisms voiced following the publication of the earlier paper by the same authors in GRL, which led to three detailed papers critiquing those findings.

and

If the paper were properly revised, it would meet the top 10% category. 2) The climate feedback parameter is of general interest. 3) I answered no, because the exact same data have been used by others to get an opposing answer and I do not see any discussion or evidence as to why one is correct and the other is not.

That point seems reasonable enough to me. However, when I asked that IPCC provide similar reconciliation of Polar Urals versus Yamal, Briffa said that it would be “inappropriate” to do so, and that was that.

While Lindzen could have accommodated the last two reviewers, he decided that it would be impossible to accommodate the first two reviewers and he submitted elsewhere.

Compare these reviews to Jones’ puffball reviews, which were some of the most important Climategate documents. Prior to Climategate, people may have suspected that close collaborators were reviewing one another’s work (as Wegman had hypothesized), but no one knew for sure. People may have suspected that pals gave one another soft reviews, but no one knew for sure. Jones’ reviews of submissions by Mann, by Schmidt, by Santer were proof.

McShane and Wyner Weights on Mann 2008 Proxies

Most CA readers are aware that proxy reconstructions use linear methods and that, accordingly, all the huffing and puffing of complicated multivariate methodologies simply end up assigning a vector of weights. Surprisingly this obvious point was not understood by paleos when I started in this field.

Because one can assign a vector of weights, it’s possible to make highly informative maps showing the weights of proxies by the size of the disk at the proxy location, designating the sign by a different color. Unfortunately, this sensible practice of examining proxy weights has not been adopted by paleos. Their failure to show proxy weights inevitably leads to quite a bit of (in my opinion) aimless thrashing, with the Smith paper being a recent example.

Smith was perplexed by the difference between McShane-Wyner reconstruction. The figure below shows what bothered him. In this case, retaining one PC led to an MBH-style Stick, while retaining 10 PCs had a pronounced MWP.


Figure 1. Mc-W Figure 14. Red – with one PC; green – with 10 PCs.

Last fall, I tweaked the McShane Wyner code so that weights of the various proxies was extracted in the process. I prepared the graphics below last October but didn’t post them up at the time – I guess that I must have gotten distracted by something else.

The figure below shows the weights for the “red” reconstruction (made from one retained PC.) The overwhelming weighting of US southwestern tree ring chronologies is evident. The largest weights are in gridcells with bristlecones. Nothing else really contributes. This is a classic distribution of weights in a Mann network. MBH reconstructions in its various guises also weight the bristlecones. The reason why supposedly “independent” reconstructions look so similar is that the proxies are commonly not independent. The Mann et al 2008 network of 1209 proxies contains the same Graybill bristlecone as MBH98.

Notice that the Central American lake sediment series (which have a prominent MWP) are flipped over. This is a result of the PC algorithm. Flipping them over makes them line up better with the bristlecones.


Figure 2. Weights for Red Reconstruction. Orange dot is only because red dot was too big. Red – positive weight; blue negative weight.

Update – For amusement, here are the weights for Gavin Schmidt’s selection of 55 of 93 proxies using the McShane Wyner Figure 14 method. Needless to say, Gavin’s reconstruction is fully addicted to bristlecones. For greater certainty, Gavin flipped over the Central American lake sediment series that offended Smith.

The green reconstruction is not dominated by bristlecones. More prominent are sediment series in Central America, a speleothem in Yemen and some Chinese speleothem series. This time, there are a number of negatively oriented series – some tree ring series in the southeast US that were originally reported as precipitation proxies, a speleothem in Scotland. The Tiljander series are flipped over to cohere better with lake sediments in Central America.


Figure 3. Weights for Green Reconstruction.

Obviously lots of readers will “like” the green series, but it’s not clear to me that it makes any more sense than red reconstruction.

In my opinion, the problem is that you can’t simply throw a bunch of inconsistent time series into a multivariate mannomatic and expect to get a statistically significant response. If a scientist cannot specify the sign of a proxy in advance, then the proxy shouldn’t be used.

Richard Smith (2011) and the Graybill Bristlecones

Richard Smith’s new paper doesn’t mention Graybill bristlecones, but once again, his paper does nothing more than discover what we already knew – that Graybill bristlecones have a HS shape. In the process, Smith amusingly discovers a “divergence” problem with lake sediments

Smith’s new paper describes the use of the methodology of his earlier paper to the “new” dataset used in McShane and Wyner 2010. Smith says that his earlier paper “used the NOAMER tree ring dataset, which consists of 70 temperature series constructed from tree rings for 581 years (1400-1980).” In the preceding post, I observed that the data set in question consisted of tree ring chronologies, which cannot be assumed to be “temperature series”.

The McShane-Wyner dataset considered by Smith consists of 93 Mann et al 2008 series that go back to 1000. Smith attributes the seeming instability of reconstructions to lake sediment records (observing that there are 12 within the dataset), pondering the possibility of “divergence” problems in lake sediments – a possibility that, according to Smith’s belief, had evaded the keen eye of paleos.

Smith defines divergence as follows:

Paleoclimatologists have coined the term “divergence” to describe cases in which the stationarity assumption appears to be breaking down within the timescale of observational data.

While this is a sensible definition, I’m not sure that this accurately characterizes its application by the Team, where the phenomenon is in practice limited to series that don’t go up. Smith’s definition would include series that go up too much. Rather than these examples being perceived as examples of “divergence”; they are welcomed by the Team.

Smith continues:

The best known example of divergence concerns trees; see for example Briffa et al. 1998 or pages 48-52 of North et al. (2006). However, the problem does not (so far as is known) apply uniformly to all tree-ring proxies; the specific class of proxies for which it is known to be a problem are tree-ring latewood density records. However, most of the known records of this type go back no further than AD 1400; in particular, none of them are among the 93 proxies used in the present analysis (Dr. Michael Mann, personal communication). Therefore, it appears that the known divergence problem with tree rings is not responsible for the results in the present paper.

Smith attributes the instability to lake sediments, of which there are 12 in the McShane-Wyner dataset, two of which, as CA readers are well aware, are upside-down Tiljander series, the modern portion of which is hugely contaminated by bridgebuilding and agriculture. Smith suggests that non-stationarity in lake sediment series might be a problem – noting that, to his knowledge, this possibility had not been previously considered.

To the best of my knowledge, no previous study has explicitly identified lake sediment records as subject to this problem, though with the benefit of hindsight, it seems obvious that lake sediment deposits in the late 20th century would be affected by anthropogenic activity other than increasing CO2.

Thousands of blog readers around the world are familiar with the fact that Mann used the modern (contaminated) portion of the Tiljander series, ironically upside down. Ross and I even went to the trouble of reporting this in a short comment in PNAS (not cited by Smith on this point.)

In this case, Smith is not complaining about the Tiljander sediments going up too much. Actually his complaint is the opposite. Some of the sediment series in the Mann 2008 data set have a pronounced medieval warm period.

Smith therefore examines a reduced dataset of 81 proxies using inverse regression on principal components and once again gets a characteristic HS shape – one that looks for all the world like the original Mann reconstruction.

There’s a simple reason. Smith once again has created a bristlecone reconstruction. The 81 series in the new data set include 18 Graybill bristlecone chronologies, ALL of which were in the 70-series NOAMER dataset of his previous paper.

Last time, Smith had 20 Graybill bristlecones out of 70. This time, the Graybill bristlecones constitute 18 of 81 series in the data set. Surprise, surprise, he gets the same answer.

Richard Smith on PC Retention

Richard Smith, a prominent statistician, has recently taken an interest in multiproxy reconstructions, publishing a comment on Li, Nychka and Ammann 2010 (JASA) here and submitting another article here. I’ll try to comment more on another occasion.

Today I want to express my frustration at the amount of ingenuity expended by academics on more and more complicated multivariate methods, without any attention or consideration to the characteristics of the actual data. For example, Smith (2010) attempts to analyze the MBH98 North American tree ring network without even once mentioning bristlecones.

Smith starts off as follows:

In this discussion, we use principal components analysis, regression, and time series analysis, to reconstruct the temperature signal since 1400 based on tree rings data.Although the “hockey stick” shape is less clear cut than in the original analysis of Mann, Bradley, and Hughes (1998, 1999), there is still substantial evidence that recent decades are among the warmest of the past 600 years.

Smith refers to MM2003, MM2005a and MM2003b, describing only one of a number of issues raised in those articles – Mannian principal components. Smith Smith describes the network as follows:

The basic dataset consists of reconstructed temperatures from 70 trees for 1400–1980, in the North American International Tree Ring Data Base (ITRDB).

The dataset is located at Nychka’s website here and is a mirror image of the MBH tree ring network that we archived in connection with MM2005a. 20 of these are Graybill strip bark chronologies – the ones that were left out in the CENSORED directory.

Pause for a minute here. Leaving aside the quibble that we are talking about tree ring chronologies rather “70 trees”, Smith has, without reflection, taken for granted that the 70 tree ring chronologies are 70 examples of “reconstructed temperature”. They aren’t. They are indices of tree growth at these 70 sites, which, in many cases, are more responsive to precipitation than temperature. Academics in this field are far too quick to assume that things are “proxies” when this is something that has to be shown.

The underlying question in this field is whether Graybill strip bark bristlecone chronologies have a unique capability of measuring world temperature. We discussed this in MM200b as follows:

While our attention was drawn to bristlecone pines (and to Gaspé cedars) by methodological artifices in MBH98, ultimately, the more important issue is the validity of the proxies themselves. This applies particularly for the 1000–1399 extension of MBH98 contained in Mann et al. [1999]. In this case, because of the reduction in the number of sites, the majority of sites in the AD1000 network end up being bristlecone pine sites, which dominate the PC1 in Mann et al. [1999] simply because of their longevity, not through a mathematical artifice (as in MBH98).

Given the pivotal dependence of MBH98 results on bristlecone pines and Gaspé cedars, one would have thought that there would be copious literature proving the validity of these indicators as temperature proxies. Instead the specialist literature only raises questions about each indicator which need to be resolved prior to using them as temperature proxies at all, let alone considering them as uniquely accurate stenographs
of the world’s temperature history

Most “practical” readers of this blog have no difficulty in understanding this point, whereas academics in this field prefer to consider the matter via abstract policies on PC retention, with Smith being no exception.

Smith’s approach was to regress world temperature against principal components of the MBH tree ring network (with all 20 Graybill chronologies) varying the number of retained principal components and examining the fit. Smith described the problem as an inverse regression i.e. “cause” (world temperature y) against “effects” – the proxies denoted as x.

While Smith says that this is a “natural” way to look at the data, I don’t think that OLS regression of cause against a very large number series is “natural” at all. On the contrary, if one looks at this methodology even with relatively simple pseudoproxies, it is a very poor method. (There’s a 2006 CA post on these issues that IMO is a very good treatment.)

In my opinion, if the tree ring series truly contain a “signal”, a much more “natural” approach is to calculate an average – an alternative that is seldom considered by academics in this field.

Reducing the number of proxy series in the X matrix makes the problem of OLS regression less bad. Smith characterizes the OLS problem as one of overfitting and says that a “standard method for dealing with this problem” is to transform into principal components. Smith then goes to the problem of how many principal components to retain.

I don’t think that one can assume that principal components applied to the NOAMER tree ring network will automatically lead to good results.

Preisendorfer, a leading authority on principal components who was cited in MBH98, provided the following advice in his text- advice quoted at CA here:

The null hypothesis of a dominant variance selection rule [such as Rule N] says that Z is generated by a random process of some specified form, for example a random process that generates equal eigenvalues of the associated scatter [covariance] matrix S…

One may only view the rejection of a null hypothesis as an attention getter, a ringing bell, that says: you may have a non-random process generating your data set Z. The rejection is a signal to look deeper, to test further. One looks deeper, for example, by drawing on one’s knowledge and experience of how the map of e[i] looks under known real-life synoptic situations or through exhaustive case studies of e[i]‘s appearance under carefully controlled artificial data set experiments. There is no royal road to the successful interpretation of selected eigenmaps e[i] or principal time series a[j] for physical meaning or for clues to the type of physical process underlying the data set Z. The learning process of interpreting [eigenvectors] e[i] and principal components a[j] is not unlike that of the intern doctor who eventually learns to diagnose a disease from the appearance of the vital signs of his patient. Rule N in this sense is, for example, analogous to the blood pressure reading in medicine. The doctor, observing a significantly high blood pressure, would be remiss if he stops his diagnosis at this point of his patient’s examination. ….Page 269.

A ringing bell.

Applying Preisendorfer’s advice, the next scientific task is to determine whether Graybill bristlecone chronologies truly have a unique ability to measure world temperatures and, if so, why. A step urged on the field in MM2005b,

Instead of grasping this nettle – one that has been outstanding for a long time – Smith, like Mann and Wahl and Ammann before him, purported to argue that inclusion of bristlecones could be mandated “statistically” without the need to examine whether the proxies had any merit or not.

Smith’s approach was a little different than similar arguments by Mann and Wahl and Ammann. Smith did a series of such regressions varying K, calculating the Akaiche Information Criterion and other similar criteria for each regression, ultimately recommending 8 PCs, still giving a HS, though one that is not as bent as the original. Smith begged off consideration of the bristlecones as follows:

I have confined this discussion to statistical aspects of the reconstruction, not touching on the question of selecting trees for the proxy series (extensively discussed by M&M, Wegman, Scott, and Said and Ammann/Wahl) nor the apparent recent “divergence” of the relationship between tree ring reconstructions and measured temperatures (see, e.g., NRC 2006, pp. 48–52). I regard these as part of the wider scientific debate about dendroclimatology but not strictly part of the statistical discussion, though it would be possible to apply the same methods as have been given here to examine the sensitivity of the analysis to different constructions of the proxy series or to different specifications of the starting and ending points of the analysis.

I strongly disagree with Smith’s acquiescence in failing to grasp the nettle of the Graybill chronologies. The non-robustness of results to the presence/absence of bristlecones should have been clearly reported and discussed.

Ron Broberg commented on Smith (2010) here. Broberg referred to a number of my posts on principal components and commented acidly on my failure to propose a “good rule for the retention of PCs”:

I’m listing some of Steve McIntyre’s posts on the number of PCs to retain. If, after reading these, you still don’t know what McIntyre believes to be a good rule for the retention of PCs, then at least I know I’m not alone. If I have missed something, please let me know.

While Broberg may be frustrated, our original approach to the problem was one of auditing and verification i.e. begin with the examination of MBH policy for retention of principal components. We tried strenuously to figure out what Mann had done and were unable to do so. Mann’s criteria for PC retention remain unexplained and unknown to this day. Broberg may be frustrated, but I can assure readers that I am far more frustrated that this important step in MBH remains unexplained to this day.

In the case at hand, until one resolves whether Graybill bristlecone chronologies are a valid temperature proxy, nor do I see the point of trying to opine on the “right” number of retained principal components. It seems to me that Smith begged the question with his initial statement that the 70 series in the NOAMER network were “reconstructed temperatures”. Maybe they are, maybe they aren’t. Surely that needs to be demonstrated scientifically, rather than assumed.