In #134, he said:

Re 133: On this site I think “undisclosed” usually means McIntyre has misunderstood something. He certainly uses it frequently where there is no reason to think anyone is hiding anything.

I replied:

Martin, you’ve made another accusation. Let’s work through an example – which is probably the one that’s been the most prominent. I originally said that the MBH reconstruction had adverse verification r2 results for the AD1400 step and that Mann failed to disclose the adverse results. You’ve made the accusation that this is just my “misunderstanding”. What have I misunderstood?

To which Juckes replied:

Re 140: see for example comment 15 on page 888: “Now there has to be some still undisclosed re-weighting.”

Also note that “You’ve made the accusation that this is just my “misunderstanding”” is clearly untrue.

So let’s look at #15, which discusses Juckes reconstruction mr_mbh_1000_cvm_nht_01.02.001_pc . The suffix “pc” is explained in Juckes SI as using the **unadjusted** first proxy PC as follows:

The optional sufx is used to describe variants of the MBH proxy collection:_

pc: using theunadjustedfirst proxy PC.

The reverse engineering clearly shows that Juckes used the adjusted first proxy PC in this particular proxy reconstruction.

Juckes’ main article stated:

for the proxy principal components in the MBH collection the sign is arbitrary: these series have, where necessary, had the sign reversed so 5 that they have a positive correlation with the northern hemisphere temperature record

Now in this particular example, it appears almost certain that the sign of the first proxy PC was not reversed so that it has a positive orientation to NH temperature. This example was re-visited with other examples in the post Replicating Juckes’ CVM.

It appears to me that in the “pc” reconstruction that 1) Juckes has used the adjusted proxy PC1 and 2) has not orieinted the adjusted proxy PC1 to have a positive orientation with NH temperature. Juckes says that I’ve “misunderstood” what he did. OK Martin, what did I misunderstand. For the record, will you categorically for the record assert that:

1) this “pc” reconstruction used the “unadjusted” proxy PC1;

2) this “pc” reconstruction used the “unadjusted” proxy PC1 with its orientation reversed so that it had a positive correlation with NH temperature?

These are pretty simple questions.

]]>Thanks, I hadn’t seen that. (Still, the question was whether this was ever submitted for publication.) Presenting fruity flavors at AGU is a good thing. They publish their abstracts? ]]>

BTW I’ve got some good new stuff on Indigirka which I’ll post up tomorrow. Juckes said that he didn’t use it because it was “unpublished data”. Actually I just discovered that it has an alter ego which I’ve even referred to without realizing that they were the same.

]]>If you are right about what these guys are doing – cherry-picking data-snooped series to deliver a pre-determined result – then there are two alternatives in response. One approach, which I had advocated from the start, was using bootstrapping to build an honest confidence interval. This would be innovative. A different approach, much simpler, would be to cherry-pick your own data-snooped series to obtain your own pre-determined, opposite result. This would not be innovative and constructive, but it would certainly make the counterpoint. You would have to know the proxies pretty well to do this (i.e. not just the proxy data stream, but all the sub-data streams that go into each proxy), and you apparently know your proxies. Possibly as well or better than the best in the world.

My question is: have you ever tried publishing your own cherry-picked, data-snooped, counterpoint recon? It would have an impact.

]]>Secondly: I notice on that on page 888 you are having trouble interpreting mr_mbh_1000_cvm_nht_01.02.001_pc: This figure could be interpreted as a coding error or as an illustration of the pitfalls of combining the use of proxy PCs and the composite approach. The problem is the arbitrary sign of the PCs. If this is not adjusted the composite is likely to be meaningless because of the arbitrariness, if it is adjusted estimates of significance can be compromised. Some such reconstructions (using adjusted PCs) are included in the discussion for comparison purposes, but for the main conclusions we avoid proxy PCs so as to avoid this problem. The curve you show on page 888 has an unadjusted PC, so it is basically meaningless.

I think that I’ve diagnosed what’s going on. One technique that I use for detective work on reconstructions is simply regressing the reconstruction against candidate proxies and looking at the coefficients. If you do this for Juckes #7, you get an exact match. Here’s the barplot of coefficients here. Now there has to be some still undisclosed re-weighting. The coefficients are all sort of similar, but for #7, they were exact. So it’s hard to say Juckes actually did with his weighting here. Reconstruction #9 is constructed somehow from these coefficients, but there’s something still missing from Juckes’ methodological description. Maybe he’ll tell us. But that’s not the main issue. You’ll notice that all the regression coefficients are positive.

Barplot of regression coefficients of Juckes reconstruction 9 ( mr_mbh_1000_cvm_nht_01.02.001_pc ) against Juckes archive of MBH99 proxies 20-32. The three left-most series are archived PC1s, with the negligibly weighted leftmost series the "unfixed" PC1. (The unfixed PC1 is used in Juckes #7.)

Now here’a a plot of the PC1-fixed. The PC1-fixed is the biased Mannian PC1 slightly shaved. It has the HS-ness of the correlation PC1. There are multiple problems with Mann’s purported “fixing” of his PC1. (The only real solution to biased bristlecones is the one recommended by the NAS panel – avoid them in temperature reconstructions. Of course Juckes didn’t do this.) In the AD1000 network, the network has a disproportionate number of bristlecones, so the vagaries of PC methodology don’t make a whole lot of difference (as we observed in MM05 EE). You still get a bristlecone shape under any method. The Mannian PC, of course, makes the HS more extreme; the "fixing" just takes it back to being close to a un-"fixed" correlation PC.

But here’s what’s amusing. In Juckes’ reconstruction #9, he forgot to flip the PC1. This is presumably the “coding error” mentioned above. Juckes undoubtedly realized what happened as soon as I pointed it out. Of course, he put the blame on me saying that it was me that was "having trouble interpreting" the reconstruction – as though this were my fault. I think that I’m interpreting this just fine.

Plot of Juckes PC1-"fixed" from archive.

Now if the issue is a "coding error", then it’s not that "I’m having trouble interpreting" the graphic,

]]>Osborn et al 2004 has

To continue with the approach selected here requires a choice to be made between optimising

the reconstruction at either local or regional-average scales (or some intermediate scale). The results

reported in the remainder of this paper relax the requirement of optimising the fit for each grid box

(which the local linear regression achieved) by instead scaling each grid-box MXD series so that its

variance matches the variance of the observed temperature series for that grid box. A “variancematching”

approach has been used by other studies, typically at a hemispheric scale (e.g., Jones et al.,

1998; Crowley and Lowery, 2000). Though the variance matching does not give the optimal fit at the

grid-box scale (i.e., the root-mean-squared error is not minimised), it yields a spatially-resolved

reconstruction that, when regionally-averaged, matches the regionally-averaged temperature much

more closely.

Essentially, regression usually has the effect that the model output for the dependent variable will have a reduced variance compared with the original data. Variance matching scales the model output to offset this. Whether it is valid or not is unclear, but when the time trend – it certainty worsens the explanatory power of the model.

]]>