2. you should have (at least) 4 settings. The grid square displacement and the only one gridcell are separate issues.

3. The 1930s impact is interesting. Good catch.

4. I wonder how much RE is impacted by the methodology choices (minor, I assume, but just curious).

5. Is your example here only the 1000 AD proxies or does it include the more recent stuff? Just want to make sure that you can emulate the more complicated Mann treatments. Why not emulate figures from his paper?

6. If you have questions on how the uncertainty was calculated, ask Mann. I would have told you to move on too, were I the reviewer. Also, the whole issue deserves it’s own post, or at least it’s own paragraph! Don’t throw it in at the end.

]]>require(“Rcompression”)

Is there an R package which has this already built in?

]]>Re: Hu McCulloch (#9),

An ideal calibration would then take into account the empirical variance of the instrumental random walk, plus the emprical variance of the calibration equation, and then do a type of Kalman smoother under the assumption that the unobserved reconstruction temperature is continuing this random walk.

Yes, random walk or any other prior for the signal. CCE + Kalman, that’s what we need. In published dendro work ICE gets trough (Briffa98, see http://www.climateaudit.org/?p=4475#comment-314389 ,

implicit assumption is iid signal, not random walk), and even variance matching + butterworth smoothing (Mann). And Briffa’s variance adjustment shows that they are not even interested in taking this topic seriously 😉

BTW, I think Butterworth is at least approximate solution to some Wiener filter problems.

]]>Mark

]]>Mark

]]>Our discussions here of calibration tend to think in terms of calibrating one date at a time, in isolation from all other dates. However, the instrumental data is, to a first approximation, roughly a random walk, so that adjacent dates are more alike than distant dates, and adjacent calibration data and even instrumental data can tell us a lot about any given date in question.

An ideal calibration would then take into account the empirical variance of the instrumental random walk, plus the emprical variance of the calibration equation, and then do a type of Kalman smoother under the assumption that the unobserved reconstruction temperature is continuing this random walk. The near end can even be pinned down with “perfect” certainty (relatively speaking) at the oldest instrumental date.

I’m still mulling how to do this with Thompson’s CC03 ice core data, which is only a Z-mometer as it stands. The problem is somewhat complicated by the fact that even given the variances, the slope uncertainty makes the pointwise reconstruction the ratio of two normals, and then the variance uncertainty also has to be integrated out, presumably numerically. But it’s not insurmountable, I think.

]]>