here’s my proposal for a Multiproxy generalization of the univariate CI that (unlike Brown’s!) never comes up empty or imaginary:

This might be relevant article,

Conservative confidence regions in multivariate calibration, Thomas Mathew and Wenxing Zha, Ann. Statist. Volume 24, Number 2 (1996), 707-725.

In the multivariate calibration problem using a multivariate linear model, some conservative confidence regions are constructed. The regions are nonempty and invariant under nonsingular transformations

…We see from Table 2 that the likelihood-based region due to Brown and Sundberg (1987) is the shortest, followed by our region (4.9) and Brown’s (1982) region (in cases where it is nonempty). Note that Brown’s region is exact and hence its coverage probability is the assumed confidence level of 95%.

Sent it to you.

]]>I tried implementing this and got very inconsistent results when testing beta = 0 using the F test or my t-based test.

I think my problem is here:

For any qX1 vector c of known constants,

cTY’ ~ cTa + (cTb)x’ + N(0, sigma^2(x’) cT Gamma c).

In particular, since we know what b and G are, we may set c = bT G^-1 so that

bT G^-1 (Y’-a) ~ (bT G^-1 b)x’ + N(0, sigma^2(x’) bT G^-1 Gamma G^-1 b)

In fact, since b is random (and perfectly correlated with b itself), it’s not really legitimate to set c = bT g^-1, or at least when we do bT g^-1 (Y’-a) does not really have a normal distribution. Rather it the convolution of a non-central chi-squared distribution with q DOF with the normal distribution of the disturbance term, which is way to messy for me.

So I’ll retract my proposal in #78. However, I still think Brown’s sometimes empty CI answers the wrong question. The answer must lie in the change in the F statistic from its minimum (as in #68 above), the LR statistic (which may amount to the same thing as the change in F asymptotically), or the posterior distribution (calculable for q > 2).

Back to the drawing boards…

]]>Empty CI and infinite (or huge) CI are separate issues

That’s true. More correct way to represent those empty cases would be no CIs at all ( classical interpretation, #71 ). That post needs some updating, but the main point remains, calibration-residual based CIs are not valid. And IPCCs explanation with ‘considerable 0.5 deg’ and ‘minimum uncertainties’ is just silly.

In MBH9x, I can see that each proxy should correlate better with its own local temperature than with global (or NH) temperature,

Not necessarily local correlation, it can be teleconnected to anywhere.

but does this step (with p > 1) really serve any purpose if all that is ultimately desired is an estimate of global (or NH) temperature?

They wanted more, local reconstructions as well, see for example http://www.ncdc.noaa.gov/cgi-bin/paleo/mannplot2.pl

]]>those floor-to-ceiling peaks are empty CI-cases.

Empty CI and infinite (or huge) CI are separate issues. Brown’s CI can come up empty because it asks the wrong question, namely whether the observation was possible or not in terms of the model. In this case, even the point estimate should be rejected if this is your question, IMHO.

My CI (in #78) instead asks how high or low could x have been, given the observation that somehow or other *did* occur. This can be huge as in the univariate case, and will even be unbounded on at least one side if you ask for a confidence level that exceeds your confidence that the relationship is invertible (ie **beta** /= **0**).

Note also that p=1 does not hold in MBH9x, except for AD1000 and AD1400 steps.

In MBH9x, I can see that each proxy should correlate better with its own local temperature than with global (or NH) temperature, but does this step (with p > 1) really serve any purpose if all that is ultimately desired is an estimate of global (or NH) temperature?

I’m planning to try it out with Thompson’s 2006 CC data and will let you know what turns up. So far it looks like there really is a HS if you use his choice of core for each site, albeit a very noisy one.

]]>Confirm that there is no significant serial correlation in the errors.

If (and only if) we can reject beta = 0 at some sufficiently small test size pmin, we may proceed with prediction.

See Mann07,

Under the assumption of moderate or low signal-to-noise ratios (e.g., lower than about SNR 0.5 or “80% noise”), which holds for the MBH98 proxy network as noted earlier, the value of p for the “noise” closely approximates that for the “proxy” (which represents a combination of signal and noise components).

Errors are as autocorrelated as proxies, sound like beta=0 to me :)

Mann07: Robustness of proxy-based climate field reconstruction methods, Journal of Geophysical Research, Vol. 112

]]>Simplifying Brown to the case p = 1, assume

p=1 case is seen in reconstructions where global average is used as target. I tried Brown’s formula for Juckes’ reconstructions, see this post

http://signals.auditblogs.com/2007/07/09/multivariate-calibration-ii/ , those floor-to-ceiling peaks are empty CI-cases. You can download the data from here,

http://www.cosis.net/members/journals/df/article.php?a_id=4661 if you want to try your method. It seems that conventional statistical methods give wider CIs than those you see in MBH9x and Juckes etc…

Note also that p=1 does not hold in MBH9x, except for AD1000 and AD1400 steps.

]]>First, if one or more proxies are missing for a particular reconstruction date, there is no need to recalibrate the whole system (as there would be with direct multiple regression of temperature on the proxies). All you need to do is omit the relevant elements of a and b (which, I neglected to mention are qX1 vectors to start with, as are alpha and beta), and also remove the corresponding rows and columns of G before inverting it. This will gradually widen the CI’s as proxies drop out, but there is no need to have a sudden change in the system every couple of centuries (as I gather MBH did).

And second, if you smooth the reconstruction by averaging over m periods, the reconstruction using the smoothed proxies is the same as smoothing the reconstruction using the unsmoothed proxies. However, the CI of the smoothed reconstruction is considerably smaller than what you would get if you just smoothed the CI bounds of the unsmoothed reconstruction. The coefficient uncertainty, measured by 1/n and x’^2/xTx in sigma^2(x’), is the same in both cases, but the lead term 1 that represents the disturbance uncertainty gets replaced by 1/m, assuming as in Brown’s benchmark case, that the disturbances are serially uncorrelated.

(Note that the X’s and Y’s can both be highly serially correlated without the disturbances being serially correlated. I don’t know about annual treerings, but this is the case with a few decadal ice core series I have looked at.)

]]>See also

http://www.climateaudit.org/?p=2563#comment-191063

and

http://wmbriggs.com/blog/2008/03/09/you-cannot-measure-a-mean/

]]>The issues with multivariate calibration -based reconstruction uncertainties in MBH98, MBH99 and many more reconstructions, are independent of PC-stuff and other proxy problems discussed at CA. That is, if you spend many blog threads to explain that treeline11.dat and pc01.out are ok to use, you still need some 10 more blog threads (*) to explain why MBH9x does not use conventional multivariate calibration methods, well described in statistical literature.

Note how latest IPCC report deals with this issue (CH6 page 472):

The considerable uncertainty associated with individual reconstructions (2-standard-error range at the multi-decadal time scale is of the order of ±0.5°C) is shown in several publications, calculated on the basis of analyses of regression residuals (Mann et al., 1998; Briffa et al., 2001; Jones et al., 2001; Gerber et al., 2003; Mann and Jones, 2003; Rutherford et al., 2005; D’Arrigo et al., 2006). These are often calculated from the error apparent in the calibration of the proxies. Hence, they are likely to be minimum uncertainties, as they do not take into account other sources of error not apparent in the calibration period, such as any reduction in the statistical robustness of the proxy series in earlier times

(Briffa and Osborn, 1999; Esper et al., 2002; Bradley et al., 2003b; Osborn and Briffa, 2006).

Hu, your x’^2/xTx term is not mentioned, I wonder why..

(*) Hi Tamino ;)

]]>