I used Temperature Anomaly approach, i.e. comparing to 1951-1980 mean. The graph is here. Clearly 3 is not enough, but is 12?

]]>Randomly picking temperature stations shouldn’t you get a one sigma error about?

20/sqrt(3)= 11.5470 degrees

However if we frame the problem as an estimation problem of T_global as

T1-E[T1- T_global]=T_global

…

Tn-E[Tn- T_global]=T_global

I would suggest computing E[Tn- T_global] from satellite data. The term should be computed assuming E[Tn- T_global] is ergodic.

http://www.ualberta.ca/~cdeutsch/images/Lec03-StatModels.pdf#search=%22ergodic%20statistics%22

That is sum{ Tn(i)- T_global(i), i=1…oo}= E[Tn- T_global].

We could even use satellite estimates to get an initial estimate of the noise covariance.

Let

Y=

[T1-E[T1- T_global]-T_global]

[…]

[Tn-E[Tn- T_global] -T_globa]

Then the noise covariance is

E[y y^T]

Again we assume the noise is ergodic and the initial estimate of the noise can be used for weighted least mean squares estimate of the global mean temperature.

Thanks! We need more, I tried 3 station average (your links + Calgary, no cherry picking ðŸ™‚ ). Reconstruction error 2-sigma for 1901-2002 is about 1.1 C. Need to find how many stations needed until we get down to 0.5 C.

#25

Some adjustments could be made because some proxies are more accurate then others.

But thermometers are equally accurate, so only area-weighting is needed.

]]>“If you could replace the 12 proxies by 12 thermometers, would the results be more accurate?’

#20

The obvious answer is of course…..well at least if you use proper weighting.

1) Define proper weighting

2) Remeber that MBH99 2-sigma is 0.5 C. You have to do better than that.

Long instrumental data would be useful in testing, but I couldn’t find the data (CA old topic here )

]]>We are trying to construct the northern hemisphere mean which is an interregnal of the temperature over the northern hemisphere divided by the surface area of the northern hemisphere. Had man first fit the temperature proxies to the grid points and then averaged them he would of used a Crude Reman Sum to estimate the northern hemisphere temperatures.

Numeric integration works by dividing up the integral into regions small enough that we can approximate the integral of these regions with integrals of functions with analytic solutions. Thus the proper approach to the problem would be to identify an iterperolation function between the proxies and the temperature over the space we wish to integrate. If we wish to be crude we could divide the global up into triangular planes with vertices at the location of the proxies. If we wish to be advanced we could consider the topology of the earth in our interpolating function and sum up over much smaller regions then allowed by our proxies. If we wish to be even more advanced we could consider the uncertainty in our interpolation and perhaps use this to adjust the weighting in our average of surface temperatures.

Anyway getting back to proxies it is unlikely that proxies in the southern hemisphere provide much useful information in determining northern hemisphere temperatures. Thus for numerical reasons it would be unlikely we would use them when constructing an interpolation function for a region of the northern hemisphere. The only reason I see to include them would be to remove an unwanted common mode signal.

]]>Actually though, thinking of tellaconections, if it is a principle that works then it should be much more effective if it is used to find a fit to grid cell temperatures first and then if the fits for grid cell temperatures are used to find a fit for global temperatures.

Global temperature is the average of grid cell temperatures, right? So, *without* teleconnections, it would make more sense to try to fit proxy nearest grid cell, one by one. That would be quite easy using the equation in #15.

I’m trying to compress my MBH99 confusion to one single question, at the time it would be something like:

‘If you could replace the 12 proxies by 12 thermometers, would the results be more accurate?’