It does not seem to me appropriate to seek to reconstruct a GMST estimate from land station data only, as it is well known that, for good physical reasons, land areas warm substantially faster that the ocean.

Both in CMIP5 GCMs and observations a typical land-ocean ewarming ratio would be in the 1.4x-2x range. The main reason is the more limited availability of moisture at the land surface. The claim in a number of published papers that the (primary) reason is that the ocean has a greater heat capacity tham land is wrong; that has a fairly minor effect.

]]>This post is interesting, but I believe that the issue with 250 km,1200 km (or other ) influence radius is even more important in the met station only index, Gistemp dTs.

Almost all attention is on the blended surface indices nowadays, but I have reason to believe that Gistemp dTs is much closer to the true global 2m SAT compared to Gistemp loti or other blended indices.

Since you have the Gistemp code up and running, you could tweak Gistemp dTs to 2000 km influence, or more, to achieve complete global coverage.

Right now dTs has the trend 0.218 C/decade for 1970-2016, whereas the CMIP5 multimodel mean is 0.211 C/dec. Gistemp loti is only 0.182 C/ decade.

It looks like Gistemp dTs trend will converge to about 0.215-0.216 C/decade when the globe is completely infilled.

The ultimate test of the global SAT representation by Gistemp dTs would be to mask model data in space and time to mimic that of dTs, then run this data with the dTs code, and see how much it differs from the complete global model dataset.

I am not the man to do such a complex analysis, so I have done it in a simpler way. I have made a simple dTs-emulating global index, by subsampling 100 evenly distributed dTs gridcells (more or less the RATPAC method), and also made an exact model equivalent of this:

https://drive.google.com/open?id=0B_dL1shkWewaSUx2S21SWXgwLTQ

The conclusion is that dTs exaggerates the global SAT by less than 0.01 C/decade (best guess 0.006), which is a much nearer estimate than Gistemp loti.

]]>You have given no numbers relevant to the integration of temperatures on Earth. I have, and I have been putting this into practice on my blog, calculating the average from raw data every month for over six years now. I post this each month in advance of the others, and it agrees very well with what they get, by other means. And I have extensively analysed and tested the errors.

In terms of your “examples”, yes, one observation of a sinusoid in a period will give an unreliable mean. One hundred, randomly placed, on the other hand, will do very well. And I have dealt with your spherical harmonics example above, with link.

]]>Why generate climate data? There is plenty of synthetic data from models, reanalyses, or real data with global coverage that one can play with.

Like this example: Throw away 99.83 percent of the spatial information, and see if you can reconstruct the whole..

Do you still have doubts about sparse sampling? Do you believe that global warming in the long run can hide between those 18 dots?

]]>Try your interpolation scheme or other gimmick on my counterexample and tell me what the mean is. Any interpolation method also relies on the smoothness of the function being interpolated and that smoothness is used by standard numerical analysis to determine the error. What is the numerical error in your interpolation method? The only person doing hand waving is you.

Jerry

]]>In fact we did compare the accuracy of the spectral method when the typical dissipation used in climate models is added. (I had to insist that the accuracy comparison of the three methods be added for this case.) And as you can see in the included BHS manuscript the accuracy of the spectral method was reduced by two orders of magnitude.

https://drive.google.com/open?id=0B-WyFx7Wk5zLR0RHSG5velgtVkk

So adding unrealistically large dissipation substantially degrades the accuracy of the numerical method, i.e., the dissipation is a poorer approximation of the differential equations (and a better approximation of the heat equation ðŸ™‚

Jerry

]]>That is a more reader-friendly topic sentence. You don’t have to overload the first sentence with every caveat and qualifier. After all, several sentences will follow.

Also, it is a bad idea to have such a long meandering phrase as a subject. The reader has to read far too long to see where the actual verb is in the sentence in relation to the subject (actually has to read forever to see that that whole phrase is acting as an 18 word noun). Has to parse 18 words and scratch skull to see that there is not an action taking place within that monstrosity.

Again, the topic is interesting. I would love to learn why the different temperature records differ. But the topic is poorly explained. Having to deal with that 31 word monstrosity of a first sentence to open an essay will just turn many people off. And it doesn’t have to be that way.

]]>I misplaced my response on the sub-thread above. ]]>

Thanks for releasing. The sameness is becauseI was exploring for the limit on links – I though three might work where four didn’t. ]]>

It comes back to my initial statement – you don’t have a derivative. I’m pretty sure that you and Jerry are thinking about integrating things determined by PDE. That tells you about derivatives – pressure gradient implies acceleration etc. But here we don’t have that. We just have a sampled field variable.

Numerical integration here basically makes an interpolation function based on the samples, and integrates that. So the question is, how good is the interpolation. The JB formulae would say, fit a polynomial based on the derivatives, and the error is attributable to neglected higher order terms. Here we can’t do that. The basis for saying interpolation is possible is correlation. That is why Nic’s post talks about 250km, 1200km etc. Correlation doesn’t have to work at all times (eg fronts) – it just has to work well enough on average for integration.

Jerry refers to his spherical harmonics example. I think that is instructive – I use SH integration extensively, and find it gives similar (good) results to triangle mesh. I posted here

moyhu.blogspot.com.au/2017/03/residuals-of-monthly-global-temperatures.html

a study of residuals, with a WebGL globe picture, especially of SH (following posts also are on integration accuracy). The number of SH is limited, because the scalar product is not an exact integral (unobtainable) but a product on the observation points, and so orthogonality fails. When the high-order SH start to have wavelengths on the order of gaps in the data, the failure is total (ill-conditioning) and you have to stop. When you do, the residuals are still large, and if you use a random model for them, you will deduce a large error.

But the residuals I show are clearly not random. They actually look very like a combination of the higher order SH’s that couldn’t be used. And the point is that, though the amplitude is large, those SH have zero (exact) integrals. The large residuals have been pushed by the fitting process into a space that makes almost no contribution to the integral.

]]>