We’ve had a couple of discussions of dealing with end-points for smoothing. Here’s a little note about smoothing algorithms, which I think is pretty funny. It’s hard to imagine a note about smoothing algorithms being funny, but see what you think.
We talked last summer about the pinned end-points in Emanuel 2005. Landsea 2005 had objected to this and Emanuel accepted the criticism. As it happened, Emanuel had somewhat of an out. He used a very short smooth (1,4,6,41) and the combined 2004-2005 seasons still left the short smoothing at very high levels.
On to 2007. Recently I discussed Mann’s “explanation” of the Divergence Problem, that it was an artifact of IPCC smoothing and, had they applied the “correct” method, set out, by coincidence, in Mann’s own article, the Divergence Problem would not exist. Re-capping, the poster had asked:
What we are interested in in this thread, however, are the error bars on the temperature reconstructions from proxies. What is striking from the IPCC chart is that the “instrumental record” starts diverging seriously upwards from the “proxies” around 1950, and is in the “10%” overlap range by about 1980. The simple read on this, surely, is that the proxies are not reflecting current temperatures (calibration period) and so cannot be relied upon as telling what past temperatures were either?
Here’s Mann’s answer again:
[Response: Actually, you have mis-interpreted the information provided because you have not considered the implications of the smoothing constraints that have been applied at the boundaries of the time series. I believe that the authors of the chapter used a smoothing constraint that forces the curves to approach the boundary with zero slope (the so-called 'minimum slope' constraint). At least, this is the what it is explicitly stated was done for the smoothing of all time series in the instrumental observations chapter (chapter 3) of the report. Quoting page 336 therein, This chapter uses the minimum slope' constraint at the beginning and end of all time series, which effectively reflects the time series about the boundary. If there is a trend, it will be conservative in the sense that this method will underestimate the anomalies at the end. So the problem is that you are comparing two series, one which has an overly conservative boundary constraint applied at 1980 (where the proxy series terminate) tending to suppress the trend as the series approaches 1980, and another which has this same constraint applied far later (at 2005, where the instrumental series terminates). In the latter case, the bounday constraint is applied far enough after 1980 that is does not artificially suppress the trend near 1980. A better approach would have been to impose the constraint which minimizes the misfit of the smooth with respect to the raw series, which most likely would in this case have involved minimizing the 2nd derivative of the smooth as it approaches the terminal boundary, i.e. the so-called 'minimum roughness' constraint (see the discussion in this article). However, the IPCC chose to play things conservatively here, with the risk of course that the results would be mis-interpreted by some, as you have above. -mike]
[Response: Well, no, actually the proper read on this is that you should make sure to understand what boundary constraints have been used any time you are comparing two smoothed series near their terminal boundaries, especially when the terminal boundaries are not the same for the two different series being compared. -mike]
[Response: p.s. just a point of clarification: Do the above represent your views, or the views of Shell Oil in Houston Texas (the IP address from which your comment was submitted)? -mike]
As I noted in the earlier post, Mann’s “minimum roughness” constraint, when translated from inflated Mannian language, boils down to a reflection of the series both horizontally and vertically around the final value. Mann:
Finally, to approximate the minimum roughness’ constraint, one pads the series with the values within one filter width of the boundary reflected about the time boundary, and reflected vertically (i.e., about the y’ axis) relative to the final value.
This implementation can be observed in some code here.
When I wrote a little routine to implement Mannomatic smoothing, I noticed something really funny. I know that it seems bizarre that there can be humor in smoothing algorithms, but hey, this is the Team. Think about what happens with the Mannomatic smooth: you reflect the series around the final value both horizontally and vertically. Accordingly with a symmetric filter (as these things tend to be), everything cancels out except the final value. The Mannomatic pins the series on the end-point exactly the same as Emanuel’s “incorrect” smoothing.
Just for fun, I applied the Mannomatic smooth with the short (1,4,6,4,1) filter to a hurricane series – category 3 days and obtained the following result, where I’ve emphasized the closing value to show that the Mannomatic smooth pins at exactly the closing value. It’s crazy, but it’s so. IPCC Chapter 3 Appendix 1 discusses smoothing algorithms drawing on Mann’s analysis, but doesn’t mention this.