## The New Mann Paper

The cat has finally dragged in Mann et al, Robustness of proxy-based climate field reconstruction methods, url 😉 Supplementary info. This article was first cited by the rather bilious 😈 Referee #2 for Bà⻲ger and Cubasch , as though Bà⻲ger should have been aware of its findings. The coauthors are the “independent” authors: Wahl, Ammann and Rutherford.

Perhaps responding in part to prior criticism, Mann has provided an extensive supplementary information, including code for many steps. (Whether the code works and whether it’s complete are different questions, but on the surface at least, it’s a big improvement.)

Jean S writes:

#99: It’s been already a while 😉 Supplementary info is available here.

Please, could someone check if I got this right:
Mann is reporting that his (in)famous North-American PC1 is orthogonal (r=0.011422767) to local temperature?
(MBHandMXDcorr.xls, MBH1980-sheet, row 89 and MBHHandMXDcorrNoInstr.xls, MBH1980-sheet, row 67)
Robustness paper:

UC writes:

Under the assumption of moderate or low signal-to noise ratios (e.g., lower than about SNR $\approx$ 0.5 or “80 % noise”), which holds for the MBH98 proxy network as noted earlier, the value of $\rho$ for the “noise” closely approximates that for the “proxy” (which represents a combination of signal and noise components).

Ah, that’s the way to estimate the redness of proxy noise. But isn’t this obvious, we have a model

$P=\alpha T + n$

and as $\alpha$ is zero, we can estimate $\rho$ of n directly from the proxy data 🙂

I haven’t had time to read it yet, but will try to do so soon, but a few quick comments.

I noticed that Mann has continued to use his PC methodology without changing a comma, notwithstanding the strong statement of Wegman that his PC methodology was simply “wrong” and the statement by the NAS panel that it should be avoided (And North’s testimony at the House E&C hearing that he agreed with Wegman) In effect, Mann is saying that, using his PC1, he can “get” a hockeystick not only with the Partial Least Squares regression of MBH98, but with the variation of RegEM (and I recall US pointing out some odd de-centering of his RegEM method.)

It’s one thing for Mann to keep using his PC methodology in the face of criticism from the Wegman and North panels, but why did the JGR reviewers acquiesce in the continued use of Mannian PCs? Pretty pathetic. Actually, it’s not just the JGR reviewers – Mann’s PC1 has been used recently by Osborn and Briffa 2006, Hegerl et al 2006 and Juckes et al – it’s as though the Team is brazenly showing solidarity with Mann to spite Wegman and others.

The word “bristlecone” is not mentioned anywhere in Mann’s new paper. So it’s a strange sort of “robustness” that Mann is proving. It’s already been agreed that, if you take the bristlecones out of the network, you can’t get a HS. So the original claim that the reconstruction is “robust” to the presence/absence of dendroclimatic indicators is false, although you won’t see a hint of that in this paper. Again, what were the reviewers doing? This has been a topical issue – why didn’t they ask Mann to consider it?

I checked Jean S’ comment about the PC1 correlation or lack of correlation and Jean S is right. The correlation of the MBH98 PC1 to the gridcell chosen here is 0.01 – not an imposing correlation for the one series that is essential to the reconstruction.

Here’s something else that’s amusing and shows Mann’s ridiculously perverse stubbornness and the ineptness of climate science referees. It’s been known for over 4 years that Mann mis-located a Paris precipitation in a New England gridcell (and Toulouse precipitation in South Carolina) and that the “Bombay” precipitation series does not come from Bombay. The mislocation of the Paris precipitation series was not corrected in the 2004 Corrigendum and, in the new SI, Paris precipitation is still shown with a New England location (“The rain in Maine falls mainly in the Seine.”) Mann and the mini-Manns duly report that the mis-located precipitation series has a positive correlation to New England gridcell temperatures. You’d think that they try to fix this sort of stuff at some point, but nope, the rain in Maine still falls in the Seine.

Update: I downloaded the reported correlation from the Mannian SI and then re-calculated correlations between the proxies in the AD1820 network and HadCRU3. Their SI stated:

Correlations were calculated between all 112 proxy indicators and both (1) local temperatures (average over the 4 nearest 5 degree lon x lat temperature gridpoints) … during overlapping intervals.

I then did a scatterplot comparing the reported correlations to the ones calculated using HadCRU3 – I used the single gridcell in which the record was located. The high correlations in the top right corner are correlations of actual temperature data to gridcell temperature – something that doesn’t seen like much of an accomplishment. Mann reported that all the correlations were positive, but I got no fewer than 60 out of 112 with negative correlations. Some prominent series e.g. Gaspé ring widths had negative correlations with HAdCRU3 gridcell temperatures.

I presume, if a correlation was negative, that Mann just changed it to positive. For a few series, e.g. Quelccaya accumulation, there’s a plausible reason for this, but, in such cases, it would be better policy to invert the series ahead of time on a priori grounds. But there are some real puzzlers. For example, the Gaspé series (#53 – treeline11;St Anne) has a negative correlation (-0.11) with HadCRU3 gridcell, but Mann reports a positive correlation of 0.34. It’s hard to tell what’s going on – maybe Mann “inverted” negative correlations for reporting purposes, but they don’t invert the series for use in reconstructions.

1. Dave Dardinger
Posted Jul 9, 2007 at 12:20 PM | Permalink

One thing still bothers me about Mann’s CFR method. Since it’s intended to measure the global “climate field” rather than a local climate indicator / field, how do you decide what proxies are best except by cherry picking them? That is, cherry picking them via a target shape, in this case a hockey-stick shape?

And since they’re cherry picked to produce a global field matching the desired pattern, that destroys any validitiy in using the normal statistics. That is, whether the picking is done via some pattern-matching program; the Mannomatic, or an explicit human matching technique or even a Darwinian survival-of-the-fittest peer-reviewed articles technique, the true number of points you’re dealing with is quite small. This should mean it’s impossible to say that the selected proxies have much if any reconstruction skill.

I’ll believe that they’ve actually done what they claim when Steve M, Jean S, etc. conclude that they have. Not on their own say-so.

2. Joe Ellebracht
Posted Jul 9, 2007 at 3:30 PM | Permalink

My football predictions are orthagonal with the game results, but they are skillfully done nevertheless.

3. Steve McIntyre
Posted Jul 9, 2007 at 3:44 PM | Permalink

I tried to replicate the reported correlations with HadCRU3 and couldn’t.

4. Posted Jul 9, 2007 at 3:51 PM | Permalink

This should be interesting. At least MM was mentioned. Out of interest, how do you calculate the 4 nearest 5deg grids? There is either 1, or 8 but not a unique 4.

Posted Jul 9, 2007 at 3:55 PM | Permalink

RE: #2 – See me in August …. LOL!!!

6. Steve McIntyre
Posted Jul 9, 2007 at 3:56 PM | Permalink

Hmmm, I didn’t notice that on the first pass. Its possible that they locate the proxy within a gridcell and then go to the nearest corner. I redid calcs on that basis and they are similar to what I posted

7. John Lang
Posted Jul 9, 2007 at 4:27 PM | Permalink

I believe the negative correlation problem is that Mann is not looking at local temperatures but his spooky action at a distance correlation with hemispheric or global temperature.

[4] The [non-Mann] CPS approach makes the potentially quite restrictive
assumption that all proxy data used are local indicators
of the particular climate field (e.g., surface temperature) for
which a reconstruction is sought. The [Mann] CFR approach avoids
this potentially restrictive assumption, providing a reconstruction
of the entire climate field of interest (e.g., surface
temperature field) from a spatially distributed network of
climate proxy indicators containing a diverse range of
climate signals. The spatial reconstructions can be averaged
to yield, e.g., a hemispheric or regional mean temperature
series.

Thus, there is no point making this “assumption” unless it means you are not correlating the proxy indicators to local gridcell temperatures but to the hockey stick to start with.

And Mann needs to take writing lessons because I have never seen such giberish in my life except from conmen.

8. Sam Urbinto
Posted Jul 9, 2007 at 4:31 PM | Permalink

How many gridcells are there in total? And what percent of them are like this?

9. Jean S
Posted Jul 9, 2007 at 4:32 PM | Permalink

Steve, I think they might have used “the infilled” grid temp data from Rutherford and SPL data from Zhang (see third bullet here). Rutherford’s gridded temp data is available here: http://fox.rwu.edu/~rutherfo/supplements/jclim2003a/

BTW, Ruthersford’s code has been updated:

UPDATE (June 20, 2007):
Since the original publication of this work, we have revised the method upon discovering a sensitivity to the calibration period.
Revised code and methods are available here

Hmmm, I wonder if they also had problems with centering 😉

10. Jean S
Posted Jul 9, 2007 at 4:34 PM | Permalink

I meant the second bullet on p. 3 (Section 2A)

11. Steve McIntyre
Posted Jul 9, 2007 at 4:46 PM | Permalink

Rutherford et al 2005 used HadCRU1 – which is intermediate between the version used in MBH98 Jones 1994 and HadCRU2. So the various new homogeneity adjustments introduced in HadCRU3 have completely screwed up all the correlations in Mann et al 2007.

Think about all the bile against Swindle for using an obsolete GISS version. I wonder if Bob Ward will file a complaint about Mann using an obsolete CRU version. Don’t referees care about these things?

12. Jean S
Posted Jul 9, 2007 at 4:53 PM | Permalink

#11: Additionally the “temperature field” of Rutherford et al is “infilled”, that is, they used their version of RegEM to fill in the missing values of HadCRU1. If I recall correctly, Burger complained about this (i.e., using the same algorithm to “reconstruct” the instrumental values the final proxy reconstruction is validated against).

13. Steve McIntyre
Posted Jul 9, 2007 at 5:24 PM | Permalink

I just tried this with HadCRU1 and it was closer to the reported figures but still a lot of discrepancies. It’s amazing how hard it is to replicate even the simplest Mannian calculation. One of the biggest discrepancies was at Gaspe. Mann reports a correlation of 0.34, while it has a correlation of -0.11 with HadCRU3. I got a correlation of 0.46 to the current HAdCRU1. The differences arise primarily from some additional 19th century values inserted into HadCRU3 that are non-homogeneous to say the least. I wonder what HadCRU3 did?

14. Mark T.
Posted Jul 9, 2007 at 5:38 PM | Permalink

I haven’t had a chance to review the code, Jean S., but have they continued to use the (assumed) ergodic mean, rather than the vector mean, in their processing?

Mark

15. Steve McIntyre
Posted Jul 9, 2007 at 5:46 PM | Permalink

MAybe someone could check out the issue discussed here: http://www.climateaudit.org/?p=519 . It appeared to me at the time that Rutherford et al had spliced their temperature with proxies one year off. Given that they don’t usually fix errors, I presume that they’ve done the same thing here.

Jean S, this post observes that the period for the archived temperature data is 1854-1993, which doesn’t tie in with the 1856 start date. Hey, it’s the gang that can’t shoot straight, why would anything match?

16. Kristen Byrnes
Posted Jul 9, 2007 at 7:57 PM | Permalink

To me, it’s still comparing the temperature of a shady forest against a temperature record of blacktop, air conditioning exhausts, bbq’s and incinerators. You will still get a hockey stick except now the handle of the stick will not be as straight.

17. Hans Erren
Posted Jul 10, 2007 at 4:05 AM | Permalink

Kirsten, when do you plan remove your error that “at most, man-made greenhouse gases are 1/ 10,000 of Earth’s atmosphere?”
You should know that 300 ppm of CO2 absorbs 16% of the heat.
Check this site:
http://www.sciencebits.com/OnClimateSensitivity

18. John Lang
Posted Jul 10, 2007 at 6:23 AM | Permalink

To Hans #17 (not that Kirsten needs any defending) “man-made greenhouse gases” are about 100ppm of CO2 (383 current minus 280 natural) and a few hundred ppbillion of other greenhouse gases which works out to about 1/10,000th of the atmosphere.

How much the “man-made greenhouse gases” add to the overall greenhouse effect is a different matter and is, currently, subject to debate. Between 0.3C to 2.0C of the total 33.0C of greenhouse effect.

19. Hans Erren
Posted Jul 10, 2007 at 7:07 AM | Permalink

which proves my point that emphasising the role of CO2 as a minute trace gas doesn’t say anything about CO2 as a strong greenhouse gas.

20. Kristen Byrnes
Posted Jul 10, 2007 at 7:12 AM | Permalink

Hans # 17,

John # 18 has it right. During interglaciations the normal CO2 concentration in the atmosphere is about 280 ppm. Currently there is about 380 ppm. 380 – 280 = 100 ppm (the maximum that can be blamed on man). 100 ppm = 1/10,000.
John’s statement about how much temperature change (.3 C to 2.0 C) however is now up for more debate because, like the tip of the hockey stick, those temperatures are based on flawed data. We cannot know what amount of temperature change can be blamed on greenhouse gases until Phil Jones, James Hansen and the rest adjust for rooftops, pavement, air conditioning exhausts, trash burn bins and etc.

21. Jaye
Posted Jul 10, 2007 at 7:17 AM | Permalink

which proves my point that emphasising the role of CO2 as a minute trace gas doesn’t say anything about CO2 as a strong greenhouse gas.

Just do the arithmetic for GHC’s. Then weight them by “green house power”…however that is defined.

22. Hans Erren
Posted Jul 10, 2007 at 7:29 AM | Permalink

Kirsten, with all respect, you are spinning. Read Nir Shaviv’s expose about the theoretical value of climate sensitivity, which is unrelated to recent observations of temperature. http://www.sciencebits.com/OnClimateSensitivity

Have also a look at my page where I compare Nir Shavivs climate sensitivity with temperatures from satellites.
http://home.casema.nl/errenwijlens/co2/howmuch.htm

Jaye, water vapour is a weak greenhouse gas, but it is very abundant, you need percentages to get te same absorption as a few ppm of CO2. “It’s the dose that makes the poison”

23. Edouard
Posted Jul 10, 2007 at 7:54 AM | Permalink

@22

Hello Mr Erren,

If one compares satellite temperatures with Co2 ppms, what do we know about a natural climate response? If the activity of the sun is very high for more than 60 years, isn’t it possible, that the whole climate system only responds slowly to this high activity of the sun. During the discussion last year about the loss of heat content, the AGW-climatologists explained that the heat had disappeared in the deep sea. What happens to the heat gathered from the higher activity of the sun? Why was ist warmer or as warm as today in the medieval warmperiod, when the sun was much less active? Perhaps the climate only responds very slowly over more than hundred years?

And a last question: If one calculates all the forcings we know, does one have a result called “climate”, or do you think that the climate can change naturally without changes in Greenhousgases and albedo from icecoverage?

best regards

Edourd

24. John Lang
Posted Jul 10, 2007 at 7:55 AM | Permalink

If the math in the climate sensitivity models is as bad as Mann’s math (see above), then Kristen is right. What was the temperature of the earth when the CO2 level was 7,000 ppm?

25. John A
Posted Jul 10, 2007 at 7:56 AM | Permalink

Jaye, water vapour is a weak greenhouse gas, but it is very abundant, you need percentages to get te same absorption as a few ppm of CO2. “It’s the dose that makes the poison”

Unfortunately the icecores do not support your contention that carbon dioxide enrichment of the atmosphere warms the Earth’s climate at all. Which means that there is either some unknown factor that negates the effect of carbon dioxide enrichment, or the treatment given of carbon dioxide sensitivity is simply wrong.

26. John Lang
Posted Jul 10, 2007 at 8:01 AM | Permalink

Global temperatures as measured by the satellites have declined by 0.8C over the past 100 months while CO2 levels have increased by 20 ppm.

27. Hans Erren
Posted Jul 10, 2007 at 8:22 AM | Permalink

John Lang please read the multivariate analysis were it is shown that enso is the domination factor in tropospheric temperature.
http://home.casema.nl/errenwijlens/co2/howmuch.htm

John A, CO2 contribution to temperature in the ice ages is hidden in the noise (pink line vs green line). So there is plenty of tweaking space in the ice ages as albedo feedback is a huge fiddle factor.

28. Kristen Byrnes
Posted Jul 10, 2007 at 8:42 AM | Permalink

Hans # 22

I am not sure how we got to all of this, you started off by assuming that my 1/10,000 figure was incorrect.

Despite that, I looked at Nir Shaviv’s climate sensativity article (and I think Nir Shaviv is about as good as you will find) and all I saw was how much temperature change is caused by a forcing (I assume mainly CO2 here). And that is the point. You can calculate all you want but you still have to confirm your calculations by observation. But the observations are flawed by pavement, burn bins, air conditioning units and asphault rooftops.

Then I went to your page as you asked and looked it over. You used the satellite temps which is not biased by air conditioning vents, burn bins, pavement and etc. I noticed your graph where you eliminated much of the chaos from the climate system (ENSO, volcanic and solar) and left with residuals (I’m not sure what your “unknown linear effect” was). It would be interesting if you went further and adjusted for GCR, UV, thermal inertia of the oceans and albedo. Never the less, I did not see a trend in the residuals.
If I can ask a favor, it would be nice if you updated the graph since it seems to end in 2001.

29. MarkW
Posted Jul 10, 2007 at 9:23 AM | Permalink

Hmm,

During the ice ages, the affects of CO2 were hidden in the noise. But today, it’s the only thing that matters, everything else is just noise.

What happened to CO2 over the last 100,000 years to so increase it’s ability to absorb heat?

30. Jonathan Schafer
Posted Jul 10, 2007 at 9:25 AM | Permalink

17 – 28,

No offense, but can this stuff be moved to a different thread? It has nothing to do with the Mann paper.

31. MarkW
Posted Jul 10, 2007 at 9:28 AM | Permalink

He started it.

Posted Jul 10, 2007 at 9:46 AM | Permalink

RE: #22 and numerous others. CO2 is a strong GHG. However, it is not as well mixed as assumed by many. Furthermore, although GHGs (including CO2) do constitute a notable positive forcing, what seems to escape the comprehension of defenders of the hyper-AGW clique are the following (among others): parasitic terms, certain negative forcing terms, system resonances and destructive interferences, convective cloud systems, maritime stratiform cloud systems, non linear CO2 concentration dependent non linear sinking mechanisms, true behavior and trends regarding dust and aerosols, etc, etc, etc. This is the essence of the debate. The AGW fanatics do not want to face this music.

33. Sam Urbinto
Posted Jul 10, 2007 at 9:50 AM | Permalink

Yeah, plus you’re not even talking about the same thing. Or maybe something along the lines arguing about if a bowl is convex or concave.

Interglaciations, 280. Gone up recently, 100, and thought to be mostly all manmade. If all, 1/10,000 atmosphere. 100 absorbs 5% of heat though, due to its forcing strength. Thought to cause current possible warming. However, warming might be measurement, adjustment or model errors.

The details are always a point of argument, a huge circular one that never ends and seems to always be matter of either semantics, perception of the meaning of the details, or preconceptions.

There’s a reason stuff from the team is so obfuscated all the time.

34. John A
Posted Jul 10, 2007 at 11:41 AM | Permalink

I then did a scatterplot comparing the reported correlations to the ones calculated using HadCRU3 – I used the single gridcell in which the record was located. The high correlations in the top right corner are correlations of actual temperature data to gridcell temperature – something that doesn’t seen like much of an accomplishment. Mann reported that all the correlations were positive, but I got no fewer than 60 out of 112 with negative correlations. Some prominent series e.g. Gaspe ring widths had negative correlations with HAdCRU3 gridcell temperatures.

I presume, if a correlation was negative, that Mann just changed it to positive. For a few series, e.g. Quelccaya accumulation, there’s a plausible reason for this, but, in such cases, it would be better policy to invert the series ahead of time on a priori grounds. But there are some real puzzlers. For example, the Gaspe series (#53 – treeline11;St Anne) has a negative correlation (-0.11) with HadCRU3 gridcell, but Mann reports a positive correlation of 0.34. It’s hard to tell what’s going on – maybe Mann “inverted” negative correlations for reporting purposes, but they don’t invert the series for use in reconstructions.

Then what Mann has done is not science, but something else.

35. Vince Causey
Posted Jul 10, 2007 at 1:59 PM | Permalink

RE #33. I don’t know what you mean when you say “100 absorbs 5% of heat though. . ” Estimates of forcing for CO2, or any other GHG for that matter are given in watts/metre squared. Can you explain how you arrived at this 5%?

36. Posted Jul 10, 2007 at 2:08 PM | Permalink

Apologies for hijacking this thread.
Kirsten please suggest a public forum where we can continue the discussion.

37. Sam Urbinto
Posted Jul 10, 2007 at 4:08 PM | Permalink

#35 It’s just an estimate, based upon the claim that 300 ppm (1850 level) absorbs 16% heat so 100 is about 5%.

I’ll post the rest in unthreaded #14 and take it out of here, since it’s not about proxies or models or anything really about Mann’s paper.

38. Edouard
Posted Jul 11, 2007 at 2:39 AM | Permalink

@all

I don’t really understand why this is blog if the holy “climatologists” like Mr Erren, for instance, NEVER answer questions to laymen. :-((( This is really a bit annoying. The AGW’lers at least answer general questions on their blog.

Please don’t be surpised If noone takes anything serious anymore what’s related to climate “science”.

and thank you SO much for the nice answers :-(((

Ciao

39. Hans Erren
Posted Jul 11, 2007 at 2:45 AM | Permalink

I don’t really understand why this is blog if the holy “climatologists” like Mr Erren, for instance, NEVER answer questions to laymen

Huh? you can ask me at ukweatherworld Edouard.
http://www.ukweatherworld.co.uk/forum/forums/forum-view.asp?fid=30

40. Edouard
Posted Jul 11, 2007 at 2:59 AM | Permalink

Huh? you can ask me at ukweatherworld Edouard.
http://www.ukweatherworld.co.uk/forum/forums/forum-view.asp?fid=30

Ok, Mr Erren, that’s a very nice proposal. :-))) I will do it immediately.

Thanks a lot

Edouard

41. Don Keiller
Posted Jul 11, 2007 at 3:09 AM | Permalink

Steve, you say
” then did a scatterplot comparing the reported correlations to the ones calculated using HadCRU3 – I used the single gridcell in which the record was located. The high correlations in the top right corner are correlations of actual temperature data to gridcell temperature – something that doesn’t seen like much of an accomplishment. Mann reported that all the correlations were positive, but I got no fewer than 60 out of 112 with negative correlations. Some prominent series e.g. Gaspe ring widths had negative correlations with HAdCRU3 gridcell temperatures.”

So looking at this plot the BEST correlations (apart from correlations of actual temperature with gridcell- which should be good) are, in the main, lower than 0.3 (+ or -). Not alot to base a paper on let alone political and economic policy.

42. Jaye
Posted Jul 11, 2007 at 3:52 AM | Permalink

Jaye, water vapour is a weak greenhouse gas, but it is very abundant, you need percentages to get te same absorption as a few ppm of CO2. “It’s the dose that makes the poison”

I thought that was what I said.

43. TCO
Posted Aug 6, 2007 at 7:17 PM | Permalink

“I haven’t had time to read it yet, but will try to do so soon, but a few quick comments.”

Well, have you “had time to read” what you snark-labelled as “what the cat dragged in”? You seem to have all this time for asphalt and such, but never have time to really grapple with issues all the way to understanding.

44. Laws of Nature
Posted Nov 23, 2009 at 4:00 AM | Permalink

Dear Steve,

I tried to follow to follow the hockey-stick debatte for quite a qhile now. I thought, that one of the key points of your arguments was the falsification of the original algorithm with simultated data.

Now in this paper Mann et al. also seem to model data (f.e. chapter 4.5). What is the difference?
It seems to me that you where assuming just red noise in order test your data, whereas Mann assumed a hockeystick present in the data + red noise for his tests. I am correct with this guess?

All the best regards and thank you for an answer,

LoN

1. By It's on... on Jan 17, 2009 at 11:54 PM

[…] on 9 Jul 2007: Here’s something else that’s amusing and shows Mann’s ridiculously perverse […]