Mann 2008: Impact of the Missing Data

Jeff Id has an interesting post in which he examines the 148 “missing” series. Check it out. I haven’t verified the calculation, but will do so. I submitted a comment at his blog observing that the two “Fisher” series are actually Dahl-Jensen borehole reconstructions from Greenland. These two series were said to have been used in Mann and Jones 2003, but the construction of the Mann and Jones 2003 composite is, at present, remains considerably more mysterious than MBH.


26 Comments

  1. shs28078
    Posted Sep 15, 2008 at 8:47 AM | Permalink

    Aren’t you beating a dead horse?

  2. ToSH
    Posted Sep 15, 2008 at 8:57 AM | Permalink

    If you don’t beat the dead horse you’ll end up with a zombie-horse, which the case of Mann 2008 clearly demonstrates.

  3. Jeff Alberts
    Posted Sep 15, 2008 at 10:14 AM | Permalink

    Time to cut off its head.

    • DeWitt Payne
      Posted Sep 15, 2008 at 10:32 AM | Permalink

      Re: Jeff Alberts (#3),

      Stuff the mouth with salt and sew the lips together. Burn the heart and scatter the ashes in a river. I think I’m mixing zombies and vampires here, but it should work for either.

  4. Craig Loehle
    Posted Sep 15, 2008 at 10:54 AM | Permalink

    Proxies are not necessarily scaled directly to temperature. They can often have an inverse relationship. For example, there is MORE ice rafted debris in the North Atlantic when it is colder. There may be LESS pollen of a species or deposits of a plankton when it is warmer. Plotting all the proxies on a graph and looking for the MWP or whatever is simply meaningless until they are converted to temperature.

    • Thor
      Posted Sep 15, 2008 at 11:41 AM | Permalink

      Re: Craig Loehle (#5),

      So, I’m trying to catch up and understand how things are done. This is my understanding of it:

      First convert data from each individual proxy to temperature using an algorithm that is appropriate for the type of proxy and the physical behaviour of the proxy in question. Then calibrate the temperature estimate against a local temperature record (scaling, offset etc), and then validate against a different part of the local temperature record (not part of the calibration period). The output after all this is an estimate of a past temperatures, for one proxy in one location.

      The validation procedure and acceptance criteria are determined in advance and followed strictly. Proxies failing validation are not considered to be valid for temperature calculations and are not used for further processing.

      After this procedure have been repeated for ALL proxies, then the temperature estimates for each proxy can then be used to create a local, regional or global temperature estimates of the past by some sort of averaging procedure.

      Is this how things are done in Mann2008?

      • Craig Loehle
        Posted Sep 15, 2008 at 1:32 PM | Permalink

        Re: Thor (#7), Thor: you captured MY prefered method. This is NOT how Mann does it, as far as I can tell, but rather uses a fancy algorithm that again weights different proxies differently, or something.

        • Thor
          Posted Sep 16, 2008 at 7:42 AM | Permalink

          Re: Craig Loehle (#13),

          Oh, I would have thought the new paper was different and improved.
          But really, it is just “Same stick, new wrapping” then :(

    • Posted Sep 15, 2008 at 12:36 PM | Permalink

      Re: Craig Loehle (#5), Amen.

      • Posted Sep 15, 2008 at 1:03 PM | Permalink

        Re: captdallas2 (#10), Tom, Thanks and sorry to everyone I had not realised this was limited to the UK (so much for our bbc world service!) – I see Steve has now overtaken my comment anyway in the latest post. If you can find/see the second episode it is certainly worth a look.

  5. Jeff Alberts
    Posted Sep 15, 2008 at 10:58 AM | Permalink

    DeWitt, I think a shotgun is much more effective ;)

  6. Posted Sep 15, 2008 at 12:01 PM | Permalink

    Please forgive me for raising a very simplistic question, (I am a layman and an admirer of your blog and especially your desire for rigorous analysis) but I saw the second part of a TV documentary hosted by Iain Stewart on the BBC last night (a geologist with a UK University – whom I must say seems a reasonable chap to me at least) in which he suggested that because there were now so many proxies producing a “hockey stick” where they all seemed to agree on one thing – i.e. the massive uptick in the last century well above anything we have seen in history – that no matter how many statisticians quibbled with the analysis (as produced by Mann originally) the climate was heading for a catastrophic warming and anyone who doubted this was now very much in the minority (and in his view probably wrong). Interestingly Mann appears on the programme saying how surprised he has been by the attack on his analysis.

    I have seen the analysis here and although I find some of the maths hard to follow above my basic understanding of statistics I sensed that you and others had shown significant areas of doubt in the very least.

    Is it true that there are now numerous analyses of proxy data that all show the same picture as the Mann analysis?

    Is it true that there are now so many studies that show a hockey stick that the evidence is becoming (as Iain Stewart seemed to suggest at least) incontrovertible?

    For the international viewers at least you can watch the episode in question at: http://www.bbc.co.uk/iplayer/episode/b00dm7d5/

    • Tom
      Posted Sep 15, 2008 at 12:24 PM | Permalink

      Re: Paul H Clark (#8),
      the international viewers can not watch the video, because the BBC permits only viewers from the UK. But the first episode can be watched at google video: http://video.google.de/videoplay?docid=8547224522119252436

    • Posted Sep 15, 2008 at 3:58 PM | Permalink

      Re: Paul H Clark (#8), Showing something doesn’t mean there is any truth to what is shown. Mann can shown something he wants shown, believes should be shown, but that doesn’t prove that it is the truth. Only reconstructions that have some reasonable correlation to temperature should be used in a temperature reconstruction. What I find frustrating is that low frequency reconstructions that have a better shot at determining past temperature are dismissed for high frequency reconstructions that are assumed to have temperature relationships. The whole paleoclimate deal has jumped the track.

  7. Steve McIntyre
    Posted Sep 15, 2008 at 1:15 PM | Permalink

    I just watched the first episode, which was pretty interesting and I’ll look forward to the next episode.

    I’m going to look at Carl Wunsch’s interpretation of the Greenland dO!8 fluctuations, which as I recall, arrive a different conclusion than Stewart.

  8. Chris H
    Posted Sep 15, 2008 at 1:32 PM | Permalink

    Completely unrelated (so please delete if this isn’t allowed), but I only just read an article on how all the GCMs assume a constant relative humidity:

    http://www.friendsofscience.org/assets/documents/The_Saturated_Greenhouse_Effect.htm

    This model assumption is confirmed on Real Climate, of all places:

    http://www.realclimate.org/index.php?p=142

    Unfortunately NOAA’s own data shows that this is wrong, for example in the tropics at the crucial 300mb (9km) height:

    http://www.cdc.noaa.gov/cgi-bin/Timeseries/timeseries.pl?ntype=1&var=Relative+Humidity+(up+to+300mb+only)&level=300&lat1=0&lat2=0&lon1=-180&lon2=180&iseas=1&mon1=0&mon2=11&iarea=0&typeout=2&Submit=Create+Timeseries

    My own back-of-the-envelope calculations show that a 1C rise in temperature (from 15C to 16C) would cause an approx. 6.7% increase in specific humidity, **assuming** a constant relative humidity. This would cause a further (approx.) 1.33C rise due to the greenhouse gas effect of water (33C*0.60*0.067=1.33C). Obviously a massive (actually run-away) greenhouse effect, not far off what is seen in the GCMs – except of course it doesn’t seem to happen in reality! (But neither is relative humidity really constant…)

    • DeWitt Payne
      Posted Sep 15, 2008 at 2:54 PM | Permalink

      Re: Chris H (#14),

      My own back-of-the-envelope calculations…

      That linear thing doesn’t work very well at all. You can use a real radiative transfer model instead. It’s a little out of date, but it should still be fairly close. Using the 1976 standard atmosphere with no clouds, it takes an increase from 375 to 820 ppm CO2 to produce a 1 C change in surface temperature at constant water vapor pressure (rebalance OLR, 100km looking down). Keep CO2 at 820 and change to constant RH and the surface temperature must go up by 1.48 C to rebalance OLR. But that’s clear sky which is somewhat unrealistic. Add low clouds and the CO2 must go to 1130 ppm to get a 1 C change at constant water vapor pressure while the surface temperature only needs to go up by 1.27 C at constant RH. Of course, the forcing in the tropics is greater, but the tropics will export a lot of the extra heat to high latitudes etc. which is why you need a GCM to do the job right, if only you could write one that gets the tropical troposphere time series over the last 28 years right.

      • Chris H
        Posted Sep 15, 2008 at 5:55 PM | Permalink

        Re: DeWitt Payne (#15)
        Thanks for the link to a real radiative transfer model! I will have a play… But just in case any other non-climate-scientist here wants to have a go, you missed one vital piece of info: A 1C rise in temperature is caused by a 3.3 W/m2 drop in the Earth’s radiative output (Iout). Thus in your first example, increasing the CO2 from 375 to 820 caused Iout to drop from 255.565 to 258.799 (i.e. drops by 3.234 which is approx 3.3).

        • Chris H
          Posted Sep 16, 2008 at 5:37 AM | Permalink

          Re: Chris H (#19),
          Please ignore my last post (or even delete it), as I had misunderstood how to use the radiative transfer model. One simply needs to increase the “Ground T offset, C” (by say 1C), to bring Iout back to 258.799. i.e. “rebalancing the OLR” (Outgoing Longwave Radiation).

  9. Arthur Dent
    Posted Sep 15, 2008 at 4:15 PM | Permalink

    Re: Paul H Clark (#8) I didn’t see the programme but one critical point would be if Ian explained the divergence problem.

  10. hswiseman
    Posted Sep 15, 2008 at 4:43 PM | Permalink

    Speaking of Dead Horses, this conversation was recently overheard between Dr. Mann and an “associate” of SM (with Apologies to FFC),

    “You don’t understand, Stevie Mac never gets that Hockey-Stick. The Stick is perfect for him;
    it’ll make him a big star. And I’m gonna run him out of the business, and let me tell you why.
    Stevie Mac ruined one of the Team’s most valuable proteges. For five years
    we had her under training. Singing lessons; acting lessons, dancing lessons. I spent hundreds
    of thousands of dollars on her. I was gonna make her a big star! And let me be even more
    frank, just to show you that I’m not a hard-hearted man, and it’s not all dollars and cents.
    She was beautiful; she was young, she was innocent. She had the greatest Bristlecone Core
    Sample I ever had seen, and I had seen’em all over a very small portion of North America! And
    then Stevie Mac comes along with his maple syrup voice, and Canuck charm. And she runs off.
    She threw it all away just to make me look ridiculous! And a man in my position can’t afford to
    be made to look ridiculous! Now you get the hell outta here!”

  11. Pete
    Posted Sep 15, 2008 at 9:19 PM | Permalink

    At the end of the 1st episode Stewart says he’ll talk about the skeptics and how science will be shown to be the winner in episode 2. That should be real interesting.

  12. Posted Sep 15, 2008 at 10:13 PM | Permalink

    I did another post, much longer this time comparing the original 1209 (deleted data) with the data actually used.

    I think you’ll find it interesting.

    http://noconsensus.wordpress.com/2008/09/16/mann-08-variable-data/

  13. Ernie
    Posted Sep 16, 2008 at 1:23 AM | Permalink

    I just watched the 2nd episode “Earth: the climate wars: fightback” it’s alarmist with selective editing to dramatize issues. I rate it as bad a “Swindle” which I also didn’t think much of. The Hockey Stick is there in all it’s glory with some short interview snippets from Mann. Iain Stewart portrays the HS as proven beyond much doubt, he doesn’t appear to have researched the controversy in great detail.

    – Ernie.

  14. STAFFAN LINDSTROEM
    Posted Sep 16, 2008 at 3:45 AM | Permalink

    #8 Paul H Clark … I’m sorry, but international viewers can’t normally
    see this BBC content … However there are methods to do so(?), which I of
    course not endorse, described at YouTube…Firefox 3 seems to help?!

  15. Posted Sep 18, 2008 at 9:44 AM | Permalink

    This is kind of a dead thread but I don’t want it to be off topic.

    I think this is my best work so far. There are some monster problems in the Mann08 data.

    http://noconsensus.wordpress.com/2008/09/18/the-all-important-blade-of-the-stick-uses-less-than-5-of-the-data/

Follow

Get every new post delivered to your Inbox.

Join 3,330 other followers

%d bloggers like this: