Unthreaded #36

869 Comments

  1. fred
    Posted Jul 9, 2008 at 1:20 AM | Permalink

    In a more serious vein, its very hard to understand where Tamino is coming from. On the one hand he apparently makes a living from being a time series analyst at a proper scientific institution, he shows at the very least considerable familiarity with math and stats, he’s published peer reviewed papers on both math and climate.

    On the other hand, there are episodes like his series on PCA, culminating in that absurd defence of MBH, which only worked if you thought that for the HS to be the first PC versus being the fourth or fifth made no difference. It started out being a model of clear exposition, educational and constructive, but suddenly took a sharp left turn into total absurdity at the end – absurdity of a sort that was incomprehensible. One felt, this is so bad, so misleading about the merits of the case and the application of the stats to it, that it really doesn’t matter if its deliberate or not, in either case its equally discreditable.

    Then you have the odd lapses into pure venom which occur from time to time, and the egging on of a claque of similarly poisonous posters.

    The blog had a lot of promise, whether one agreed with it or not, it has large patches of clear exposition and explanation. He has a talent, when he chooses to use it, for clear exposition of technical subjects. But as in the case of the final installment of PCA, he all too frequently misuses it in ways that mislead, and all too frequently lapses into substituting a torrent of abuse for argument and evidence.

    I have finally found him unreadable and unrewarding, but its a pity. Its an object lesson in the importance of keeping it clean, calm and honest, if you want to persuade. Ah well, as Aristotle says, or is it Acquinas, its human nature. We know the better and we choose the worse.

  2. Steve McIntyre
    Posted Jul 9, 2008 at 6:10 AM | Permalink

    I got stung once again by (another) really annoying habit of Windows. I write scripts in Notepad and don’t always save what I’m working on. Normally I shut down my computer at night, but I left it running last night as I was scraping some data. When I got up, some update process had taken effect; my computer had been automatically shut down by Windows and re-booted; things in Word are saved, but things in Notepad are lost. So I lost quite a bit of work, the 2nd time in 6 weeks.

  3. Posted Jul 9, 2008 at 6:22 AM | Permalink

    21 David,

    The difference illustrates how unsettling the science can be, adjusted data from a network of thousands of surface stations versus satellite data adjusted for altitude.

  4. Tony Edwards
    Posted Jul 9, 2008 at 6:31 AM | Permalink

    Joel, #6, in the spirit of accuracy, shouldn’t a female bulldog be “Tamina”?
    A La Magic Flute. Also, if I remember it right, Tamino remained silent in the face of adversity.

  5. Smokey
    Posted Jul 9, 2008 at 6:34 AM | Permalink

    In #22’s post, the key word is “adjusted.” You either have raw data or cooked data.

  6. Posted Jul 9, 2008 at 6:39 AM | Permalink

    Bummer. Try doing more in Word, just remember to save as plain text before posting. Better yet write a post on how idiotic it is that Microsoft Word doesn’t have a blog friendly mode.

  7. Cliff Huston
    Posted Jul 9, 2008 at 6:44 AM | Permalink

    #21 David Janes,

    Tamino explains:

    tamino // July 5, 2008 at 7:42 pm

    John Lederer:
    The graph in this article is of annual average temperature, the graph in your link is monthly average. The graph in this article covers a span of nearly 130 years, the graph in your link covers a bit less than 30 years time span.

    If you are going to make cherry pie, you need to pick your cherries – in this case, you are being served two different flavors of cherry pie. Chose the one you like, but remember that other versions are possible and pay attention to the details to understand the differences.

    Also, Tamino is using GISS data and Climate Skeptic is using UHA data.

    Cliff

  8. PaulM
    Posted Jul 9, 2008 at 6:59 AM | Permalink

    David, Yes, what Tamino doesn’t tell you about his graph is that it is the data produced and adjusted by Hansen himself, from a network of stations that has dropped very sharply in number over the time period. The climate skeptic one comes from UAH satellite measured in the same way over the time period. Also the climate skeptic one includes the latest 2008 cooling.

    See Lucia’s rank exploits blogs for more Haikus and poems.

  9. Pat Keating
    Posted Jul 9, 2008 at 7:26 AM | Permalink

    Yes, I find the temporary take-over of my computer by outside forces very irritating. I guess we didn’t have that problem in the dial-up days, but now we leave the internet connection on. Perhaps we should turn off the electrical/wireless connection to the modem when we are not actually using the internet, but that’s a PIA, too.

  10. SMcE
    Posted Jul 9, 2008 at 7:27 AM | Permalink

    I have used a couple of text editors with great success over the years.
    V-Edit by Greenview Data Systems which has compiler support and you can configure for all sorts of languages. It is extremely fast and file size is virtually unlimited. They have versions for Windows/Unix/Linux/and I believe Mac. Made from assembly language so no spurious DLL files to get tangled up. I believe it has an auto save feature as well. Lots to like.

    ULTRA Edit is similar but written in C so it is a bit more windows like. Nice feature set but not nearly as fast as V-Edit.

    You could deal with Word but I have found that it just does not handle plain ASCII text very well at all. Lots of extra little hidden control characters get embedded and saved with potentially nasty consequences.

    I believe both V-Edit and Ultra Edit will let you download a trial version. You may or may not like them but I have found them useful and very stable so it is worth a try. Good luck.

  11. Jim
    Posted Jul 9, 2008 at 7:56 AM | Permalink

    Hi Steve

    I am not sure if you aware of this, but I looked up

    http://www.sourcewatch.org/index.php?title=Climate_change_skeptics

    And there is your name listed as an individual skeptic along with
    Exxonmobil as an oprganizational skeptic. I guess they don’t believe
    your oft repeated denials that you are not a denier!

    You efforts here are achieving recognition. :-)

  12. David Charlton
    Posted Jul 9, 2008 at 8:21 AM | Permalink

    I believe that this KB item will instruct you on how to disable automatic restarts under XP Pro. Look about half way down in the article.

    This behavior is a real nusiance. I suppose another poem is called for…

    David Charlton

    Steve:
    I located the article. Thanks.

  13. zap123
    Posted Jul 9, 2008 at 9:44 AM | Permalink

    Steve
    Did you try system restore?
    You might be able to get your work back.

  14. Jeremy
    Posted Jul 9, 2008 at 9:54 AM | Permalink

    Re: Unthread #35 post #446

    jae:

    “By a “core” AGW-theory paper, I presume you mean one that provides a first-principles physics-based exposition of how CO2 is supposed to warm the Earth, including how the “positive water vapor feedback” thing works.”

    Yes, that’d be great. Though I wasn’t intending on holding my breath. I was going to assume that AGW theory would require multiple papers to form a core since they’re dealing with such a complicated system.

    “Steve Mc has been trying to locate this paper for years, and last I knew, he hadn’t found it. As an IPCC reviewer, he asked IPCC to include one in AR4. IPCC did not do so. He has challenged people on this blog many times to point out such a paper. Silence. So, first, we have to locate the core paper.”

    This I really find hard to believe. How can there be no papers explaining AGW theory when so many scientists are willing to use AGW theory to explain their results?

  15. Nick
    Posted Jul 9, 2008 at 10:53 AM | Permalink

    Sytem restore won’t get anything back. You simply need to turn off automatic updates…Go to your control panel and open “Automatic Updates” or “Windows Updates”. Go to settings and select either “download but let me choose to install” or “Notify me but let me choose what to download and install”. Real Simple. No more auto-restarts.

  16. Ian Random
    Posted Jul 9, 2008 at 11:55 AM | Permalink

    I use VIM on Linux. For Windows there is a version called VIM with cream that adds menus for a lot of commands. It will save what you edit as you type, then when you resume after a reboot/disconnect, it offers you the chance to use those changes.

    http://www.vim.org/

    Also there is a plugin that you can enable for Excel to cause it to autosave.

  17. zap123
    Posted Jul 9, 2008 at 12:54 PM | Permalink

    Nick
    It depends on if there was a system checkpoint after he made the file and before the automatic update, if there was, then the file should be there after system restore.

  18. Leon Brozyna
    Posted Jul 9, 2008 at 2:03 PM | Permalink

    Well Steve, had that happen to me a couple times. But I save everything before leaving the computer on unattended. If it makes you feel any better, there was a huge worldwide effort on patches to fix a major glitch in the Internet. Probably still ongoing. Story from Breitbart:

    http://www.breitbart.com/article.php?id=080709124916.zxdxcmkx&show_article=1

  19. Scott Lurndal
    Posted Jul 9, 2008 at 3:06 PM | Permalink

    Re #2

    The geeks would point out that that’s not a problem with Linux: It won’t reboot unless you tell it to, even after an update of operating system or support components. It supports OpenOffice which provides word, excel and powerpoint analogues. You can use gvim instead of notepad and it will automatically save work if the system crashes or is rebooted. It even runs R. (plus you’d then be able to use gfortran and uncompress to attempt to replicate the GISS adjustments :-)

    You can load the free “VMware Player” from http://www.vmware.com and they have some pre-built linux VM’s you can download and play with. vmplayer runs under windows (or linux) and lets you run linux (or windows) as a ‘guest’ operating system. I’d suggest an Ubuntu based vmplayer appliance for new user ease of use. This way you can try it out without reinstalling your current windows system (you can also get live CD’s for most of the major distributions for free: Fedora Core, OpenSuse and Ubuntu are the majors, with Ubuntu being best for new users).

  20. jae
    Posted Jul 9, 2008 at 3:25 PM | Permalink

    14, Jeremy:

    This I really find hard to believe. How can there be no papers explaining AGW theory when so many scientists are willing to use AGW theory to explain their results?

    Hard to believe, but true, IMHO. As I understand the situation, there are only computer models, measurements of CO2 increases in the atmosphere, and the observations that temperatures have creeped up since the Little Ice Age (and many of these are suspect). Welcome to this strange science.

  21. Martin Å
    Posted Jul 9, 2008 at 3:27 PM | Permalink

    Steve,

    Notepad++ is a really good open source notepad replacement for windows. It can be just as simple as ordinary notepad, but also has a lot of useful additional features. Apparently it can also be turned into a R script editor

  22. Nick
    Posted Jul 9, 2008 at 3:42 PM | Permalink

    zap123…only if he saved the file…the temp file would be from last save point as well since it is notepad.

  23. DeWitt Payne
    Posted Jul 9, 2008 at 3:59 PM | Permalink

    cba,

    There isn’t any significant non-radiative energy transfer outside the troposphere. Convective transfer to and from the surface is bounded by the temperature inversion at the tropopause on average. There are exceptions, but they are localized and don’t amount to much globally. There are no significant sources of energy other than solar EM radiation which is accurately represented by the solar constant and spectrum. If there were, then they would have been observed not only by their effect on the Earth’s atmosphere, but the atmospheres of the other planets as well. If your model is not in radiative balance above the tropopause, it’s wrong. It’s not a matter of LTE, it’s an error in the programming or the data or both. Considering how many people use that data, my bet is on the programming.

    The excess emission integrated over the stratosphere in your graphic must be tens of Watts/m2. No one has ever observed anything like this. MODTRAN 1976 standard atmosphere at 40 km looking up gives less than 2 W/m2. Are you saying that your model is right and MODTRAN is wrong by an order of magnitude? I find that impossible to believe and reminiscent of the climate modelers’ faith in their models over actual data. Unfortunately, I don’t have $3,000 to spare to purchase a copy of HITRAN PC to prove my point.

  24. ad
    Posted Jul 9, 2008 at 5:25 PM | Permalink

    Re #19. If you want to use virtual machines check out Virtual Box. It’s just better, IMHO, than the others.

  25. ad
    Posted Jul 9, 2008 at 5:30 PM | Permalink

    Oh yeah, and get Notepad++. Wouldn’t be without it.

  26. Ernie
    Posted Jul 9, 2008 at 6:49 PM | Permalink

    cba says:
    This doesn’t explain the 57km factor. This is something like the d-layer bottom of the ionosphere which absorbs radio waves. This has a deficit of about 4.5w/m^2 for its T compared to what is supposed to be absorbed. It has a good 10 million collisons per second and should be LTE. I can’t believe there is 4.5w/m^2 worth of RF energy being generated on earth or that there could be that much radiated to that area outside of my wavelength range.

    I guess that would equate to many billions watts of RF required. Which RF wavelengths are you talking about that could cause such an effect?

  27. Terry
    Posted Jul 9, 2008 at 7:27 PM | Permalink

    Notepad++ is invaluable. It is also a good HTML editor.

  28. Buddenbrook
    Posted Jul 9, 2008 at 7:35 PM | Permalink

    Hope it’s ok to raise this question here.

    How serious do you believe the Ocean Acidification issue to be?

    I have now read a number of articles on it, and it seems that there is a near 100% consensus among marine scientists that antropogenic co2 has caused 0.1 acidification since the pre-industrial times. And the acidification is calculated to increase a further 0.2-0.3 in the business as usual scenario in which co2 emissions remain high for the next 50+ years. The scientists underline that, unlike in climate science, no modelling is required to calculate the results, as these can be measured directly. A lot more study is required on the effects this will have on marine life, but we know from historical past that ocean acidification and mass extinctions have occurred simultaneously, and can be inter-linked. Key ocean life-forms are feared to be in danger. Many marine scientists are warning that the impact of ocean acidification could far surpass the (assumed) ills of climate change.

    How do you feel about this question? If these marine scientists are correct (and lot of that study is very new from 2006-2008) won’t it render the climate change debate a rather moot issue, if co2 emissions have to be reduced in any case because of ocean acidification? Oceans absorb about 1/3 of anthropogenic co2 emissions.

    I have no opinion on this. Anyone here who is familiar with this question? Should we be worried?

  29. dan
    Posted Jul 9, 2008 at 7:56 PM | Permalink

    Get a Macintosh. I never have problems.

  30. Colin Davidson
    Posted Jul 9, 2008 at 8:15 PM | Permalink

    #28 try http://www.co2science.org/articles/V11/N21/EDIT.php

    That is a link to a summary of research into the effect of increased HCO3- ions in seawater on the growth of coral reefs. The summary is that coral reefs thrive in increased CO2. I believe the same is true of most marine plants, and likely also animals.

  31. jae
    Posted Jul 9, 2008 at 8:20 PM | Permalink

    28, Buddenbrook:

    How serious do you believe the Ocean Acidification issue to be?

    LOL. I think this “crisis” fits in with all the other “crises” associated with AGW. IOW, I would not lose too much sleep over it. The oceans have a HUGE buffering capacity, or they would have long ago gone sterile.

  32. jae
    Posted Jul 9, 2008 at 8:23 PM | Permalink

    Buddenbrook: I would add that you have to pay very close attention to how they are measuring this acidification. I would bet my horse that it’s determined through the magic of computer models, rather than actual measurements.

  33. Bill Marsh
    Posted Jul 9, 2008 at 8:37 PM | Permalink

    Steve,

    That’s the MS autoupdate system. MS now releases updates on the 2nd Tuesday of every month. Some require restarts, some don’t. In any case you can reasonably expect your computer to reboot the night of the 2nd Tuesday of every month.

  34. John M
    Posted Jul 9, 2008 at 8:39 PM | Permalink

    Buddenbrook,

    Ocean pH isn’t as easy to measure as advertised. Sure, there are spectroscopic techniques that can give results to a couple of decimal places, but natural variability of ocean waters and extreme sensitivity to sample collection and handling lead to large variances in actual measurements. You’ll find plenty of pretty multi-colored maps showing the pH of the oceans, but when you get right down to it, the 0.1 pH drop is based on calculations using atmospheric CO2 concentrations , equilibrium constants, and buffering assumptions.

    Honest efforts are being made to get accurate pH measurements, but in reality, it’s really no easier than trying to get a “global temperature”.

    As others have alluded to, the vast majority of pH studies are modeling studies.

    Here’s what appears to be a thoughtful attempt at a critical assessment, although I haven’t read it carefully. I did see that the author thinks pH measurements to 0.1 units are challenging, which is true with a pH meter, but spectroscopic techniques can do better. It’s the natural variability and sampling errors that I worry more about.

    I saw a paper (Nature?) within the last two years where they showed ranges of pHs obtained experimentally. Pretty large variation, measurement-to-measurement and site-to-site. Can’t find the exact reference though.

  35. John M
    Posted Jul 9, 2008 at 8:47 PM | Permalink

    #34

    Hmmm, scanned through a little more. Definitely a skeptical point of view, but perhaps still a good starting point.

  36. bernie
    Posted Jul 9, 2008 at 8:53 PM | Permalink

    On acidification of the oceans — I looked at the NOAA report briefly and was struck by the lack of actual up to date references that were associated with the otherwise scary maps and charts. Do the ocean buoy systems measure pH values?

  37. Paul S
    Posted Jul 9, 2008 at 8:54 PM | Permalink

    I don’t know if this is the proper place to ask this question but it has been bugging me for a long time and I have never seen it really addressed.

    So here goes. News releases the last few years state that the Antarctic is losing mass. According to GRACE measurements, it is losing approx. 152 cubic kms a year (+or- 80 cubic kms). And the Antarctic is approx. 30,000,000 cubic kms.

    My question is, to determine whether the Antarctic is losing ice, that would imply scientists are able to measure the total mass of the Antarctic very accurately. Does anyone know what the precise size of the Antarctic is? And what margin of error is associated with this measurement?

    I’m not a science guy, just someone interested in the issue. Thanks in advance for any responses.

  38. Dishman
    Posted Jul 9, 2008 at 9:01 PM | Permalink

    Hmmm… I did a rigorous analysis, and which shows that the oceans absorbing CO2 from the atmosphere causes a rise in sea level of 1.1 mm/year without changing the ocean temperatures. This closely matches the satellite measured rise in sea level and buoy measured ocean temperature, validating my results.

    I won’t disclose my data, or methods, though. Trust me, they were robust and skilled.

    (I don’t think I’m off by more than a factor of 10)

  39. John M
    Posted Jul 9, 2008 at 9:06 PM | Permalink

    Bernie,

    While it’s theoretically possible to place pH probes on buoys, the issues with trying to continuously and accurately measure pH to 0.1 units remotely and under harsh conditions would make Anthony Watts’ Surface Stations look like semiconductor cleanrooms.

    Here’s what seems to me to be the state-of-the art form measuring seawater pH to the precision and accuracy necessary to see the effect of increasing atmospheric CO2. The ability to attain that precision requires thermostatted instrumentation and careful sample handling.

  40. jae
    Posted Jul 9, 2008 at 9:11 PM | Permalink

    Buddenbrook: John M said almost all. You also have to wonder about how they established the “baseline” pH. Did they have accurate methods of measuring “average ocean pH 50-100 years ago? I doubt it.

  41. CMD
    Posted Jul 9, 2008 at 10:28 PM | Permalink

    Steve and the regulars here-

    Thanks for entertaining a naive lurker’s suggestion:

    I would be curious to see you apply the level of scrutiny/auditing you give to IPCC’s “core” contributors, to the more core “skeptics” such as Lindzen, Christy, etc. Those names might not be the right ones- I’m no expert, but I thinnk you get my point. Shouldnt we be reviewing their findings/conclusions/archiving skills/methods/reproducible-ness/stonewalling skills with the same level of skepticism that we have for IPCC? I’m no IPCC worshipper, just thought it would be interesting to see if you find the same level of scientific sloppiness in the “skeptics” camp. It would add an new and interesting balance to your site- unless I missed all those audits elsewhere on your blog- which is possible.

    Thanks for your interesting work here all- it’s a daily stop in my surfing routine

    Steve: I’ve explained on many occasions that there’s only so much that I can do as an individual. I’m already swamped with topics. I really don’t see how I can handle much more. I try to do things that no one else is doing. There’s already lots of criticism of the authors that you mention, so trying to do lengthy analyses of these articles can hardly be justified as a priority. If the IPCC were relying on these studies, then I would definitely spend the time to look at them. But they aren’t. I realize that this leaves an impression that may seem a bit one-sided, but I don’t see how I can do much about it as one person. Leif SValgaard is very critical of most skeptical solar theories and I’ve provided him with a platform to criticise some of the solar theories that I can’t cover.

    Of the names you mention, though, I’ve had contact with Christy, who places his data online meticulously and who provided his code to Mears and Wentz (who criticized it.) I have no reason to give him anything other than a clean bill of health. If Team scientists were like Christy, none of the frustrating archiving problems would exist.

  42. Jeremy
    Posted Jul 9, 2008 at 10:49 PM | Permalink

    Re #41.

    I’m not a regular poster. However, I would say I agree with you in that the same level of skepticism is required in all cases.

    I would say the reason that some people are seemingly questioned constantly, and others are not, lies solely in the differences between their claims. Extraordinary claims require extraordinary proof. The more famous skeptics of AGW claim little, other than a lack of real knowledge, and as a result of their belief in a lack of knowledge, their conclusion is that there can be no meaningful conclusions. The prominent flag-wavers of global man-made doom claim the sky is falling, and consistently offer incomplete science as proof that action is needed.

  43. Philip_B
    Posted Jul 9, 2008 at 11:27 PM | Permalink

    Some perspective on the issue of scepticism in science, which is and should be the norm.

    My brother is a prominent US scientist in a field unrelated to climate. A few years ago he was required to advise the US Federal Government on what to do about a common and irreversible medical procedure that had a high failure rate. He told me that dozens of studies had been performed, many of which claimed to identify the cause or causes of the high failure rate. However, all of them were flawed in his professional opinion (and he is a world leading expert in this field). As result, he advised the government that we didn’t know with sufficient confidence what the cause of the high failure rate was and therefore should do nothing until further studies were performed and a clearer picture emerged. This was despite considerable pressure to do ‘something’.

    That is how scientific scepticism should work, especially in relation to public policy.

    Not only is this level of scientific scepticism seemingly absent from much of climate science, but many climate scientists are indulging in advocacy to the detriment of science, and this IMO is an abuse of their positions.

  44. Philip_B
    Posted Jul 9, 2008 at 11:46 PM | Permalink

    Extraordinary claims require extraordinary proof. The more famous skeptics of AGW claim little, other than a lack of real knowledge, and as a result of their belief in a lack of knowledge, their conclusion is that there can be no meaningful conclusions

    That is a gross travesty of the facts, and typical of the reality inversion practiced by the Left. Ref, Gore and Flannery qualified in divinity and english literature respectively.

    Anyway, the extraordinary claim requiring extraordinary proof is that CO2 can raise global temperatures by between 2C to 5c without any remotely plausible physical explanation of how this happens. And despite many requests by Steve for a scientific paper explaining this.

  45. david
    Posted Jul 10, 2008 at 12:09 AM | Permalink

    #28 Budd. Try http://www.jennifermarohasy.com/blog/archives/003223.html

  46. david
    Posted Jul 10, 2008 at 12:20 AM | Permalink

    #28 Budd Also, rain generally has a pH less than about 6, so depending on the surrounding materials you would expect fresh water lakes to be at least sometimes that acid. Ocean pH is around 8.1 (so they’re not so much acidifying, but, if anything, becoming less alkaline). So it is obvious that marine life is compatible with a wide range of pH.

  47. Richard Hill
    Posted Jul 10, 2008 at 12:46 AM | Permalink

    On Tuesday 8 July, 2008 “The Age”, a newspaper published in Melbourne, Vic, Australia, printed an article by William Kininmonth, former senior climate scientist in the Australian Public Service.

    http://business.theage.com.au/why-so-much-climate-change-talk-is-hot-air-20080707-34iz.html

    From this article….”
    Frank Wentz and colleagues of Remote Sensing Systems, California, published a paper in the prestigious international journal, Science.
    This paper reported a finding of the international Working Group on Numerical Experimentation that the computer models used by the IPCC significantly underestimated the rate of increase of global precipitation with temperature. The computer models give a 1-3% increase of precipitation with every degree centigrade while satellite observations, in accordance with theory, suggest that atmospheric water vapour and precipitation both increase at a rate of 7 % for each degree centigrade rise.”…
    Link to Wentz paper.

    http://www.sciencemag.org/cgi/content/abstract/317/5835/233

    Link to comment by Takahashi et al

    http://www.atmos.washington.edu/~ken/PUB/comment_on_wentz_et_al_2007.pdf

    (It is hard for an outsider to understand if Takashaki is politely demolishing Wentz or not)
    I understand that this is OT for Climate Audit, but if someone could direct me to
    more analysis of Wentz’ 2007 work, it would be appreciated.
    ps I made the same request at RC but no response there, so far.

  48. Andrey Levin
    Posted Jul 10, 2008 at 1:42 AM | Permalink

    Re#30, Colin Davidson:

    CO2science has numerous summaries on the subject at “calcification” chapter:

    http://www.co2science.org/subject/c/calcification.php

    http://www.co2science.org/subject/c/calcificationother.php

  49. crosspatch
    Posted Jul 10, 2008 at 2:12 AM | Permalink

    Steve, if you open IE (won’t work with Firefox or other browsers) and navigate to update.microsoft.com you should see a shield on the right side of the page that might say something like “Automatic Updates Turned ON”. Click the link in that frame that says “Pick a time to install updates”. That will launch an “Automatic Updates” panel. Select “More Options”. Select either the “Download but don’t install” or “Notify but don’t download” buttons and select “Apply”.

  50. Ed Hinton
    Posted Jul 10, 2008 at 4:32 AM | Permalink

    Steve,

    If you are running XP, go to control panel and open ‘Automatic Updates’. You can turn it off there. You don;t want to do it normally, as the updates are important. But if you have a long running process overnight, that should prevent the updates from occurring. You can also control when they occur. I hope this helps.

  51. Pliny
    Posted Jul 10, 2008 at 5:06 AM | Permalink

    #28 Budd and others
    The chemistry of ocean acidification is not so simple. pH decrease is not the most important or sensitive indicator, and there are more accurate measures.

    In acid-base terms, the ocean is buffered by the presence of carbonate and bicarbonate in solution. If CO2, or any other acid, is added, the nett effect is to convert carbonate to bicarbonate, with little change in pH. A cost of acidification is that the carbonate in solution is, in turn, in equilibrium with solid calcium carbonate. When dissolved carbonate is converted to bicarbonate, more calcium carbonate goes into solution.

    There are more reliable measures of this than pH. Commonly two quantities are measured – total dissolved CO2 (as CO2, bicarbonate and carbonate) and “alkalinity”, which is conc carbonate +2*bicarbonate. These are easier to measure, because the species are much more abundant and stable during the collection process. The other variables in the various equilibria, including pH, can then be worked out by known equilibrium constants.

    A huge effort has gone into measuring these quantities, including the Glodap project.

  52. peter_ga
    Posted Jul 10, 2008 at 7:25 AM | Permalink

    Hi, Being half-smart, in that I’ve studied engineering, feedback systems, and heat transfer, many years ago, I attempt to interpret the data in my own skeptical way, so to speak. I develop my own personal reason why I am skeptical about the AGW hypothesis.
    For some of my data, I was going to use the diagram here, but its values have changed. It used to have a figure for latent heat/evaporation, but now only gives a surface-atmosphere total power output of 492 W/m2 implies a temperature of 32 degrees C if it were all radiation: wikipedia greenhouse diagram Anyway, there is about 40 W/m2 of radiation direct from the surface, with average temperature 287K = 14C.

    How to develop the greenhouse effect in a simple algebraic fashion. This is done by calculating a combined surface-atmospheric radiative emissivity. Normally, lumping the earth’s surface at one point, without any atmospheric absorbtion, and an emissivity of unity, the total black body radiation would be from the Stefan-Boltzmann law Stefan* T**4 = 384 W/m2. (Stefan = 5.67e-8, Stefan-Boltzmann constant, and T = 287K). It is common to multiply this equation by a dimensionless constant to reflect the fact that most surfaces are not in fact black-body. Since only 40 W/m2 makes it directly to outer space, assign/define a surface-atmospheric emissivity Es of 40/384 = 0.104, from the ratio of the radiation that did make it to outer space to the radiation that would make it to outer space assuming a perfect black body with no atmosphere. So at infrared temperatures, use the expression (Es * Stefan * T**4) to calculate radiation from the Earth to outer space, where T is the earth’s surface temperature.

    This formula can be used to calculate the surface temperature if no other loss mechanisms were available. If the net average solar input to the surface is 168 W/m2 (from the wikipedia link), then the temperature would be calculated from the formula 168 = Es * Stefan * pow(T, 4) which would give T = 410K or 137C, which indicates the power of the greenhouse effect. Since it is actually much cooler than this, obviously other mechanisms are cooling the surface. Given 168 W/m2 of solar radiation, 40 W/m2 lost through radiation to outer space, leaves 128 W/m2 net loss through convection/evaporation. Call this the convective loss.

    Extra CO2 may be regarded as an extra infra-red heat input at the surface, rising at 0.28 W/m2/decade (from an IPCC ch2 p140).The temperature is rising at 0.131K/decade, from nsstc-uah. The ratio of these two values is 2.13 W/K.

    The emissivity calculated previously is now used to develop the radiative sensitivity, which is the ratio of radiated power to surface temperature. Differentiating the Stefan-Boltzmann equation, dW/dT = 4 * Es * Stefan * T**3, = 0.558 W/K.

    However most of the heat lost from the earth’s surface is through non-radiative mechanisms, and it is reasonable to assume this heat loss is sensitive to surface temperature. Without an atmosphere, the earth would be at a temperature of 255K. If the assumption is made that convective loss is linear from this point, then 128 W/m2 for (287 – 255 = 32) degrees kelvin, gives a sensitivity of 4 W/K. However I find it impossible to believe that convective evaporation loss function is only linear from this base temperature. If the surface temperature is cooler than that giving the atmospheric temperature lapse rate at which vertical movement of air occurs, then the atmosphere is stable and no convection occurs. However once the temperature is reached, then it is effectively capped at that temperature, and there is very little increase, super adiabatic lapse rates occurring only very rarely above deserts. Looking at balloon radiosonde traces, it is obvious that the tropospheric temperature is everywhere close to the convective lapse rate, and this mechanism of surface heat loss dominates. A reasonable assumption of a few degrees base temperature range for calculation of this ratio gives such a high sensitivity, that any increase in radiative input is easily accounted for. For example if 3 degrees K were used as the temperature difference between no convection and full convection, then a sensitivity of more than 40 W/K would result.

    Given that convective loss of 128 W/m2 “powers” the weather, and there is an extra input of 0.28 W/m2/decade, claims that this results in wild weather swings, hurricanes, heat waves and so on, appear to me to be ridiculous.

  53. Bruce
    Posted Jul 10, 2008 at 7:38 AM | Permalink

    Paul S:

    The huge West Antarctic Ice Sheet may be headed for a complete meltdown in a process that a new study indicates was started thousands of years ago, and not as a result of global warming.

    As scientists have been increasingly able to document melting and the discovery of icebergs breaking off from Antarctica in recent years, concerns have risen that a human-induced change in climate could be damaging the ice sheet.

    But the future of the West Antarctic Ice Sheet ”may have been predetermined when the grounding line retreat was triggered in early Holocene time,” about 10,000 years ago, a team of scientists led by Dr. Howard Conway of the University of Washington reported on Friday in the journal Science.

    The grounding line is the boundary between floating ice and ice thick enough to reach the sea floor. The scientists found that line has receded about 800 miles since the last ice age, withdrawing at an average of about 400 feet per year for the last 7,600 years.

    http://query.nytimes.com/gst/fullpage.html?res=9405EED71630F931A25753C1A96F958260

  54. John Finn
    Posted Jul 10, 2008 at 8:21 AM | Permalink

    A few weeks back (I can’t remember when exactly), David Smith posted a global surface temperature anomaly Hovmoeller plot. It showed that over the past few months while there were higher temp anomalies in the tropics, they had fallen in the polar regions. I remember thinking at the time if this would be reflected in the GISS June anomaly. One of the reasons given for GISS being out of step with other data sets is that GISS extrapolates over the whole of Arctic and, because Arctic surface temperatures have been particularly high recently, this has inflated the GISS figure. If the Arctic has cooled (as per the DS graphic) this should bring GISS back in line with Hadley and the satellite back.

    Sure enough – the GISS June anomaly is a relatively low 0.26 deg C. I’m just wondering if people have been a bit too quick in criticising the GISS data. It’s still possible their methods are flawed (none of them are perfect, though) but it’s possible they are, at least, consistent.

  55. Sam Urbinto
    Posted Jul 10, 2008 at 12:07 PM | Permalink

    #2 Steve: They’re correct. Don’t leave the Automatic Updates on Automatic. The default is to try and install updates (and reboot “if required”) every morning at 3 AM. If you have a real firewall (DSL or cable with a router and NAT works great), anti-spyware (Like spybot S&D and spywareblaster, very handy) and anti-virus (AVG is free and not too intrusive, the commercial products are resource hogs) the updates are basically worthless anyway.

    Others: Regular notepad doesn’t do temp files. No save? Work is gone forever on restart.

    #28 Buddenbrook: The .1 lowering of pH is modeled. There’s a link to the source on the acidification wikipedia article.

    #37 Paul S: Anyone who seriously says they know how much ice there is covering the entire continent of Antarctica needs to be in a mental institution. :)

  56. cba
    Posted Jul 10, 2008 at 12:26 PM | Permalink

    DeWitt,

    Guess I didn’t notice the #35 close out soon enough. The overall discrepancy with the Stratosphere sums out to under 2W/m^2 for the whole thing up to 100km. You’ll note on the chart that there’s a positive deficit of several W/m^2 at around 57km but that it dips into the negative both above and below here. The total sum of the deficits from 22km to 100km turn out to be under 2 W/m^2 and I don’t recall if that was + or -. It’s again a very small fraction of the total power involved in the process and supports the notion that modtran and this model are in fairly good agreement.

    Why it would bunch up at 57km is the curiosity. There is a peak of solar absorption going on there and it is the bottom of the d-layer ionosphere. Typically, that is supposed to mean ozone as I recall. It also could possibly represent a problem with the 1976 std atm having about the right amount of ozone and/or other molecules up there while possibly improperly concentrating them too much at one location. It may also represent a fairly slight error in the T of that region as it is above sounding balloons and below satellites and is possibly poorly measured in the real world.

    I’m tending to not blame my model here because this stuff is done in loops by altitude and if it’s screwed up at 1 altitude, it will certainly be screwed up everywhere else too and that doesn’t seem to be the case. Also, simple typos in data would tend to be limited to just one altitude, not spread out to several surrounding ones.

    Actually, the 2 W/m^2 total stratos. discrepancy is for the co2 doubling and for current levels, it’s well under 1, stopping at 100km. Each layer around there though is bouncing around a good fraction of 1 w/m^2 and above 100km it starts to diverge showing a deficit of power – which suggests the LTE assumption has started to fail. Around 100km and lower, the deficit tends to stay small and mostly negative – except around 57km where it shoots up dramatically, along with the solar absorption.

    In the troposphere, one sees the deficit strongly positive low in the atmosphere and dropping down towards 0 higher up. It too bounces a bit above and below 0 high up towards the 22km tropopause. The net sum of this comes out at around 60w/m^2 with the vast majority down towards the surface which places it in the expected ballpark of sensible and latent heat transfer. Considering we’re dealing here only with the clear sky condition, there are still substantial variables unaccounted for.

  57. Sean Egan
    Posted Jul 10, 2008 at 1:29 PM | Permalink

    Paul S
    The GRACE satellites can measure gravitation and hence mass, but they are not the tool NASA used to estimate mass loss. NASA looked at snow fall, and satellite radar to estimate the glacier ice loss and concluded more ice melted than snow fell. http://www.jpl.nasa.gov/news/news.cfm?release=2008-010
    Do you have a url or paper showing Antarctric ice loss in recent years. I thought mass was growing there.

    I have looked for and the are no up to date draft figures, or mass/volume data for 2006/2007 or 2008 on the Internet. We have extent, area, even a well hidden estimated age. The big name in ice depth is Rothrock. Rothrock et al use declassified military submarine records. As the subs are there principly to hide out, they do not go to the same place at the same time of the year. Rothrock does modelling to read the results – and not everyone gets the same result from the same data. Plus the sub record does not cover before 1950s or after 2002 yet. So forget 2006/2007/2008.

    There is data from around the edges of the arctic going back over 200 centures; Look at the 20 century
    in Polyakov et al, 2003b. Like the surface temp record, it looks less record breaking the longer the record you look at.
    There is helicoper EM data, drift stations, boueys, but is it limited areas. I have not read it all, but it appears somewhat reserved in its support for reduced thickness.

  58. Posted Jul 10, 2008 at 2:00 PM | Permalink

    Australian Researchers Warn of Global Cooling
    Michael Asher (Blog) – July 1, 2008 11:09 AM

    A 2005 prediction of solar activity. The sunspot number should now stand close to 100, instead it is zero. “Spin-orbit coupling” to blame; effects could last decades.

    A new paper published by the Astronomical Society of Australia is warning of upcoming global cooling due to lessened solar activity.

    http://www.dailytech.com/Australian+Researchers+Warn+of+Global+Cooling/article12250.htm

  59. DeWitt Payne
    Posted Jul 10, 2008 at 2:24 PM | Permalink

    cba,

    But it is screwed up everywhere else, certainly everwhere above about 20 km. Your deficit trend reverses because the temperature starts to go down again at 50 km in the 1976 standard atmosphere, i.e. the stratopause. It doesn’t matter that the integrated net deficit from 20 to 100 km is fairly small. It’s still quite large in comparison to total emission. The point is still that either emission or absorption or both at any given altitude above 20 km are just too high by at least an order of magnitude. Emission in your model dominates at high temperatures in the stratosphere and lower mesosphere. Then as the temperature falls sufficiently, absorption, which isn’t, for the most part, a function of temperature, more than catches up. The situation reverses again at about 100 km, the mesopause, because the temperature begins to increase rapidly there.

    Again, you cannot possibly have an emission excess at 57 km of over 4 W/m2 because total emission at that altitude is only a small fraction of 1 watt/m2. By the same logic, you also can’t have excess absorption of solar radiation at 80 km of nearly 2 W/m2. For anyone just tuning in, the chart in question is here. Just because your extremely large errors nearly cancel when integrated over altitude doesn’t mean your model is more or less correct. Too much solar absorption at high altitude means there’s likely some problem with ozone and oxygen, or possibly singlet oxygen and hydroxyl radical as well.

    Humor me. If you’re using all the isotopologues, try zeroing out the concentration of all the molecules containing minor isotopes like 2H, 13C, 17O and 18O and see what happens. If that doesn’t have a significant effect, then zero large blocks of the low concentration molecules. Eliminate half, then restore that half and eliminate the other half. If that doesn’t allow you to zero in on the problem, then there’s something fundamentally wrong with your computing algorithm.

  60. John Galt
    Posted Jul 10, 2008 at 3:31 PM | Permalink

    Steve McIntyre wrote:

    I got stung once again by (another) really annoying habit of Windows. I write scripts in Notepad and don’t always save what I’m working on. Normally I shut down my computer at night, but I left it running last night as I was scraping some data. When I got up, some update process had taken effect; my computer had been automatically shut down by Windows and re-booted; things in Word are saved, but things in Notepad are lost. So I lost quite a bit of work, the 2nd time in 6 weeks.
    ———
    Steve, I am a software developer and I often use Notepad to create little notes. I always save whatever I’m working if it’s not something I’m going to use immediately. Just save the text file to your desktop and save again after making any important changes. Delete the file later when you’re done with it. If you make a lot of notes, you might try OneNote instead.

    I use a notebook computer and work and at home and it’s tempting to assume the system will never shutdown unexpectedly because it has a battery. But it’s just a good habit to expect the worse and save frequently.

    I’m sorry about your lost work, but I hope you’ll learn from this and incorporate frequent saves as a work habit.

  61. Bruce
    Posted Jul 10, 2008 at 3:57 PM | Permalink

    GOODBYE air pollution and smoky chimneys, hello brighter days. That’s been the trend in Europe for the past three decades – but unfortunately cleaning up the skies has allowed more of the sun’s rays to pierce the atmosphere, contributing to at least half the warming that has occurred.

    Since 1980, average air temperatures in Europe have risen 1 °C: much more than expected from greenhouse-gas warming alone. Christian Ruckstuhl of the Institute for Atmospheric and Climate Science in Switzerland and colleagues took aerosol concentrations from six locations in northern Europe, measured between 1986 and 2005, and compared them with solar-radiation measurements over the same period. Aerosol concentrations dropped by up to 60 per cent over the 29-year period, while solar radiation rose by around 1 watt per square metre (Geophysical Research Letters, DOI: 10.1029/2008GL034228). “The decrease in aerosols probably accounts for at least half of the warming over Europe in the last 30 years,” says Rolf Philipona, a co-author of the study at MeteoSwiss, Switzerland’s national weather service.

    The latest climate models are built on the assumption that aerosols have their biggest influence by seeding natural clouds, which reflect sunlight. However, the team found that radiation dropped only slightly on cloudy days, suggesting that the main impact of aerosols is to block sunlight directly.

    http://environment.newscientist.com/channel/earth/mg19926634.800-cleaner-skies-explain-surprise-rate-of-warming.html?feedId=online-news_rss20

  62. Raven
    Posted Jul 10, 2008 at 4:36 PM | Permalink

    Bruce says:

    GOODBYE air pollution and smoky chimneys, hello brighter days. That’s been the trend in Europe for the past three decades – but unfortunately cleaning up the skies has allowed more of the sun’s rays to pierce the atmosphere, contributing to at least half the warming that has occurred.

    I beleive the proxy record provides compelling evidence that Europe was warmer than today during the MWP and the Roman Warm Period (Alarmists argue that the MWP was simply a local phenomena).

  63. Bruce
    Posted Jul 10, 2008 at 4:45 PM | Permalink

    Raven, the air was undoubtedly cleaner than today.

    It the robust sun doing what little warming occurred.

  64. Basil
    Posted Jul 10, 2008 at 5:13 PM | Permalink

    What are the testable propositions, or falsifiable hypotheses, of anthropogenic global warming? Two that I’ve seen mentioned are that warming should advance first in the tropics, and that the troposphere should warm at a greater rate than the surface. Are there others like these?

    I realize, of course, that all kinds of things are attributed to AGW, but that’s not quite the same thing, especially when they can be attributed to other phenomenon. Thus some propositions are too course, or insufficiently specific, to be tested. Rising global temperature, for instance. Can we really distinguish how much of the rising temperature of the last half century was attributable to natural causes and how much was attributable to anthropogenic sources? I don’t think so. And that doesn’t even consider the distinction between different possible anthropogenic factors (such as changes in land use cover vs. greenhouse gas emissions).

    So I’m interested in what testable propositions there are that are sufficiently specific to be truly testable. Are there others besides the two that I’ve mentioned?

  65. Pat Keating
    Posted Jul 10, 2008 at 6:07 PM | Permalink

    64 Basil
    I don’t think that GHG warming of the surface and warming of the upper troposphere are as tightly linked as you suggest. IOW, the failure wrt high-altitude warming is probably due more to inadequacies in the models than in a failure of the surface warming thesis. GHG gases definitely warm the Earth’s surface, and increases in CO2 will cause some further warming. However, IMO it is a lot less than the AGW alarmists claim.

  66. Philip_B
    Posted Jul 10, 2008 at 6:53 PM | Permalink

    Raven, the air was undoubtedly cleaner than today

    I very much doubt this is true. Naturally occuring and human caused fires that burned for long periods and introduced large amounts of material into the atmosphere have occured for a very long time. Hunter gatherers used to burn the forests to improve grazing. Also, smoky domestic fires and other activities such as charcoal production and crop residue burning likely go back millenia.

  67. crosspatch
    Posted Jul 10, 2008 at 7:04 PM | Permalink

    I would not be surprised to learn that in times of extreme drought, fires could have raged from the Carolinas to Nova Scotia or across thousands of square miles of the plains anywhere from Texas to Alberta. There would have been little to stop extensive grass and forest fires and if the wind is right, jumping a river isn’t a problem. All it would take would be some lightning strikes and wind or even a domestic fire that got away from someone.

  68. Larry Sheldon
    Posted Jul 10, 2008 at 7:11 PM | Permalink

    On “Ocean Acidification”:

    From You’re a scientist who measures levels of aragonite in the oceans – not very sexy, but it’s grant time. Then, a brainwave! You coin the scary words, “ocean acidification”

    http://wmbriggs.com/blog/2008/07/04/at-least-theyre-admitting-it/

  69. David Jay
    Posted Jul 10, 2008 at 7:11 PM | Permalink

    #64

    Pat is correct at one level, however the only “verification” of catastrophic AGW is the models. The models that indicate large positive feedbacks (i.e. that forecast a catastrophic future) have that characteristic warming of the tropical troposphere.

    So in a practical sense, the models that forecast a catastrophic future HAVE been falsified by the failure of the tropical troposphere to warm at a faster rate than the surface.

  70. Larry Sheldon
    Posted Jul 10, 2008 at 7:17 PM | Permalink

    At the risk of being tedious, how in the world can a model be verification of anything?

    I mean even theoretically?

    (OK, I understand that if the model correctly “predicts” past and current events from actual observed data, the model can be said be verified for some definitions of “verified”. But it is the experimenter that does the verifying, not the model.)

  71. Pat Keating
    Posted Jul 10, 2008 at 7:21 PM | Permalink

    69 David
    I agree.
    The current models have been falsified by the upper-troposphere trends. However, that does not mean that better models would not reproduce the cooling trend without negating the surface warming effects.

    The real issue re the latter is the large climate-sensitivity values which are used in these current models, which are not justified by any objective evidence independent of the models.

  72. John M
    Posted Jul 10, 2008 at 7:26 PM | Permalink

    Pliny #51

    There are more reliable measures of this than pH. Commonly two quantities are measured – total dissolved CO2 (as CO2, bicarbonate and carbonate) and “alkalinity”, which is conc carbonate +2*bicarbonate. These are easier to measure, because the species are much more abundant and stable during the collection process. The other variables in the various equilibria, including pH, can then be worked out by known equilibrium constants.

    Thanks, that’s helpful.

    Do you have a good reference for a review that clearly indicates the error bars in such measurements? In particular, the equilibria involved seem to be somewhat dependent on the medium (seem to be different in “synthetic” sea water than in acutal) and some of the minor buffering equilibria (HF for example) might be a little uncertain.

    Thanks again.

  73. John M
    Posted Jul 10, 2008 at 7:28 PM | Permalink

    Yipe! Really screwed up the formatting on that one.

  74. crosspatch
    Posted Jul 10, 2008 at 8:01 PM | Permalink

    “How serious do you believe the Ocean Acidification issue to be?”

    Wouldn’t have most forms of marine life evolved when atmospheric CO2 levels were much higher than today? In my opinion, I would think it might change sea life, but I don’t think it would “kill” it. The seas were teaming with life when CO2 was many times today’s atmospheric levels. I honestly don’t think we are going to wake up one morning to find the ocean covered with a mat of dead animals. Some species might be less successful and others more so, though. Or existing species might adapt and continue. I also don’t see any lack of sea life around ocean volcanic vents belching liquid CO2 and sulphur.

  75. jae
    Posted Jul 10, 2008 at 8:36 PM | Permalink

    71, Pat: LOL. Hang in there.

  76. John Lang
    Posted Jul 10, 2008 at 8:38 PM | Permalink

    The carbonate-based life-forms of the Trilobites and Ammonites dominated the oceans at times when CO2 levels were 15 to 20 times higher than today (4,000 to 7,000 ppm.) The acidification of oceans by CO2 threatening carbonate and shell-based sea life theory is just another “doom and gloom alarmist” exageration. I am personally get sick and tired of this garbage which never seems to end.

  77. kuhnkat
    Posted Jul 10, 2008 at 9:34 PM | Permalink

    Pat Keating,

    they already have models that do not show trop warming. Of course, those models also do not show excessive surface warming in the future. The closer the models are to matching the observations, the less instability they show!!

    You probably won’t be shown many of these by the IPCC any time soon.

    Another requirement for excessive warming is that the oceans warm and are the heat sinks that carries the warming over surface excursions. Again, we do not have excess heat hiding in the oceans according to the Argos data.

    Originally they also claimed both poles would warm. Obviously the Antarctica hasn’t.

    Finally, the sea level rise is a primary indicator. This is still a hotly debated subject. They just orbited another satellite to do the job of observing the oceans with higher precision. In the meantime the sea level has been dropping since early 2007!!!

    Basically they have a DREAM, and it doesn’t match this earth!!!

  78. Basil
    Posted Jul 10, 2008 at 9:50 PM | Permalink

    #65, Pat

    Models are merely theoretical constructs, i.e. hypotheses. If the models are “failing” wrt high altitude warming because of inadequacies in the models, then that constitutes falsification of the models. Now if they get tweaked to the point that they merely agree with observational data, that doesn’t verify the models. I’m sure I paraphrasing Karl Popper poorly, but the whole reason for his principle of falsifiability was because if you are searching for confirming evidence, you’ll likely always find it.

    As for the AGW hypothesis faring better in relation to surface warming, I question that too. Most of the warming is in the NH, over land, and is higher in the surface station record than in the satellite record over NH land. That is not global warming. It suggests (to me) either problems with the surface station records or land use/cover changes along with UHI effects, or both.

    But for now, I’m not so much interested in debating these particular points as I am in understanding whether the AGW hypothesis has more specific testable propositions of this nature. When I look at sources like Chapter 3 of AR4 WG1 I don’t see very specifically framed testable hypotheses. What I see is a lot of casual empiricism. Is there really nothing more?

    What would be nice would be a specific list of testable propositions, one that could be presented in bullet format without any reference to the data. Is that not possible? That’s not the IPCC approach. It has it rather backwards. I.e. it goes on and on through observational data casually inferring that rising greenhouse gas emissions are behind it all.

    Really, is this the essence of the AGW hypothesis?

    Major Premise: [Models Predict that] Rising greenhouse gas emissions will cause warming.
    Minor Premise: The earth is warming.
    ——-
    Conclusion: Rising greenhouse gas emissions are causing warming [verifying the models].

    If so, somebody needs a lesson in elementary logic.

  79. Jim Arndt
    Posted Jul 10, 2008 at 9:57 PM | Permalink

    Hi Guys,

    Just a heads up,

    We will see what comes from this?;’

    http://climatesci.org/2008/07/10/special-guest-seminar-at-cu-by-roy-spencer-july-17-2008global-warming-recent-evidence-for-reduced-climate-sensitivity/

  80. J. Marshall
    Posted Jul 10, 2008 at 11:55 PM | Permalink

    The method of all the models need to be revamped, with all we now know about the atmosphere. It was a large task to begin with one that has not evolved much. By this I mean the core belief to make the data agree with AGW. I think we can afford some objectivity. The world will not end in a day, and I don’t see it happening (ending)now.

  81. Pat Keating
    Posted Jul 11, 2008 at 7:44 AM | Permalink

    78 Basil

    I don’t disagree with your first paragraph, but no-one here is suggesting that it verifies the models. On the other hand, you were implying that if the models are wrong on the upper troposphere, then they are also wrong on the surface warming. But that does not follow logically, either (though it does indeed introduce some doubt).

    Is there really nothing more? Not much more, I think. The conclusion seems to have preceded the analysis!

    77 kuhnkat

    Agreed. Those lower-sensitivity models seem to me to be closer to the truth than the others. Do you have a link to what you consider the ‘best’ of them?

  82. Bruce
    Posted Jul 11, 2008 at 7:50 AM | Permalink

    I very much doubt this is true. Naturally occuring and human caused fires that burned for long periods and introduced large amounts of material into the atmosphere have occured for a very long time. Hunter gatherers used to burn the forests to improve grazing. Also, smoky domestic fires and other activities such as charcoal production and crop residue burning likely go back millenia.

    Naturally occurring huges fires also occurr today (Peat fires for example).

    But the human population was miniscule 1000 years agao and industrial pollution and coal fires on a grand scale did not exist.

    For hundreds of years, the mists and fogs of Britain’s major cities were all too often polluted and noxious, with London especially badly affected. The fogs endangered health and also posed a threat to travellers who lost their way and thus became an easy prey to robbers. Around 1807, the smoke-laden fog of the capital came to be known as a ‘London particular’, i.e. a London characteristic. Charles Dickens used the term in Bleak House (published in 1853) and provided graphic descriptions of London’s fogs in this and other novels.

    Fig 1: Hazardous driving conditions due to smog

    The smoke-laden fog that shrouded the capital from Friday 5 December to Tuesday 9 December 1952 brought premature death to thousands and inconvenience to millions. An estimated 4,000 people died because of it, and cattle at Smithfield, were, the press reported, asphyxiated. Road, rail and air transport were almost brought to a standstill and a performance at the Sadler’s Wells Theatre had to be suspended when fog in the auditorium made conditions intolerable for the audience and performers.

    The death toll of about 4,000 was not disputed by the medical and other authorities, but exactly how many people perished as a direct result of the fog will never be known. Many who died already suffered from chronic respiratory or cardiovascular complaints. Without the fog, they might not have died when they did. The total number of deaths in Greater London in the week ending 6 December 1952 was 2,062, which was close to normal for the time of year. The following week, the number was 4,703. The death rate peaked at 900 per day on the 8th and 9th and remained above average until just before Christmas. Mortality from bronchitis and pneumonia increased more than sevenfold as a result of the fog.

    http://www.metoffice.gov.uk/education/secondary/students/smog.html

  83. Jon
    Posted Jul 11, 2008 at 8:31 AM | Permalink

    remember how you condemned Hansen?

    For what?

    He’s right, over long enough timescales. Unchecked emissions growth would consign many species to extinction, between climatic changes and ocean acidification.

    Perhaps you meant to reference a different Hansen quote- the one where he suggested that CEOs of fossil fuel companies knowingly engaging in misleading the public should be tried for high crimes against humanity and nature?

  84. pat
    Posted Jul 11, 2008 at 8:48 AM | Permalink

    This is probably off topic, but I’m amazed that the Danish government report that there is nothing unusual in the Greenland glacier melting has received almost no publicity. I found out about it in the NYT (of all places!)about a week ago, albeit buried in the back pages. Since this melting has been cited as sure evidence of apocalyptic AGW disaster, you’d think it would be celebrated in banner headlines and lead all the TV news shows.

  85. DeWitt Payne
    Posted Jul 11, 2008 at 9:29 AM | Permalink

    cba,

    I urge you to look closely at how you deal with ozone and the other ghg’s that are not well mixed like singlet oxygen and hydroxyl radical in your model. The more I look at your data, the more I think that your excess emission must be related to ozone. However, according to the data tables in MODTRAN, the absolute ozone concentration peaks at 22 km and the relative concentration in ppm peaks at about 35 km because the concentration of ozone is not falling as fast as the pressure. There isn’t very much ozone at all above 50 km (or much of anything else either) so there shouldn’t be much emission or absorption at or above that altitude by anything except CO2 and that will be very small (less than 1 w/m2 total over the entire column above 50 km. This is entirely consistent with the temperature profile. Your data isn’t. You show significant solar absorption above 50 km. If that were so, then the temperature would still be rising not falling.

  86. Craig Loehle
    Posted Jul 11, 2008 at 9:50 AM | Permalink

    Re: Basil and others about testing models. My paper:
    Loehle, C. and G. Ice. 2003. Criteria for Evaluating Watershed Models. Hydrological Science and Technology 19:1-15
    gives an approach for multi-variable models. If you need a copy, email me with your postal address (no pdfs for this one).

  87. Philip_B
    Posted Jul 11, 2008 at 10:01 AM | Permalink

    But the human population was miniscule 1000 years agao and industrial pollution and coal fires on a grand scale did not exist.

    It wasn’t miniscule. Wikipedia gives a figure of 310 million people in 1000 AD. Industrial pollution is somewhat of an oxymoron, not least because industrial processes replaced far more polluting domestic and local production. Industrial particulate air pollution (the context of the discussion) is for practical purposes non-existent in the developed world. Coal fires are irrelevant, but I’ll note that coal is cleaner burning than wood and other fuels it replaced, notably charcoal. It happens that many years ago I lived in Ireland where peat was still a common and very smoky fuel.

    Otherwise, you just seem upset that I poured cold water on an environmentalist myth.

  88. DeWitt Payne
    Posted Jul 11, 2008 at 10:03 AM | Permalink

    cba,

    Here is the plot from the Archer MODTRAN website. The concentrations are in ppmv so have to be corrected for pressure for absorption and emission calculations.

  89. Basil
    Posted Jul 11, 2008 at 10:06 AM | Permalink

    #81 Pat

    Okay, now I understand better what you were saying. It seems maybe I wasn’t very clear, as I didn’t intend for the two hypotheses to be as closely linked as you took me to mean. They can be completely independent hypotheses as far as I’m concerned. I’m just interested in understanding better what the testable propositions of AGW are, no matter whether linked or not. And as I’ve indicated, I’m looking for hypotheses that are specific enough to actually be tested. A theory that simply predicts “warming” is basically untestable unless it can be distinguished from other factors that could produce “warming.”

    This is why I think it is so important to observe that most of the warming of the past two decades occurred in the extratropical NH, over land. Yes, there has been warming. But is it consistent with the AGW hypothesis? If the theory predicts greater warming in the tropics, even at the surface (i.e. let’s ignore the predictions about greater warming in the troposphere than at the surface), then it seems to me that this is a fundamental problem for the hypothesis.

    I’m just trying to get a sense of just how testable the hypothesis really is, and how well it holds up in the face of specific falsifiable tests.

  90. DeWitt Payne
    Posted Jul 11, 2008 at 10:08 AM | Permalink

    cba,

    also note that methane isn’t well mixed either. It is oxidized to CO2 by ozone and hydroxyl radical in the stratosphere. Concentration data in atm cm/km versus altitude is in the MODTRAN tables that can be accessed by enabling the save data option.

  91. DeWitt Payne
    Posted Jul 11, 2008 at 10:56 AM | Permalink

    Philip_B,

    coal is cleaner burning than wood and other fuels it replaced, notably charcoal.

    The problem, at least with respect to the London fogs, with coal was sulfur, but soot from coal burning grates for home heating was still a problem. Ride behind a coal fired steam locomotive and then tell me again how cleanly coal burns. Charcoal burns very cleanly. It’s the process of making charcoal that’s the problem.

  92. crosspatch
    Posted Jul 11, 2008 at 10:58 AM | Permalink

    Dr. Hathaway says nothing unusual about this sunspot minimum.

    “The average period of a solar cycle is 131 months with a standard deviation of 14 months. Decaying solar cycle 23 (the one we are experiencing now) has so far lasted 142 months–well within the first standard deviation and thus not at all abnormal. The last available 13-month smoothed sunspot number was 5.70. This is bigger than 12 of the last 23 solar minimum values.”

    Aren’t we due for a sunspot or something?

  93. Sam Urbinto
    Posted Jul 11, 2008 at 10:59 AM | Permalink

    300 million people growing food by hand for mostly themselves only, leaving all the trees in place, burning wood and peat and dried patties and so on is not 6000 million people paving large areas, driving trucks, flying airplanes, making concrete, lighting up hundreds of square thousands of miles of cities, and growing food for everyone else.

  94. Pat Keating
    Posted Jul 11, 2008 at 11:08 AM | Permalink

    88 Basil

    most of the warming of the past two decades occurred in the extratropical NH, over land

    Have you noted the recent paper out of Switzerland that indicates that at least half of the warming in Europe is due to cleaning the air of pollution? The same would apply to the US, so there we have much of the NH land-mass covered.

    I’m just trying to get a sense of just how testable the hypothesis really is, and how well it holds up in the face of specific falsifiable tests.

    Not very much or well, I think. However, perhaps you should ask an AGW enthusiast that question.

  95. Jon
    Posted Jul 11, 2008 at 11:10 AM | Permalink

    would you care to mention some of the species which have become extinct because of GW, let alone AGW? In the only case I’m aware of, a SA tree frog if I recall, the demise of the frog was from deforestation rather than any change in local temperatures.

    You’re referring to the Golden Toad, or Bufo periglenes. Global warming was a contributing cause, not the only, direct cause of its extinction (see here and here).

    (As an aside, I just glanced at the Wiki article and don’t understand the claim that a “Jennifer Neville” published a study showing that ENSO was to blame. I cannot find the study anywhere, the organization to which Ms. Neville allegedly belongs (NOAH) is incorrectly named in the article as “North[sic] Ohio Association of Herpetologists”, and the link is dead. Someone should clean that up and use the credible references available.)

    In any event, as with many such situations, we aren’t talking about the increased planetary energy balance due to higher GHG concentrations outright killing species. We’re talking about the disruption of climatic norms changing their habitat in such a way that they cannot survive.

    OTOH, if this species extinction is only future, I think it’s a bit presumptuous to assume species can’t handle a degree or two of temperature rise.

    This is getting way off topic, but a 2C warming would cause widespread alterations in habitat. Many species will not be able to adjust quickly enough, and others simply will have no where else to go (e.g. plants and animals already at the northernmost/highest elevations). This is compounded by the fact that urbanization and land use has in effect created “island” habitats that constrain species that could otherwise move with the changing climate. The effects of acidification on the ocean, particularly the potential to disrupt food chains from the bottom up (e.g. species of plankton) and organisms that function as ecosystems (e.g. corals) will be large even before the effects of warming are considered.

  96. John Lang
    Posted Jul 11, 2008 at 11:15 AM | Permalink

    Re 91: Solar cycle 23 started in May 2006 so we are now more than 145 months into it (which would be outside of the standard deviation.) Total Solar Irradiance (TSI) continues to decline so Cycle 23 is not over yet.

    Here is a plot of TSI over the past year from the SORCE instrument (note the cycles in the plot relate to the rotation of the Sun which are interesting nonetheless.)

    http://lasp.colorado.edu/cgi-bin/ion-p?ION__E1=PLOT%3Aplot_tsi_data.ion&ION__E2=PRINT%3Aprint_tsi_data.ion&ION__E3=BOTH%3Aplot_and_print_tsi_data.ion&START_DATE=1640&STOP_DATE=2050&TIME_SPAN=6&PLOT=Plot+Data

  97. jae
    Posted Jul 11, 2008 at 11:22 AM | Permalink

    Oh, brother! I wonder how all the species survived the MWP. And the frogs and polar bears can relax for now, because there has been no increase in temperature for 10 years.

  98. crosspatch
    Posted Jul 11, 2008 at 11:29 AM | Permalink

    “300 million people growing food by hand for mostly themselves only, leaving all the trees in place, burning wood and peat and dried patties and so on is not 6000 million people paving large areas, driving trucks, flying airplanes, making concrete, lighting up hundreds of square thousands of miles of cities, and growing food for everyone else.”

    No kidding. But we are healthier now than we ever were during that time. In the first millennium AD 85% of the people born never made it to 35 years of age and life expectancy didn’t really rise much until late into the second millennium (middle 19th century or so). Greek and Roman civilization often suffered severe epidemics of diseases such as cholera. The average living person in the NH today breathes less particulates than the average living person did as we no longer have peat and wood and candles and whale oil burning in the home. We drink cleaner water and breathe cleaner air, overall, than people ever did in the past.

    And we have the technology already at hand and already proven to produce trillions of watts of power with nuclear power AND we could recycle the spent fuel rather than bury it and REDUCE the emission of radiation into the atmosphere. Coal power plants release levels of radiation that would get a nuclear plant shut down.

    It is all part of some weird psychosis to think that somehow things are going to hell in a hand basket. I can testify under oath that the environment now is much cleaner than it was when I was a child. The air is cleaner, the water is cleaner, and there is a greater variety of foods available year-round than there ever was when I was growing up. A person today would probably wretch if they were to be transported back in time to stand on the banks of the Chesapeake Bay or downtown Pittsburgh in the mid 1960’s.

    Why is it not major news that sturgeon are now abundant in the Hudson and salmon have returned to the Thames?

    Yes we have more people and they don’t grow their own food and yes, we make a lot of energy. But that isn’t a bad thing when overall things are now much better environmentally. I think the biggest change one would notice in recent times would be the change happening and the cleanup going on in Eastern Europe. Things are not spiraling into a vortex of environmental disaster.

  99. crosspatch
    Posted Jul 11, 2008 at 11:35 AM | Permalink

    Have you noted the recent paper out of Switzerland that indicates that at least half of the warming in Europe is due to cleaning the air of pollution?

    There is a much older paper out of Switzerland that finds wood being uncovered by receding glaciers in the Alps is 5000 C14 years old. So 5000 years ago, those valleys were not only ice free, they had been ice free long enough to grow forests. Whose SUV caused the melting 5000 years ago?

  100. Tony Edwards
    Posted Jul 11, 2008 at 11:39 AM | Permalink

    The effects of acidification on the ocean, particularly the potential to disrupt food chains from the bottom up (e.g. species of plankton) and organisms that function as ecosystems (e.g. corals) will be large even before the effects of warming are considered.

    Not this one again! Two points. One, the ocean is not going acidic, if anything, it is becoming slightly less alkaline, going from approx 8.1pH to 7.9pH. Acidic is anything below 7pH. Given the wide variation of CO2 levels documented for the past, during the time when many species evoloved, the levels have doubtless varied much more over time that we are experiencing now.
    Two, plankton are largely chlorophyll using creatures, which, surprise, surprise, use CO2 for growth.
    As to corals, a point was recently made on another blog, commenting that, over the past 10,000 years or so, the oceans have risen by some 400 ft or so. But corals only grow in relatively shallow water, so where all the reefs are today was dry hills only 10,000 years past and where they were growing then is now far too deep for them. They obviously are very good at adapting to changing circumstances

  101. Jon
    Posted Jul 11, 2008 at 12:04 PM | Permalink

    One, the ocean is not going acidic, if anything, it is becoming slightly less alkaline, going from approx 8.1pH to 7.9pH. Acidic is anything below 7pH.

    The term used is acidification. I don’t need to be lectured on what the pH levels of acids and bases are. You can pretend that marine chemists and biologists missed Chem 101 and that means no one has to worry about anything, but it won’t change reality.

    Given the wide variation of CO2 levels documented for the past, during the time when many species evoloved, the levels have doubtless varied much more over time that we are experiencing now.

    CO2 levels haven’t increased in the amount, at the rate that they are currently for at least the last 800 kyr, likely the last million. The pace is simply faster than the timescales evolutionary development is concerned with.

    Two, plankton are largely chlorophyll using creatures, which, surprise, surprise, use CO2 for growth.

    Calcification in forming their shells is the central concern. Hopefully, this will increase ahead of the acidification process- that would be outstanding. And there is some hope that this might be the case- a recent study (Iglesias-Rodriguez et al. 2008) supports this. However the bulk of prior research points in the other direction. I am actively hoping that the new study is the better indicator of what will happen, because unlike the caricatures tossed, nothing could make me happier than finding problems regarding anthropogenic climate change.

    As to corals, a point was recently made on another blog, commenting that, over the past 10,000 years or so, the oceans have risen by some 400 ft or so. But corals only grow in relatively shallow water, so where all the reefs are today was dry hills only 10,000 years past and where they were growing then is now far too deep for them. They obviously are very good at adapting to changing circumstances

    Different issue, different timescales.

  102. Sam Urbinto
    Posted Jul 11, 2008 at 12:04 PM | Permalink

    crosspatch #95:

    I made much of the same point in another thread. I remember being in Eastern Europe with coal smoke in a valley so thick you could feel it, or in Los Angeles where the eyes burned in the ~100 F heat. This notion that things are worse is bogus, even just considering the last 20 years, much less the last 50 or 100.

    And yes, modern medicine and safety features make it much safer and so much greater longevity. I think maybe it’s a distrust of technology more than anything else. An idea that what animals do to their environment is good or bad on a moral level.

  103. Jon
    Posted Jul 11, 2008 at 12:06 PM | Permalink

    smog in Los Angeles so bad your eyes burned. And now? I’d say cleaner. What about the Great Lakes in the 60’s and 70’s compared to now?

    You’re making the case for governmental regulation. I don’t disagree.

  104. Sam Urbinto
    Posted Jul 11, 2008 at 12:09 PM | Permalink

    Carbon dioxide? You mean like 600 ppmv? Yeah, I’m sure that would be a disaster.

    Same thing with corals. Dump a few thousand gallons of bleach into the water near a few reefs. Dip in like a giant coffee warmer and make the water boil. I’m sure dropping in pH .2 and getting .5 C warmer would have equal results.

    Acidification is a PR term, much like denier. Going towards acid, becoming less basic, lowering in pH, they just don’t have that op/ed ring to them, you know?

  105. Basil
    Posted Jul 11, 2008 at 12:10 PM | Permalink

    #94 John Lang

    Solar cycle 23 started in May 2006 so we are now more than 145 months into it (which would be outside of the standard deviation.) Total Solar Irradiance (TSI) continues to decline so Cycle 23 is not over yet.

    I’m sure you meant 1996. So at the end of June 2008, we were at 12 years (144 months) and counting. So we may now be more than 145 months into it. But why is the focus on a single standard deviation? Usually two standard deviations are used to determine whether something is “out of the ordinary.”

    So SC23 may just be long, if not be abnormally long, but I think the evidence may be mounting that SC24 is taking its good old time getting started. And that increases the likelihood that it will be a period of lower than usual solar activity. But even if the R turns out to be only 75, I bet someone can tell us that that is within a single deviation of the average.

  106. Basil
    Posted Jul 11, 2008 at 12:16 PM | Permalink

    #93 Pat,

    I was aware of the paper, but hadn’t really considered its significance for explaining why most warming has been over NH extratropics land mass.

    That’s an intriguing notion. Would you (or anyone) know offhand (so I don’t have to google) if this improvement in air quality is noticeably greater in the NH extratropics, compared to the tropics? Hypothetically, I could imagine that air quality standards in the industrial nations have improved air quality relative to third world countries where slash and burn still takes place widely. Is that the case?

  107. crosspatch
    Posted Jul 11, 2008 at 12:29 PM | Permalink

    “most warming has been over NH extratropics land mass.”

    It might explain some things in Europe but I don’t think it explains much “globally”. Most of the land showing the greatest increase in temperatures these days is in areas of Asia where pollution is increasing. So at the moment I would say that pollution and “warming” are tracking together in those areas. I would say that NH ground based temperature records are now tracking land development around the stations more than any change in Earth’s climate. The US is currently showing a cooling trend of -0.63 degrees per decade since 1998 according to NCDC even though pollution from China blowing across the Pacific is increasing.

  108. Pliny
    Posted Jul 11, 2008 at 12:38 PM | Permalink

    JohnM #72
    The best I’ve seen online is the GLODAP Pacific analysis, where they have written a lot about accuracy issues. There is a page here and here. You have to look around the site a bit.

  109. Bruce
    Posted Jul 11, 2008 at 12:44 PM | Permalink

    Philip B

    Industrial particulate air pollution (the context of the discussion) is for practical purposes non-existent in the developed world.

    Did you read the link to the London Smog page I posted?

    Huge quantities of impurities were released into the atmosphere during the period in question. On each day during the foggy period, the following amounts of pollutants were emitted: 1,000 tonnes of smoke particles, 2,000 tonnes of carbon dioxide, 140 tonnes of hydrochloric acid and 14 tonnes of fluorine compounds. In addition, and perhaps most dangerously, 370 tonnes of sulphur dioxide were converted into 800 tonnes of sulphuric acid. At London’s County Hall, the concentration of smoke in the air increased from 0.49 milligrams per cubic metre on 4 December to 4.46 on the 7th and 8th.

    Thats was 1952. In the UK. Do you not consider that to be part of the developed world?

  110. Stan Palmer
    Posted Jul 11, 2008 at 1:19 PM | Permalink

    CO2 levels haven’t increased in the amount, at the rate that they are currently for at least the last 800 kyr, likely the last million. The pace is simply faster than the timescales evolutionary development is concerned with.

    This is real question and not sarcasm but what happened 800,000 years ago?

  111. crosspatch
    Posted Jul 11, 2008 at 1:47 PM | Permalink

    “Thats was 1952. In the UK. Do you not consider that to be part of the developed world?”

    My impression was that the discussion was about conditions in the developed world at the current time. In 1952 there was a lot more coal being burned and much of that was high sulphur coal. People heated homes with coal then. A similar weather pattern as in 1952 would not produce anything like the event we saw back then. The air is much cleaner now than it was in 1952.

  112. crosspatch
    Posted Jul 11, 2008 at 1:52 PM | Permalink

    Re:coral and CO2 …

    “Corals first appeared in the Cambrian period (570mya). Reef-building corals appeared during the middle of the Triassic period (251mya – 200mya).”

    So coral lives just fine with atmospheric CO2 levels between 4000 and 6000 ppm, 600ppm isn’t likely to harm them in the least.

  113. jae
    Posted Jul 11, 2008 at 2:00 PM | Permalink

    “most warming has been over NH extratropics land mass.”

    LOL. But don’t the climate models all say it should be occuring over the tropical areas? The physical basis of the models seems to be in question.

  114. bernie
    Posted Jul 11, 2008 at 2:07 PM | Permalink

    #111
    Crosspatch
    I don’t know about 1952, but I remember walking home over Shooters Hill in SE London in 1960-2 because the smog was so thick you could walk faster than the #89 bus. Sometimes at the top of the hill you could look down on a veritable sea of fog/smog. The smog could block out the Sun for days. We got central heating soon thereafter, thanks to North Sea oil and gas.

  115. Pat Keating
    Posted Jul 11, 2008 at 2:10 PM | Permalink

    101 Jon, 104 Sam

    Acidification is a loaded term, and an objective chemist would call it “Neutralization”.

    But that isn’t doom-laden enough for Jon.

  116. Pat Keating
    Posted Jul 11, 2008 at 2:21 PM | Permalink

    106 Basil

    I have no link for you, but I think there is little doubt that the improvement is greater in the extra-tropical regions.

    In the first place, there was little industrialization in the tropics, so not so much to clean up (note the horror stories about fogs in Europe). Secondly, it was in the temperate zones where most of the clean-up occurred, and its effect on elimination of fog and smog (the main issue) was most pronounced.

  117. Jeremy
    Posted Jul 11, 2008 at 2:45 PM | Permalink

    Re: #101

    CO2 levels haven’t increased in the amount, at the rate that they are currently for at least the last 800 kyr, likely the last million. The pace is simply faster than the timescales evolutionary development is concerned with.

    I don’t understand how anyone can make such a claim. The reasons I don’t are somewhat simple.

    #1) Proxy data used to discover levels of CO2 in the past do not have the resolution of today’s modern measurements. It’s a bit like having a greased-up magnifying glass to look at things in the past, and a scanning electron microscope to look at the present. Is CO2 going up? sure. Is what we’re seeing within the noise? We don’t know because we did not have mass spectrometers planted on each continent for the past million years. It’s possible that over short periods of time the CO2 on this planet varied much more, and because of the short timescale it barely knudged every proxy-measurement we use to discover those levels today. You can perhaps make handwaving arguments that demonstrate this as unlikely, but in the lab it doesn’t matter where your hands go, all that matters is what data you have.

    #2) We don’t have a great grasp of evolutionary timescales. For all we know, major changes can happen to species very quickly, and we just haven’t seen it happen yet.

  118. Posted Jul 11, 2008 at 2:51 PM | Permalink

    All my posting were eaten up by spam karma.

    Just testing if it is accepting now.

  119. Hans Erren
    Posted Jul 11, 2008 at 2:53 PM | Permalink

    I recently was not able to post here.

    Just checking.

  120. Bruce
    Posted Jul 11, 2008 at 3:08 PM | Permalink

    crosspatch

    The air is much cleaner now than it was in 1952.

    That was the point of my original posting which says sunshine totals are way up in Europe. The air has gotten a lot cleaner in the last 30 years which coincides with GW (which has warmed most in the regions that have cleaned up the air).

    As the Met article says:

    As recently as the early 1960s, winter sunshine totals were thirty per cent lower in the smokier districts of London than in the rural areas around the capital. Today, there is little difference.

    If the weather stations in the UK (and the rest of Europe) are now getting 30% more sunshine that would explain most or all of GW in Europe.

  121. crosspatch
    Posted Jul 11, 2008 at 3:39 PM | Permalink

    “That was the point of my original posting which says sunshine totals are way up in Europe. The air has gotten a lot cleaner in the last 30 years which coincides with GW”

    Right, but the clearing coinciding with warming might be coincidence and not cause/effect. China’s land record is currently showing considerable warming and their air is getting dirtier. My point was that the land record probably more accurately reflects economic development than air clarity or global climate. And in the US, the air has continued to improve since 1998 yet we have seen a significant drop in temperatures since then according to NOAA, at a rate of -0.63 degrees/decade.

    Also, most of the warming in the US was a rather pronounced “step change” in 1976 and a peak in 1998 when we had a very powerful el nino event and has been declining ever since with possibly a new “step change” back down in 2007. Did pollution suddenly stop in 1976 and return in 2007? Somehow I don’t think so.

    I will buy the notion that atmospheric clarity has some impact but we aren’t seeing any similar correlation anywhere else on the planet. If that were the case, we should see China cooling rather dramatically and the land record shows the opposite.

  122. Bruce
    Posted Jul 11, 2008 at 4:13 PM | Permalink

    Maybe you could point me to a reliable land record for China that shows considerable warming.

    Some people disagree.

    http://www.informath.org/WCWF07a.pdf

    http://www.informath.org/apprise/a5620/b17.htm

    If you go to this page and pick 1988 1988 as the base years, and pick 250km smoothing, I see mongolia and some of souther russia warmer. I see a cooler coast. I see huge chunks of missing data.

    http://data.giss.nasa.gov/gistemp/maps/

    http://data.giss.nasa.gov/cgi-bin/gistemp/do_nmap.py?year_last=2008&month_last=6&sat=4&sst=0&type=anoms&mean_gen=06&year1=2008&year2=2008&base1=1988&base2=1988&radius=250&pol=reg

    (And I don’t trust the GISS to get it right anyway.)

  123. Bruce
    Posted Jul 11, 2008 at 4:18 PM | Permalink

    crosspatch,

    May 2008 temp anomaly from UAH

    http://climate.uah.edu/may2008.htm

    It seems the industrial center of China is 0 or colder since 1979.

  124. Steve McIntyre
    Posted Jul 11, 2008 at 4:23 PM | Permalink

    I’m away for a few days, so everyone please behave If you feel tempted to be impolite or discuss policy or be angry, please don’t do it here.

  125. DeWitt Payne
    Posted Jul 11, 2008 at 4:26 PM | Permalink

    If corals could survive the Paleocene-Eocene Thermal Maximum about 55 million years ago. when two trillion metric tons (2 teratonnes) of carbon were released into the atmosphere, probably in the form of methane, over a period likely less than 1,000 years with CO2 levels reaching about 2,000 ppmv in the atmosphere, then it doesn’t seem likely that current anthropogenic carbon release will do them in either. Yes, there was massive decalcification of sediment, but living organisms clearly survived. The global average temperature was a lot higher then too. The polar ocean temperature was about 12 C higher than today and spiked an additional 2 C based on 18O ratios. After recovery from the PETM, temperatures continued to rise again reaching about 14 C higher than today before beginning the long decline leading to the current epoch of rapid glacial/interglacial shifts. The oceans didn’t boil and the tropics didn’t become uninhabitable. The carbon release rate, assuming 1,000 years, is in the ballpark of the current anthropogenic release rate, but it could have been much higher or much lower. That rate isn’t going to continue for 1,000 years. IMO, it’s not likely to continue for 100 years.

  126. crosspatch
    Posted Jul 11, 2008 at 4:32 PM | Permalink

    #123 Bruce

    Right … I was pretty sure I said “land record” UAH is a satellite measurement. The change in measurements on the ground is what corresponded to atmospheric clearing in Europe. UAH also shows Western Europe colder since 1979 while the land record shows Hottest.Year.Evah! or some such. UAH is actually more accurate in my opinion because it reflects actual lower troposphere temperature and not the temperature of the bottom 6 feet of air the way the surface record does. The surface record reflects changes in the temperature of the earth’s surface in the area around the measuring station, not changes in the temperature of the atmosphere.

  127. crosspatch
    Posted Jul 11, 2008 at 4:37 PM | Permalink

    Ooops … Western Europe colder since 1979 …

    sorry, I misread that map. Still, the warming showing in the UAH graphic is nowhere close to the warming showing in the ground station record.

  128. crosspatch
    Posted Jul 11, 2008 at 4:48 PM | Permalink

    This one is hard to read and I will see if I can find a better one that shows grid cell temperature anomaly but I haven’t bookmarked any.

  129. Bruce
    Posted Jul 11, 2008 at 5:08 PM | Permalink

    crosspatch, the UAH satellite record infers the surface temperature.

    However, even if you don’t like UAH, the GISS map shows no warming since 1988 in China. The big red/brown splotch is not in China. Its Mongolia and Russia. China is cool or no change since 1988.

  130. Posted Jul 11, 2008 at 5:23 PM | Permalink

    Re #125

    If corals could survive the Paleocene-Eocene Thermal Maximum about 55 million years ago. when two trillion metric tons (2 teratonnes) of carbon were released into the atmosphere, probably in the form of methane, over a period likely less than 1,000 years with CO2 levels reaching about 2,000 ppmv in the atmosphere, then it doesn’t seem likely that current anthropogenic carbon release will do them in either. Yes, there was massive decalcification of sediment, but living organisms clearly survived.

    Well some of them survived but about 95% of ocean’s life forms became extinct.

  131. crosspatch
    Posted Jul 11, 2008 at 5:26 PM | Permalink

    Well, take this one for example. This one shows 2006 anomaly from the “base” period which shows warming pretty much everywhere on the planet, including places where pollution is increasing and where it is decreasing.

    And I guess that sort of illustrates a main frustration of mine. The “warmists” use the surface observations to validate their hypothesis because that is pretty much the ONLY data they can find that does. The whole world is getting hotter! At least according to adjusted surface data. Satellite data show otherwise, ocean data show otherwise, but they cling to the adjusted surface measurements because it is the last remaining data that validates their view. And even that is going to go away soon in the US with the introduction of an unadjusted network … but never fear, there is still the ROW stations to save the day.

  132. John M
    Posted Jul 11, 2008 at 5:49 PM | Permalink

    Pliny 108

    Thanks. That gives me something to chew on. Now I need to find papers that report changes in the values due to increases in CO2, although I still didn’t see error estimates for carbonate/bicarbonate (just DIC, TALK, TCO2. etc).

    Is it just my software are does that first link give some funky formatting and hidden text and figures? BTW, the second link is empty.

  133. Hans Erren
    Posted Jul 11, 2008 at 6:27 PM | Permalink

    I suffered from spam karma banning, testing testing…

  134. Hans Erren
    Posted Jul 11, 2008 at 6:28 PM | Permalink

    I’m baaaack.

  135. Real Richard Sharpe
    Posted Jul 11, 2008 at 6:49 PM | Permalink

    Phil says:

    Well some of them survived but about 95% of ocean’s life forms became extinct.

    Not according to Wikipedia, which says:

    The PETM is accompanied by a mass extinction of 35-50% of benthic foramanifera (especially in deeper waters) over the course of ~1000 years – the group suffering more than during the dinosaur-slaying K-T extinction. Contrarily, planktonic foramanifera diversified, and dinoflagellates bloomed. Success was also enjoyed by the mammals, who radiated profusely around this time.

    The deep sea extinctions are difficult to explain, as many were regional in extent (mainly affecting the north Atlantic): this means that we cannot appeal to general hypotheses such as a temperature-related reduction in oxygen availability, or increased corrosiveness due to carbonate-undersaturated deep waters. The only factor which was global in extent was an increase in temperature, and it appears that the majority of the blame must rest upon its shoulders. Regional extinctions in the North Atlantic can be attributed to increased deep-sea anoxia, which could be due to the slowdown of overturning ocean currents,[8] or the release and rapid oxidation of large amounts of methane.[21][verification needed]

    It goes on to make other interesting observations …

    While other extinction events caused 95% of sea life to go extinct, it seems that the PETM did not. Can you provide a link to support your claim?

  136. Posted Jul 11, 2008 at 8:15 PM | Permalink

    There’s an interesting article in today’s Guardian about plausibility and belief in science:

    http://www.guardian.co.uk/commentisfree/2008/jul/12/medicalresearch

    I particularly liked the complaint that a report was not backed up by data.

    JF

  137. Posted Jul 11, 2008 at 8:19 PM | Permalink

    Sorry, forgot the quote:

    quote a research study demonstrating that the Bridgend suicide cases all lived closer to a mobile phone mast than average. When I contacted Coghill it turned out he wasn’t really a government adviser, he said the Express had made a mistake in calling him a doctor, he had lost the data, and he couldn’t even explain what he meant by average.

    You will be very pleased to hear that Coghill has now found the data. This is a matter of great public health significance, as suicide is the second most common cause of death in men aged 15 to 44, and mobile phone use is extremely prevalent. Sadly Coghill still does not wish to tell me what figures he collected, what analysis he did on them, what “average” he compared them with, what the results were, and what interpretation he makes from these results. This baffles me. He claims online that he has offered to let me inspect his data but that I declined. This baffles me too, because he also explains – in a complaint to the Press Complaints Commission about me harassing him – that he will not give me his data, as he considers it “sensitive”. unquote

    JF

  138. kuhnkat
    Posted Jul 11, 2008 at 9:53 PM | Permalink

    John Lang,

    looked at your link and it was the typical one showing TSI at 1 AU. The earth orbit has a MEAN of 1 AU. I don’t know where to find graphs, but here is a link to their data file which also shows a figure for EARTH DISTANCE. This figure varies from just over 1400 down to almost 1300.

    http://lasp.colorado.edu/sorce/tsi_data/six_hourly/SORCE_L3_TSI_6M_V0008_20030225_20080704.txt

    Compare the numbers in column 5, 1 AU, to column 10 (earth distance). It shows that minimum TSI occurs in July, NH summer, and maximum TSI occurs in December, SH summer!!! As the ocean absorbs more heat than land I would be interested in some smart person plotting this variation in orbit and therefore TSI at earth distance over the last couple hundred years!!!

    As the variance year to year is little it should mostly average out EXCEPT for the issue of where we get more heat absorption, ocean or land, and possible albedo changes!!!

    Anyone know of a paper or other presentation that covers this??

  139. Pliny
    Posted Jul 11, 2008 at 10:47 PM | Permalink

    #132 John M
    Sorry about the missing link. It’s on the same site, so you probably found it. I use Firefox and didn’t notice any difficulties.

  140. Bernie
    Posted Jul 11, 2008 at 11:36 PM | Permalink

    Julian:
    You forgot to mention the section in the article that involved a study where the guy spoke gibberish and was repeatedly rated favorably by what passed for an educated audience.

  141. WA
    Posted Jul 12, 2008 at 12:44 AM | Permalink

    Basil #78:

    You wrote:
    “Really, is this the essence of the AGW hypothesis?
    “Major Premise: [Models Predict that] Rising greenhouse gas emissions will cause warming. (P)
    “Minor Premise: The earth is warming. (Q)
    “Conclusion: Rising greenhouse gas emissions are causing warming [verifying the models].
    “If so, somebody needs a lesson in elementary logic.”

    Agreed about the need for lesson, particularly in light of the political assumption that CO2 is the only cause worth attention. How is this for starters? I think the AGW hypothesis is a fallacy, perhaps the fallacy of “Affirming the Consequent”. Its pattern is:
    If P then Q.
    Q.
    Therefore, P
    .

    http://www.fallacyfiles.org/afthecon.html

    Perhaps, Basil, we all need the lesson you mention, because we could then identify alarmist arguments by their true nature: fallacies.

  142. John M
    Posted Jul 12, 2008 at 10:40 AM | Permalink

    Pliny,

    Thanks again. I see now I was unclear about the document that looks strange. It was linked from the site you linked to (here).

    Maybe I just have the wrong version of Adobe Reader.

    Cheers.

  143. Basil
    Posted Jul 12, 2008 at 10:41 AM | Permalink

    #141 WA

    Or,

    cum hoc ergo proctor hoc

  144. Andrey Levin
    Posted Jul 13, 2008 at 3:57 AM | Permalink

    Re#121, Crosspatch:

    China’s land record is currently showing considerable warming and their air is getting dirtier.

    Different kinds of air pollution produce different kind of climate response. Particulate emissions produce heating of surface and lower troposphere. Smog and aerosol forming substances like SOx and NOx produce cooling due to fog, smog, and cloud formation.

    Historically, particulate control (like electrostatic precipitation, removing about 95% of particles) was introduced on coal power plants, cement kilns, and basic iron furnaces two decades before effective SOx and NOx control. China is only now beginning to introduce particulate control measures.

    So it was and will be warming over China until PM will be mostly controlled, then it will be cooling due to aerosols, and then again warming when aerosols will be controlled. That being antropogenic signal; how it measures against natural forcing a big question.

  145. cba
    Posted Jul 13, 2008 at 4:56 AM | Permalink

    DeWitt,

    I’m not sure what’s going on here but it looks like my posts are not making it in now. There’s at least two responses that are not here.

    I’ve been going through the input data and linewidth coding. I’ve had a couple of times where I thought I found something in the way of a problem only to trace through to the point where it handles the problem. I’m currently on the third which might be it. For some reason, as we go higher into the atmosphere and the p drops, the peak of the line is not really decreasing even though the pressure broadening is narrowing nicely and goes away. It is to be expected that the peak rises as ( for emitting) the total power over a bandwidth should remain the same as the bandwidth narrows. There appears to be a problem with the 1nm wide bin containing the central peak which is supposed to have a simple linear average bandwidth on each side where the peak is averaged with the slope value at each edge of the bin wavelength and summed for the total.

    Hopefully later today I can verify what is going on with the code to determine if / why the peak in the bin is not properly being averaged and yielding too much absorption at lower pressures. The result of the problem is that despite partial pressures of molecules being treated correctly including isotope fractions and atmospheric pressure being treated correctly for layers, the central peaks of lines are not reducing in a linear fashion as expected once it appears that the broadening has diminished to under the width of the wavelength bin causing an overstating of the power absorption/emission.

  146. John F. Pittman
    Posted Jul 13, 2008 at 6:29 AM | Permalink

    Jon 18K B.P. the Holocene Transgression occurred with rapid sea level rise of 10 meters per 1000 years Curray 1965. The AR4 worst case was 5.9 meters http://www.realclimate.org/index.php?p=427. Although modern reefs can grow about 1 meter per 1000 years, Easton and Olson 1976, the depth of living coral is about 135 meters at extinction due to radient energy Stoddart 1969. The real issue is the Ca++ and CO3 ions in the oceans. Warmth is not a factor for decline, but will increase the area of the world that coral can grow, unless the water gets above 28C, StoddarT 1969. Since large amounts of calcium carbonate have been sequestered since the Holocene Transgression, the question is about the fate of dissolved ions and the export of calcium from suspension and leeching, Stearn 1977. Alkalinity and total alkalinity are the proper measurements, not pH Stearn 1977. Given the mass of coral from the Pleistocene, where is the mass balance that this is a problem. Unfortunately many of the current articles are behind the paywall. But as pointed out above, CO2 in the Palaeocene was about 3000 ppm,Pearson and Palmer 2000, any worries about coral reefs not surviving or even thriving is not supported in the geologic record. Corals appear in the geologic record from the beginnning of the Paleozoic, Newell 1972. They have seen much change in depth, temperature, CO2, and other environmental factors. Current literature appears to be somewhat alarmist, and typically ignores the geologic record, IMO.

  147. Philip_B
    Posted Jul 13, 2008 at 6:39 AM | Permalink

    My post about trains and smoke went into an internet blackhole. Coal steam trains aren’t smoky. It’s one of the Gore myths. Show a chimney pumping out steam and tell the ignorant its smoke pollution. Coal driven trains put out a lot of steam but little smoke. Did you know, the London underground originally had coal/steam driven trains. Substantial smoke from the trains travelling through 10 mile or longer tunnels would have made the underground system impossible.

  148. kim
    Posted Jul 13, 2008 at 8:40 AM | Permalink

    145 (Floor Anthoni) Thanks for the link, though I admit I skipped to the conclusion. I’ve little doubt that the biosphere, in concert with the sun, will resequester the newly released fossil carbon ‘in due course’.
    ==================================================

  149. DeWitt Payne
    Posted Jul 13, 2008 at 9:05 AM | Permalink

    cba,

    That does sound like it could be the problem. The math changes when the line width becomes less than the resolution. That’s when you start getting square root dependence on concentration for strong lines. But even a strong line should become effectively weak as the pressure drops and the mmass path gets small enough. At that point, the emission/abosrption shows a linear dependence on concentration. In the limit of a Dirac delta function line, only a fraction of the bin can actually ever be absorbed because the rest of the bin is transparent. It sounds like the effective width for some lines in your program never drops below 1 nm and stays saturated regardless of concentration. But you should never see a Dirac line. There is always Doppler broadening at the observed temperatures and even at absolute zero, the Uncertainty Principle would give a finite line width.

  150. DeWitt Payne
    Posted Jul 13, 2008 at 9:18 AM | Permalink

    Philip_B,

    It depends on how you define smoke. The real problem with ice melting is carbon black, not condensed organics. Even if you burn coal in a modern power plant at high temperature, the fly ash still contains some carbon. And there is fly ash. That’s why they now have the big filters and electrostatic precipitators, not to mention sulfur scrubbers, that reduce efficiency by a significant amount. Now take an old coal fired locomotive where the coal is shoveled into the fire in chunks rather than ground and fed as a powder and you get clinkers and soot galore, and the plume from the chimney isn’t white either.

  151. Pat Keating
    Posted Jul 13, 2008 at 10:03 AM | Permalink

    145 Floor
    The oceans are ‘neutralizing’ a little, not ‘acidifying’, which is a bogeyman word. They are alkaline.

    snip

  152. crosspatch
    Posted Jul 13, 2008 at 1:11 PM | Permalink

    #147 Andrey Levin

    “So it was and will be warming over China until PM will be mostly controlled, then it will be cooling due to aerosols, and then again warming when aerosols will be controlled. That being antropogenic signal; how it measures against natural forcing a big question.”

    That sort of misses the point that we don’t really have any measurements of high enough quality to say if there is any warming or cooling on the ground in China. The UAH map posted earlier showed some very slight warming, nothing like the adjusted GISS numbers. We have no ground station network in China of sufficient quality do take these kind of measurements. The ground station networks we have in the rest of the world rely on dubious “adjustments” that are often as large as the warming “signal” the are showing. In other words, we don’t have data of sufficient quality to say if temperatures really have warmed or cooled or to what degree in response to atmospheric changes.

    I remember reading in this blog a comment by someone in the UK who noticed the measurement station near them was reporting record high temperatures while surrounding stations were not and on investigation, it turned out the thermometer was in a metal (not wooden) stevenson screen painted white on the outside and black on the inside. I am surprised it wasn’t erected in an asphalt parking lot with heat lamps pointed at it.

    We just don’t have data of high enough quality to reach any conclusions. The satellite data are probably the best we have at the moment along with drift buoys in the ocean. What we are doing with the surface network is basically like announcing that time is drifting by taking a global average of people’s wristwatches and attempting to apply “corrections” that are based on how far off the watch was in the past and what the neighbors’ watches read.

  153. Philip_B
    Posted Jul 13, 2008 at 5:29 PM | Permalink

    DeWitt Payne. I was just rebutting the point above. By modern standards coal/steam trains are dirty and polluting. And you are right that unburned carbon and ash are produced as smoke. When I was a child in the 1950s in Britain, Chimneysweeps were still common. I remember watching them sweep our chimney. Then ‘smokeless’ coal was mandated and chimneys needed to be swept much less often. At least that’s my recollection. My comment about steam trains was also based on my recollection from the UK and I assume they would have been using the smokeless coal.

  154. Gerald Machnee
    Posted Jul 13, 2008 at 7:29 PM | Permalink

    Re #50 – Well at least you got a candid remark about the polar bears. On our trip last September I recall global warming being named re the bears.

  155. Real Richard Sharpe
    Posted Jul 13, 2008 at 8:14 PM | Permalink

    MPaul says:

    She then said, “I know that the fact that the Polar bear population is increasing dramatically is disappointing to some of you — but I’m just relating the facts”.

    Are you sure about that? The WWF has been exhorting us to donate money to help save the polar bears.

  156. Ferdinand Engelbeen
    Posted Jul 13, 2008 at 8:31 PM | Permalink

    Having a daughter living in Alaska and working as helicopter pilot at te North Slope (we are now visiting here in Anchorage), she confirms the general trend that the polar bears population is growing and that ice melting was late this year. They have more and more work to prevent human – bear encounters…

  157. Andrew
    Posted Jul 13, 2008 at 8:43 PM | Permalink

    Steve-Um, I really don’t think that post was too policy related, just warning Jon that he does himself no favors by boldly announcing for all the world to hear his pro-regulatory stance.

  158. BarryW
    Posted Jul 13, 2008 at 8:44 PM | Permalink

    As far as polar bears go they are playing the projection game. “In the future if this goes on polar bears will become endangered”. So present values are irrelevant. Also, I’m sure someone has made the case that the southern migration is because of a lack of ice to hunt on.

  159. Andrew
    Posted Jul 13, 2008 at 9:07 PM | Permalink

    And, I might add, Jon, there’s a school of thought that economic development creates wealthy societies that clean up their acts, and that Western countries pollution success is due to prosperity, not regulation. Not endorsing, just pointing out.

  160. WA
    Posted Jul 14, 2008 at 12:24 AM | Permalink

    Basil # 143

    Good work. That was another of Al Gore’s fallacies. To achieve it, he had to work hard mashing an 800-1000 year lag into buckets of the proper width, so that “With CO2 therefore because of CO2.”

  161. John Finn
    Posted Jul 14, 2008 at 4:26 AM | Permalink

    Re: #54 (my post)

    Hadley’s June anomaly is 0.31 which is higher than GISS. As I suspected GISS/Hadley/MSU none of them are necessarily right nor wrong. They’re just different.

  162. cba
    Posted Jul 14, 2008 at 2:11 PM | Permalink

    DeWitt,

    It looks like the problem is the narrowness of the broadening at the uppper altitudes on the model. I’ve got a simple crude solution to fix it but have not implemented it as of yet. I’m looking for a slightly better crude simple solution for dealing with the actual gaussian (lorentzian or voight) curve to get a better integral output value while maintaining the necessary crudeness required so as not to exceed the computing capacity. (it already takes almost an hour to process through the line database to build up the spectral values for the 50 layer model. I think my approach works reasonably well for the lower atmosphere however, when the line widths become less than around 1/4 times the wavelength bin width, the values start becoming overstated.

  163. James Erlandson
    Posted Jul 14, 2008 at 2:38 PM | Permalink

    For serious R practitioners:
    UseR! The R user conference 2008.

    A major goal of the useR! conference is to bring users from various fields together and provide a platform for discussion and exchange of ideas: both in the formal framework of presentations as well as in the informal part of the conference in Dortmund’s famous beer pubs and restaurants.

  164. Posted Jul 14, 2008 at 3:26 PM | Permalink

    #155

    Can I point you to the following article;

    “…the nature of the endangered bears was wryly observed by Senator Ted Stevens (R-Alaska)in response to the Department of the Interior’s decision to list polar bears as “threatened” under the Endangered Species Act.
    “I am disappointed and disturbed by the U.S. Fish and Wildlife Service’s decision to weaken the Endangered Species Act by listing the polar bear as threatened despite the steady increase in the species’ population. Scientists have observed that there are now three times as many polar bears in the Arctic than there were in the 1970s.
    “Never before has a species been listed as endangered or threatened while occupying its entire geographic range.

    http://community.adn.com/adn/node/123321

    WASHINGTON, D.C. –
    “This decision was made without any research demonstrating dangerously low population levels in polar bears, but rather on speculation regarding how ice levels will affect Arctic wildlife. Worse yet, today’s decision cannot and will not do anything to reverse sea ice decline.
    “Instead, this action by the Fish and Wildlife Service sets a dangerous precedent with far-reaching social and economic ramifications. It opens the door for many other Arctic species to be listed, which would severely hamper Alaska’s ability to tap its vast natural resources. Reinterpreting the Endangered Species Act in this way is an unequivocal victory for extreme environmentalists who want to block all development in our state.
    “The manipulation of the Endangered Species Act was highlighted by Kassie Siegel, the lawyer who wrote the legal petition for the Center for Biological Diversity. Ms. Siegel made no attempt to disguise her group’s intent when she said that the effort to list the polar bears was to `try to make the point that global warming is not some future threat’. This statement confirms that these fringe environmentalists are simply using the polar bears to advance their extreme agenda.”

    I guess that like many things it comes down to politics/activism on one side with the desire to ‘exploit’ natural resources on the other. I make no comment as to whose side is the most desirable, just that the situation with polar bears is not as is portrayed in the media.

    Tony Brown

  165. george h.
    Posted Jul 14, 2008 at 4:21 PM | Permalink

    You just can’t make this stuff up:

    If global warming trends continue as projected by the UN Intergovernmental Panel on Climate Change in 2007, the United States can expect as much as a 30 percent growth in kidney stone disease in some of its driest areas, said the findings published in Monday’s Proceedings of the National Academy of Sciences.

    http://www.breitbart.com/article.php?id=080714210823.ys7a8mb8&show_article=1

  166. Sam Urbinto
    Posted Jul 14, 2008 at 4:39 PM | Permalink

    121 crosspatch:

    I agree. Yes, everything is cleaner, we just have tighter standards and better measurement devices. (And bad memories at times perhaps!) Still, some of everyone’s output makes it to the poles sooner or later it seems. We’ll see as China and India et al get to the point where the pollution becomes unaffordable (or otherwise refused to be stood for any longer), I’d think. Who else will be joining the technology-population cycle in the next few decades might be something to wonder about.

  167. DeWitt Payne
    Posted Jul 14, 2008 at 4:41 PM | Permalink

    cba,

    I’ve been looking through chapter 10 (Broadband Fluxes and Heating Rates in the Cloud-Free Atmosphere) of Grant Petty’s A First Course in Atmospheric Radiation. I think he has all the math there you would need. It’s in usual textbook form, though, where much is left to the reader and it’s a little beyond me when I’m just skimming. Towards the end of the chapter, though, when he’s writing about longwave cooling, he points out that the contribution to longwave cooling in the lower troposphere from the water vapor continuum is substantial. It peaks at over 5 C/day at about 3 km. That’s a fair number of watts/m2. It’s also about 50% more than from the water vapor line spectrum alone. I don’t think you’re going to be able to do radiative balance without including continuum absorption/emission.

    I’ve been thinking about MODTRAN and why the emission is too high with the 1976 standard atmosphere, especially when you correct the emission to account for the cutoff on the shortwave end of the spectrum. I think the reason it’s too high is that the temperature of the surface is too high. 288 is the temperature of an isothermal gray body. That’s higher than the average temperature of a non-isothermal body. I’m going to take the measured (I think) longwave emission curve by latitude from Petty’s book and try to calculate surface temperatures in MODTRAN that will produce that emission and then average them. Then I’ll do the same for the shortwave insolation curve, i.e. a body with no meridional heat transfer.

  168. DeWitt Payne
    Posted Jul 14, 2008 at 5:07 PM | Permalink

    Philip_B,

    I’m assuming that ‘smokeless’ coal is either anthracite (high carbon) coal or lower carbon coal that has undergone some pretreatment similar to the coking process to remove most of the volatiles. IIRC, the German chemical industry, and synthetic organic chemistry in general, got its start by trying to figure out things to do with coal tar from the coking process. It’s the unburned volatiles that form the tar that sticks to chimneys, IIRC. Peat, or buffalo chips for that matter, is low carbon and high volatile so the fire is very smoky. Peat smoke is good for flavoring whiskey, buffalo chip smoke has little going for it.

  169. Pat Keating
    Posted Jul 14, 2008 at 5:34 PM | Permalink

    168 DeWitt

    buffalo chip smoke has little going for it.

    That’s a very mono-cultural, non-PC assertion. My native American friends insist that it improves buffalo chops (not chips) enormously.

  170. DeWitt Payne
    Posted Jul 14, 2008 at 5:46 PM | Permalink

    Pat,

    …it improves buffalo chops (not chips) enormously.

    Over raw, perhaps. But it’s all a matter of taste. I have my limit for peat smoke in scotch. Laphroig is a bit too much for me.

  171. Pat Keating
    Posted Jul 14, 2008 at 7:44 PM | Permalink

    Yes, some of them are very strong (were there buffalo up there in the islands?). However, I once got very drunk on Scotch at a party while in college, and since then hardly ever bring myself to drink it.

  172. Raven
    Posted Jul 14, 2008 at 8:27 PM | Permalink

    It appears the advocate/scientist disease is spreading into other fields:

    http://www.theglobeandmail.com/servlet/story/RTGAM.20080714.wcowent15/BNStory/specialComment/home

    Despite this fatal weakness, “there are very few other substantive areas of research where the debate is this one-sided,” Prof. Davies said in an interview. The same small set of people peer-review each other’s work, and even the drug journals are politicized. “It’s very difficult to get a contrarian opinion published. If you’re not on the side of supervised injection, you get marginalized.”

    In any event, researchers who want to analyze the data for themselves are out of luck, because the researchers refuse to share it. Steven Lehrer, who studies health economics at Queen’s University, says it’s frustrating. “They were reporting unbelievably large effects,” he says. And that makes it especially important for independent researchers to validate the results. “They don’t seem interested,” he says.

  173. Bob Koss
    Posted Jul 14, 2008 at 9:56 PM | Permalink

    Horror of horrors! Now kidneys stones are linked to global warming.

    http://www.breitbart.com/article.php?id=080714210823.ys7a8mb8&show_article=1

    If global warming trends continue as projected by the UN Intergovernmental Panel on Climate Change in 2007, the United States can expect as much as a 30 percent growth in kidney stone disease in some of its driest areas, said the findings published in Monday’s Proceedings of the National Academy of Sciences.

  174. crosspatch
    Posted Jul 15, 2008 at 1:13 AM | Permalink

    “Now kidneys stones are linked to global warming.”

    Once you come to grips with the fact that 50% of the population is below the median intelligence level, it all starts to make sense.

  175. STAFFAN LINDSTROEM
    Posted Jul 15, 2008 at 2:41 AM | Permalink

    173 AND 174 KOSS AND CROSS…(You could have a show…10% for
    me if you take me AD NOTAM…) IS also GW, or at least the
    notion of it, doing that people, 600, mostly men I gather, run from
    a village in Austria, 800 m ASL [fair guess] rain and +13C to run
    to the top of Zugspitze 2967 or so m ASL and you only wear short
    very short pants, and you are the marathon type: the last grams of
    fat you got rid of decades ago??!!…Well, ONLY two died, both
    Germans, aged 41 and 45, from hyperthermia and lack of oxygen…
    I would say, the wind and heavy wet snow! You can find
    pictures on Bayrischer Rundfunk site…
    And then, OTH we have the story of the russian scientists on
    their ice island west of Novaja Zemlja last Sept…What a surprise
    the floe didn’t make its way like a good old Soviet icebreaker
    to some place north of Greenland melt-proof…

  176. BarryW
    Posted Jul 15, 2008 at 5:48 AM | Permalink

    This just in. A new study shows that at the present rate everything will be blamed on AGW by 2012.

  177. Philip_B
    Posted Jul 15, 2008 at 6:26 AM | Permalink

    DeWitt Payne, the smokeless coal was uniform pellets. Presumably from powdered treated coal.

  178. cba
    Posted Jul 15, 2008 at 6:49 AM | Permalink

    DeWitt,

    I started looking at what to do on the code to come up with a correction. Basically, at the higher altitudes such as approaching 50km, the original code is improperly handling the power spectrum, giving it as much as a few hundred times the proper line width with a substantial fraction of the peak height. I think it’s working properly at lower levels only just not so at the higher ones.

    I did see where someone was supposed to create an h2o continuum dataset for hitran and was to make it available when done 5 years ago. I’ve not had a chance to search for it and so have seen it yet. The only other alternative is to try to cobble some other existing thing together and incorporate it. It doesn’t seem like there is really a first principles thing to work from as the first principles don’t seem to be really known.

    There’s still the matter of all that shorter wavelength scattering that must be dealt with as well. I doubt it’s much but still – all that blue sky has to have some effect.

  179. DeWitt Payne
    Posted Jul 15, 2008 at 7:49 AM | Permalink

    cba,

    I think there’s a polynomial fit for the water vapor continuum somewhere. The absorption is related approximately to the square of the water vapor pressure, IIRC. The units would be something like cm atm^2/km.

    A significant fraction of scattered light still makes it to the ground as diffuse light spread over the whole visible sky. Scattering is a continuum process as well. The problem with atmospheric scattering is that by the time you get to 0.4 nm, the simple Rayleigh approximation doesn’t hold because the optical depth is no longer much less than one. Transmittance of the whole atmosphere is about 0.7 at 0.4 nm. and is approximately proportional to wavelength^-4, according to the graph in Petty.

  180. David Holland
    Posted Jul 15, 2008 at 9:11 AM | Permalink

    I don’t know if this has been reported before but it puts things into perspective. The ink is barely dry on AR4 and they are already getting stuck into planning AR5. I hope Steve gets invited back.

  181. Posted Jul 15, 2008 at 9:37 AM | Permalink

    Re #167

    You might want to have a look at Clough & Iacono, JGR, vol. 100, 1995 and subsequent publications, it’s very readable.

  182. george h.
    Posted Jul 15, 2008 at 11:47 AM | Permalink

    Not sure where to post this, but here is an excellent critique from APS Physics and Society where Monckton methodically demolishes the IPCCs notions of climate sensitivity. Might I suggest a new thread.

    Climate Sensitivity Reconsidered

    By Christopher Monckton of Brenchley

    Abstract

    The Intergovernmental Panel on Climate Change (IPCC, 2007) concluded that anthropogenic CO2 emissions probably caused more than half of the “global warming” of the past 50 years and would cause further rapid warming. However, global mean surface temperature has not risen since 1998 and may have fallen since late 2001. The present analysis suggests that the failure of the IPCC’s models to predict this and many other climatic phenomena arises from defects in its evaluation of the three factors whose product is climate sensitivity:

    1. Radiative forcing ΔF;
    2. The no-feedbacks climate sensitivity parameter κ; and
    3. The feedback multiplier ƒ.

    Some reasons why the IPCC’s estimates may be excessive and unsafe are explained. More importantly, the conclusion is that, perhaps, there is no “climate crisis”, and that currently-fashionable efforts by governments to reduce anthropogenic CO2 emissions are pointless, may be ill-conceived, and could even be harmful.

    http://www.aps.org/units/fps/newsletters/200807/monckton.cfm

  183. Follow the Money
    Posted Jul 15, 2008 at 1:30 PM | Permalink

    4 W/m^2 For Doubling, “Business as Usual”

    I found the source of the mysterious 4 W/m^2 for doubling of CO2, and the science behind it is nearly non-existent. Last year I wrote at CA about the source of the 2.5C increase in global warming with doubling of CO2 “science.” The source was a comment in the 1990 First Assessment Report in the volume titled, “Scientific Assessment of Climate Change – Report of Working Group I.” The text expressly chose 2.5C as a control value:

    Most scientists declined to give a single number, but for the purpose of illustrating the IPCC Scenarios, a value of 2.5C is considered the “best guess” in the light of current knowledge. [orig. bold]
    http://www.climateaudit.org/?p=2123 Start with post 192

    In subsequent IPCC Assessment Reports this 2.5C figure selected for model comparison purposes in AR1 transformed into a fact of “science” until IPCC AR4 [3?] which bumped it up to 3.0C.

    A few days ago Pat Frank wrote a thoughtful post about the lack of science for IPCC’s mysterious beliefs which spurred me to go back to AR1’s Scientific Assessment volume to find the source of 4 W/m^2. My previous experience told me this would be the likely place to find the root of this mystery of “post-normal science.” Section 3 of the Scientific Assessment is titled “Processes and Modeling.” Sub-section 3.3, p. 77 is “Radiative Feedback Mechanisms.” We find the genesis of the 4 W/m^2 belief in these words,

    As discussed in Section 2, the radiative forcing of the surface-atmosphere system [delta]Q, is evaluated by holding all other climate parameters fixed, with G = 4 Wm^2 for an instantaneous of atmospheric CO2.

    What immediately struck me is that I did not see any such thing earlier in Section 2. A little later at p. 78,

    The definition of radiative forcing requires some clarification. Strictly speaking, it is defined as the change in net downward radiative flux at the tropopause, so that for an instantaneous doubling of CO2 this is approximately 4 Wm^2 and constitutes the radiative heating of the surface-tropopause system.

    The word “approximately” rings a warning bell. Figures in the context of CO2 modeling calculations too often seem to benice round numbers or .5’s. I returned to Section 2 and was stumped. Nowhere in the text is 4 W/m^2 or a discussion of the same to be found. But then I recalled how the Executive Summaries preceding each section of an IPCC report can be a wealth of obfuscation and looked there. Bingo:

    4. Using the scenario A (“business-as-usual” case) of future emissions derived by IPCC WG3, calculations show the following forcing from pre-industrial values (and percentage contribution to total) by year 2025:

    CO2: 2.9 Wm^2 (63%); CH4: 0.7 Wm^2 (15%); N2O : 0.2 Wm^2 (4%); CFCs and HCFCs : 0.5 Wm^2 (11%); stratospheric H2O: 0.2 Wm^2 (5%)

    The total, 4.6 Wm^2, corresponds to an effective CO2 amount of more than double the pre-industrial value.

    I’m not a scientist but shouldn’t the amount attributed to CO2 be 2.9 W/m^2, not 4.6 for total greenhouse gasses? Anyway, if you ask me why 4.6 could be rounded down to 4.0 in Section 3, I would reply “don’t be surprised.” I have encountered various odd roundings and guesstimates in IPCC reports, blatant to my non-expert eyes. I re-read the main text of Section 2 and again found no explanatory text to support Point 4 in the Section 2 Executive Summary. It is as if the text was there once, it was excised for some reason, perhaps because it did not make the grade for the Scientific Assessment volume of AR1, but ghosts of its former presence remained in the Executive Summary and the 4 W/m^2 assertion of Section 3. The figures in Point 4 (rounded only a little) are found in a chart at page 57, Table 2.7, explaining,

    Changes in radiative forcing in Wm^2 for the 4 policy scenarios. The change due to stratospheric water vapour is an indirect effect of changes in methane concentration (see text). All values are changes in forcing from 1765 concentrations.(orig. italics)

    Four scenarios are presented: “Business-as-Usual;” “Low Emissions;” “Control Policies;” and “Accelerated Policies.” At “SCENARIO A (Business-as-Usual)” on the line for “1765-2025” the numbers for the various gasses, without percentage comparisons found at the Executive Summary, are similar to those found in the Executive Summary. Their stated sum is not 4.6 but 4.59.

    So the genesis of 4 W/m^2 is a bald statement in Section 3 of AR1’s Scientific Assessment volume referring to draft explanatory text in Section 2 which did not make the final cut, but whose prior existence we can infer from Section 2’s Executive Summary.

    Based on my reading of Sections 2 and 3 I aver the following. The “science” of 4 W/m^2 is not based on physics, laboratory tests of CO2, or such. It is based on modeling assumptions and a selected forecast under a selected scenario. That is, according to the IPCC favored modelings as of 1990, if the world develops without CO2 abatements imagined by the IPCC, the so-called “Business-as-Usual” scenario, CO2 concentration in the atmosphere by the year 2025 will be double that of 1765. Temperature increases since 1765 are wholly attributed to greenhouse gasses in these models, solar variability is discounted as a negligible influence in the Assessment’s text. The sum of radiative forcing in W/m^2 for greenhouse gasses is 4.59 (4.6 in the E.S.), for CO2 alone it is 2.88 (2.9 in the E.S.). The reason Section 3 says 4 W/m^2 is just because it says so. Thus is the genesis of 4 W/m^2.

    So the Scientific Assessment volume of AR1 (1990) is the source of both 2xCO2=2.5C and 4 W/m^2. AR1 is in three volumes, none of which is accessible online. I only have access to volume 1, the Scientific Assessment. Volume 3 titled “The IPCC Response Strategies – Report of Working Group III” would be where figures for the “future emissions derived by IPCC WG3” were derived. It is a rare volume, and the only one printed not by a government authority or major university press. See the bottom of the IPCC site:

    http://www.ipcc.ch/ipccreports/assessments-reports.htm

    I have visited Climate Audit for a few years now. An amazing site with amazing scientific minds commenting herein. I have noticed a certain stance of scientists exhibiting a “show me the science” towards IPCC expecting a response in accord with scientific ethical standards and customs.

    The powers that be will not show you the science. They are engaged in what a warmer, without irony, named “post-normal science.” I would call it “gravy train science.” If you wish to debunk the IPCC, or improve the science, I suggest you start with the source, AR1. Write about it. Surely an explication of the source of 4 W/m^2 is worth a note in a History of Science-type publication? Show me I’m completely wrong here at CA, or improve my analysis of the 4 W/m^2 genesis, the latter I feel would be easy to do. I sense you will find complex internal contradictions within AR1 that are beyond my ability to express or adequately comprehend.

    And if a reader wholly discounts my analysis of the genesis of 4 W/m^2 because it seems unbelievable this tenet of AGW climate science could have such a flimsy basis, I reply, “Hey, look at the origin of 2xCO2=2.5C, it’s Business-as-Usual

    Steve: There’s a little more to it than that. Ramanathan’s paper in the 1970s are what I believe to be the provenance of the 4 wm-2 rule of thumb, but in any other area, when you probe something, they say that they’ve “moved on”. IF Ramanathan is the only such derivation, then that would be what I’d consider. But IPCC or someone needs to say so.

  184. Pat Keating
    Posted Jul 15, 2008 at 2:47 PM | Permalink

    178 cba
    Do you use SpectralCalc.com?

  185. Steve McIntyre
    Posted Jul 15, 2008 at 3:07 PM | Permalink

    Ross McKitrick will be on the Michael Coren show in Intario and Alberta tonight at 8 pm. Try http://www.ctstv.com/programs/index.asp?id=26&pt=1 but they don’t post the shows. It only broadcasts in Ontario and Alberta.

  186. Follow the Money
    Posted Jul 15, 2008 at 3:52 PM | Permalink

    Steve, re: 183

    The References for Sections 2 and 3 in the Scientific Assessment do not list any Ramanathan paper published earlier than 1985.

    Ramanathan’s 1979 “Zonal and seasonal radiative effects of a doubling of atmospheric carbon dioxide concentration” is inaccessible to me. Is that the one to which you refer?

  187. DeWitt Payne
    Posted Jul 15, 2008 at 4:42 PM | Permalink

    Phil,

    Thanks for the reference. That’s actually the second paper in the series. For a moneywalled journal, the $9.00 article price is fairly reasonable. So 265 W/m2 turns out to be the measured value for clear sky conditions.

    The abstract for part 1, Line-by-Line Calculations of Atmospheric Fluxes and Cooling Rates: Application to Water Vapor, is here and part 2, Line-by-line calculation of atmospheric fluxes and cooling rates 2. Application to carbon dioxide, ozone, methane, nitrous oxide and the halocarbons, is here. This paper, Application of infrared interferometer spectrometer clear sky spectral radiance to investigations of climate variability, looks interesting too. That’s enough for a start.

  188. DeWitt Payne
    Posted Jul 15, 2008 at 5:27 PM | Permalink

    cba,

    Clough, one of the authors of the papers cited above is the C in the CKD model of the water vapor continuum. Interesting comments on validating GCM’s by looking at calculated regional IR spectra in the conclusion section of the third paper above. I’m curious to find out if it’s been done. Also, we would know a whole lot more about what’s going on if they can ever develop a flyable spectrometer that can measure from 700 to 100 cm-1. That covers the really important part of the water vapor spectrum.

  189. Follow the Money
    Posted Jul 15, 2008 at 6:27 PM | Permalink

    The more I delve….

    Raval & Ramanathan, Nature, 1989, v. 342, p.759,

    “The greenhouse effect of doubling of CO2 is 4 W m^2 and that of human activities over the last century [footnote to other paper] is ~ 2 W m^2.”

    Thus, the cited paper on human activities is given a ” ~ ” but within the same sentence 4 W/m^2 is written without qualification.

    Steve: Look at his articles online at AMS (Journal Climate published but predecessor journals) in the 1970s. That’s where the number comes from. It then gets passed on to IPCC 1990 (which is not a primary source). I’m a little sorry to explain this since “climate scientists” don’t seem to know. It’s possible to show continuity through the 1980s in some of the 1980s reports. Is the derivation valid, no idea. Maybe they’ve “moved on”, maybe they haven’t. But it’s much better than Houghton’s absurdities.

  190. Pat Keating
    Posted Jul 15, 2008 at 7:08 PM | Permalink

    188 De Witt

    if they can ever develop a flyable spectrometer that can measure from 700 to 100 cm-1.

    See Palchetti et al, at

    http://www.atmos-chem-phys-discuss.netdiscuss.net/6/4061/2006/acpd-6-4061-2006-print.pdf

  191. kuhnkat
    Posted Jul 15, 2008 at 7:25 PM | Permalink

    cba,

    these do you any good??

    http://www.ace.uwaterloo.ca/publications/Gordon-waterHITRAN2007.pdf

    http://wejump.ifac.cnr.it/pages/papers/FIR-FTS/Evaluation_of_foreign-broadened_water_vapour_continuum_coefficients_from_emitted_spectral_radiance.pdf

  192. DeWitt Payne
    Posted Jul 15, 2008 at 9:01 PM | Permalink

    Pat,

    Your link is broken. Remove the .netdiscuss and it works (fixed link). They cite the two modeling papers by Iacono and Clough from 1992 and 1995 too. It looks like they might have a suitable prototype instrument, but the key sentence in the opening paragraph, IMO, is:

    No space mission that exploits the FIR for Earth’s observation has been made or selected for future operations

    You almost have to wonder if the people who control the budget really want to know. It’s so much easier to speculate on either side of the debate when you don’t have data.

  193. Pat Keating
    Posted Jul 15, 2008 at 9:27 PM | Permalink

    DeWitt
    Just a typo, it works fine in my bookmarks. Clough and Iacano 1995 is pretty much a classic paper.

    Maybe they can fly their Euro spectrometer on a US mission. The only other data I’ve seen is the FTIR data shown in LeMarshal et al 2003, and elsewhere, but that only goes down to 600cm-1, not much good for the FIR water bands.

  194. DeWitt Payne
    Posted Jul 15, 2008 at 9:57 PM | Permalink

    So I’m looking at the Ramanathan and Coakley 1978 review article Climate Modeling Through Radiative-Convective Models and I get to Table 4 on page 15 and there it is. The first number in the table, radiative forcing at the bottom of the stratosphere for doubled CO2, -3.9 W/m2. So now I look at the notes and find the citation:

    The flux changes and the radiative-convective model surface temperature changes were calculated by using the model developed by J. A. Coakley( unpublished manuscript, 1978

    That helps a lot. Then at the end there are caveats about clouds and lapse rates which are apparently still unresolved. Even if you used a modern line-by-line radiative transfer model, you still have to make assumptions about water vapor mixing ratios, lapse rates and clouds to come up with a forcing. Then you have to make even more assumptions to convert the forcing to a temperature. By contrast, Iacono and Clough used radiosonde data to calculate spectra to compare to observed spectra.

  195. Posted Jul 15, 2008 at 11:30 PM | Permalink

    Re #192

    They cite the two modeling papers by Iacono and Clough from 1992 and 1995 too. It looks like they might have a suitable prototype instrument, but the key sentence in the opening paragraph, IMO, is:

    No space mission that exploits the FIR for Earth’s observation has been made or selected for future operations

    You almost have to wonder if the people who control the budget really want to know. It’s so much easier to speculate on either side of the debate when you don’t have data.

    Indeed which makes the cancellation and mothballing of the DSCOVER satellite project which would have given us a few years of measurements out to 100 microns by now all the more puzzling!

  196. Filippo Turturici
    Posted Jul 16, 2008 at 3:08 AM | Permalink

    Folks, bad news for climate warmers: forecasts expect a weak to null Nino to end with this Autumn, and the possible development of a new Nina before next Spring.
    That is to say, just 6 months after a strong and unexpected (with this force) Nina, another Nina could develop in just another 6 months, without having a real Nino phase in the middle.
    Meanwhile, PDO remains negative and AMO is turning downward too: coupled with a new negative ENSO phase, I would really be glad to see for 2009 global temperatures; and remember, Solar activity is still very low…whether it affects or not global temperatures, its influence cannot be a warming, today.

  197. Niels A Nielsen
    Posted Jul 16, 2008 at 4:18 AM | Permalink

    Filippo: How have those short term forcasts done in the past?

  198. Filippo Turturici
    Posted Jul 16, 2008 at 5:05 AM | Permalink

    Niels: in the recent past, well enough I would say – just, they underestimated last Nina strength, having to correct it downward through all 2007 ;)

  199. Geoff Sherrington
    Posted Jul 16, 2008 at 5:59 AM | Permalink

    Occasionally I rudely interrupt the stream of logic and conversation with an occasional piece. My apologies.

    http://www.independent.co.uk/environment/green-living/no-words-.htmlssary-the-cartoonists-tackle-climate-change-859017.html

  200. cba
    Posted Jul 16, 2008 at 6:17 AM | Permalink

    184 (Pat):

    no, I received permission last year to download the freebie version.

    188 (DeWitt):

    I did download a paper or two a month back about ckd – probably one by clough. I briefly went over it but have not yet had time to really read it. It sounded like there was some sort of dataset one had to incorporate to deal with it. I also came across some 2003 reference to an adaptation for hitran that was supposedly forthcoming but I have not had the chance yet to just sit and search for more info on it.

    The problem likely to occur for a spectrometer like that is could they see much with it inside the atmosphere and could they see much into the atmosphere.

    189 (ftm):

    the 4w /m2 doubling is probably just a clear sky calculation on a co2 doubling from a 0 dimension or 1 dimensional model. It’s as crude a number as it gets. Even that isn’t of much consequence as there are quite a few w/m^2 involved in maintaining our atmosphere 33k above what it would otherwise average to. It is a bit high though from what i’ve seen.

    199 (Kuhnkat):

    Tnx, I’ll take a look at them

  201. jae
    Posted Jul 16, 2008 at 10:39 AM | Permalink

    FTM, 183

    A few days ago Pat Frank wrote a thoughtful post about the lack of science for IPCC’s mysterious beliefs which spurred me to go back to AR1’s Scientific Assessment volume to find the source of 4 W/m^2. My previous experience told me this would be the likely place to find the root of this mystery of “post-normal science.” Section 3 of the Scientific Assessment is titled “Processes and Modeling.” Sub-section 3.3, p. 77 is “Radiative Feedback Mechanisms.” We find the genesis of the 4 W/m^2 belief in these words,

    There’s a summary of the derivation here.

    1. Radiative forcing ΔFCO2,where (C/C0) is a proportionate increase in CO2 concentration, is given by several formulae in IPCC (2001, 2007). The simplest, following Myrhe (1998), is Eqn. (3) –

    ΔFCO2≈ 5.35 ln(C/C0) ==>ΔF2xCO2≈ 5.35 ln 2 ≈ 3.708 W m–2. (3)

    Steve: These “simplified expressions” have discussed in detail in a past post. I suggest that you read these posts before spinning your wheels on this.

  202. DeWitt Payne
    Posted Jul 16, 2008 at 6:29 PM | Permalink

    Has there been a discussion of Hansen, et. al., Efficacy of climate forcings, 2005 w.r.t. radiative forcing? It uses Model E to calculate forcing using a 4 x 5 degree grid size rather than a 1D model. Was that paper cited in AR4? It seems to be the most up-to-date statement of forcing and how it’s calculated. It’s still somewhat far from an engineering study, but with the code for Model E being available and a lot of detail on the setup of the calculations, it seems like it might be a place to start than Ramanathan.

    I’ve posted these graphs from data in the paper above on another thread in rebuttal to the commonly held belief that radiative forcing is greater at the poles than the equator, but it’s easier to post them again than to try to search out the original.

    2X CO2, instantaneous forcing at the tropopause:

    2X CO2 average by latitude:

  203. DeWitt Payne
    Posted Jul 16, 2008 at 6:36 PM | Permalink

    cba,

    The problem likely to occur for a spectrometer like that is could they see much with it inside the atmosphere and could they see much into the atmosphere.

    Because the scale height of water vapor is 2 km compared to the scale height of 8 km for CO2, oxygen and nitrogen, the radiation observed in the 100 to 500 cm-1 range comes from quite deep in the atmosphere in clear sky conditions. The emission from water vapor peaks at 3km, so above that level absorption drops drastically. If there is cloud cover, all it will see is emission from the cloud top, but that’s interesting too.

  204. DeWitt Payne
    Posted Jul 16, 2008 at 6:38 PM | Permalink

    …a better place to start than Ramanathan.

    proofread, proofread, proofread.

  205. Pat Keating
    Posted Jul 16, 2008 at 6:46 PM | Permalink

    202 DeWitt
    Why the trough along the equator?

  206. DeWitt Payne
    Posted Jul 16, 2008 at 7:43 PM | Permalink

    Pat,

    Good question. I’m betting it has something to do with the circulation patterns that cause the trade winds and affect humidity profiles, but I don’t know for sure.

  207. Philip_B
    Posted Jul 16, 2008 at 8:58 PM | Permalink

    Why the trough along the equator?

    At a guess, more cloud, more rain, more CO2 dissolved into water.

  208. Posted Jul 16, 2008 at 9:03 PM | Permalink

    Note to Al Gore disciples, major speech tomorrow to be given at Constitution Hall in Washington DC on climate change and the energy crisis. Some are questioning the timing of the speech, with some politicians running for cover on thoughts that Al Gore will praise $4 gasoline. Remember, $4 is not a bad thing, just how fast the price rose… Al Gore Speech particulars

  209. Pat Keating
    Posted Jul 16, 2008 at 9:06 PM | Permalink

    207 Phil B
    I wouldn’t think that Model E includes rain removing CO2. I’m fairly sure that its predecessor Model II didn’t, and this is pretty secondary to other issues not covered properly in the models.

  210. MJW
    Posted Jul 17, 2008 at 2:31 AM | Permalink

    Though comments are closed on the thread dealing with the supposed link between AGW and kidney disease, I wanted to comment on the following:

    Eric Berger quotes Dr. Paul Epstein of Harvard Medical School as saying the new study is “an elegant piece of work.”

    Quoting what I said last December on the AccuWeather blog:

    Just in case the name of his organization, the “Center for Health and the Global Environment at Harvard Medical School,” didn’t give it away, Dr. Paul R. Epstein is hardly some random disease expert commenting on the possible effects of climate on health. He’s pretty much the go-to guy when it comes to AGW health concerns (some might say alarmism). He’s served as an editor of climate change papers for the Union of Concerned Scientists and is a member of the Physicians for Social Responsibility — both advocacy groups with strong stands on AGW. Dr. Epstein doesn’t confine himself to blaming global warming for causing illness, he also blames if for lots of other ills. For example, in January 2004 he wrote, “It seemed incongruous when former Vice President Al Gore gave a speech on global warming on a bitterly cold day in New York City this month. But in fact it was an appropriate topic: New Yorkers may be able to blame the city’s current cold spell — the most severe in nearly a decade — on global warming.”

  211. PaulM
    Posted Jul 17, 2008 at 11:26 AM | Permalink

    Watts-up-with-that has a link to an
    APS forum editor article acknowledging the considerable presence of skeptics. This could be really significant as the APS is a very important organization.

  212. Pat Keating
    Posted Jul 17, 2008 at 11:54 AM | Permalink

    211
    Physicists are generally pretty good at distinguishing between good science and junk science. Just think of all the perpetual motion motion machines the have had to review over the years.

  213. jae
    Posted Jul 17, 2008 at 1:13 PM | Permalink

    Corev notes on the BB:

    The American Physical Society, an organization representing nearly 50,000 physicists, has reversed its stance on climate change and is now proclaiming that many of its members disbelieve in human-induced global warming.

  214. jae
    Posted Jul 17, 2008 at 1:14 PM | Permalink

    OOps, sorry, I see PaulM already linked to the APS stuff.

  215. Follow the Money
    Posted Jul 17, 2008 at 4:14 PM | Permalink

    I like the editor’s note, it tacks that whether or not significant AGW is true, there are plenty of other reasons to worry about fossil fuels.

    Following is a link to the pro-AGW article in the same issue.

    http://www.aps.org/units/fps/newsletters/200807/hafemeister.cfm

    It is our belief that “theory leads experiment” on climate change because all well-accepted atmospheric models predict a temperature rise.

    Interesting.

  216. Follow the Money
    Posted Jul 17, 2008 at 4:30 PM | Permalink

    jae, others, help me with this one.

    From the APS pro-AGW paper:

    Solar Variations. We might expect solar variations of 0.2% are possible since that is twice the present 11-year solar variation.The 0.2% variation gives a surface temperature variation of

    This begins the part to discount solar influences on change. Is the inference to be taken that historical solar variation can reasonably be limited to a comparison to the current cycle?

  217. Dishman
    Posted Jul 17, 2008 at 5:04 PM | Permalink

    Solar Variations. We might expect solar variations of 0.2% are possible since that is twice the present 11-year solar variation.The 0.2% variation gives a surface temperature variation of

    That’s a rather cursory dismissal of solar variation compared to other factors. I’m not sure where, but I’ve seen recent papers saying that up to half the GISS reported anomaly correlates to solar cycles.

    It’s worth noting also that while TSI doesn’t vary all that much, the UV component varies tremendously, roughly +/- 10% at 145nm during SC 22 and 23. It may be higher during other cycles.

  218. DeWitt Payne
    Posted Jul 17, 2008 at 5:41 PM | Permalink

    The solar variation thing has been discussed nearly to death on the various Svalgaard threads (up to #8 at this writing). The flux density of the solar spectrum at 145 nm is quite low so even a large variation doesn’t amount to much. It’s also absorbed very high up in the atmosphere (oxygen absorbs very strongly at wavelengths shorter than 200 nm). This amounts to pure speculation with little to no data. With so many real gaping holes in the current AGW dogma it makes little sense to trumpet this kind of thing. It makes it very easy to dismiss all AGW skepticism as unscientific.

  219. jae
    Posted Jul 17, 2008 at 5:53 PM | Permalink

    I agree with DeWitt. Read the Svalgaard threads; IMO he has demonstrated quite well that there are no known solar variations that consistently explain significant temperature variations (yet).

  220. Philip_B
    Posted Jul 17, 2008 at 6:30 PM | Permalink

    I noticed that there is a very deep low pressure system in the Southern Ocean. I am unable to read the central pressure from the BOM Chart, nor could I find a better chart, but counting isobars (zoom up the image a lot), it looks like the central pressure is at super cyclone levels – 920mb. BTW, I have no idea if this is unusual or not, but I’d be interested in hearing from someone who does.

  221. Orson
    Posted Jul 17, 2008 at 11:01 PM | Permalink

    I managed to make Roy Spencer’s presentation today at CIRES, University of Colorado at Boulder. (cf #79, Jim Arnt) He elaborated on his work since last year on climate sensitivity, and it is now in press at Journal of Climate.

    He comes to a 0.25 C per century delta-T estimate. Mind you, this is derived on only six years of data. Clearly, more and better will yield more and better representations of reality. What stands out starkly is that we have the first realistic modeling of the global climate system in Spencer’s work. As I commented during Q&A, “there is lots left to explore.” Dr Spencer agreed.

    He pitted his empirically derived model against selected IPCC models. None of the latter stood up well.

    Two points of humor: he named (something important-I’ll check notes later) “the Hansen effect.” The “Smoking gun” in his title was to tweak Hansen for his infamously titled paper wherein he projected catastrophe.

    He asked if anyone from (nearby) NCARs climate modeling program was in attendance? No voices or hands went up. He quipped something about some people having a problem with reality….

    His line of thought conveyed how a meteorologist thinks practically about climate and its subtleties, as opposed to most modelers, typically trained in math (Connelley) or physics (astro- in Hansen’s case) or other fields, who think more categorically, unable to see tenuous connections in brut nature that meteorologists only find natural.

    For instance, the modelers assume the climate system is naturally in equilibrium unless disturbed from the outside, like from mankind. His satellite evidence shows how climate trends tend to persist, on up to decadal timescales, because of internal forcings. But climate never actually equilibriates as these modelers presuppose.

    In just the last days or weeks, he has integrated some insights on ENSO and other known oceanic circulatory phenomenon as climatic drivers that he has been pulling together since January (cf. Anthony Watts web site) on natural climate drivers.

    Th BIG point is that internal climate variability has been mis-identified, heretofore. His model was matched against Forster (et Al, 2006), and improved on his results, indicating that his results are veridical.

    His new insight is that ocean PDO-neg and PDO-pos effects are likely driving a more dominant natural climate effect, and his results are unlikely to be coincidental. Forster agreed.

    It was an almost full house, maybe 70 (out of 100) seats filled. He says his PP presentation should be up at Roger Pielke (Sr?) web site now, he said.

    (Too busy right now to check details right now – this all from memory after work – THIS was full of interest for people here.) I’ll try to fill in gaps later, and see the PPP myself to refresh my notes and memory.

    There were also two interesting working elements on display here. First, the fact that he’s doing this all on Excel spreadsheets, not the multi-million dollar super-computers with teams of well-paid researchers like the modelers. Second, when someone asked if he had communicated with Bill Gray, Spencer said he had not seen Bill in ten years time, but bumped into him while registering at a conference awhile ago. Dr Gray got down on his knees and reportedly said “I love you!” “Kind of embarrassing,” said Spencer smirkingly, admitting that Gray is very passionate about observationally driven climate science. NOT what’s become of it these days.

    Again, see the PPP at Pielke’s web site for Spencer’s compelling conclusions and implications for what this means for climate modeling. And again, this work opens up a lot moree work to do to confirm his teasing inferences.

    Spencer admitted to having engaged in the kind of “handwaving” often criticized at CA. But now there are increasing amounts of hard empirical evidence to back up such previously idle claims. Quite fascinating. Could this be a turning point in the theory versus observational debate that has characterized the field since the demise of the Hockey Stick?

  222. crosspatch
    Posted Jul 17, 2008 at 11:39 PM | Permalink

    “I am unable to read the central pressure from the BOM Chart”

    If you select “Printable B&W” under the map it will load a PDF. The PDF reader will allow you to zoom. Looks like a central pressure of 928 millibars. Katrina was 902. Still, a healthy storm.

  223. Philip_B
    Posted Jul 18, 2008 at 12:24 AM | Permalink

    Thanks, crosspatch. It appeared even more intense 24 hours ago. unfortunately, the archived image is even more unreadable and there is no pdf option on the archive. It looks like it was 91?. Note, that this storm is just off the coast of Antarctica.

  224. Dishman
    Posted Jul 18, 2008 at 5:41 AM | Permalink

    In my previous post here, I attempted to address concerns relating to model level of detail. Where the models include feedback terms for CO2, they do not appear to contain feedback terms for TSI.

    I’ll take another run at the Hafemeister and Schwartz paper.

    I’ll start by writing the function for temperature in the following form:
    T = k1 * Fco2 + k2 * Fsv + K3 * Flu + Fnoise
    where
    T is the measured temperature
    k1, k2, k3 are feedback coefficients
    Fco2 is the forcing function from CO2, from IPCC ~1 K/doubling
    Fsv is the forcing function from solar variation by direct heating
    Flu is the forcing function from land usage, which I won’t address
    Fnoise is the the sum of weather variability, oscillators and functions not otherwise covered

    The variation of Fsv is not necessarily the same as the variation of TSI, though it might be. I don’t have enough evidence to make a definitive statement.

    The IPCC posits that the value of k1 is approximately 3.

    Hafemeister and Schwartz implicitly assert that k2 = 1. Allowing for 0.2% variation in TSI, they get a T variation of 0.14 K.

    I have a problem with the implicit assumption that k1 k2.

    It’s my understanding that the largest component of k1 relates to water vapor. The water vapor equations do not contain any terms for the source of the temperature change, only the amount. Therefore, k1wv = k2wv. The same logic may apply to other components of k1 and k2.

    The dendroclimatology data appears to set an upper bound on k2

  225. Dishman
    Posted Jul 18, 2008 at 6:04 AM | Permalink

    ahhh.. some of my text got taken as html…

    I have a problem with the implicit assumption that k1 != k2.

    It’s my understanding that the largest component of k1 relates to water vapor. The water vapor equations do not contain any terms for the source of the temperature change, only the amount. Therefore, k1(water vapor) = k2(water vapor). The same logic may apply to other components of k1 and k2.

    The dendroclimatology data appears to set an upper bound on k2 of 1. Unless there are some other terms in k1 which are separate from k2, that sets an upper bound on k1 of 1.

  226. bernie
    Posted Jul 18, 2008 at 8:46 AM | Permalink

    Apparently the validity of the IPCC endorsed CO2 sensitivity parameter is getting serious and skeptical attention at the American Physical Society. Pehaps Steve will finally get an answer to his request for a definitive derivation of this key number.

  227. bender
    Posted Jul 18, 2008 at 9:08 AM | Permalink

    #226 careful, bernie. Did you read the discussion below the article?

  228. henry
    Posted Jul 18, 2008 at 10:05 AM | Permalink

    The following just in:

    [QUOTE]Starbucks is detailing all stores slated for closure. California will now lose 88 stores with two each in Los Angeles and San Francisco, and 10 in San Diego.

    Florida will lose 59 stores, including three each in Tampa and Palm Beach Gardens. Louisiana will lose 13 stores, nine of them in Baton Rouge.[/QUOTE]

    [url]http://i.usatoday.net/money/industries/food/starbucks-list.pdf[/url]

    This is going to make updating those proxies even harder…

  229. Pat Keating
    Posted Jul 18, 2008 at 10:07 AM | Permalink

    226 Bernie
    I wouldn’t hold your breath.

    The ‘bare’ sensitivity (without feedbacks) can be calculated fairly well (about 1.0-1.2 C per doubling) but no-one knows whether the feedback is even positive or negative, never mind about how much.

    My bet is that the feedback is negative for modern levels of humidity, but may be positive for very low humidity, such as presently in the polar regions and historically in lower latitudes during the Ice Ages.

  230. Pat Keating
    Posted Jul 18, 2008 at 10:12 AM | Permalink

    227 bender

    A good point, but I wouldn’t put much weight on it. The APS is just trying to maintain its objectivity, and the Physics and Society forum is really the only part of APS that cares about global warming, I think.

  231. Bob Meyer
    Posted Jul 18, 2008 at 10:22 AM | Permalink

    The American Physical Society’s publication “Physics and Society” published an article by Monckton but prefaced it with

    “The following article has not undergone any scientific peer review. Its conclusions are in disagreement with the overwhelming opinion of the world scientific community. The Council of the American Physical Society disagrees with this article’s conclusions.”

    Monckton claimed in a radio interview that the APS went over his submission with a fine tooth comb before they published it. I suspect that there will be a future debate on what constitutes “peer review”.

    The opposing view was entitled “A Tutorial on the Basic Physics of Climate Change” suggesting that Monckton’s article was devoid of even the “basic physics”.

    I don’t think that things have changed as much as many of us would like to believe. This “debate” could easily end with an OJ type verdict. On the bright side, at least I’ll get to read some interesting articles.

  232. Posted Jul 18, 2008 at 11:24 AM | Permalink

    Noel Sheppard at Newsbusters.org unloads on RealClimate and Gavin Schmidt in particular. Sheppard unloads on RealClimate.

    Last Saturday, one of the nation’s leading climate alarmists — a government employee with a history of attacking people that don’t agree with his views on anthropogenic global warming — wrote rather disparagingly about a somewhat satirical NewsBusters piece.

    Despite claiming he typically doesn’t comment on things “written about climate change in the more excitable parts of [sic] web,” NASA’s Gavin Schmidt took time out of his busy Saturday schedule to respond to something he described as “probably the most boneheaded article that I have seen in ages.”

  233. bernie
    Posted Jul 18, 2008 at 12:16 PM | Permalink

    My point in pointing to the Physics and Society piece was simply that the editor emphasized the point that the net impact of CO2 emissions was still an issue of debate for a large number of members:

    With this issue of Physics & Society, we kick off a debate concerning one of the main conclusions of the International Panel on Climate Change (IPCC), the UN body which, together
    with Al Gore, recently won the Nobel Prize for its work concerning climate change research. There is a considerable presence within the scientific community of people who do not agree with the IPCC conclusion that anthropogenic CO2 emissions are very probably likely to be primarily responsible for the global warming that has occurred since the Industrial SchroederRevolution. Since the correctness or fallacy of that conclusion
    has immense implications for public policy and for the future of the biosphere, we thought it appropriate to present a debate within the pages of P&S concerning that conclusion.

    The editor closed with a request for the discussion to begin. My conjecture is that the first principle derivation that Steve has been looking for, or a reference thereto, should emerge from this debate.

    I have not read these articles in depth but I was somewhat surprised by this statement in the opening paragraphs of David Hafemeister and Peter Schwartz’s article:

    The naturally occurring greenhouse gases (present before industrialization) cause the earth to be 33 oC warmer than if there was no infrared trapping by the atmosphere. One can attribute 21 oC of that warming to the IR trapping of water vapor, 7 oC to CO2 and 5 oC to other gases.

    Is this correct?

  234. Follow the Money
    Posted Jul 18, 2008 at 12:37 PM | Permalink

    Monckton claimed in a radio interview that the APS went over his submission with a fine tooth comb before they published it. I suspect that there will be a future debate on what constitutes “peer review”.

    APS could use some IT review, with any comb. The dodgy word and line spacing and more makes the article look like an amateurish conversion from Word to pdf to ?

  235. jae
    Posted Jul 18, 2008 at 2:45 PM | Permalink

    Steve Mc: the link to the Idsos on the blogroll appears not to be working.

  236. Jedwards
    Posted Jul 18, 2008 at 2:48 PM | Permalink

    So now the question: Did the Hafemeister & Schwartz paper undergo “peer review”? What percentage of the “world scientific community” agrees or disagrees with either the Monckton or Hafemeister & Schwartz papers? And how was this consensus polled? And what particular sections of the Monckton paper does the “Council of the APS” specifically disagree with? Actually I think in a kneejerk fashion APS may have just stepped in some deep “you know what”.

  237. bernie
    Posted Jul 18, 2008 at 3:13 PM | Permalink

    Somehow I think Hafemeister and Schwartz may have oversimplifed the situation given what appears to happen on the moon.

  238. jae
    Posted Jul 18, 2008 at 4:55 PM | Permalink

    236, Jedwards:

    So now the question: Did the Hafemeister & Schwartz paper undergo “peer review”?

    It doesn’t look like it to me, since it has this absurd statement:

    Put a blanket over a light bulb, and you will have a fire. For the full power of the light bulb to pass through the blanket, the inner temperature must rise considerably. The atmosphere is not a mere thermal resistor, but the analogy is illuminating.

  239. bernie
    Posted Jul 18, 2008 at 7:07 PM | Permalink

    Perhaps the example says much of the predispositions rather than the science.

  240. Boris
    Posted Jul 19, 2008 at 5:58 AM | Permalink

    232:

    I must remember the next time I say something stupid to claim it was satire.

  241. cheeky
    Posted Jul 19, 2008 at 6:01 AM | Permalink

    “Open Mind”
    “Real Climate”
    “Fair and Balanced”

    Am I discerning a theme here?

  242. KevinUK
    Posted Jul 19, 2008 at 9:40 AM | Permalink

    Boris have youread the paper?

    It is a re-examine of the factors used by the IPCC to derive the net effect of doubled CO2. What is patently clear from the content of the paper is that but for the claimed (nearly all) positive feedbacks (by far the largest of which is water vapour) there is no possibility of a climate crisis and in particular that there is no need to ‘act now’.

    Nature continues to demonstrate that she is in fact in control of the climate and not man. The only people who seem to think that man is determining our current climate are those who fail to actually look at what has actually happened since Hansen’s ‘we must act now to save the planet’ 1988 testimony and prefer instead believe in the predictions of ‘tuned’ to show that there is a climate crisis (when there isn’t) computer models.

    KevinUK

    Steve: As I’ve said on many occasions, my views are not the same as Kevin’s. I don’t think that the issue has been laid to rest nor do I discount the fact that serious scientists believe that there is an issue.

  243. DeWitt Payne
    Posted Jul 19, 2008 at 12:46 PM | Permalink

    Here’s some depressing reading (EPA draft ANPRM on Regulating Greenhouse Gas Emissions under the Clean Air Act) that probably more than borders on policy rather than science, but there is useful data on total emissions and what it will take to make a dent in them. Note, this is the leaked draft before modifications. This isn’t the place to discuss the merits of this proposal, but everyone on both sides should be aware of the stakes at issue.

  244. Raven
    Posted Jul 19, 2008 at 12:50 PM | Permalink

    An interesting artical on the web and science:

    http://www.economist.com/science/displayStory.cfm?source=hptextfeature&story_id=11745514

    But this comment on the story caught my eye:

    Online indexing isn’t a new phenomenon. Medline has been around for over 20 years and has been searchable free on the internet since 1995 (I think?). So any change in citation use isn’t directly related to the use of the internet.

    However, a change could be explained by the medium in which the internet is searched. Google uses proprietary means to determine relevancy, but one of the criteria is how popular a citation is. By using Google, instead of a traditional citation database, the popular stuff floats to the top. The narrow definition of popular continues as more and more people choose to use Google, rather than traditional databases, clicking on the links and reinforcing the small number of popular articles.

    http://www.economist.com/science/displayStory.cfm?source=hptextfeature&story_id=11745514&mode=comment&intent=readBottom

  245. Posted Jul 19, 2008 at 11:55 PM | Permalink

    Steve and all colleagues,

    Have you read from APS the article of Lord Monckton and the advice added two days after from the Editors? You can read it here.

  246. Jon
    Posted Jul 20, 2008 at 12:08 AM | Permalink

    And remember kids, #42 and the link are credible. Lonnie Thompson, Michael Mann, James Hansen, et al. are not.

    It’s a big looking glass, but quite easy to fall through.

  247. kim
    Posted Jul 20, 2008 at 1:04 AM | Permalink

    37 (Steve) I gotta go with Pat #36 on this one. Something even more durable than tenure is reputation.
    ================================================

  248. D. Patterson
    Posted Jul 20, 2008 at 1:40 AM | Permalink

    45 kim says:

    July 20th, 2008 at 1:04 am
    37 (Steve) I gotta go with Pat #36 on this one. Something even more durable than tenure is reputation.
    ================================================

    I can think of more than a few whose tenure lasted longer than their good reputations….(s)

  249. Hans Erren
    Posted Jul 20, 2008 at 6:21 AM | Permalink

    Here link to the Monckton response including a list of all reviewer critics and subsequent edits:

    http://www.webcommentary.com/aps.htm

  250. RomanM
    Posted Jul 20, 2008 at 6:38 AM | Permalink

    #35 Pat Keating
    I am not sure what you were reading. At http://www.aps.org/policy/statements/07_1.cfm , the APS National Policy Statement on Climate Change states:

    Emissions of greenhouse gases from human activities are changing the atmosphere in ways that affect the Earth’s climate. Greenhouse gases include carbon dioxide as well as methane, nitrous oxide and other gases. They are emitted from fossil fuel combustion and a range of industrial and agricultural processes.

    The evidence is incontrovertible: Global warming is occurring. If no mitigating actions are taken, significant disruptions in the Earth’s physical and ecological systems, social systems, security and human health are likely to occur. We must reduce emissions of greenhouse gases beginning now.

    Because the complexity of the climate makes accurate prediction difficult, the APS urges an enhanced effort to understand the effects of human activity on the Earth’s climate, and to provide the technological options for meeting the climate challenge in the near and longer terms. The APS also urges governments, universities, national laboratories and its membership to support policies and actions that will reduce the emission of greenhouse gases.

    Does this constitute a “mild” position? The middle paragraph strikes me as stating that the science is settled and the debate is over and the last paragraph says “get on the bandwagon”!

  251. Pat Keating
    Posted Jul 20, 2008 at 7:37 AM | Permalink

    48 Roman
    I was referring to the piece you first quoted. I agree that this one is most definitely in the alarmist camp. I like to believe that more evidence against this alarmist viewpoint has accumulated in the last year, since that was written (probably by Robert Park).

  252. Philip_B
    Posted Jul 20, 2008 at 8:44 AM | Permalink

    Does this constitute a “mild” position? The middle paragraph strikes me as stating that the science is settled

    What struck me about this statement is that it doesn’t attribute GW to increased GHG emissions. You could argue it’s assumed, but still a curious ommission. In fact, you could interpret the statement to mean, while we cannot attribute the GW warming to date to GHG emissions, the risks to future climate from increasing GHG levels means we must reduce emissions.

  253. RomanM
    Posted Jul 20, 2008 at 10:14 AM | Permalink

    What about the first sentence of the quote? It appears to me that the implied meaning of the text is pretty clear. It was probably written be a committee. ;)

  254. Philip_B
    Posted Jul 20, 2008 at 5:36 PM | Permalink

    RomanM, it looks to me that a statement saying GW is due to AGHG has been removed. The gap in the logic struck me immediately, as I suspect it would anyone with a science background.

    I’d speculate, enough people objected to the direct linkage and a statement was removed. The remaining statements are weaker and typical of when some people in a committee want a key statement removed and negotiate the secondary issues away, in order to achieve their primary objective (I speak from experience).

    And concerning the first sentence, it refers to one thing – ‘affects the climate’ – then the second sentence refers to – ‘global warming’, with nothing to connect them. Scientists don’t write like this. By omission, they are saying we cannot say GW is due to emissions of greenhouse gases from human activities.

    This statement reminds me of the IPCC reports, deliberately crafted to avoid statements scientists would object to, while giving the impression of conclusiveness to a non-scientific audience.

  255. DaleC
    Posted Jul 21, 2008 at 4:18 AM | Permalink

    An incarnation of MBH is alive and well (Chucky…please go away) at

    http://www.climatechange.gov.au/science/faq/question2.html

    The page says last modified 11Sep2007.

  256. RomanM
    Posted Jul 21, 2008 at 5:15 AM | Permalink

    #254 Philip_B

    Point taken. You are correct in pointing out the disconnect between the first and second paragraph. It definitely sounds to me like the work of a committee.

  257. Bob S
    Posted Jul 21, 2008 at 7:17 PM | Permalink

    Orson (221):

    Any idea why in the beginning of Spencer’s presentation he uses an ocean mixed layer depth of 50 m and that at the end he uses a mixed-layer depth of 1000 m? Was there any discussion of this at the presentation?

  258. henry
    Posted Jul 21, 2008 at 8:37 PM | Permalink

    Meanwhile, over at “Open Mind”, Tamino’s doing what he calls “jury duty” – he’s reviewing a paper.

    I liked his answer to a poster, as follows:

    Question: Your response brings me to an interesting question…. what actually delineates a “scientist” from a “non-scientist”?

    I’ve had the pleasure of reviewing papers for submission in my field before, but I really wouldn’t qualify myself as a scientist per se.

    I do statistics for a living, but wouldn’t call myself a statistician either.

    Is there a hard-and-fast rule for it? Can’t be a PhD or anything like that, or else Faraday would *still* be getting shafted :-) Although I guess he could be termed an “experimentalist”.

    Thoughts?

    Response: There’s no hard-and-fast rule, and no “perfect” answer. I’d say a scientist is someone who *does science* — so there are a lot of Ph.D.s who don’t make the cut! And there are many who aren’t degreed, don’t work in an institution, but in my opinion are definitely scientists because they’re doing it. In fact, in astronomy there’s a vast number of amateurs who do it only for the love of it (they don’t get paid) but make very important contributions to science.

    If someone *claims* to be a scientist but I have reason to doubt, I’ll check the peer-reviewed literature for publications as a primary indicator. But it’s not infallible.

    Sound familiar?

  259. gb
    Posted Jul 22, 2008 at 12:58 AM | Permalink

    # 257:

    Spencer is using a mixing layer depth of 1000 m? That’s complete nonsense. If he uses that then you know he is trying to mislead the public.

  260. Orson
    Posted Jul 22, 2008 at 3:27 AM | Permalink

    Bob S (#257)-

    I believe Spencer said he was limited to available data in answering different, respective questions. (Or ee\lse he e\mispoke.)

    In #221 I utterly missed the fact that Spencer was connecting satellite data (+-60 degrees) to cloud formation and alternating oceanic regimes. A trifecta of empirical evidence.

    (SORRY. I have too been busy moving to re-read my notes.)

    -Orson
    PS I noticed that the “advertised PPP” was not up at Pielke’s blog, but it is NOW.

  261. Orson
    Posted Jul 22, 2008 at 3:33 AM | Permalink

    Damn

    PS I noticed that the “advertised PPP” was not up at Pielke’s blog, but it is NOW. No it is not. Anyone know better where it might be?
    (I have queried the webmaster.)

  262. Posted Jul 22, 2008 at 12:12 PM | Permalink

    Criminy. This Bruno Latour? Hard bait for the likes of me to resist. You suggest:

    “you will realize that the proponents of AGW have mostly succeeded in garnering the support of scientists of all fields by “translating” the question. So it’s not only if AGW is right or wrong, it’s whether scientists should be trusted or not. Any questioning of AGW is “translated” into a questioning of science’s legitimacy in the public eye. Doing that, you ensure that the entire scientific body has joined, and reinforced, your “network”.

    This is true only to the extent that either the science in question is illegitimate, or to the extent that science itself is illegitimate, in its claim to be an instrument for revealing objectively true statements about the world. I really doubt that most of such readers of this site as understand the deconstructionist roots of your argument would have much sympathy for it. You have translated the question from being about the legitimacy of science to being about the legitimacy of scientists. One of the actually interesting things about science is that increasingly effective results emerge from consistently flawed imperfect people working under conditions that promote certain habits of mind.

    The question asked by CA is primarily whether or not climate science falls under this category. That most scientific bodies support the small climate science community in such a claim does constitute legitimate evidence in its favor.

    If you believe that science is a social construct you have a lot of explaining to do about how my ideas are glowing on phosphors in front of your nose. Did all those intervening electric impulses sign a social contract? If faith in science declines, will the internet go away?

  263. Stan Palmer
    Posted Jul 22, 2008 at 12:55 PM | Permalink

    re 76

    This is true only to the extent that either the science in question is illegitimate

    You should really read Latour before you make comments on his ideas. The question of the “science” being “illegitimate” will have nothing to do with this.

  264. Francois Ouellette
    Posted Jul 22, 2008 at 1:16 PM | Permalink

    #76 yours is a typical, and rather tired, response to the work of sociologists of science. The science wars have been over since ten years, with, of course, no real winner. I know of Latour’s position on the issue of climate change, and it is a bit absurd, because, as he explicitly says, it goes against everythiing he has taught in the past. Guess he was himself “enrolled” in the network, which would be the ultimate irony.

    The question here is not whether AGW is true or not, and in Latour’s work, the question was never whether scientific theories are “true” or not. Latour set about describing, in a general framework, how scientists act, much like an anthropologist would describe some tribe in Amazonia. If you knew him better (that is, if you were not so afraid of what he has to say, because, of course, it might give a poor “image” of science), you would know that in his “networks”, the “things” also have to be “enrolled”. So contrary evidence will weaken a nework. For example, the fact that temperatures are declining since the past 7 years is weakening the network built around AGW, and may ultimately (if the decline continues) disrupt the network entirely.

    For example, is the “fact” that continents are drifting true or not? In 1930 it was not. In 1975 it was. In 1930, you could have told me, talking about fixed continents:

    That most scientific bodies support the small (geological) community in such a claim does constitute legitimate evidence in its favor.

    What Latour says is that in 1930, continents were NOT drifting, and that in 1975, they WERE. The “network” around fixed continents was very strong in 1930. Fixed continents were a “fact”. That network was only weakened, and eventually disrupted, when some “things” stopped cooperating, but also when some of the members of the network got older or died, and younger scientists were not themselves “enrolled”.

    Personnally, I find Latour’s framework very useful, once you go past his most provocative statements. Again, in my comments, I’m not saying that TGGWS is “true”. I’m just trying to understand the behavior of the protagonists. I find that if you adopt that framework (Actor network theory), most of the pieces fall together rather nicely.

  265. MrPete
    Posted Jul 22, 2008 at 2:00 PM | Permalink

    gb, re: clouds and humidity in GCM’s. I was being loose in my language**, partially because my main point was with respect to whether there are solid requirements specifications for GCM systems.

    In the software industry (and I submit that’s an appropriate parallel since AFAIK all GCM’s are custom software), important software is built to a specification of some kind rather than ad-hoc. It might be as informal as a set of 3×5 cards (in the Extreme Programming/XP model, or as integrated as a documentation block in each module (in the Literate Programming model), but one way or another, important software is well-specified.

    **Yes, clouds and humidity are mentioned in GCM’s. Are they modeled in a physically accurate way? The honest answer appears to be somewhere between “no” and “that’s controversial at best.” I could easily be out of date on this but last I recall, there’s generally a static assumption for humidity, and very unphysical cloud modeling. Hardly a specification that brings confidence.

    Search in CA for the words GCM and clouds or humidity for extensive discussion and links.

  266. harold
    Posted Jul 22, 2008 at 3:16 PM | Permalink

    An update on the Monckton controversy. Recently the disclaimer has changed from

    “The following article has not undergone any scientific peer review. Its conclusions are in disagreement with the overwhelming opinion of the world @ scientific community. The Council of the American Physical Society disagrees with this article’s conclusions.” (in red letters) to:

    “The following article has not undergone any scientific peer review, since that is not normal procedure for American Physical Society newsletters. The American Physical Society reaffirms the following position on climate change, adopted by its governing body, the APS Council, on November 18, 2007: “Emissions of greenhouse gases from human activities are changing the atmosphere in ways that affect the Earth’s climate.” (black letters)

    I read that the whole text of the Nov 18 2007 was:

    “Emissions of greenhouse gases from human activities are changing the atmosphere in ways that affect the Earth’s climate. Greenhouse gases include carbon dioxide as well as methane, nitrous oxide and other gases. They are emitted from fossil fuel combustion and a range of industrial and agricultural processes.

    The evidence is incontrovertible: Global warming is occurring. If no mitigating actions are taken, significant disruptions in the Earth’s physical and ecological systems, social systems, security and human health are likely to occur. We must reduce emissions of greenhouse gases beginning now.

    Because the complexity of the climate makes accurate prediction difficult, the APS urges an enhanced effort to understand the effects of human activity on the Earth’s climate, and to provide the technological options for meeting the climate challenge in the near and longer terms. The APS also urges governments, universities, national laboratories and its membership to support policies and actions that will reduce the emission of greenhouse gases.”

    jeez.

  267. Orson
    Posted Jul 22, 2008 at 3:35 PM | Permalink

    I’m told that Spencer’s PPP is “at:

    http://cires.colorado.edu/science/groups/pielke/

    under the What’s New Section at the right”

  268. bernie
    Posted Jul 22, 2008 at 3:45 PM | Permalink

    David Stockwell has provided a link to a stunningly succinct and compelling indictment of catastrophic AGW by a leading Australian scientist who spearheaded Australia’s carbon accounting program and was prevously a strong proponent of AGW.

  269. Follow the Money
    Posted Jul 22, 2008 at 4:16 PM | Permalink

    #263, therefore the following was removed from the Monckton disclaimer

    Its conclusions are in disagreement with the overwhelming opinion of the world @ scientific community.

    That is a big retreat.

  270. Bob S
    Posted Jul 22, 2008 at 4:30 PM | Permalink

    Orson (260):

    Thanks for the response. It doesn’t sound there was too much discussion of the difference.

    (By the way, the link on here to Spencer’s presentation works for me)

    gb (259):

    Was that post meant to be sarcastic? Sounds like someone has been reading a little too much realclimate.

    harold (266):

    The same disclaimer is now at the top of the Schwartz and Hafemeister paper.

  271. Pliny
    Posted Jul 22, 2008 at 4:45 PM | Permalink

    #268 These accounts of David Evans are getting silly. He is not a leading Australian scientist. His bio, which I think he wrote himself, is here.

  272. jae
    Posted Jul 22, 2008 at 4:51 PM | Permalink

    271, pliny:

    #268 These accounts of David Evans are getting silly. He is not a leading Australian scientist. His bio, which I think he wrote himself, is here

    Just why are they “silly?” And what does his bio have to do with it, anyway?

  273. harold
    Posted Jul 22, 2008 at 4:51 PM | Permalink

    That is funny Bob, I heard this was not the case awhile ago.

  274. Sam Urbinto
    Posted Jul 22, 2008 at 5:04 PM | Permalink

    Wake up and smell the cat food in your bank account.

  275. Pat Keating
    Posted Jul 22, 2008 at 5:06 PM | Permalink

    264 Francis
    Actually, the fixed-continent consensus was alive and well in 1956. I lived with a bunch of geology students who laughed sneeringly at my outlandish belief in continental drift.

  276. DeWitt Payne
    Posted Jul 22, 2008 at 5:31 PM | Permalink

    Continental drift was not accepted in academia until the early 1960’s when the theory of plate tectonics was developed to support the by then overwhelming geological evidence that the continents had been connected. The major oil companies had known this for decades, based on their sea floor surveys, but their evidence was proprietary so not published.

  277. Pliny
    Posted Jul 22, 2008 at 5:36 PM | Permalink

    #272 jae, the silly item was “a leading Australian scientist who spearheaded Australia’s carbon accounting program and was previously a strong proponent of AGW”. The relevance of his bio is that it supports none of those statements.

  278. D. Patterson
    Posted Jul 22, 2008 at 5:46 PM | Permalink

    275 Pat Keating says:

    July 22nd, 2008 at 5:06 pm
    264 Francis
    Actually, the fixed-continent consensus was alive and well in 1956. I lived with a bunch of geology students who laughed sneeringly at my outlandish belief in continental drift.

    276 DeWitt Payne says:

    July 22nd, 2008 at 5:31 pm
    Continental drift was not accepted in academia until the early 1960’s when the theory of plate tectonics was developed to support the by then overwhelming geological evidence that the continents had been connected. The major oil companies had known this for decades, based on their sea floor surveys, but their evidence was proprietary so not published.

    As late as 1967, I was given a failing grade on a paper I wrote about the relationship between continental drift, plate tectonics, and the New Madrid earthquake 1967. The professor asserted that continental drift was a fantasy, and the consensus of the scientific community that continental dirft did not exist could not be wrong. Nothing else like being a rogue and a maverick among a scientific consensus, eh?

  279. Dishman
    Posted Jul 22, 2008 at 6:10 PM | Permalink

    MrPete
    The NASA standard Software Assurance process is rather more than just requirements. On my cursory reading, it appears much like the DO-178B process the FAA requires (which is the process I know and use). My reading is that GISTEMP and GISS GCM should both be at Full/High because the impact of a bug would exceed 200 man-years of work lost.

    Here’s some of what’s in DO-178B Level A (equivalent to NASA Full/High):
    * Process planning documents
    * Version control on all code and documents
    * High and Low level requirements documents
    * All requirements must be testable
    * Test plans, procedures and reports
    * MCDC testing (every path through the code, not just every line)
    * Full requirements testing
    * Traceability on all requirements from high level through low level to code and testing
    * Independent verification of key steps
    * Independent review of all documentation

    That’s approximately the level of verification that should be applied to GISTEMP and GISS GCM, based on how many man-years of work depend on their correctness. Perhaps Dr. Hansen does not assign such a high value to the work of others. Researchers who are relying on GISTEMP and GISS GCM should be aware of the risks they are being exposed to. They have no way to be sure that their scholarship will not be invalidated by some undiscovered flaw in critical data.

    I believe that GISTEMP and FILNET should be treated as a single package for Assurance purposes. The reason for this is demonstrated by (as I understand events) the Jan. 2000 discontinuity when NOAA changed its procedures, but GISTEMP was not adjusted accordingly. If both adjustments were under a common Assurance process, the process would have caught and flagged the problem before it was released. I believe the appropriate concept is “chain of custody” for evidence.

  280. DeWitt Payne
    Posted Jul 22, 2008 at 6:58 PM | Permalink

    D. Patterson,
    snip – let’s not go there

    Here’s another apt quote from the Wikipedia article cited above:

    One of the main problems with Wegener’s theory was that he believed that the continents “plowed” through the rocks of the ocean basins. Most geologists did not believe that this could be possible. In fact, the biggest objection to Wegener was that he did not have an acceptable theory of the forces that caused the continents to drift. He also ignored counter-arguments and evidence contrary to his theory and seemed too willing to interpret ambiguous evidence as being favorable to his theory.[14] For their part, the geologists ignored Wegener’s copious body of evidence, allowing their adherence to a theory to override the actual data, when the scientific method would seem to demand the reverse approach.

    [emphasis added]

  281. Bernie
    Posted Jul 22, 2008 at 7:14 PM | Permalink

    Pliny:
    I stand corrected, I clearly overstated who he was and what he had contributed. I will be more diligent in the future in checking out the actual backgro. Thanks for the links.

  282. Geoff Larsen
    Posted Jul 22, 2008 at 7:15 PM | Permalink

    Bob S, Orson.

    Bob S -Any idea why in the beginning of Spencer’s presentation he uses an ocean mixed layer depth of 50 m and that at the end he uses a mixed-layer depth of 1000 m? Was there any discussion of this at the presentation?

    I wasn’t at the presentation but in his “simplified version of a paper entitled ‘Chaotic Radiative Forcing,
    Feedback Stripes, and the Overestimation of Climate Sensitivity’ he submitted on June 25, 2008 for publication in the Bulletin of the American Meteorological Society”, in Section 5, he says (see link): –

    http://www.weatherquestions.com/Climate-Sensitivity-Holy-Grail.htm

    Significantly, note that the feedback parameter line fitted to these data is virtually horizontal, with almost zero slope. Strictly speaking that would represent a borderline-unstable climate system. The same results were found no matter how deep the model ocean was assumed to be, or how frequently or infrequently the radiative forcing (cloud changes) occurred, or what the specified feedback was. What this means is that cloud variability in the climate system always causes temperature changes that “look like” a sensitive climate system, no matter what the true sensitivity is. This is a very significant result…it isn’t entirely new, since at least one previously published paper suggested it, but the authors of that study did not appreciate its importance.

    Reading this simplified version in conjunction with his power point presentation makes interesting reading.

  283. Bob S
    Posted Jul 22, 2008 at 9:53 PM | Permalink

    Geoff Larsen (282):

    Exactly what I was looking for.

    Thanks.

  284. Mark T
    Posted Jul 22, 2008 at 9:56 PM | Permalink

    #272 jae, the silly item was “a leading Australian scientist who spearheaded Australia’s carbon accounting program and was previously a strong proponent of AGW”. The relevance of his bio is that it supports none of those statements.

    Um, the only debatable portion is “a leading Australian scientist.” It reads pretty clear to me that he “started working as a consultant for the Australian Greenhouse Office where he wrote the software which the Australian Government uses to calculate its land-use carbon accounts for the Kyoto Protocol,” which sounds pretty much like someone who spearheaded their carbon accounting program (note that software is generally considered a program or part of one).

    Mark

  285. Steve McIntyre
    Posted Jul 22, 2008 at 10:28 PM | Permalink

    We’re having crash problems again. Three tonight.

  286. bender
    Posted Jul 22, 2008 at 10:28 PM | Permalink

    sounds pretty much like someone who spearheaded their carbon accounting program

    Sounds more like technical support from someone with some programming expertise. Not to discredit the guy. Just to say that I agree with bernie’s summary: his influence, though not insiginificant, was overstated.

    Someone may want to start a “Skeptic Watch” however. If temperatures continue on the current plateau, there will be increased skepticism, and it would be useful to track the various comings out.

  287. Mark T
    Posted Jul 23, 2008 at 12:53 AM | Permalink

    Nonsense. If he’s the one that wrote it, he spearheaded it. I wrote all of the code used in my latest radar system, and hence spearheaded it. Mincing words such as this is a ridiculous game, with a sole purpose of discrediting this guy anyway, i.e., it is nothing but an ad-hominem. Calling the guy “a leading scientist” is, as I mentioned, the only overstatement (albeit debatable, given the current crop of “leading scientists” in this particular field).

    Mark

  288. nevket240
    Posted Jul 23, 2008 at 2:45 AM | Permalink

    http://www.udonmap.com/udonthaniforum/viewtopic.php?p=111794#111794

    please read, take your false teeth out first. In market terms this is now a mania and we will possibly see a few sell-out performances soon.

    regards

  289. nevket240
    Posted Jul 23, 2008 at 2:59 AM | Permalink

    http://blogs.abc.net.au/sa/2008/07/interview-with.html?program=broken_hill_breakfast

    if you open this link you can listen to the Evans interview on the ABC.
    (if you have trouble understanding Orstraylian, watch Crocodile Dundee again)

    regards

  290. Filippo Turturici
    Posted Jul 23, 2008 at 5:08 AM | Permalink

    I have a question for anyone who could answer it.

    First of all, tell me if I am wrong or not: current Earth surface mean temperature is measured with an uncertainty of 1°C (±0.5°C); while, the anomalies of this mean temperature are currently measured with an uncertainty of 0.2°C (±0.1°C), but in the past we had larger errors (±0.3°C to ±0.2°C).

    Thus, if I am right: how can GISS or Hadley measures temperature anomalies in a more precise way that temperature itself, without using other instruments than the ones that already measures such mean temperature (better, that make the measures by which we can calculate a mean temperature)?
    What are the mathematical instruments they use? Where can I find them? Who ever did a review of these calculations or even “invented” such methods (who is not a GISS, NOAA, Hadley nor IPCC or similars member)?

  291. Posted Jul 23, 2008 at 5:38 AM | Permalink

    Here is the latest hurricane path predictions from the webcomic xkcd.com

  292. Pat Keating
    Posted Jul 23, 2008 at 10:38 AM | Permalink

    290 Fillippo

    The argument is the following: If you have N measurements of the same given quantity with independent random errors and determine the average value of the measurements, the % error will be reduced by a factor equal to the square root of N.

    That is the mathematical tool. It depends on “independent” and “random” (and perhaps “the same”). That is where the debate regarding the validity of using this tool on the global temperature data lies.

  293. TerryBixler
    Posted Jul 23, 2008 at 11:00 AM | Permalink

    Filippo Turturici,Pat Keating
    Saying it very clearly, the global temperature is not random nor are the individual site temperatures used to make up that temperature average. Further the errors are not random, poor siting is not random. Poor sampling is not random. One cannot improve poor experimental data with inappropriate mathematical tools.

  294. Posted Jul 23, 2008 at 11:32 AM | Permalink

    “here is an old joke that goes something like this: The easiest way to starve a government bureaucrat is to hide his lunch money under his work boots. From the files of the National Climate data Center and the Kristen Byrnes Science Foundation comes evidence that this may not be just a joke.

    Climate Scientists at the National Climate Data Center noticed an anomaly in the temperature record of the Durham, New Hampshire temperature station (USHCN # 272174). When compared to the local Climate Reference Network station, scientists observed that during the summer, Durham was over 1 degree warmer. During the winter, it was slightly cooler.”

    http://ncwatch.typepad.com/media/2008/07/the-climate-bur.html

    h/t: smalldeadanimals.com

  295. jae
    Posted Jul 23, 2008 at 12:08 PM | Permalink

    LOL. Fighting fire with fire!

  296. Nick
    Posted Jul 23, 2008 at 3:56 PM | Permalink

    Two fascinating news stories from 1963 and 1948, respectively.

    Very pertinent to discussions today, I feel.

  297. Pat Keating
    Posted Jul 23, 2008 at 4:03 PM | Permalink

    293 Terry
    I tried to explain the situation as objectively as possible, leaving others more familiar with the data to address its randomness and independence, which I agree is suspect.

  298. bernie
    Posted Jul 23, 2008 at 4:05 PM | Permalink

    Does anyone have references to research on the use of intraday temperature patterns to identify the relative size of different feedback mechanisms?

  299. Steve McIntyre
    Posted Jul 23, 2008 at 4:15 PM | Permalink

    #298. Bernie, I noticed that you’ve defended my Ofcom posts elsewhere. I try to make these posts accurate and quote at length from original sources – far more than anyone else. The usual culprits all seem to hate the posts, but I haven’t seen anyone specify what, if anything, I said that was incorrect. IT’s quite amazing, isn’t it.

  300. Russ S
    Posted Jul 23, 2008 at 5:05 PM | Permalink

    Steve,

    I hope this is not old news. I see that Mann has a new book out Dire Predictions, Understanding Global Warming. I found part of the Penn State PR interview hilarious, given the number of global warming myths promoted by Mann.

    “The most fun for me was collecting the misinformation out there and debunking the myths surrounding global warming,” says Kump. Mann also thought finding and debunking the myths was fun.

    These myths include the idea that carbon dioxide is causing the holes in the ozone, that the increase in carbon dioxide is the result of natural cycles, and the possibility that our atmosphere is not warming at all. The authors consider each myth or misunderstanding and explain any kernel of truth within and why the myths are untrue.

    It might be interesting to start a thread as we examine of the myths Mann and Kump. Book details here: http://live.psu.edu/story/33704

  301. bender
    Posted Jul 23, 2008 at 5:14 PM | Permalink

    Here’s another myth – propagated in the individual responses to the Ofcom decision: response to CO2 is monotonic increasing, such that past GHG warming will continue into the future, as negative convective & cloud feedbacks fail to kick in to mitigate the coming disaster. That is why they must avoid at all costs the topics of Spencer and Lindzen. Their work, and life experience, could serve as the source of another documentary. In fact. several of the individuals responding to Ofcom – Santer, Wunsch, among them – hinted at the richness of the material already there, ready for presenting to the world.

  302. Philip_B
    Posted Jul 23, 2008 at 6:05 PM | Permalink

    Bernie, Jonathan Lowe at a Gust of Hot Air has many analyses of intraday temperatures, with some really interesting results. Unfortunately, the awful blogger/blogspot software makes searching the site difficult. I suggest accessing via google.

  303. Gerald Machnee
    Posted Jul 23, 2008 at 7:02 PM | Permalink

    Re #296 **Two fascinating news stories from 1963 and 1948, respectively.**
    Yes, temperatures had increased by 10 degrees since 1912 between Norway, Spitzbergen, and northeast Greenland. That could be unprecedennntttted.

  304. bernie
    Posted Jul 23, 2008 at 8:07 PM | Permalink

    Steve:
    I hesitate to say anything in support of someone who is so adept at defending himself – it also seems a bit presumptious and sycophantic. However, it also seems inappropriate to let things that Michael or anyone else says that is patently inaccurate go unchecked. I am still enormously puzzled by so many sophisticated and undeniably smart people arguing by assuming the worse about their opponents when there is no evidence to support that presumption. Their misread of what you actually said and the report itself has to be driven by assumptions about what they think they should be seeing or reading. IMHO Your analysis of the Ofcom report is QC quality brief!

  305. Allan MacRae
    Posted Jul 24, 2008 at 4:51 AM | Permalink

    Some historical perspective, excerpted from E&E ~2005:

    Any scientist who dares challenge the Kyoto Protocol faces a vicious assault, a turf war launched by the pro-Kyoto gang.

    These pro-Kyoto attacks are not merely unprofessional – often of little scientific merit, they are intended to intimidate and silence real academic debate on the Kyoto Protocol, a global treaty to limit the production of greenhouse gases like CO2 that allegedly cause catastrophic global warming.

    Witness the attack on Bjorn Lomborg, author of “The Skeptical Environmentalist”. While Lomborg did not challenge the flawed science of Kyoto, he said that Kyoto was a huge misallocation of funds that should be dedicated to more important uses – such as cleaning up contaminated drinking water that kills millions of children every year in the developing world.

    In January 2003, the Danish Committees on Scientific Dishonesty (DCSD) declared that Lomborg’s book fell within the concept of “objective scientific dishonesty”. The DCSD made the ruling public at a press conference and published it on the internet, without giving Lomborg the opportunity to respond prior to publication.

    In December 2003, The Danish Ministry of Science, Technology and Innovation repudiated the DCSD’s findings. The Ministry characterized the treatment of the Lomborg case as “dissatisfactory”, “deserving criticism” and “emotional”, a scathing rebuttal of the DCSD.

    But such bullying is not unique, as other researchers who challenged the scientific basis of Kyoto have learned.

    Of particular sensitivity to the pro-Kyoto gang is the “hockey stick” temperature curve of 1000 to 2000 AD, as proposed by Michael Mann of University of Virginia and co-authors in Nature.

    Mann’s hockey stick indicates that temperatures fell only slightly from 1000 to 1900 AD, after which temperatures increased sharply as a result of manmade increases in atmospheric CO2. Mann concluded: “Our results suggest that the latter 20th century is anomalous in the context of at least the past millennium. The 1990s was the warmest decade, and 1998 the warmest year, at moderately high levels of confidence.”

    Mann’s conclusion is the cornerstone of the scientific case supporting Kyoto. However, Mann is incorrect.

    In 2003, Willie Soon and Sallie Baliunas of the Harvard-Smithsonian Center for Astrophysics and co-authors wrote a review of over 250 research papers that concluded that the Medieval Warm Period and Little Ice Age were true climatic anomalies with world-wide imprints – contradicting Mann’s hockey stick and undermining the basis of Kyoto. Soon et al were then attacked in EOS, the journal of the American Geophysical Union.

    Also in 2003, University of Ottawa geology professor Jan Veizer and Israeli astrophysicist Nir Shaviv concluded that temperatures over the past 500 million years correlate with changes in cosmic ray intensity as Earth moves in and out of the spiral arms of the Milky Way. The geologic record showed no correlation between atmospheric CO2 concentrations and temperatures, even though prehistoric CO2 levels were often many times today’s levels. Veizer and Shaviv also received “special attention” from EOS.

    In both cases, the attacks were unprofessional – first, these critiques should have been launched in the journals that published the original papers, not in EOS. Also, the victims of these attacks were not given advanced notice, nor were they were given the opportunity to respond in the same issue. In both cases the victims had to wait months for their rebuttals to be published, while the specious attacks were circulated by the pro-Kyoto camp.

    [end of quote]

    Fast forward:

    McIntyre and McKitrick and the Wegman inquiry thoroughly discredited Mann’s hockey stick. Mann’s data was not made readily available to Steve. Then there was Mann’s “censored” file, and other tidbits of “Mann-made global warming”.

    The Channel 4 controversy is the latest chapter in the attempt to stifle debate on the science of global warming, and it too has failed.

    James Hansen of NASA now wants to jail those who speak out against his global warming hysteria.

    And no wonder – the Earth is getting colder, much colder, about 0.7 degrees C since January 2007. This cooling reverses all the global warming since ~1940. With the recent shift in the Pacific Decadal Oscillation to its cold phase, global cooling should last another 30 to 40 years.

    If the warming hysterics do not succeed very soon, they have absolutely no chance in a cooling world.

    Regards, Allan

  306. Filippo Turturici
    Posted Jul 24, 2008 at 6:18 AM | Permalink

    #292, #293: Pat, thank you, anyway I know what you mean: but you did not answer my question. That is not simply how uncertainty is fixed, but why if I got the same instruments, the same measurements and the same sampling, I then have two very different uncertainties even if they should be related to the same thing.

    And I agree with Terry: since what I have seen until now, in both contemporary and historical temperature anomalies reconstructions, calculating uncertainties misses almost completely “B-class” uncertainties and systematic effects, making it simply related to statistically-calculated (which are, we have to remember, calculated on a statistical basis, but must not be used as e.g. a Gaussian distribution) “A-class” uncertainty, that is completely out of every technical and scientifical normative in both Europe and America (I hope to have used the right words, I studied such things just in Italian not in English).
    Which would mean, in my opinion, we are underestimating uncertainty, maybe largely: it does not mean the global warming is a fake, it means that maybe we cannot measure such warming related to the entire Earth surface (nor a cooling), nor have any real reconstruction of past temperatures.

    But the main matter is the same: why, if I measure the mean temperature at ±0.5°C, I can measure temperature anomalies at just ±0.1°C?
    And even if it is possible, has it any meaning? If my uncertainty belt is as large as 1°C, an anomaly of 0.5°C could mean absolutely nothing.
    These are questions, not statements.

  307. L Nettles
    Posted Jul 24, 2008 at 8:44 AM | Permalink

    Blogger Bill Hennessey publicly calls out Hansen as a hoaxer.

    Dr. Hansen purposely and with malice of forethought manipulates actual temperature observations in order to perpetuate a global warming hoax.

    If I’m wrong, he can sue me. But he won’t, because he’s a fraud

    That’s extremely intemperate language of a kind not tolerated by this blog, but Hennessey seems determined to instigate a confrontation.

  308. harold
    Posted Jul 24, 2008 at 8:52 AM | Permalink

    Bill’s logic is that if Hansen does not sue, Hansen admits being a liar and a fraud? Bill Hennessey is acting like a big baby.

  309. Barney Frank
    Posted Jul 24, 2008 at 9:11 AM | Permalink

    As I read somewhere else, perhaps its Hennessey’s attempt at gaining access to Hansen’s code and data; discovery.

  310. John Lang
    Posted Jul 24, 2008 at 9:51 AM | Permalink

    RealClimate posted up a review on Moncton’s paper this morning. Again, I cannot make heads nor tails out of the review in between all the strawman arguments RealClimate always uses, but one of the posts I thought was interesting.

    What is the accurate figure for the change in C per W/m2 increase?

    I have now heard figures from 0.260C / W/m2 from Moncton to as high as 1.0C / W/m2.

    [Response: You are confusing different numbers with the same units. What are you looking for? Overall climate sensitivity is around 0.7 C/W/m2 (range from 0.5 to 1); the no-feedback change (which is just a theoretical estimate) is around 0.3 C/W/m2. Different things. - gavin]
    [Continued] Is the forcing then

    – 3.7-4.0 W/m2 for CO2 alone; and then,

    – roughly 9.0 W/m2 including all feedbacks?

    [Response: No. The forcing is what drives the temperature changes, which drive the feedbacks. The multipliers generally apply to the temperature change. - gavin]

    I always thought a Watt per metre squared is a W/m2 is a W/m2 and it is each W/m2 (of forcings and feedbacks – I hate this terminology, they use it to confuse people mainly) which drives the temperature change.

    Even the IPCC charts it like the poster was trying to do and opposite to Gavin’s response.

    http://www.grida.no/climate/ipcc_tar/wg1/fig9-13.htm

    Any thoughts on this? Is Gavin having a bad day or is there really some kind of math error perpetuated by the climate models (not unlike what Moncton states)?

  311. Real Richard Sharpe
    Posted Jul 24, 2008 at 1:48 PM | Permalink

    Climate Change in Kansas City: A Guest Weblog By Dr. Lynwood Yarbrough

    But, but, says the IPCC, we’re no longer in Kansas!

  312. Steve McIntyre
    Posted Jul 24, 2008 at 9:51 PM | Permalink

    PRoblems at the server end affecting multiple sites.

  313. DeWitt Payne
    Posted Jul 25, 2008 at 12:58 AM | Permalink

    John Lang,

    Gavin is correct in terms of the IPCC definition of forcing. Forcing is the Outgoing Longwave Radiation deficit at the tropopause from an instantaneous increase in a greenhouse gas. For the purposes of calculation, the stratosphere is allowed to equilibrate to the change before the forcing is calculated. The climate sensitivity refers to what happens at the surface and through the troposphere to make up that deficit. The value of the climate sensitivity is independent of the forcing.

  314. Follow the Money
    Posted Jul 25, 2008 at 5:01 PM | Permalink

    Lang, #310

    The fellow just wanted clear answers. But that site is full of obfuscation an epithets that persons enamoured with the scientific method are under the thrall of “Exxon-Mobil” or “Rush Limbaugh.” Eek! He won’t get any. Also “peer-reviewed” is repetitvely invoked as a synonym for “true.”

    But Gavin is correct about Monckton in one matter, and it is interesting.

    [Monckton's] derivations and discussions of the no-feedback sensitivity and feedbacks is extremely opaque (a much better description is given on the first couple of pages of Hansen et al, 1984)). His discussion of the forcings in that paper are wrong (it’s 4.0 W/m2 for 2xCO2 (p135), not 4.8 W/m2), and the no-feedback temperature change is 1.2 (Hansen et al, 1988, p9360),

    I think it ironic to charge AGW dissenters with opacity, but he is right, sort of, about 4.8 v. 4.0 in Hansen, et al. 1984. I see how Monckton could make the “mistake” because both numbers, or 4.8 and “about 4.0″ are in consecutive sentences.

    The 2 percent So [solar irradiance] change corresponds to a forcing of 4.8 W m-2. The initial radiative imbalance at the top of the atmosphere due to doubling CO2 is only ~2.5 W m-2, but after the CO2 cools the stratosphere (within a few months) the global mean radiative forcing is about 4 W m-2 (Fig. 4, Hansen et al. 1981).

    http://pubs.giss.nasa.gov/docs/1984/1984_Hansen_etal_1.pdf

    There’s that “about 4 W m-2″ again. Here, in Hansen, it talks about doubling and a change after the vague “within a few months.” But in AR1 the “approximately 4 W m-2″ (which I mention in a post above) expressly refers to an “instantaneous doubling of CO2″. AR2 emphasizes that the numbers for doubling mentioned in AR1 were for “instantaneous” doubling.

    I think Monckton might have mistakenly blurred the propinquitous “4.8” and “about 4.0.” Perhaps with the undue unconcious expectation that such vagueness such as “about 4.0″ could not be the basis from a seminal paper relied upon by so much later climate science.

    Above I posted how I “found” the beginnings of “about 4 W m-2″ in the AR1 (1990) scientific assessment. Steve corrected me that the “about 4.0″ is traceable to a Ramanathan article from the 1970’s which DeWitt suggests is likely the “3.9” footnoted to an “unpublished manuscript.”

    Gavin cites “4.0” against Monckton as the correct figure from Hansen 1984. Does it come from Ramanathan? Hansen 1984 says “about 4 W m-2″ but provides no science to back it. Instead he cites it to another article, Hansen, et al. 1981. But another problem. Hansen 1981 doesn’t explain or give a source for the figure either. At page 959 in Science, 28 Aug. 1981, there is a discussion of oceans, horizontal heat fluxes between oceans and continents, then the appearance of “4”

    The net flux into ocean surface is therefore larger than it would be for a 100 percent ocean-covered planet by the ratio of global area to ocean area, totaling ~5.7 W m-2 for doubled CO2 rather than ~4 W m-2.

    I suppose the ~5.7 is inferred from the ~4 and a planet ~70% covered with ocean, but there is no explanation for the ~4 in the text, and no citation that would allegedly support it. Nada. Yet Hansen 1984 is treated as one of the Holy of Holies of AGW science. Oh, it was “peer reviewed!”

    Steve called the “about 4″ “a rule of thumb” above. I suppose it is, but to my way of thinking it is more. The “about 4 W m-2″ is not only a repetition derived from a text in the relatively ancient years of this science, but a holy figure within which one must fit all approaches into. Instantaneous doubling, doubling considered the atmosphere plus time, vague time spans, mixed level models, etc. all remarkably, and without explication and specious, are forced to apply or unconvincingly affirm “about 4.” Now that’s some forcing.

    And in what I’ve read there is nothing to dissuade me from believing there is no straight up physics, no “engineering quality” paper, and all the relied upon papers, directly or indirectly, posit assumptions about doubling that derives in part from assumptions about alleged global warming since the 1700’s, and inferences such must have been mostly caused by increasing CO2 levels.

    All this won’t be new, but I found this all interesting given the currentReal Climate’s jumping on Monckton referenced above. Monckton himself quietly noted the reported science is opaque, his “4.8” is anyway “about” 4.0, and the peer reviewed Hansen articles cited against him are extremely opaque on this matter too.

    Steve: Ramanathan contains a derivation of 4 wm-2, which, as far as I can tell, is the origin of the tradition. It is cited in some CDIAC articles in the early and mid 1980s and thus becomes part of the tradition at the time of IPCC 1. If I could make another clone of myself, it would be very worthwhile discussing Ramanathan’s articles from the 1970s. Ramanathan is non-trivial.

  315. Follow the Money
    Posted Jul 25, 2008 at 5:19 PM | Permalink

    BTW, Monckton wrote

    based on Hansen (1984), who had estimated a range 1.2-1.3

    Which is correct. That is what Hansen 1984 says.

    Yet RC thinks this deserves specific opprobium

    and the no-feedback temperature change is 1.2 (Hansen et al, 1988, p9360),

    RC cites Hansen’s 1988 paper to refute Monckton’s correct quotation from Hansen’s 1984 paper. Interesting technique. What does Hansen 1988 say? At p9360 a “1.2x” appears in an equation. In the text at fig. 2 the number “1.25C” is stated.

    Can’t a Viscount get a break?

    based on Hansen (1984), who had estimated a range 1.2-1.3

    Which is correct. That is what Hansen 1984 says.

  316. Pat Keating
    Posted Jul 25, 2008 at 6:06 PM | Permalink

    Follow

    For what it’s worth, I did a line-by-line calculation and came up with the same number, around 1.25C, for the ‘bare’ climate sensitivity as Hansen gave. The ‘bare’ means it is not ‘dressed’ by any feedback, either positive or negative.

    I’d like to see a straightforward GCM modeling which forgets about any feedback and uses the 1.2C number, instead of the inflated numbers usually employed.

  317. DeWitt Payne
    Posted Jul 25, 2008 at 6:15 PM | Permalink

    I’m surprised that Gavin didn’t reference his and Hansen’s more recent work. According to Hansen et.al., 2005, Efficacy of Climate Forcings, JGR (available online see link in post 202 above), the IPCC definition of forcing is forcing at the top of the atmosphere from an instantaneous change but allowing the stratosphere to adjust called Fa. The global value for that calculated using GISS model III, the current (2005) version of Model E, for 2X CO2 (291 to 382 ppmv) in the above paper is 4.12 W/m2. The charts in #202 are for Fi, not Fa. Fi is the forcing at the tropopause with no adjustment.

  318. Raven
    Posted Jul 25, 2008 at 6:45 PM | Permalink

    Pat Keating says:

    I’d like to see a straightforward GCM modeling which forgets about any feedback and uses the 1.2C number, instead of the inflated numbers usually employed.

    My understanding is the higher sensitivity is an output of the models rather than an input and the real issue is whether the models they properly model all of the things that affect the sensitivity such as the effect of clouds. I find it hard to believe that a SWAG made 30 years happens to match the output of these multi-million dollar models. That leads me to beleive that the various parameters affecting sensitivity were tweeked until the models produced the expected sensitivity.

  319. Posted Jul 25, 2008 at 8:06 PM | Permalink

    URL Window Icon question:

    When was the multicolored spaghetti/hockey stick graph added? It’s hilarious. I noticed it first yesterday, or was it earlier?, but didn’t take a closer look-face to screen-until today.

    I’ll use the information to model a prediction of when I should see an eye doc in the future.

  320. Pat Keating
    Posted Jul 25, 2008 at 9:34 PM | Permalink

    318 Raven

    I am not an expert on the GCMs but I believe that the answer is both ‘yes’ and ‘no’.

    The sensitivity is in a sense an output of the models, but there are many adjustable-parameter inputs which determine that value. The crude parameterizations of clouds and moist convection, which have a strong impact on the climate sensitivity that is present, are such as to give high values, such as 3C.

  321. DeWitt Payne
    Posted Jul 25, 2008 at 10:07 PM | Permalink

    Pat Keating,

    There’s also an interaction with energy storage system time constants. A long ocean heat sink time constant, Hansen’s so-called pipeline, leads to high climate sensitivity and conversely. You still get the same current temperature trend but the future behavior is different.

  322. David Archibald
    Posted Jul 25, 2008 at 10:09 PM | Permalink

    For Sun worshipers, there is a solar update at Icecap and Lubos Motl

    snip

    Steve- please discuss these matters at those other sites.

  323. Liselle
    Posted Jul 26, 2008 at 5:30 AM | Permalink

    Gerry Morrow:

    Hear, hear! Einstein was famous for saying, when informed that many did not believe his theory of relativitiy, that it would only take one of them to prove it wrong. That attitude is sorely missing among members of the Team, from whom we get snarky comments like “Why should I give you the data when your only purpose is to find something wrong with it?” (Forgive me if this isn’t the exact quote; I’m going from memory.)

    Sorry, guys, but science is all about falsification. If there is no method of falsification provided for a hypothesis, it isn’t science. It’s that simple.

  324. Pat Keating
    Posted Jul 26, 2008 at 7:48 AM | Permalink

    321 DeWitt
    Very good point.

  325. John Lang
    Posted Jul 26, 2008 at 8:19 AM | Permalink

    I recently went back and looked at Hansen’s 1988 CGM predictions using 1958 as the Base Year which is what Hansen says he was trying to do.

    http://pubs.giss.nasa.gov/docs/1988/1988_Hansen_etal.pdf

    Temps in the last 12 months are up 0.3C since 1958.

    Scenario C predicted an increase of 0.65C

    Scenario B predicted an increase of 0.85C

    Scenario C predicted an increase of 1.0C

    GHG increases ended up somewhere between B and C, probably closer to B.

    Clearly, the 1988 model over-estimates the temperature response and the predicted feedbacks have not appeared as YET. To date, Moncton is much closer to the answer than Hansen.

    But this is the problem with the climate sensitivity estimates currently. The warmers say there is still “warming in the pipeline” – that oceans are storing some of the excess energy/temperature – that aerosols masked some of the increase until the late 1980s.

    In other words, we have to wait several more decades before we can say whether the climate sensitivity estimates of Moncton or Hansen are right.

    I don’t know whether we will ever be able to answer the question because the warmers will always be able to say there is “more warming in the pipeline” to come (although Hansen’s Scenario C results seem to indicate that warming in the pipeline only raises temps by a further 0.1C before stabilizing in a few years.)

    But we can also say that Hansen’s 1988 CGM model was very poor in modeling the climate. We can also say he has not changed his basic theoritical model very much from 1988 and the climate sensitivity figures he produces from the current “much better models” are nearly the same as well.

  326. kim
    Posted Jul 26, 2008 at 8:54 AM | Permalink

    324 (John Lang) With the very recent drop in sea level as measured by Aqua, the argument that pipeline heat is in the ocean is weakened since thermal contraction seems the best explanation for the drop. This conundrum provoked Kevin Trenberth’s famous comment to NPR to the effect that maybe the extra heat has been radiated back out to space. He’s right, you know.
    ============================================================

  327. Steve McIntyre
    Posted Jul 26, 2008 at 9:18 AM | Permalink

    I’m away for a couple of days. Everyone please behave.

  328. Gerald Machnee
    Posted Jul 26, 2008 at 9:44 AM | Permalink

    Re #324 – **But this is the problem with the climate sensitivity estimates currently. The warmers say there is still “warming in the pipeline” – that oceans are storing some of the excess energy/temperature – that aerosols masked some of the increase until the late 1980s.**
    I do not see how they can be storing any excess. The ocean temps have been either steady or down since 2003. Maybe we could get a release of heat as in 1998, which could show some “warming” but would cool the oceans.

  329. BarryW
    Posted Jul 26, 2008 at 10:33 AM | Permalink

    Re 327

    Sea level rise also appears to have stopped (and maybe even falling) since 2006. If the oceans were storing heat you would expect a rise due to expansion and a falling sea level would be a sign of release, no? Are the present dip in temps being moderated by ocean heat release?

  330. David Smith
    Posted Jul 26, 2008 at 10:43 AM | Permalink

    I’m intrigued by a cold nose of water in the tropical east-central Pacific. Below is a comparison of the current subsurface temperature anomaly as of yesterday, versus a month ago. The plots are are subsurface cross-sections at 140W, from 20N to 20S.

    The cold nose has risen upwards over the last 30 days and is currently only 35 meters (100 feet) below the surface.

    I wonder what drives this – I assume it is wind-related and perhaps it is a part of the negative-phase PDO behavior. Or, it may simply be some random behavior which never affects the surface. If it does ultimately affect the surface then it may serve to reinforce the trade winds, which may affect ENSO. That’s simply conhjecture on my part.

    At this point it’s something to occasionally check.

  331. Dave Clarke
    Posted Jul 26, 2008 at 12:06 PM | Permalink

    #53
    Sure, I can help with Peter O’Neill’s “problem” with George Monbiot. For those who want to follow along, here’s Monbiot’s original article:

    http://www.monbiot.com/archives/2005/05/10/junk-science/#more-930

    Three years ago, with his characteristic rigorous research and references, Monbiot showed how botanist David Bellamy’s wildly mistaken claim that *since 1980* “555 of 625 glaciers” were advancing, was derived from a mistaken (not to mention 16 years out of date) claim from Fred Singer. Now Peter O’Neill has shown that Singer (or someone else before him) may have misunderstood a valid published study that stated that *55% of 446* glaciers in the European Alps advanced *between 1960-1980*. So Bellamy changed 55% into 555, and of course got both the time period and number of glaciers completely wrong too (as did Singer).

    O’Neill, and presumably Steve, believe that Monbiot should issue an “acknowledgment”, or better yet, a “correction” that Bellamy’s utterly wrong figures “are not fraudulent figures based on a non-existent data set.”

    You can’t be serious. Or, as it says on top, “You can’t make this stuff up.”

    And, by the way, there is still no correction from either Singer or Bellamy after all this time. Although Singer has removed his page on glaciers that Monbiot referenced, the mistaken claim remains here in a comment on another story:

    http://www.sepp.org/Archive/controv/controversies/afp.html

    Nor is there any correction from Junkscience.com, which still has the Singer howler in at least four places (yes, Monbiot provided this reference too, in footnote 17, despite O’Neill’s claim to the contrary):

    http://www.junkscience.com/nov98/moore.htm

  332. Gerald Machnee
    Posted Jul 26, 2008 at 12:23 PM | Permalink

    Re #328 – Oceans are not my field. My first thought is that if the oceans are cooling then there is a net heat loss. I expect that to me mostly radiation, as that is how they also receive most of the heating (not from the atmosphere.) The atmosphere may capture some of the radiant energy, but most of is gone to space. Now if there is a net loss of energy in the oceans, is it due to less received or more loss due to (clearer skies)?. That I cannot answer, but that is one for study.

  333. Jonathan Schafer
    Posted Jul 26, 2008 at 2:03 PM | Permalink

    #329

    Kelvin Wave? I believe those are wind driven.

  334. DeWitt Payne
    Posted Jul 26, 2008 at 2:32 PM | Permalink

    kim et. al.,

    Roger Pielke, Sr. has always recommended ocean heat content as the best measure of overall climate forcing. It’s a direct measure of radiative balance. The problem right now is that only recently has a system been deployed that can give a reasonable measure and it may still have teething problems. It took quite a while to work the bugs and systematic errors out of the satellite temperature calculations. We have already seen one correction in the ocean heat content calculation. Still, an independent confirmation by sea level measurement gives some confidence that there is in fact no significant pipeline to worry about. That would likely make a climate sensitivity of 3 C for doubling CO2 the upper end of the range rather than the middle. It also makes near term cooling much more probable.

  335. Peter O'Neill
    Posted Jul 26, 2008 at 3:34 PM | Permalink

    #69
    As I said, “while not impressed by the care taken by either Bellamy or Singer in citing these figures”, Monbiot’s unnecessary sneer “You must, if you are David Bellamy, embrace instead the claims of an eccentric former architect, which are based on what appears to be a non-existent data set” is based on a suggestion that these figures come from a non-existent data set, which, if true, would mean that these were fraudulent figures. As Dave Clarke acknowledges, they do in fact come from a valid study, Monbiot’s suggestion of a non-existent data set is clearly and demonstrably false, and in my view, for that reason, should be withdrawn. I provided the abstract from Wood’s paper, and detailed how it had been misquoted by Singer, further compounded by Bellamy’s typing error (as Monbiot explained “on the standard English keyboard, 5 and % occupy the same key. If you try to hit %, but fail to press shift, you get 555, instead of 55%”). But these errors do not alter the fact that Monbiot’s suggestion above is untrue.

    Dave Clarke is right, Monbiot did provide a junkscience.com reference in footnote 14 (not 17). I can only assume that in scrolling I failed to notice it and saw only the other two urls in the remaining two lines of the footnote. I need not have bothered searching for (another) link on junkscience to confirm that Monbiot was correct in claiming that Singer’s figures were carried on junkscience.com

    I remain unimpressed by Monbiot’s “characteristic rigorous research and references” if, as his article would indicate, he failed to ask Singer for a complete reference for the mysterious paper, and overlooked the obvious possibility that the unsatisfactory reference was only a hazy recollection ten years on. I would certainly have tried a Google search first, rather than going “through every edition of Science published in 1989, both manually and electronically”. And references 3, 4, 17 and 18 cannot all be 5th May 2005 unless the chronology of events in his article is inaccurate. Characteristic rigorous research and references?

    And a final niggle for Dave (totally irrelevant to the substance of this post, but nevertheless a response should accurately quote the original): the abstract I quoted indicated a change in the percentage of advancing glaciers:

    Between 1960 and 1980, on the basis of data for about 400 to 450 glaciers observed each year, advancing glaciers are shown to have increased from about 6% of observed glaciers to 55%.

    not

    a valid published study that stated that *55% of 446* glaciers in the European Alps advanced *between 1960-1980*

  336. Posted Jul 26, 2008 at 4:53 PM | Permalink

    Re #70

    It appears that it’s Friends of Science who are making this claim, perhaps you should ask them to correct it?

    http://www.friendsofscience.org/index.php?id=116

  337. Dave Clarke
    Posted Jul 26, 2008 at 5:12 PM | Permalink

    #70

    You can, evidently, make this stuff up.

    Ross,
    As I’ve already stated, it was FOS, who apparently can and did “make this stuff up” and claimed that Joe Barton sent his letter to Michael Mann “in response” to testimony from you and Tim Ball and the viewing of the film “Climate Catastrophe Cancelled”. You should take the matter up with the Friends of Science, not me.

    See: http://www.friendsofscience.org/index.php?id=116

    I’m sure FOS will correct the error as soon as possible. But certainly I’ll be even more vigilant in examining their claims in future. It’s clear to me now that nothing they say can be believed. For example, it has come to my attention that FOS claimed in 2006 that they do not “employ lobbyists”, yet the federal Canadian registry shows that Morten Paulsen and Bryan Thomas of Fleishman-Hillard were registered to lobby on behalf of the Friends of Science from March through mid-August, 2006.

    I do stand by my assertion that the production and dissemination of the FOS CCC film, in which you appeared, was orchestrated by APCO Worldwide, which received in excess of $100,000 from Barry Cooper’s “Climate Change” research fund at the University of Calgary to “provide strategic communications services” for the Barry Cooper’s project. These services included “advice regarding video production, promotion of the video, distribution of the video, media relations services and other services”, according to a U of C investigation. The report can be viewed here:

    http://sourcewatch.org/images/4/4b/U_of_C_Auditor%27s_Report_April_14_2008.pdf

    You were listed on the 2002 FOS website as one of four “professional contacts”, the precursors of FOS Scientific Advisors, along with Tim Ball. But your name may well have been removed by 2005, and given what we now know about FOS, perhaps you were unaware that you were listed on their website in the first place.

    Good luck with getting FOS to set the record straight.

  338. Dave Clarke
    Posted Jul 26, 2008 at 5:29 PM | Permalink

    #69
    Peter,
    I believe you have demonstrated the ultimate genesis of Bellamy’s and Singer’s flagrant errors. This points to gross incompetence and extreme bias, rather than deliberate malice, as the main drivers in this matter, as with much misguided “skeptic” material.

    But Monbiot was quite correct – there is no data set that demonstrates advancing glaciers since 1980. Thus it’s exceedingly bizarre that you keep calling for Monbiot to issue a correction, and not Bellamy, Singer, Milloy et al. Uh, Steve, you can jump in any time – surely you’re not on board with this.

  339. joy
    Posted Jul 26, 2008 at 6:40 PM | Permalink

    DaveC: (tip)
    Ask politely. A demand is simply a dare to do the opposite.

  340. Ross McKitrick
    Posted Jul 26, 2008 at 11:11 PM | Permalink

    I have let FOS know that the web page in question has erroneous material on it, and I am sure it will be rectified promptly. I have met some of the people who started FOS, and I’ve always found them to be decent, sincere people. They arranged for me to do a talk in Calgary last year on the T3 tax, and they were good hosts.

  341. Real Richard Sharpe
    Posted Jul 26, 2008 at 11:20 PM | Permalink

    OK, now I know we are doomed. Global warming will cause an explosion in the kitten population.

  342. Mike N
    Posted Jul 27, 2008 at 12:27 AM | Permalink

    Hi Dave, re: #74

    “But Monbiot was quite correct – there is no data set that demonstrates advancing glaciers since 1980″

    Here are a couple of advancing glaciers in the Alps to start you out with: (from the NSIDC database)
    A4J131IS009
    A4J131IS031
    A4J131IS033
    A4J131IS034
    A4J131IS043
    A4J131IS046
    A4J131IS102
    A4J131IS107
    A4J131IS110
    A4J131IS112
    A4J131IS121
    A4J131LI014
    A4J131LI022
    A4J131MO023
    A4J131MO026
    A4J131MO043
    A4J131MO045
    A4J143FA006
    A4J143FA014
    A4J143FA015
    A4J143FA020
    A4J143OE015
    A4J143OE018
    A4J143OE019
    A4J143OE023
    A4J143OE034
    A4J143OE058
    A4J143OE059
    A4J143OE064
    A4J143OE068
    A4J143OE069
    A4J143OE070
    A4J143OE075
    A4J143OE076
    A4J143OE090
    A4J143OE094
    A4J143OE096
    A4J143OE099
    A4J143OE101
    A4J143OE106
    A4J143OE108
    A4J143OE113
    A4J143OE117
    A4J143OE126
    A4J143OE127
    A4J143OE128
    A4J143OE144
    A4J143PI015
    A4J143PI018
    A4J143PI019
    A4J143SA050
    A4J143SA051
    A4J143SA053
    A4J143SA054
    A4J143SA056
    A4J143SA058
    A4J143SA063
    A4J143SA066
    A4J143SA068
    A4J143SA069
    A4J143SA071
    A4J143SA073
    A4J143SA092
    A4J143SA105
    A4J143SA117
    A4J143SA124
    A4J143SA125
    A4J143SA130
    A4J143SA131
    A4J143SA132
    A4J143SA152
    A4J143SA154
    A4J143SI002
    A4J143SI039
    A4J143SI044
    A4J143SI046
    A4J143SI056
    A4J143SI058
    A4J143SI060
    A4J143SN014
    A4J143SN020
    A4J143SN045
    A4J143ZI003
    A4J143ZI008
    A4J143ZI015
    A4J143ZI016

  343. kim
    Posted Jul 27, 2008 at 4:57 AM | Permalink

    332 (DeW P.) I wish I knew who said ‘The climate is the continuation of the ocean by other means’.
    ========================================================

  344. Bob Koss
    Posted Jul 27, 2008 at 6:22 AM | Permalink

    Steve,

    The URL to Roger Pielke Sr. in the left-side frame no longer goes to the proper page. The correct URL is now http://climatesci.org/

  345. Posted Jul 27, 2008 at 6:42 AM | Permalink

    Re #329

    That’s the normal process of upwelling which happens in the Easter Pacific. It is suppressed by El Nino, but when the Boy’s not in Town, it brings cold water and nutrients to the fish and hence the fishermen of Peru.

    It means that the domination of El Nino since 1998 is over, and La Nina or neutral conditions should dominate, especially with the PDO in a negative (ie cold) phase.

    In Alaska, they’ve had a rotten summer, setting new records for fewest number of days above 65F, and its the same phenomenon of the PDO being in a negative phase.

  346. Real Richard Sharpe
    Posted Jul 27, 2008 at 8:36 AM | Permalink

    Kim, that was Santa Claus!

  347. kim
    Posted Jul 27, 2008 at 9:22 AM | Permalink

    Ah, a Claus wit, eh?
    =============

  348. harold
    Posted Jul 27, 2008 at 10:18 AM | Permalink

    A pensive Monbiot on howtoboilafrog which also has interviews with Oreskes and Hansen. Two quotes:

    {Runaway climate change} is the moral issue of our times.This is an emergency similar to that that confronted the world with the rise of the Axis Powers.

    Have we become so soft and so selfish that we can’t make those tiny sacrifices in order to potentially save the lives of hundreds of millions of people.

    http://nl.youtube.com/watch?v=mas-8Vf301k

  349. Dave Clarke
    Posted Jul 27, 2008 at 12:29 PM | Permalink

    #73 (Dave C)

    It’s clear to me now that nothing they say can be believed.

    #78 (Ross M)

    I have met some of the people who started FOS, and I’ve always found them to be decent, sincere people.

    I certainly don’t want to leave the impression that FOS leaders were deliberately misleadng in their press release of August 10, 2005. But it seems they may not be fully aware of what is, or is not, being done in their name. For example, FOS President Douglas Leahey stated that FOS-sponsored radio ads, broadcast in targeted Ontario markets during the last federal election camapaign, did not run “in Toronto or Ottawa because the cost was too high in the larger cities.”

    http://tinyurl.com/5t7aj4

    Yet the University of Calgary report stated that Ottawa was one of the five markets selected by Morten Paulsen’s consulting firm, as reported by CanWest (in the Calgary Herald): “The ads ran only in vote-rich Ontario during the election in the regions of Kitchener-Waterloo, London, Ottawa, Peterborough and Thunder Bay.”

    http://tinyurl.com/58z98q

  350. STAFFAN LINDSTROEM
    Posted Jul 27, 2008 at 3:27 PM | Permalink

    #337, 338 What a coincidence what the Swedish Evening
    Paper: “Aftonbladet” has for webb-tv (flash needed)
    RP11 can download it… SNOW IN SYDNEY FOR THE FIRST TIME
    IN ALMOST 200 YEARS!! (1836 actually) …Yes, I mean Sydney
    NSW, Aussieland…Checking SMH took 10 seconds to debunk
    this claim of snow 8-10C and winterhail, this happens now
    and then even up in Brisbane…When we have real snow
    settling in Sydney…Thanks for the coffee!!=Nice being here
    but time to leave now..Aftonbladet you should know has
    had some two years of “Klimathotet” THE CLIMATE THREAT campaign
    so I phoned the paper’s responsible for the evening, asking
    him if they had zero fact controle, thank you, I’ve got
    the message…Doubt it…A BOM spokesperson would have explained
    that snowflakes never settle at 10C…

  351. Philip_B
    Posted Jul 27, 2008 at 4:12 PM | Permalink

    Staffan, there was a comment by a BoM person that the last recorded instance of snow in Sydney in the 1830s was probably what was seen yesterday.

  352. Real Richard Sharpe
    Posted Jul 27, 2008 at 6:51 PM | Permalink

    Shame on you Staffan, technically is was just soft hail!

  353. STAFFAN LINDSTROEM
    Posted Jul 27, 2008 at 8:43 PM | Permalink

    340,341 …Awake again…that is not sleeping…Just
    got informed by my “boss” and newspaper (certainly not
    Aftonbladet…there are limits…!!) distributor colleague
    that it would be good if I worked…on my first
    day of summer holidays…He,Rolf, is also one of the persons
    who do the observations on the Stockholm Observatory Hill
    some 5-6 days a month…We have a stretch run here too…
    Will Stockholm Observatory experience its first “noche
    tropical” since July 12 2005??? The excitement is almost
    unbearable…untropicalbearable perhaps LOL FYI we
    don’t use the spanish words here in Sweden, I just thought
    it sounded sexier like being in Santo Domingo Dominican
    Republic…Latest reading (automatic) 03:00 Swedish time
    was … 20.3C It’s a pretty cloudy night, but yesterday
    was not as “hot” as saturday 31C or so …29C perhaps
    so Here is my question for the Climate Audit community:
    Should not a “tropical night” have three requisites:
    1. Temperature during night 19-07 or 20-08 never lower
    than 20.0C …
    2. During the Tmin reading, skies MUST be clear or almost
    clear…
    3. NO FOEHN effect that is temperature doping,
    Sorry Norway and Tafjord!! You never have REAL tropical
    nights in late October or early November…!! SYDNEY precipitation:
    Soft hail like Israeli “Sabras” Hard cover, soft core…
    If you want to see Soft hard hailstorm, go to YouTube
    and search for Bogota hailstorm…Nobody killed!!??
    Providence rules… The Germans would have called it “Graupelhölle”??
    …Stockholm Observatory … thundery showers coming in to
    cool off … but 04:00 20.2C …MORE LATER after work,
    a real cliffhanger that you may follow better than I …

  354. Posted Jul 28, 2008 at 6:09 AM | Permalink

    SteveMc.
    The link to PielkeSr’s blog points to his old domain, which is now screwed up. Could you fix to:
    climatesci.org?
    Lucia

  355. Kenneth Fritsch
    Posted Jul 28, 2008 at 8:36 AM | Permalink

    Should not a “tropical night” have three requisites:

    Staffan, your excitement may not allow for you to interpret my comment, but OK anyway. We must leave “tropical night” as feeling of romance even for the northerly disposed. Think blithe and beautiful women in white clingy dresses with gentleman attending in white sport coats all swaying in tropical breeze with glistening foreheads.

    Even a log cabin somewhere in Swedish winter bedecked with palms and fireplace and fan turned to high with sufficient Absolut becomes my meaning of tropical night.

  356. steven mosher
    Posted Jul 28, 2008 at 9:45 AM | Permalink

    RE 80.

    When Monboit got to the point in his interview that he was arguing that people should
    give up watching TV complied and I turned it off.

  357. Posted Jul 28, 2008 at 10:54 AM | Permalink

    #82 steve mosher

    I didn’t even get that far. As soon as he started doing the crazy-staring-eyes thing I had to switch off…

  358. STAFFAN LINDSTROEM
    Posted Jul 28, 2008 at 11:51 AM | Permalink

    #Kenneth, I think you hit the head of the nail there,
    …In the tropics all nights are tropical at around sea level
    but as you may be avare of…in the outer parts of tropical
    Pacific NE of New Caledonia Ile Lifou Ouanaham AP, some winter
    months JJA report 7C on some nights…28 m ASL!! Noumea has
    seldom below 16C at the same time…Ounaham is just a little more
    than 16 degrees from the equator…Bring a sweater or two to South
    Pacific, outgoing radiation under clear and very clean skies is
    enormous… Noumea is about 500 km SW of Ouanaham. I was going
    to call it a water desert…Yes and yours truly was right[SEE ESA] due
    to lack of phytoplanckton these waters are very reflective…
    And Mr Linström, was the night so called “tropical” at Stockholm
    Observatory …Heck NO…Latest time was July 12, 2005, to the
    best of my research…1112 days and counting…The thundery
    showers came in, double rainbow, appropriate as Stockholm is
    Europride this week…5 mm of rain or so 07:00 17.7C …Bad
    luck again warmers…
    another

  359. See - owe to Rich
    Posted Jul 28, 2008 at 11:58 AM | Permalink

    Re #324 and #327 etc. on the ocean pipeline.

    I think we should view the oceans as a continuum of pipelines, with approximately reversible characteristics. Thus, if it takes n years for heat to filter down to stratum n, you expect it to take n years to filter back up again. I know it’s more complicated than that, but this allows to think that if CO2 has been warming the oceans mostly since its sharp upturn 50 years ago, it won’t take more than 50 years to come out. And in fact, the biggest strata will be the most recent and shallow strata.

    Of the warming caused by CO2, some will hit the ocean as long wave radiation, and some will hit it as conduction with the surface layer. What proportion of a 0.4 degrees warming over 30 years will actually penetrate into the oceans? If it’s a large proportion then the pipelines will be large, but if as I suspect it’s a small proportion then th epipelines will be small.

    Presumably some real climatologists will be trying to theorize and measure that.

    Rich.

  360. Sam Urbinto
    Posted Jul 28, 2008 at 1:19 PM | Permalink

    Dave Clarke said:

    I do stand by my assertion that the production and dissemination of the FOS CCC film, in which you appeared, was orchestrated by APCO Worldwide, which received in excess of $100,000 from Barry Cooper’s “Climate Change” research fund at the University of Calgary to “provide strategic communications services” for the Barry Cooper’s project

    So Cooper’s fund at the UoC donated $100,000 to APCO. Big deal. Non-sequiter.

    ExxonMobil created Save the Tiger Fund and donated $700,000 to Leuser International Foundation.

    Starbucks, JPMorganChase, Duke Energy, DOW, BT and BP help fund The Climate Group.

    So?

  361. DeWitt Payne
    Posted Jul 28, 2008 at 1:38 PM | Permalink

    Rich,

    I think you are referring to a sort of transmission line model. But for calculation purposes, you have to use a small number of elements, especially for a GCM where there are rather drastic limits on detailed computations. So the question is still, how many elements and what are the individual time constants. The first one has to be relatively short as demonstrated by how rapidly volcanic and ENSO events come and go. The deep ocean time constant is almost certainly on the order of kiloyears and can probably be neglected for estimates on the order of a century or two.

    If you look at sea level rise, it started around the turn of the twentieth century and was fairly constant for the rest of the century. There was no acceleration in the last 50 years of the century. If you look at sea level as a lagging by some unknown but fairly small factor proxy for global average temperature, it calls into question the accuracy of the surface instrumental record. There was no slowing from the 50’s to the 70’s and no acceleration in the 80’s and 90’s. OTOH, if the sea level acts as a low pass filter with a time constant on the order of 50 years, we wouldn’t really be able to see short time scale phenomena like that.

  362. Gunnar
    Posted Jul 28, 2008 at 2:22 PM | Permalink

    #346, the situation you describe is like trying to heat an olympic sized swimming pool with a hair dryer.

    >> if CO2 has been warming the oceans mostly since its sharp upturn 50 years ago, it won’t take more than 50 years to come out. And in fact, the biggest strata will be the most recent and shallow strata.

    That’s not just simplified, it’s incorrect. Ocean currents are driven by the sun and earth’s orbit. Water is cooled at the poles, and sinks to the ocean floor. Water rises in the equatorial regions. Water is a much better conductor of heat than air. The bottom line is that the ocean is not like a stack, ie first in, first out.

    >> What proportion of a 0.4 degrees warming over 30 years will actually penetrate into the oceans? If it’s a large proportion then the pipelines will be large, but if as I suspect it’s a small proportion then th epipelines will be small.

    Air temp is driven by water temp, not the other way around. However, if the air is heated by some external means, so that the average air temp is greater than the ocean surface, then heat transfer takes place. Since the ocean has over 1000 times the energy of the atmosphere, the extra heat is quickly absorbed into the ocean.

    For example, if both were at equilibrium at 293 K, and then the air temperature was raised to 294. There would be 5.1 E 21 Joules of extra energy in the air. The delta T would cause heat transfer to take place. The water temperature would be increased, and the air temp would drop. The resulting equilibrium temperature of both air and water would be 293.00087.

    Conclusions:

    1) the pipeline is potentially very long.
    2) the proposed mechanism for heating is very inefficient.
    3) if it was occurring, it would take a very long time to be reflected in air temperatures
    4) if it was occurring, it would be very hard to detect in water temperatures

  363. cba
    Posted Jul 28, 2008 at 5:33 PM | Permalink

    co2rich 359:

    Rich,

    I think there’s been mention around here that the PDO is known to shift shake rattle and roll the ocean around to uncover lower strata in large areas – and the like.

    That tosses in even more fun – where the pipeline has major leaks and even length changes.

    DeWitt:

    I’ve completed the mod. correcting line widths at lower pressures. There’s some differences but it is still not quite the same. Evidently modtran differs by as much as a percent or more from my model for radiated power. It is still quite close considering the difference between approaches and the like. I still have some higher absorption and emission going on at fairly high altitudes than modtran puts out but up there, their data is getting rather sparse, comparitively speaking.

    I should have some graphs soon concerning comparisions with modtran.

  364. DeWitt Payne
    Posted Jul 28, 2008 at 6:30 PM | Permalink

    cba,

    The MODTRAN on the web at the Archer site is version 3. I think the latest version available for purchase is at least version 4. So I’m not at all surprised you get slightly different results. At very high altitudes the nitrogen (N2*N2) and oxygen (O2*O2 and N2*O2) continua might start to be significant as well. But then you cut off at 64 microns and those continua peak at about 100 microns. Any progress with H2O self and foreign continua?

  365. Joel Black
    Posted Jul 28, 2008 at 6:42 PM | Permalink

    Hey Steve,
    Sorry to interject if it’s OT, but I’ve asked this question over at RC and Gavin won’t answer. What areas of AGW are currently unknown to climatology and the subject of current research? I want to be able to pay attention to current issues, rather than read over and over about “settled” science.

  366. Pat Keating
    Posted Jul 28, 2008 at 7:14 PM | Permalink

    365 Joel Black
    Here are some unknowns or problem areas to start with:
    – Climate sensitivity (the feedback multiplier)
    – Clouds
    – Precipitation
    – Moist Convection
    – Ocean ‘pipeline’

    As you can see, there’s quite a lot of important unsettled science……..

  367. Kenneth Fritsch
    Posted Jul 28, 2008 at 7:46 PM | Permalink

    Re: Staffan, I did not hit your nail on its head because I was going to change a subject on you. Your Nothern Euro directness might not allow me to talk from the heart when we have reasonable subjects to discuss. I spent the afternoon in the Andersonville neighborhood in Chicago, IL eating lunch and I know all Swedish subjects are not always reasonable as I saw at the Swedish bakery and delicatessens in Andersonville.

    Anyway you will make me audit further what you say about tropical conditions.. in tropics and in Sweden.

  368. trevor
    Posted Jul 28, 2008 at 7:57 PM | Permalink

    365 Joel Black

    1. How the CO2 cycle really works, lags etc.
    2. Processes of desertification resulting in droughts. Land-use issues such as clear felling of forests, impacts of modern monoculture farming practices and damage to soil, impacts of irrigation disturbing natural processes, and many more.

    plus those mentioned by Paul Keating.

  369. Dave Clarke
    Posted Jul 28, 2008 at 8:07 PM | Permalink

    #68 (was 70)

    You can, evidently, make this stuff up.

    Ross,
    As I’ve already stated, it was FOS, who apparently can and did “make this stuff up” and claimed that Joe Barton sent his letter to Michael Mann “in response” to testimony from you and Tim Ball and the viewing of the film “Climate Catastrophe Cancelled”. You should take the matter up with the Friends of Science, not me.

    See: http://www.friendsofscience.org/index.php?id=116

    I’m sure FOS will correct the error as soon as possible. But certainly I’ll be even more vigilant in examining their claims in future. It’s clear to me now that nothing they say can be believed. For example, it has come to my attention that FOS claimed in 2006 that they do not “employ lobbyists”, yet the federal Canadian registry shows that Morten Paulsen and Bryan Thomas of Fleishman-Hillard were registered to lobby on behalf of the Friends of Science from March through mid-August, 2006.

    I do stand by my assertion that the production and dissemination of the FOS CCC film, in which you appeared, was orchestrated by APCO Worldwide, which received in excess of $100,000 from Barry Cooper’s “Climate Change” research fund at the University of Calgary to “provide strategic communications services” for the Barry Cooper’s project. These services included “advice regarding video production, promotion of the video, distribution of the video, media relations services and other services”, according to a U of C investigation. The report can be viewed here:

    http://sourcewatch.org/images/4/4b/U_of_C_Auditor%27s_Report_April_14_2008.pdf

    You were listed on the 2002 FOS website as one of four “professional contacts”, the precursors of FOS Scientific Advisors, along with Tim Ball. But your name may well have been removed by 2005, and given what we now know about FOS, perhaps you were unaware that you were listed on their website in the first place.

    Good luck with getting FOS to set the record straight.

    Steve: Dave, you say: “You should take the matter up with the Friends of Science, not me.” Dave, I’m taking it up with both parties. FOS has been asked to correct their information. But now that you are aware that this information is incorrect, you should not yourself disseminate false information. You no longer have an excuse for spreading false information; that’s why we’ve taken it up with you. It would be easy enough for you to simply say – I misunderstood the situation for a valid reason and thanks for taking the trouble to clarify it for me. I also try to restrict discussion to scientific matters and have repeatedly not allowed people to discuss Soros, Gore’s investments etc. They’ve been mentioned from time to time, but I try to discourage it as I am also doing here. What’s sauce for the goose is sauce for the gander. I’ve allowed you to make your point, but in the future please discuss scientific points as I don’t want people to use these posts by you as precedents for demanding that we discuss Gore finances.

  370. Dave Clarke
    Posted Jul 28, 2008 at 8:11 PM | Permalink

    Steve,
    I’m sure it was a mistake but if you are going to move OT posts, you should have moved Ross’s completely unjustified attack when you moved the two responses (min and Phil’s) setting the record straight.
    – Dave

    Steve- I left a couple of posts but snipped snark from Ross’ post and moved the conversation to UT.

  371. Joel Black
    Posted Jul 28, 2008 at 8:11 PM | Permalink

    Thanks to the responders, but, are any of you “climatologists”? If not, are there any out there willing to speak about what is not “settled” in the field? Not to be disrespectful of those in other disciplines, but from what I’ve read, it appears that climatologists do not consider the opinions of those outside of their ranks to be of much value.

  372. gens
    Posted Jul 28, 2008 at 8:28 PM | Permalink

    Dave Clarke, Do you not see the irony in your posts here? In #331, you fault Bellamy for relying on Singer. In #369, you say you are blameless because you relied on FOS. Pretty funny.

  373. cba
    Posted Jul 28, 2008 at 8:33 PM | Permalink

    DeWitt:

    I may have found some tidbit here on the modtran 3. It’s not showing variations around 50km where there is a slow decay of density/pressure and variations of 50 Kelvins in temperature up and down which should make noticeable differences in emissions. This is where my model is varying most.

    Here is where the differences are in a doubling.

    Note that the output is radiative transfer at the top of whatever altitude is on the graph there so it’s also accumulative in effects.

    There is no added factors yet such as the continuums or particulates and aerosols. The modtran looks as if it might be more of an idealized approach as well. The only other thing I can imagine that might be in error is a failure in the implementation of the line calculations adapting them to other temperatures and pressures. These have been checked once already and seemed to be ok.

    Hopefully, I’ll have a couple of hours soon to try implementing the energy balance again. It’s probably going to be fairly close to the first attempt as there hasn’t been a tremendous difference in results.

    I have also implemented a much better data reduction for graphing, bringing the data down by a factor of 50 or so in order to actually be able to graph the spectrum images.

    You’ll note that at the 70km altitude (seems to be max. for modtran as there’s no variations in OLR above that to 1/1000 of a w/m^2) the added absorption or lack of emission between now and a doubling is around 2.1W/m^2 as compared to modtran’s result of 2.8W/m^2 and the hitran model increases to around 2.4W/m^2 at slightly higher altitudes. In both cases, they’re substantially less than the 3.8 or more W/m^2 of the established AGW oriented literature.

  374. Dave Clarke
    Posted Jul 28, 2008 at 10:05 PM | Permalink

    #70
    Steve,
    That’s a lot better, but I still think it’s fair to correct the impression that the original statements of the OP (me) were baseless. So with your indulgence … Not that anyone will make ever it this far.

    #68
    For the record:
    The account of the testimony of Tim Ball and Ross McKitrick before a US congressional committee, and the subsequent Barton letter sent to Michael Mann “in response”, came from a Friends of Science press release, dated August 10, 2005, not from the original poster. It does appear that the press release was in error.

    Ross McKitrick was one of four names on the FOS list of “professional contacts” or “academic contacts” (the precursor to the Science Advisory Committee) from 2002 through at least early 2005, according to archived versions of the FOS website.

    Steve: Your statements were baseless. However, you had at the time a valid excuse because of the error on the FOS website and thus, although your statement was baseless, you were not blameworthy. However that doesn’t mean that the statements themselves were npt baseless. Capisce?

  375. bender
    Posted Jul 28, 2008 at 10:14 PM | Permalink

    Author: Rind D
    Source: BULLETIN OF THE AMERICAN METEOROLOGICAL SOCIETY Volume: 89 Issue: 6 Pages: 855-864 Published: JUN 2008

    Abstract: We still can’t predict future climate responses at low and high latitudes, which constrains our ability to forecast changes in atmospheric dynamics and regional climate.

    But the “settled” science … what happened?

  376. DeWitt Payne
    Posted Jul 28, 2008 at 10:15 PM | Permalink

    cba,

    MODTRAN only calculates Fi or the instantaneous forcing. Look at the OLR deficit, Fi, at the tropopause, ~20 km looking down. That’s the number that is approximately equivalent to the IPCC Fa which is calculated at the top of the atmosphere after allowing the temperature profile in the stratosphere to equilibrate to the new ghg concentration. Fi is higher for a tropical atmosphere and lower for subarctic. In MODTRAN the temperature profile is fixed so all you can calculate is Fi. Whatever altitude you look, though, if you adjust the surface temperature to eliminate the OLR deficit, the surface temperature offset ends up being the same. Hansen, et.al., Efficacy of climate forcing, 2005 goes into this in some detail. I’ve linked to it elsewhere, but it’s easily found in a search and the full article is available on line.

    You haven’t fixed the problem with your program. There isn’t enough stuff or energy at 60km to emit or absorb as much as you calculate. If there were, somebody would have seen it. You need to run some cross checks like heating and cooling rates and see if they are as far out of balance as I think they are. For example, take a 10 km slice of the atmosphere centered on 60 km altitude. What’s the total mass/m2? What’s the average temperature? Now use the dry air heat capacity of 1004 J kg-1 K-1 and your calculated emission rate to calculate a cooling rate in degrees/day. Now do the same thing for absorption. Looking at mass extinction coefficients wouldn’t be a bad idea either because in order to get that sort of emission, the atmosphere would have to be opaque at that altitude over a fairly wide wavelength range. Petty’s charts stop at 30 km and the heating and cooling rates increase rapidly with altitude. But at 30 km the rates are only about 3 degrees/day.

  377. kim
    Posted Jul 29, 2008 at 6:43 AM | Permalink

    26 (John Lang) The trouble is where in the pipeline is the heat? Originally it was claimed to be in the ocean. The Argos buoys showed oceanic cooling to 2,000 meters, so the suggestion became that the ‘extra heat’ was in the deep oceans. Now that the Aqua satellite is showing sea level dropping, presumably by thermal contraction, that idea went out the window. Kevin Trenberth, in a famous quote for NPR, speculated that maybe the ‘extra heat’ was radiated back out into space. I suspect he’s right, there.
    ============================================

  378. cba
    Posted Jul 29, 2008 at 7:03 AM | Permalink

    DeWitt,

    I may have finally fixed the real problem. Late last night I went through the software calculations and discovered an inversion of a ratio of reference vs current T energy states (QT/QTref). It seems the comment line immediately above the code line was correct and the code line had that part of the eqn inverted. After running the OLR again, it was matching Modtran to about +/- 3% up to around 70km. The variations for the lower troposphere below 16km are mostly within 1%. All of the larger errors seem to be associated with the modtran asymptoptically approaching a slightly higher W/m^2 output than my code is doing plus there is a bump at 50km due to increased T that reverses the sign of the % diff. and there seems to be no fluctuation for modtran olr there despite substantial variations in T.

    Trying to do the inbound calcs and the energy balance will have to wait until at least tonight. I’m pretty sure that the energy balance ought to be much closer than the original from last month. The values are slightly less at 70km (and most other altitudes other than 50km) than the modtran olr wheras before, there was a bit more – indicating less absorption and more emission up there. The results initially indicate too that it was working quite well differentially for relative comparisons despite the error causing absolute problems of significance like stratos. energy balance.

    Here is the modtrn / 1d model comparison again after the corrections to the software.

  379. Steve Geiger
    Posted Jul 29, 2008 at 7:08 AM | Permalink

    Does Hansen himself still stand by the model who’s results are presented here (originally from late 1980’s)? Has he provided a refined model since that time that might have more realistic forcing assumptions and/or different formulations? I agree its important to go back and verify models of the past but wouldn’t *they* be inclined to just argue that they have a better understanding now (which, perhaps again can’t be verified for many more years to come). As with most these posts, what I’m immediately most interested in is “what would Hansen say’ (WWHS) or for that matter Tamino, Gavin, etc. BTW, Lucia does a very nice job of trying to anticipate the primary arguments ‘against’ her blog analyses… I for one really appreciate that approach.

    Thanks

  380. Posted Jul 29, 2008 at 7:19 AM | Permalink

    Seems like everyone in climatology is coming down with a bad case of divergence. I wonder if they’ll next prove that it’s an infectious agent, via computer model?

  381. Carl Gullans
    Posted Jul 29, 2008 at 7:28 AM | Permalink

    #30: You can play that game all you want, but if (I am not saying that they have) model after model fails verification, you tend to question whether you truly have a grasp on the underlying process.

    Are any of these models cross-validated? For example, one would build a temperature model in 2008 using data from 1950-2000 and then check the accuracy of the projections against actual temperatures in 2001-2008. This is the only sensible way to check a model’s predictive power without waiting 10 years.

  382. Posted Jul 29, 2008 at 8:47 AM | Permalink

    (duplicate)

  383. bender
    Posted Jul 29, 2008 at 8:49 AM | Permalink

    #39 uncertainties in the physical sciences? biases? preposterous.

  384. joy
    Posted Jul 29, 2008 at 8:59 AM | Permalink

    Dave Clarke:
    Is your record stuck? Allow me to jog your needle. Your amazing revelation is amazing only to you because it confirms a preconceived idea about FOS and that is a matter of politics, not science. So, “now that we all know what we know…etc” is simply an attempt to muddy the waters of the FOS. Mistakes happen. You will read into it if it suits you, but allow others to make their own minds up. A mistake does not cancel out their entire organisation. If you focussed on the scientific content rather than the funding you will reach beyond Politics.

  385. Kenneth Fritsch
    Posted Jul 29, 2008 at 10:04 AM | Permalink

    A reoccurring theme in some climate model predictions seems to be presenting the predictions with little or no attention to the uncertainty or error bands involved and then pointing to that information to influence climate policy (all aided by a show of hands by climate experts from time to time per IPCC methods that, while giving the appearance of some objective measure of uncertainty, are nevertheless in the end subjective).

    When those predictions are questioned, the predictions then take on the wide error bars that were initially suppressed or at least under emphasized. As examples I give (1) the Hansen Scenario A, B and C with not only the uncertainties connected to the model output given the actual inputs, but the uncertainty involving the effects of the scenario inputs relative to the actual inputs and (2) the recent discussion we had here at CA discussing the Douglas paper and the huge uncertainties that climate scientists want to push onto, not only the ensemble of climate model outputs for predicting the differential rates of temperature increases between the tropical surface and troposphere, but the uncertainty of the chaotic content from the singly realized real world output – an uncertainty that many apparently did not even want to estimate in the Douglas case.

    This process seems to have been a major part of getting from climate science to climate policy (recommendations, at least) and in turn covering for climate science when that is required. With the apparent success of this process, I do not see its demise any time soon.

  386. Richard deSousa
    Posted Jul 29, 2008 at 10:20 AM | Permalink

    I can just imagine the warmers response to the latest downturn in temperatures and it goes like this: “Well, may be the PDO and sunspots do have an influence on the climate but when these natural forcings cycle back to normal we will see the actual results – the temperatures will jump back up.”

  387. Richard deSousa
    Posted Jul 29, 2008 at 10:22 AM | Permalink

    Should have added “… the temperatures will jump back up and continue to climb.

  388. bender
    Posted Jul 29, 2008 at 10:38 AM | Permalink

    #49
    Good post, Ken.

    With the apparent success of this process, I do not see its demise any time soon.

    But this is the one thing that must change. The “success” that is “apparent” now can not be sustained.

  389. jryan
    Posted Jul 29, 2008 at 10:46 AM | Permalink

    I have a silly question. I know that the major players in the AGW world have been arguing that there will be a 10 year cooling but that will have no effect in the long run on the actual AGW crisis.

    If that is the case, how is it that none of the models have replicate or predicted this cooling?

    It’s the fact that they didn’t see it coming and refuse to alter their models to accomodate this event is real nail in the coffin. How can you trust a group of scientists that refuse to accept or incorporate such a huge divergence from their hypothesis?

  390. George Tobin
    Posted Jul 29, 2008 at 11:13 AM | Permalink

    1) Does the warming bias identified by Pielke et al account for the difference between GISS numbers and those done by the radiosondes, satellites and the Brits? (the links to the pdfs failed, BTW)

    2) I admire your restraint and professionalism with respect to not overinterpreting the graphs in which Hansen A and B are currently failing the eyeball test. But how far out does the current rate of divergence have to extend before definitive statements can be made?

    I have on occasion waded into lucia’s marvelous number-crunching festivals that found significance with respect to IPCC AR1 projections within a range of 10 years but Gavin Schmidt and Tamino keep asserting a figure of 30-32 years which I don’t quite get. How can a climate model focused on an expressly short-term/immediate forcing factor like CO2 still be good even if it’s off for a couple of consecutive decades?

    For example, if John Lang’s numbers in #35 are correct, how far out do we have to go before the projected alarmist model of warming rate is declared a failure?

    Lukewarmists Unite! Our time is almost at hand!

  391. Craig Loehle
    Posted Jul 29, 2008 at 11:49 AM | Permalink

    George Tobin: “Lukewarmists”–I love it!

  392. Jedwards
    Posted Jul 29, 2008 at 12:14 PM | Permalink

    Re #35
    Nice sysnopsis John. I did a similar look at trending from xx date to present, and found that the “worst” trend for warming appeared if you cherrypick Apr/May 1992 as your start date. Everything else seems to suffer by comparison.

  393. R John
    Posted Jul 29, 2008 at 12:45 PM | Permalink

    Almost everyday I look at the forecast weather models and read my local NWS forecast discussion. Those models, as noted by the trained meteorologists, almost always diverge after 48 hours. In hindcast, each model has its good days and bad days in forecasting. Beyond 72 hours, all models tend to be just a guess. Yet, we have been told that these models are constantly being improved. This begs the obvious question – if models for forecasting our daily weather are only a guess beyond this time frame, then how can a longer range projection be any more accurate? Also, I would love to see a study of those three-month climate predictions that are put out each and every month in terms of an accuracy rate. Not one of them predicted our winter patterns (eg. lots of snow in upper Midwest) even after the first month of winter (Dec) to any degree of accuracy.

  394. jeez
    Posted Jul 29, 2008 at 12:58 PM | Permalink

    Re: anonymous

    Bingo.

  395. dearieme
    Posted Jul 29, 2008 at 1:12 PM | Permalink

    “As a rule (and there are few exceptions) geneticists generously make their raw data available in the public domain
    on publication of their analyses.”

    from the Acknowledgements of “The Origins of the British” by Stephen Oppenheimer, paperback edition, 2007.

  396. Pierre Gosselin
    Posted Jul 29, 2008 at 1:20 PM | Permalink

    anonymous
    You are correct. But Big Jim and Big Al seem to think they have it down pat – science is settled.
    Again, the models didn’t predict the temp drop we experienced over the last year. Clearly they forgot to, or refused to, factor in something that was awfully important.

  397. DeWitt Payne
    Posted Jul 29, 2008 at 1:20 PM | Permalink

    R John,

    nitpick: begging the question doesn’t mean what you think it means. Look it up under logical fallacies.

    On models: this has been discussed at length on other threads here. I think the expert view is that short term weather prediction models can never successfully forecast beyond a few days. It’s an initial value problem not a lack of computer power. Chaos theory, after all, was invented by a weather modeler who also coined the phrase the ‘butterfly effect’. Weather forecasting fails because you are trying to calculate one trajectory out of an infinite number of possibilities which diverge exponentially for infinitesimal changes in initial conditions. This is not to mention the paucity of data to initialize the model.

    Climate prediction may be a different matter. At least some people, lucia for one IIRC, think a solution to the problem is at least theoretically possible. Climate forecasting, attempts to calculate the distribution of trajectories. As near as I can tell, though, current climate models all contain fudge factors that eliminate the chaotic behavior, like viscosities that are orders of magnitude higher than actual viscosities. This makes their forecasting skill, in my opinion, questionable no matter how well tuned they are at hindcasting.

  398. Pierre Gosselin
    Posted Jul 29, 2008 at 1:27 PM | Permalink

    Smokey
    The page you refer to is a little outdated. This one is from this year:

    http://ratemyprofessors.com/ShowRatings.jsp?tid=543236&page=1

  399. DeWitt Payne
    Posted Jul 29, 2008 at 2:54 PM | Permalink

    cba,

    You still have significant absorption and emission above the tropopause, including substantial absorption of OLR above 50 km. Ignore the emission bump for now. The optical density cannot be that high. MODTRAN OLR flattens out at about 20 km for a reason. There just isn’t enough absorbing mass at those altitudes. Your net total difference in absorption above about 20 km is about 14 W/m2. So above 20 km you have about 2 degrees/day warming from OLR. Then there’s going to be absorption of shortwave radiation that will increase the warming even more. Sorry, can’t and doesn’t happen.

  400. Steve W.
    Posted Jul 29, 2008 at 3:01 PM | Permalink

    #60

    “As a rule (and there are few exceptions) geneticists generously make their raw data available in the public domain on publication of their analyses.”

    My field is bioinformatics, so I regularly have to deal with this “raw” data. See my site at harvest.ucr.edu

    A little background:
    The gene sequencer produces a chromatogram – often called a trace file. This is the true raw data. This trace file is run through software called a basecaller (Phred is a common one). The basecaller reads the wave forms, and produces a sequence file (often called a flat file), and a quality value file. So, for each letter of genetic code you have a corresponding quality value called Phred score.

    Often the first 10-30 bases of the sequence are bad. The sequence in the trace file could be 1200 bases long, but usually only about the first 800 bases are of good quality, and this varies greatly. You can also have bad patches in the middle of the sequence. You often can’t tell by looking at the sequence which parts are good or bad.

    The scientists will often upload their flat files to Genbank, but not their trace files (the actual raw data). The flat files lack quality score information, and are often vector/linker contaminated. THEY OFTEN DO NOT TRIM OFF THE LOW QUALITY BEGINNINGS AND ENDINGS, OR LIST WHERE THE GOOD QUALITY PART BEGINS AND ENDS. This so-called raw data has very little, and perhaps a negative value. Many scientists will refuse to release the trace files to you (sound familiar?).

    This is a very common practice. The interesting thing is that some communities are very generous with their raw data (the barley community is amazing) and others are very stingy.

  401. Peter M.
    Posted Jul 29, 2008 at 4:24 PM | Permalink

    Sorry for double posting.
    This editor seems little bit crazy ;-)

  402. Jerry
    Posted Jul 29, 2008 at 4:28 PM | Permalink

    Climate prediction may be a different matter. At least some people, lucia for one IIRC, think a solution to the problem is at least theoretically possible. Climate forecasting, attempts to calculate the distribution of trajectories. As near as I can tell, though, current climate models all contain fudge factors that eliminate the chaotic behavior, like viscosities that are orders of magnitude higher than actual viscosities. This makes their forecasting skill, in my opinion, questionable no matter how well tuned they are at hindcasting. – Dewitt Payne

    This is the problem I have with model scenarios of future climate. In a closed system (pipe network) where we can model actuall behavior in the pipe system, I just can’t buy the fact that averaging climate of a chaotic system over time will provide the necessay accuracy at which most AGW theorist describe.

    When I hear this type of absolute, my Engineering B.S. meter goes through the roof.

  403. Pat Frank
    Posted Jul 29, 2008 at 4:52 PM | Permalink

    #33 and #37 — The science I know is not like that. Most researchers in my experience enjoy fame if it comes (as do most people), but typically do the work because they love it. The work is too bloody hard to do unless you love it. Some scientists are ego-driven, but they’re a distinct minority, and very few of those will jigger their data to aggrandize their fame or their funding.

    People who criticize the archiving lacunae of academic scientists don’t understand the ad hoc society of most academic research groups. It’s grad students killing themselves to find a way through some open-ended project; one which no one has done before, and for which there are no guiding sign-posts. One really must find one’s way. It’s post-docs looking for that important experiment. It’s faculty wondering where the money will come from next year, attending horrid departmental meetings, and meeting with students. Data get collected and stored, but no one wants to spend the time to make a systematic archive. Doing so is desirable, but well down the list of things that must be done. Typically there is no data archive at all, anywhere, for a given field. In our group, we have partial archives and could probably provide a fair part of the data we have collected over the years. But some of it is almost not recoverable.

    In climatology, especially dendroclimatology, there is a fine international archive for data. This is a nearly unique situation for any field of science. Given the way paleoclimatologists have encouraged the politicization of their field, they have put themselves in the position of having to justify their claims to a far larger audience than the usual scientist, namely to the full range of public scrutiny. These scientists have therefore a self-selected ethical onus to faithfully archive their data, so that the policy decisions they have consciously enjoined can be evaluated on the bases they themselves have engendered.

    It’s clear from Steve M.’s experience that not only have many important climatologists not met their self-elected ethical obligations, but they have actively obstructed the realization of that ethic. This is a double crime, and a shameful one. But this is a crime of climatologists, abetted, it’s true, by journal editors and overseeing institutions.

    But it’s very wrong to tar all scientists with that brush, and wrong to make unqualified pejorative comments about the institutions of science. I know lots of these people, and they’re generally a fine group. They don’t deserve people constructing an atmosphere of discredit.

  404. cba
    Posted Jul 29, 2008 at 5:16 PM | Permalink

    DeWitt,

    I’m not sure how you’re reading the graph. There is 14w/m^2 difference at 100km but at 70km which seems to be the maximum limit of modtran, it’s well under 1%, I think about 0.13% difference. Modtran shows multi 1/1000ths of a watt variations up to about 70km and there it stops changing at all.

    Up there the pressure is dropping by around half every 5km or so – as is the density. The modtran calcs are done to 1/1000 W/m^2 resolution. It’s basically changing in the 1/10 W/m^2 until around 70km where it ceases any change. That’s very suggestive that no additional calculations are being done above there since the number of molecules is only halved yet it flatlines rather than continuing this gradual decrease per layer – especially considering at 50km all of a sudden there is a huge T increase that provides no increase radiative output despite being able to see a gradual decline there which seems to not vary in the higher temperature area along with the T.

    What is happening there in the htran 1’d model is that the concentrations are declining but the bandwidth of the lines are decreasing and the line peaks are increasing up there or actually – are not dcreasing at the peak because they aren’t being smeared out as much due to the lower pressures and hence less broadening factor. Hence, while the pressure is dropping and there’s fewer molecules involved which lowers the peaks, the pressure drop is also permitting the peaks to be higher due to less broadening. Above 70km, we’ve pretty well decided LTE is on the way out and it is starting to create problems, especially higher up where modtran doesn’t function at all (anything over 100km).

    At any point on the curve there, below 70km the difference maximum is less than 3.5% and under 9w/m^2. Remember this chart is the total OLR at any particular altitude and it’s the difference in values from one altitude to another that can provide the activity in a layer.

    Anyway, I’ll continue to keep poking at it to try to ascertain what is going on – if anything. Stratos. energy balance might provide something of a clue as there isn’t supposed to be convection/conduction involvement.

  405. Real Richard Sharpe
    Posted Jul 29, 2008 at 5:24 PM | Permalink

    It has now been identified which group of people will be the most impacted by climate change … so everyone causing climate change will have to pay.

    I guess we could have predicted who they are.

  406. bender
    Posted Jul 29, 2008 at 6:11 PM | Permalink

    #78 I know those types. I like them. Yes, they’re out there in numbers. But your description does not characterize the bulk of the people that I know who seek to influence global environmental policy. The ones who could care less about precise measurements, accurate models, and disclosure of model uncertainty.

    But the topic here is “Hansen’s update” – which is more about Andrew Bolt’s use of data and Tim Lambert’s accusations of selective use of data, and Steve’s point that Lambert – as usual – is wasting bandwidth trying to discredit people without just cause. Was Bolt right or wrong, or does it even matter – that is the question.

  407. Posted Jul 29, 2008 at 9:59 PM | Permalink

    My reaction to this is that it’s lot of hoohoo about nothing substantial. Our instruments aren’t good enough to do these hair-splitting analyses. What good are four different projections based on questionable data? In my humble opinion, none. Call me when something substantial and unusual happens. So far nothing has.

  408. DeWitt Payne
    Posted Jul 29, 2008 at 10:34 PM | Permalink

    cba,

    All I can say is what I’ve said before: if there were really that much radiation, someone would have seen it. The peaks in the center of the CO2 and ozone dips would be much larger, for example. Also, satellite measurement of clear sky OLR shows that your TOA emission is too low. Your percentages at high altitude are off by orders of magnitude too because you are comparing to total emission from the surface and entire lower atmosphere, not the emission at a given altitude. That 8 W/m2 peak at about 50 km is more like 3,000% high compared to MODTRAN, not 3%.

    As far as absorption, look at your graph. At 75 km the emission graph is at 250 W/m2. At 100 km it’s at least 5 W/m2 lower. The pressure at 70 km is 0.05 mbar. How can that little mass possibly absorb 5 W/m2? That implies long wavelength absorption vastly exceeds emission above 75 km which is absolutely impossible. At altitudes above 30 km radiative energy balance is required at constant temperature. The real atmosphere at all wavelengths in the thermal IR is optically thin above 30 km. Longwave radiation at those altitudes always results in net cooling by emission not warming by absorption and is balanced by short wavelength absorption. Your model appears to have the atmosphere at very high altitude having net absorption of both long and short wavelength radiation.

    To paraphrase the movie line: Show me the spectra!

  409. bender
    Posted Jul 29, 2008 at 10:40 PM | Permalink

    Is there enough cloud cover data yet to say whether it is a Hurst-like, 1/f-noise process in space-time?

  410. EJ
    Posted Jul 29, 2008 at 11:11 PM | Permalink

    Kenneth F.

    Ditto Dude.

  411. EJ
    Posted Jul 29, 2008 at 11:16 PM | Permalink

    Regarding where the CO2 goes.

    We do know where the missing carbon (CO2) goes. The so called missing carbon sink.

    The oceans are our sink for everything.

    I heard the head of the Sierra Club say that trees (the rain forests) are our source of oxygen.

    The oceans are our oxygen source.

    The oceans inhale more than 2 times man’s annual emmisions of CO2 in one breath.

    If you do a carbon budget, the oceans hold many order of magnitude more CO2 than the atmosphere and land surface combined.

    We do know where the CO2 goes.

  412. Reference
    Posted Jul 30, 2008 at 5:39 AM | Permalink

    From the concluding remarks:

    The current scientific scene is dominated by the hypothesis that climate is deterministically predictable, combined with the belief that GCMs suitably implement this hypothesis and produce credible projections of future climate.

    Finally, a succinct statement about these unspoken assumptions behind GCMs!

  413. PaulM
    Posted Jul 30, 2008 at 7:39 AM | Permalink

    IPCC Drafts and comments have moved

    Steve and others: have you noticed that the location of the IPCC WGI reviewer comments has changed? This means that your links for example at

    http://www.climateaudit.org/?p=1790

    http://www.climateaudit.org/?p=2960

    no longer work, so you might want to put an update on those.
    The new location is

    http://hcl.harvard.edu/collections/ipcc/

    You no longer have to agree to anything to get the information. Also, the reports are searchable. However, you get things painfully slowly on a page-by-page basis. There is a convert to pdf option but this is even slower.

    Steve: I noticed the change. It’s amazing how their changes never seem to improve things for users. I had enough foresight (and have been burned enough times) that I saved all the pdfs in case they withdrew them later. If there are chapters that you want, I’ll arrange for them to be posted up.

  414. DeWitt Payne
    Posted Jul 30, 2008 at 8:54 AM | Permalink

    cba,

    If you want to see how much your model actually differs from MODTRAN, calculate the power looking up instead of down and compare. As I pointed out before, the emission looking up is the same as the absorption from that point to the TOA so it shouldn’t be difficult to do. MODTRAN looking up at 40 km with 384 ppm CO2 and 1976 atmosphere: 1.962 W/m2. My estimate is that your model would show at least a factor of 10 higher.

  415. Paddy
    Posted Jul 30, 2008 at 9:48 AM | Permalink

    This research paper should be included in comments to the EPA regarding whether and how to regulate CO2 as a pollutant under the Clean Air Act. The EPA is obligated to apply the “best available science” in its decision making. It takes someone with established scientific credentials to do.

  416. Phillip Bratby
    Posted Jul 30, 2008 at 10:24 AM | Permalink

    #34 Scott-in WA

    I’m with you there. As another ex-nuclear industry worker and user of system codes, I agree with your comments and would add that it’s not just the GCM, its documentation, validation and verification, configuration control, input guides and output control, but its also the ability and experience of the user that is important. Finally, archiving and the ability to retrieve everything is essential.

  417. Kenneth Fritsch
    Posted Jul 30, 2008 at 10:36 AM | Permalink

    Re: Pat Frank @ #403

    People who criticize the archiving lacunae of academic scientists don’t understand the ad hoc society of most academic research groups.

    It has been a long time since I was exposed to the scientific paper publishing environment, but what I do not “get” from your general assessment of that environment currently is this:

    With electronic capabilities of storing data and even scanning laboratory notes into electronic form that has been available for many years now, it is difficult to see where that could not be done with relative ease and minimum of efforts.

    Papers that I recall from my graduate days (inorganic chemistry) normally contained within the contents sufficient information to replicate the efforts outlined in the paper. In those cases source data, while being available in laboratory note books, were less important than someone using a data base, whether second, third or first hand, to make analysis and than failing to make those records available.

    When anyone writes a scientific paper I would assume that that person(s) must assemble all the data together in order to insure that the writings can be confirmed by these data or at least not contradicted by it. In today’s world that would seem to mean that all the data had to be available to the writer(s) in electronic form at that point in time.

    With the data together at this time I would assume that the author(s) and/or the author(s) sponsoring organization are going to have a scientific motivation to retain these data and records in electronic form. After decades of storage and data left uncalled (which would be more likely in the case of non controversial findings) I could see a situation where the data may be in forms that makes recalling with modern software difficult, but probably not impossible.

    Pat, could you indicate what in this admittedly simple-minded scenario I do not “get”.

  418. Mark T.
    Posted Jul 30, 2008 at 10:43 AM | Permalink

    Design me a circuit that can EXACTLY reproduce an impulse (dirac function) that is EVERYWHERE zero except at a single point x.

    Boy, wouldn’t that be nice. :)

    Mark

  419. Kenneth Fritsch
    Posted Jul 30, 2008 at 11:18 AM | Permalink

    Bender @ #406

    But the topic here is “Hansen’s update” – which is more about Andrew Bolt’s use of data and Tim Lambert’s accusations of selective use of data, and Steve’s point that Lambert – as usual – is wasting bandwidth trying to discredit people without just cause. Was Bolt right or wrong, or does it even matter -that is the question.

    Bender, your admonishments are well taken — and by a member of the guilty party.

    Steve M’s point is also well taken as it shows that if Bolt had used the later data (with the corrected earlier data) his point would have been better made than by using the shorter time period with the uncorrected data. Bolt references Steve M’s update on his blog, but does not show the graph. That should be the end of it and leaving little to discuss (on topic) unless we have to discuss Tim Lambert’s chronic silliness in matters such as these.

    Having said that I can personally never get enough of discussing all the issues and data involved in the evolution of Hansen’s scenarios A, B and C.

    As an aside that I hope I can bring up here on the unthreaded thread is the auto correlation in the global mean temperature anomalies. As I recall when looking at those anomalies on a monthly basis the DW statistic of the residuals indicated a strong AR1 but the same statistic showed no auto correlation problem when using annual data. I think Lucia found the same relationship from her calculations at her blog.

    I do think it would be great to summarize all the Hansen scenario data and GHG trends leading up to his producing his scenarios all in one place.

    Steve: I’ve done some experiments using maximum likelihood methods on the trend data and got wider confidence intervals than Lucia has with Cochrane-Orcutt; and it’s very sernsitive to specification e.g. LTP.

  420. DH
    Posted Jul 30, 2008 at 11:32 AM | Permalink

    Has anyone compared model climate predictions to those of the Farmer’s Almanac which are based upon historical records?

  421. steven mosher
    Posted Jul 30, 2008 at 1:28 PM | Permalink

    RE 60. Yup when you drop the bomb what happens

    http://jmem.northropgrumman.com/Brochure.htm

    Fun work. hehe.

  422. Larry T
    Posted Jul 30, 2008 at 2:27 PM | Permalink

    re 26

    That said, the smart money would bet on warming over the next century at the same rate that we have seen since 1950 (i.e. 0.7-1.0 degC/century).

    I am firm believer that primary climate driver is solar activity and that current lack of activity will cause a global cooling of at least 20-30 yrs but i would still agree with that statement simply because we are in a inter-glacial period heading upward and would do so if there were a virus that eliminated the entire human population tomorrow.

  423. Buddenbrook
    Posted Jul 30, 2008 at 3:39 PM | Permalink

    Anyone read Mann’s & Kump’s new book “Dire Predictions” that was published this month? It seems to be getting good reviews and is selling very well. In amazon.com sales ranks it’s currently 3rd in books on climate.

    I found this reviewer comment hilarious:
    “Here’s a powerful, straight-forward guide to how scientists, economists, and engineers really understand the problem of global warming. It makes 20 years of research and consensus-building completely accessible to anyone who cares to know the truth–and to do something about it.”

  424. Mark T.
    Posted Jul 30, 2008 at 4:37 PM | Permalink

    I don’t know many engineers that agree with the concept of scientific “consensus” let alone global warming.

    Mark

  425. STAFFAN LINDSTROEM
    Posted Jul 30, 2008 at 4:53 PM | Permalink

    #416 What a coincidence, Buddenbrook, that you,almost a book
    by Thomas Mann the writer etc who was born 80 years on the day before
    I was, and died the day I SHOULD have been born, give or
    take some days, [1875 June 6-1955 August 12], writes
    about another book of one of the black sheep of the family…NOT…
    I hope I can read the german pocket of “Buddenbrooks Verfall
    einer Familie” I bought some 25 years ago in Sweden at a sale…
    If AGW turns out to be completely false, [even my jury is still out!]
    will Michael Mann write a book with the title in German:”Globale Erwärmung –
    Verfall einer Theorie-Familie”…?? IDTS…
    However many billion dollars are invested in AGW,
    the biggest investments are made in personal prestige, or??
    [There is also a present-day Dr Thomas Mann with a connection to A Kump
    family in WV...]

  426. D. Patterson
    Posted Jul 30, 2008 at 5:44 PM | Permalink

    60 lucia says:

    July 30th, 2008 at 1:19 pm
    [....]
    Agreed. Even if GCMs are totally horrible (whatever horrible might be), there is still plenty of reason to expect that doubled CO2 raises temperature.
    [....]

    Does it, Lucia? How is that possible given past experience? The following is an approximate description of the relationship between atmospheric CO2 concentrations versus temperatures. If atmospheric CO2 has a physical property capable of forcing air temperatures to increase for each doubling of CO2 concentraton, then why didn’t this same physical property cause this to always happen in the past or even happen more than a seldom occassion in the past? Why did temperature increase while CO2 decreased? Why did temperature remain relatively unchanged as CO2 increased and decreased? Why did temperature decrease while CO2 increased? Why did temperature increase 10C and 11C while CO2 increased 100 percent at one time and 1000 percent at another time, contrary to anything remotely close to a 1 degree per CO2 doubling?

    Geological time
    —————
    Approximate earlier atmospheric CO2 ppm,
    Approximate later atmospheric CO2 ppm,
    Percentage difference earlier versus later atmospheric CO2 ppm,
    Percentage increase or decrease earlier versus later atmospheric CO2 ppm,
    Approximate temperature change C degrees earlier versus later time period
    ————————————————————-
    Cambrian
    7000, 4500, 156, +56, 0

    Ordovician
    4200, 4500, 107, +7 -10

    Silurian
    4500, 3000, 67, -33, +10

    Early Devonian
    3000, 4000, 133, +33, +10

    Late Devonian
    4000, 1500, 38, -62, -2.5

    Carboniferous
    1200, 380, 32, -68, -8

    Permian
    250, 2850, 1140, +1040, +11

    Triassic
    1900, 1300, 68, -32, -1.5

    Jurassic
    1300, 2600, 200, +100, 0

    Cretaceous
    2100, 750, 36, -64, +5

    Tertiary
    900, 300, 33, -67, -10

    C.R. Scotese http://www.scotese.com/climate.htm Temperature
    R.A. Berner, 2001 (GEOCARB III) CO2

  427. D. Patterson
    Posted Jul 30, 2008 at 6:10 PM | Permalink

    Correction for the Cambrian: sorry, it was supposed to be 7000, 4500, 64, -36, 0

    Geological time
    —————
    Approximate earlier atmospheric CO2 ppm,
    Approximate later atmospheric CO2 ppm,
    Percentage difference earlier versus later atmospheric CO2 ppm,
    Percentage increase or decrease earlier versus later atmospheric CO2 ppm,
    Approximate temperature change C degrees earlier versus later time period
    ————————————————————-
    Cambrian
    7000, 4500, 64, -36, 0

    Ordovician
    4200, 4500, 107, +7 -10

    Silurian
    4500, 3000, 67, -33, +10

    Early Devonian
    3000, 4000, 133, +33, +10

    Late Devonian
    4000, 1500, 38, -62, -2.5

    Carboniferous
    1200, 380, 32, -68, -8

    Permian
    250, 2850, 1140, +1040, +11

    Triassic
    1900, 1300, 68, -32, -1.5

    Jurassic
    1300, 2600, 200, +100, 0

    Cretaceous
    2100, 750, 36, -64, +5

    Tertiary
    900, 300, 33, -67, -10

    C.R. Scotese http://www.scotese.com/climate.htm Temperature
    R.A. Berner, 2001 (GEOCARB III) CO2

  428. Posted Jul 30, 2008 at 8:03 PM | Permalink

    Steve Mc, et. al.,

    It is my understanding that the GCMs do not differ significantly from simple models in the effect of CO2 on climate. i.e. a number of people have come out with simple CO2 models (Nir Shaviv among others) showing about 1 deg C increase in temps from CO2 doubling based on first principles. The GCMs show similar results.

    http://www.coyoteblog.com/coyote_blog/2008/04/the-keystone-is.html

    Where it gets tricky is in the assignment of a multiplication coefficient due to other factors (water vapor being the prime suspect).

    So the lukewarmers agree (broad generalization) with at least one central aspect of the GCMs.

  429. Gerald Machnee
    Posted Jul 30, 2008 at 8:06 PM | Permalink

    Re #416 **Anyone read Mann’s & Kump’s new book “Dire Predictions” that was published this month? It seems to be getting good reviews and is selling very well.**
    You mean they are PAYING for the book, but get to read CA for FREE??
    Something is wrong with the picture.

  430. Dave Dardinger
    Posted Jul 30, 2008 at 10:06 PM | Permalink

    As others here have remarked, this is well over my head but pretty nevertheless. What I want to see, is one or more of the team make a definitive statement concerning your work here. Some possibilities might be:

    1. Unfortunately Steve has made the following mistakes….
    2. This post is totally unreadable and amounts to an attempt to pull the wool over the eyes of untrained laymen.
    3. Generally speaking this article is correct, but here are some problems with attempting to apply it to paleo-climate proxies….
    4. It’s over my head too. Steve, want to join the team as our new statistics guru?

    Any of these responses could be addressed and might move the ball down the field.

    Unfortunately experience shows the follow will be the sort of responses which will be given.

    1. Total silence.
    2. Ad hom attack claiming Steve is in the clutches of big business.
    3. Claims of error which somehow are never actually presented in a form which can be checked by anyone.
    4. A combo of “we’ve all moved on.” and “all this sort of thing is covered in Mann et. al 2009 now being furiously written.” which will not actually appear until after the next IPCC issue after which they’ll move on again.

  431. Geoff Sherrington
    Posted Jul 30, 2008 at 10:14 PM | Permalink

    It’s over my head too and I hope that you are not tempted one day to write a similar-looking spoof and see how many people get caught. Beware April 1, readers.

    If we take a site and its daily max temp record, we can do classical geostatistical work over a 100 year period (or whatever) to determine at first pass when two days are so far apart that one can not usefully be used in predictions for the other. We run into autocorrelation problems. We can try the same approach on averaged monthly data, then on averaged annual data, and simply probe the predictive utility of one measurement on another from a different date.

    Then we can go to miniumum temperatures, mean temperatures, daily rainfall readings, daily evaporimeter readings and so on. The point of the exercise is to examine predictive ability.

    I see this as a prelude to moving to proxies. If, for example, a direct thermometer method has a certain predictive power, then a proxy for temperature can do no better. Thus, the error limits on thermometer temperature (a mathematical construct related philosophically to real life, which is not lived in reference to sigmas) set the best obtainable error limits on proxies or models of temperature.

    It might be the case that a better method than geostatistics is what you are writing about, but I have not kept up with the computative methods, only old guidelines about acceptability of error bounds.

    Please snip this if useless.

  432. D. Patterson
    Posted Jul 30, 2008 at 11:03 PM | Permalink

    Between the tornado watch, severe thunderstorm, power outtage, software reversing edits unnoticed, server glitches, and WordPress formatting munges, and dropped posts, the task of getting a simple and error free table has turned into a non-trivial task. Nonetheless, here is a revised version of the previous table. I apologize in advance in the event I’ve still failed to catch an error in this one, and I encourage everyone to use their own interpretation of the graph/s to produce their own. Note how the temperature change was as much as 10C to 11C. Ask yourself what the CO2 amount was supposed to be if there is any truth in the claim that each degree of the 10C change in temperature was supposed to require a doubling of CO2, in other words ten to eleven degrees requiring ten to eleven doublings?

    Cambrian
    4500 7000 1.56 156% 22 22 0
    7000 4500 0.64 64% 22 22 0
    Ordovician
    4200 4500 1.07 107% 22 12 -10
    Silurian
    4500 3000 0.67 67% 12 22 +10
    Devonian
    3000 4000 1.33 133% 22 22 0
    4000 1400 0.35 35% 22 16 -6
    Carboniferous
    1200 380 0.32 32% 20 12 -8
    Permian
    250 1900 7.60 760% 12 23 11
    Triassic
    1900 1300 0.68 68% 23 21.5 -1.5
    Jurassic
    1100 2500 2.27 227% 22 22 0
    2500 2000 0.80 80% 21 16 -5
    2000 2300 1.15 115% 16 15.5 -0.5
    Cretaceous
    2300 700 0.30 30% 15.5 22 6.5
    Tertiary
    900 300 0.33 33% 22 12 -10

  433. Vincent Guerrini Jr
    Posted Jul 31, 2008 at 12:32 AM | Permalink

    These three sites seem to monitor weather/ (7 day climate?) quite well:
    There has been some normal heating in NH for last 2 weeks as expected since 9/10 of land mass mid-summer in NH but now the “real” cold sesm to be spreading both NH and SH

    http://wxmaps.org/pix/clim.html

    click on each continent temperatures
    The trend at wxmaps seems to be highly correlated with satellite temperatures if you look closely (unfortunately you have to follow it probably on a weekly basis.

    http://discover.itsc.uah.edu/amsutemps/

    900, 600 and 400 mb level.
    From this we could expect major cooling in SH and then NH. Another good pointer is this 7 animation of land/sea temp anomalies.

    http://www.cdc.noaa.gov/map/images/fnl/sfctmpmer_01a.fnl.anim.html

    So these three measurements seem to monitor day to day and 7 trend data quite well. Comments welcomed

  434. Vincent Guerrini Jr
    Posted Jul 31, 2008 at 1:01 AM | Permalink

    Re 420 previous

    What I find interesting is the cold pools forming (and disappearing intermittingly, of course)in the NH (artic and northern Russia, North America and Europe for past 6-7 month as distinct to last year when warm pools predominated there. Also the SH warm pools during SH antartica are quite interesting. Seems to be a flip-flop from last year. Must be the sun… whether geomagnetic, TSI or some other factor me thinks…., because earths atmosphere hasn’t changed in the meantime has it?

    http://www.cdc.noaa.gov/map/images/fnl/sfctmpmer_01a.fnl.anim.html

  435. D. Patterson
    Posted Jul 31, 2008 at 1:12 AM | Permalink

    Climate Experts Tussle Over Details. Public Gets Whiplash. Andrew C. Revkin. New York Times, July 29, 2008. Revkin quotes Pielke, mentions RealClimate.org, and others; but climateaudit.org is not mentioned. Revkin’s article describes a confusion and uncertainty amongst the public as to whether or not Global Warming is a reality while scientists struggle to comunicate the urgency of a climate emergency.

    Evidently, Revkin must be in disagreement with OFCOM’s conclusion that there is no longer a valid controvdersy about the existence or non-existence of the anthropogenic Global Warming promoted by the IPCC and others.

  436. STAFFAN LINDSTROEM
    Posted Jul 31, 2008 at 4:09 AM | Permalink

    # 420 Vincent Guerrini, from wxmaps maps, I browsed
    through all of them: Most mountain ranges will have
    below average temps next 180 hours, Tropical Africa
    real “cool” as well as the subcontinent ie India and
    most of China. S Mongolia very hot, 8-10 degrees above
    normal, South America cool except Pampas and Amazonas,
    As Mexico and the rest of Central America are mountaneous,
    mostly cool…N America quite hot except ALASKA and YUKON
    Except SW AK…Home to Europe…Alps and Massif Central a
    little cool and most of Russia cool or very cool rest
    a little warmer/hotter than normal..SO comes cold climate
    from above?? meaning freezing level sinking worldwide,
    I’ve seen this on Snow Forecast com too Kilimanjaro
    freezing level down to 4050m …and no precipitation!!
    Que pasa, compañeros?

  437. Posted Jul 31, 2008 at 6:30 AM | Permalink

    Hopefully, this effort will lead to a rigorous standard in time series analysis yielding robust results here to fore unprecedented in paleoclimatology.

    A good paper would be applying the proposed standard CI method to the author calibrated proxies used by Loehle in his paper.

  438. Posted Jul 31, 2008 at 6:54 AM | Permalink

    This may be off topic even for unthreaded but, has anyone determined how significant a regional event like the MWP would have to be to produce a say 0.25 degree C global temperature impact. Steve mentioned the basic math in the Tsonis thread previously but I haven’t seen it picked up by anyone.

    In the 1980’s I believe, there was a roughly 3 degree C jump in temperatures in northern Europe apparently due to natural climate change. It would be interesting to determine what impact that may have/should have had on the global average temperature, mathematically.

    Just a thought. Ignore this if it is irrelevant.

  439. kim
    Posted Jul 31, 2008 at 7:06 AM | Permalink

    422 (D. Patterson) Andy Revkin is a believer, but he also is fair-minded. His DotEarth blog is moderated such that skeptics have a voice, and I’m pretty sure he is aware of the strong and growing skeptic position. I’d encourage experts here to weigh in over there; it’s pitiful having someone like me explicating the alternative view.
    =====================================================

  440. cba
    Posted Jul 31, 2008 at 7:17 AM | Permalink

    DeWitt,

    I’m back in crunch mode so may not be able to do anything for over a week that takes more than a few minutes here or there. Looking up was the last thing I succeeded in setting up this week after the programming change. At the 40km – looking up to 100km, emissions are showing 36w/m^2 vs your 1.9 from modtran. At the surface, mine is 251 versus 258.7 w/m^2 for modtran. Note too, I’m at a slightly narrower bandwidth. Compared with K*T 97 (the big cartoon), after corrections for surface albedo according to their value of .08, I’m about 3 w/m^2 off of their SW incoming estimate.

    There is .0522mb at70km and a total mass estimate (weight) of about half that over the 5km shell thickness I’m using for the model. Note at these altitudes, the emission / absorption is occuring over 5km length versus 1km for the bottom atm.

    In synopsis, the model follows modtran for OLR for the first 15km to the linewidth of the graph. Above there, it approaches a slightly lower plateau until T increases and it bulges around 50km, then returns to its line while modtran is just a flat line. For inbound, around 40km, the model is 36w/m^2 higher emissions downward versus 1.9 of modtran. By ground level is 251 vs 258.7 for modtran. For SW, transmission to ground level is very close to Keihl and Trenberth value. The same software is used for all layers and wavelengths. The same input data is used (1976 std atm). The material in the optical path is determined the same way for each altitude. T corrections to the line calcs has recently been fixed (apparently or supposedly) to be in accord to Hitran documentation.

    Now what do these factors suggest as to the nature of the anomoly?

    Also note that I’m still dealing with two halves of the whole. The OLR and incoming are separate and have not been combined yet for a total. What is radiated outward from a shell must be radiated downward from a shell as well. IE, the energy balance comparison in the model has not yet occurred. Perhaps tonight as that may be the last chance for several days.

  441. jae
    Posted Jul 31, 2008 at 7:42 AM | Permalink

    I’m with John A. If you have to cherry-pick samples to get a relationship, then it ain’t science, anyway.

  442. Timo van Druten
    Posted Jul 31, 2008 at 8:17 AM | Permalink

    Kim (425),

    It seems that Chris Colose at RealClimate doesn’t think that a lot of people do not take you seriously
    “For the most part, I don’t think a lot of people (or at least those that matter) take people like “kim” seriously”

    http://www.realclimate.org/?comments_popup=584

    However, he has noticed you and consider that your making compelling comments on the thread of Andy Revkin.

    Another interesting comment is comment 69 in that particular thread:
    “I am disheartened to admit that, although a big fan of Real Climate and Gavin, et.al, nearly done slogging thru AR4, and originally finally convinced by Spencer Weart’s “The Discovery of Global Warming”, that I can no longer comfortably say “the science is settled.” But only that it seems “probable” (more likely true than not) that AGW is occuring, and primarily as a function of GHG’S. I suspect very strongly that within 20 years the case for “certainty” will be much stronger. And also much closer to being “too late.”

    On the other hand, I am not a scientist, struggle to follow many of the arguments and threads, and am very comfortable totally dismissing those who cry “hoax.”

    I will continue to watch and study and attempt to be at least minimally conversant. But my confidence of what I used to feel more certain about is waning.

    Not specifically relevant to this or any other thread; just my $0.02; and wondering if any other of you, particularly those more capable than I, might also be feeling the same?
    Comment by Shelama — 30 July 2008 @ 5:31 PM

  443. jae
    Posted Jul 31, 2008 at 10:20 AM | Permalink

    422, D Patterson:

    Revkin’s article describes a confusion and uncertainty amongst the public as to whether or not Global Warming is a reality while scientists struggle to comunicate the urgency of a climate emergency.

    Confusion and uncertainty, indeed. Ironically, it is caused in large part by outlandish claims of catastrophe that only the most naieve could believe. And now with gasoline above $4.00, the elitists are about to meet Joe Sixpack.

  444. Kenneth Fritsch
    Posted Jul 31, 2008 at 10:25 AM | Permalink

    From #415:

    Steve: I’ve done some experiments using maximum likelihood methods on the trend data and got wider confidence intervals than Lucia has with Cochrane-Orcutt; and it’s very sernsitive to specification e.g. LTP.

    I do not follow RankExploits closely but I evidently and mistakenly recalled that Lucia was using annual temperature anomalies to avoid the Cochrane-Orcutt correction with monthly data. I thought the monthly plus correction for autocorrelation gave approximately the same CI as the annual (with less data points, obviously) without corrections required for AR1. Question: should one avoid using monthly data that needs autocorrelation corrections when the annual data shows little or no autocorrelation?

  445. DeWitt Payne
    Posted Jul 31, 2008 at 11:25 AM | Permalink

    cba,

    If you can plot a spectrum of the 36 W/m2 downwelling radiation (1900% error) at 40 km, there might be a clue to what is wrong. You should only be seeing emission from CO2 and ozone and that emission should be far from saturated. Somehow your line strengths at high altitude are going off the chart. I was looking at the spectral calculator and the line strength for CO2 at 667 cm-1 at zero pressure was about 3E-19 cm-1/mol cm-2. Lines do not narrow without limit, there is a ‘natural’ line width even at absolute zero due to the doppler broadening from the Uncertainty Principle (or something like that). Above absolute zero, there’s always additional doppler broadening.

  446. Jeffrey Mushens
    Posted Jul 31, 2008 at 11:58 AM | Permalink

    I read your blog with interest as a layman who will be affected by the decisions policy makers take with respect to climate change. So I have some questions. I should say that, purely as an interested observer, there does appear to be recent rapid increases in average global temperature over the twentieth century and the idea that there should be a man made element seems pretty plausible.

    But,

    1. How much of the increase is a result of coming out of the Little Age Age?
    2. Are the temperatures seen comparable to the past?
    3. Is it really true that the IPCC backcasting for temperatures through to Roman times is derived fro hypotheses about the links between bristle -cones and temperature?
    4. Are there any other ways of using records from the past to generate temperature estimates? i remember from my history studies, that in Roman Britain it was possible to estimate temperatures relative to modern times depending on pollen levels from plants at various latitudes and this, together with similar work in mediaeval England (around 13 AD was the end of the MWP) gave an indication that modern temperatures are not uniqely elevated. It was this – the declaration that there was no MWP, in flat contradiction to the historical record- that aroused my doubts about some of the estimates about the past.
    5. How accurate have the various IPCC forecasts been compared to the actual? Anybody involved in modelling knows that it is (relatively) easy to fix your models to reflect the past – but that does not make them much better in seeing into the future.
    6. How accurate are the climate models compared to say econometric models in looking one or two years out?

    Thank you very much in advance


    Steve:
    These are all large questions that have been the topics of many threads here. The Categories on the left classify past posts and I hope that you’ll find some that are responsive to your interests.

  447. harold
    Posted Jul 31, 2008 at 3:47 PM | Permalink

    I agree with Sam. Using a thermometer one time, to find out if I have a fever is ok, but using a quantitative model for the earth’s temperature is asking for problems. But I guess that is what they were looking for.

  448. D. Patterson
    Posted Jul 31, 2008 at 4:34 PM | Permalink

    harold says:

    July 31st, 2008 at 3:47 pm
    I agree with Sam. Using a thermometer one time, to find out if I have a fever is ok, but using a quantitative model for the earth’s temperature is asking for problems. But I guess that is what they were looking for.

    It all depends on where you put the thermometer, for how long, and just what kind of temperature you really need to derive the solution needed for a particular problem. Looking at the planet from afar can tell you something about the amount of energy it is radiating, but the ability to infer how much energy, thermal and otherwise, it harbors and contributes to an atmosphere in an given time period becomes problematical when other factors besides radiative properties come into play. Likewise for measurements of limited proxy samples of the air temperature in the planetary boundary layer, upper air layers, and surface sea layers. They tell you far too little about the mass of the hydrosphere which provides both the overwhelming mass for varying patterns and distributions of energies. IPCC measurements of cloud conditions are virtually absent from the energy equations, despite their significant effects upon modulation and distributions of energies. Getting a general idea about the prevailing atmospheric mean temperatures of the planet’s atmosphere are helpful to some disciplines such as paleontology and geochemistry, insofar as they can crudely categorize gross differences in the planetary environment, but attempting to characterize the planetary environment’s future changes in fine detail with unrepresentative sampling risks even greater unrepresentative extrapolations and results in false modeling. For example, ask what the measurements are for the masses of the hydrosphere, cryosphere, and atmosphere; and ask for the measurements of how their energy potentials vary in their ability to contribute to atmospheric temperatures over long time periods? Does anyone really know to a required degree of accuracy just how much energy is stored in the masses and just how much of those energies are translated into atmospheric surface air temperatures at given points in time? If not, then how can the GCMs properly calculate such unknown values to reach a deterministic conclusion about a future climate condition? Failure to correctly provide a model with initial conditions or valid parameterizations can result only in what?

  449. Pat Frank
    Posted Jul 31, 2008 at 5:28 PM | Permalink

    #414 — Kenneth, your understanding is right on. But I was referring to central data archives, not experimental data stored on an author’s computer. I have all the data I’ve gathered in my current research position, even including UV/vis absorption spectra and EPR spectra, plus my notebooks, and can provide information on all of that. Scanning my bound notebooks would be a total pain, and I can tell you now I’ll never, ever do it. :-)

    But to the larger question, the group I work in, for example, collects x-ray absorption spectra of biologically interesting transition metals, and sometimes of their ligands (mostly halides and chalcogenides). We have a central server that has all the data we’ve ever collected dating back to 1995. But it’s organized by date and x-ray beamline, and not by element or project. We also have data on DAT tapes and TK50 and TK70 tapes dating back to about 1980 or so. The data tapes are labeled with dates, but not with data sets. We have binders with back-up list files describing the files on the data tapes. No one has looked at any of the old data for many years. We keep hard copy log files of all of our experimental runs dating back to 1986, plus the associated note books. We have shelves and shelves of hard copy data in binders, and grad student notebooks. No one looks at them.

    Previous students have been pretty good about electronically archiving their own data, and we can pretty readily find old x-ray spectra in the folders they left on our server, but we know that some data are missing. We probably have thousands of x-ray spectra in digitized formats, lots of them in binary.

    And that’s just us. Other groups probably have similar stores of data, and none of that is organized in any discipline-wide sense. Our graduate students don’t want to spend the time making systematic archives for all of this stuff. It would take months. They are all pretty good about tracking their own data, but that’s about it. When they leave, we ask for some sort of archive, but when they’re gone it’s pretty much up to us to maintain that. There is no central archive where x-ray absorption spectra can be deposited. This disorganized situation is probably typical for most academic research groups and disciplines.

    In my papers, I consciously give all the information necessary to reproduce my data and results. But I’ve found, in trying to understand other’s work, that this standard is not always upheld. Sometimes I wonder if some people are coy in a studied way about their methods.

    Maybe it would be a good thing if there was a national effort to institute a set of central clearing houses for data that are worth archiving. But until that happens, academic research groups are likely going to be very spotty about archives. The culture just doesn’t pay much attention to old data, because it’s all about what’s happening now, new results, and how to make that bloody experiment work. You probably remember that last part. :-) What’s past is published and mostly, that’s that.

  450. cba
    Posted Jul 31, 2008 at 7:55 PM | Permalink

    DeWitt,

    Here is the downward IR at 40km. Ignore the negative values in the visible for the moment as there was a subtraction needed to get rid of the visible solar contribution and I haven’t had any time to to determine why there are some negative values present there. It may just be too much dynamic range resolution for numbers in double precision floating point. This shouldn’t be relevant to the IR for any consequence for the spectrum. If there’s problems, it’s much greater than that small amount.

    Note this is for 384ppm co2 and all 38 molecules for the 1976 std/atm and hitran.

  451. cba
    Posted Jul 31, 2008 at 8:00 PM | Permalink

    DeWitt,

    PS, that graph is all downward radiation starting at 100km in radiative flow down through the layers – at least for that power making it through more than one layer.

  452. DeWitt Payne
    Posted Aug 1, 2008 at 12:20 AM | Permalink

    cba,

    I see way too much CO2 at 15000nm and even more too much ozone at 99600nm. I’m not sure what the stuff is about 8000nm and shorter. I think I did the units conversion correctly for the MODTRAN data. Here it is plotted in the same energy units as yours. I’m putting in a link to the plot rather than the plot itself to minimize load on the server. As you can see, the CO2 emission is far from saturation, unlike yours. And the ozone is barely evident.

  453. DeWitt Payne
    Posted Aug 1, 2008 at 12:21 AM | Permalink

    One too many zero’s on the ozone wavelength, should be 9960.

  454. Posted Aug 1, 2008 at 12:28 AM | Permalink

    The UK Met Office has answered a FoI request about their Climate Myths pages on their website, showing the scientific evidence to support their claims. The requester has clearly been reading CA, because he asks for evidence to support the Met Office’s claim that the link between rising CO2 and rising T is “well-quantified”. Surprise, surprise, the answer is “IPCC 4AR”.

  455. gb
    Posted Aug 1, 2008 at 1:17 AM | Permalink

    I see a lot of nonsense/misunderstanding about chaos v. predictability, viscosity in (climate) models and the implications of Lorenz’ work. Perhaps it is good to remind the people that engineers/scientist have already developed many models for a chaotic process such as turbulent flows with quite some success. Also in this case it is impossible to develop a model for the instantaneous turbulent velocity, but models quite well predict the mean and other statistics of turbulent flows. And these models use subgrid viscosity … Do a google search and you will find many papers about this topic or get a textbook on turbulence.

  456. Posted Aug 1, 2008 at 1:39 AM | Permalink

    Ask for the specific part of AR4 which clearly demonstrates the link

  457. Pierre Gosselin
    Posted Aug 1, 2008 at 2:40 AM | Permalink

    From Science Daily 31 July 2008:
    “Climate Change Science Program Issues Report On Climate Models”
    “Change Science Program (CCSP) has released a new report “Climate Models: An Assessment of Strengths and Limitations,” the 10th in a series of 21 Synthesis and Assessment Products (SAPs) managed by U.S. federal agencies.”
    “‘The authors find that the “models have important strengths and limitations.'”

    http://www.sciencedaily.com/releases/2008/07/080731173127.htm

  458. MarkR
    Posted Aug 1, 2008 at 4:06 AM | Permalink

    Meanwhile, the rest of the world has a reality check:

    “As western nations step up pressure on India and China to curb the emission of greenhouse gases, Russian scientists reject the very idea that carbon dioxide may be responsible for global warming….

    http://www.hindu.com/2008/07/10/stories/2008071055521000.htm

  459. Posted Aug 1, 2008 at 5:16 AM | Permalink

    UC, Steve, Ross, and everyone else:

    Whilst I’m not going to pretend to understand the nuts and bolts of the Brown statistical analysis, I can understand calibration. If a key assumption of dendros is that tree rings contains some Y resulting from temperature X, and they produce lots and lots of “evidence” of statistical correlation between X and Y, then people tend to build careers reproducing past climate. If the basis for the assumption of a link between X and Y turns out to be false, then the statistical testing has failed to detect a spurious result.

    Does the methodology of Brown help spot this? I don’t think it does.

    Sure, one should not attempt calibration when one is not confident that \beta \neq 0 ( Brown82 ) . And our ideas on causation must come from outside statistics (any stat book).

    I’ve no idea what this confidence is, if it means something like “faith”, the substance of things not seen. Before the Brown statistical filter there needs to be a physical mechanism which links X and Y in a meaningful way, or if you like things negative, a BS filter.

    You’d think that before $50 billion or whatever of US taxpayers’ money was spent on climate science, someone would have physically checked that trees physically carry temperature change from leaf to trunk but noooooo….too much like hard work.

    I also find myself wondering about the assumption of linearity in systems which clearly show clear non-linear behavior on every scale.

  460. Scott-in-WA
    Posted Aug 1, 2008 at 6:07 AM | Permalink

    gb: Perhaps it is good to remind the people that engineers/scientist have already developed many models for a chaotic process such as turbulent flows with quite some success. Also in this case it is impossible to develop a model for the instantaneous turbulent velocity, but models quite well predict the mean and other statistics of turbulent flows. And these models use subgrid viscosity … Do a google search and you will find many papers about this topic or get a textbook on turbulence.

    And this means precisely what in the context of attempting to reach some conclusion as to why the specific GCMs being evaluated do poorly at predicting climate on decadal timeframes at the eight chosen stations?

    Gb, can you — and/or anyone among the CA readership out there in Cyberspace who wishes to step forward — speak to the specific internal design of each GCM being evaluated in this paper and tell us what specific features about its design (e.g. physical processes modeled, parameter initialization or lack thereof etc. etc.) should enable it to perform the climate prediction functions that are being expected of it?

  461. Jordan
    Posted Aug 1, 2008 at 6:13 AM | Permalink

    In support of KevinUK and PaulM. Have a look and you might find a RS publication “Climate Change Controversies: a simple guide” (http://royalsociety.org/page.asp?id=6229)

    Looking forward to a round-up of scientific controversies in lay terms … think again:

    This is not intended to provide exhaustive answers to every contentious argument that has been put forward by those who seek to distort and undermine the science of climate change and deny the seriousness of the potential consequences of global warming. Instead, the Society – as the UK’s national academy of science – responds here to eight key arguments that are currently in circulation by setting out, in simple terms, where the weight of scientific evidence lies.

    Those who seek to distort and deny. Does that sound like “us and them”.

    Then into the real meat of the climate controversy:

    Misleading argument 1 : The Earth’s climate is always changing and this is nothing to do with humans.
    Misleading argument 2 : Carbon dioxide only makes up a small part of the atmosphere and so cannot be responsible for global warming.
    Misleading argument 3 : Rises in the levels of carbon dioxide in the atmosphere are the result of increased temperatures, not the other way round.
    Misleading argument 4 : Observations of temperatures taken by weather balloons and satellites do not support the theory of global warming.
    Misleading argument 5 : Computer models which predict the future climate are unreliable and based on a series of assumptions.
    Misleading argument 6 : It’s all to do with the Sun – for example, there is a strong link between increased temperatures on Earth with the number of sunspots on the Sun.
    Misleading argument 7 : The climate is actually affected by cosmic rays.
    Misleading argument 8 : The scale of the negative effects of climate change is often overstated and there is no need for urgent action.

    How about climate sensitivity. Is that controversial? I dunno, maybe the problem whas how to phrase it in terms of a misleading argument without the whole article backfiring.

  462. Craig Loehle
    Posted Aug 1, 2008 at 6:27 AM | Permalink

    It was nice of the RS to compile all the contentious issues, but to call them “misleading arguments” is very very naughty, and the RS should be sent to bed without dinner. These are serious issues that deserve serious consideration. Cosmic rays are a misleading argument? Then why can correlations with temperature and cosmogenic isotopes like Be-10 be found going back tens of thousands of years? and so on.

  463. Gunnar
    Posted Aug 1, 2008 at 6:39 AM | Permalink

    John A, very well said in #444. Math is a tool that serves science and engineering, not the other way around. One must first start with a descriptive scientific hypothesis of causality, and then attempt to invalidate it. Statistics can perhaps only serve to help invalidate the hypothesis, not provide the only support for it. It’s not dispositive.

    math hubris

  464. cba
    Posted Aug 1, 2008 at 7:41 AM | Permalink

    DeWitt,

    The initial data should be the same for qty of o3 there unless it’s been turned into a variable by modtran. I also did a 40km emission only chart but it’s only slightly different from the original I posted so I didn’t put it up. The peak on it was 0.0091 for 15microns and 0.011 for 9.6microns. Considering the code on my model is the same for alllines and wavelengths, I would expect that any error in line height would be somewht consistant between molecules and that if the o3 shows stronger than the co2 – then there is a pretty good liklihood it might be stronger. There’s about 47 times the co2 versus o3 but I thought o3 was far more potent than co2 per unit as well.

    I did co2 only at 40km – and that’s for 2.5km (5km thickness starts at 50km) and it’s still high. Actually, it is about right for a 1 meter path length for the peak at around 0.002 W/m^2. So far, searching for a pathlength adjustment duplication has shown nothing.

  465. DeWitt Payne
    Posted Aug 1, 2008 at 8:37 AM | Permalink

    cba,

    Are you sure your emission units aren’t W m-2 nm-1 steradian-1? If you calculate a Planck curve for a given temperature, is there a pi in the formula? I say this because I’m trying to duplicate your results by raising the CO2 concentration in MODTRAN and my peak intensity is lower than yours by a factor of pretty close to three. I’m going to assume that it actually is steradian-1. In that case, the results in the posted graph should be a factor of 3.14 higher. I’m at 30,000 ppm CO2 and the results are starting to look a lot like yours for CO2 for 40 km looking up.

  466. Hoi Polloi
    Posted Aug 1, 2008 at 8:43 AM | Permalink

    The End Is Near!

    Only 100 months left…

    Because in just 100 months’ time, if we are lucky, and based on a quite conservative estimate, we could reach a tipping point for the beginnings of runaway climate change. That said, among people working on global warming, there are countless models, scenarios, and different iterations of all those models and scenarios. So, let us be clear from the outset about exactly what we mean.

    The concentration of carbon dioxide (CO2) in the atmosphere today, the most prevalent greenhouse gas, is the highest it has been for the past 650,000 years. In the space of just 250 years, as a result of the coal-fired Industrial Revolution, and changes to land use such as the growth of cities and the felling of forests, we have released, cumulatively, more than 1,800bn tonnes of CO2 into the atmosphere. Currently, approximately 1,000 tonnes of CO2 are released into the Earth’s atmosphere every second, due to human activity. Greenhouse gases trap incoming solar radiation, warming the atmosphere. When these gases accumulate beyond a certain level – often termed a “tipping point” – global warming will accelerate, potentially beyond control.

    According Andrew Simms in The Graudian: http://www.guardian.co.uk/environment/2008/aug/01/climatechange.carbonemissions

    On which planet has he been living lately???

  467. DeWitt Payne
    Posted Aug 1, 2008 at 8:51 AM | Permalink

    cba,

    I’m backwards. Mine is steradian-1. But it still means I have to multiply the MODTRAN results by pi as well as 10 to convert from W cm-2 micrometer-1 steradian-1 to W m-2 nm-1. At 30,000 ppmv CO2 I’m getting 9.4 W/m2 total, which looks like it should be pretty close to your results. Now for ozone.

  468. bender
    Posted Aug 1, 2008 at 8:57 AM | Permalink

    On which planet has he been living lately?

    Sounds like Venus.

  469. John Lang
    Posted Aug 1, 2008 at 9:03 AM | Permalink

    The concentration of carbon dioxide (CO2) in the atmosphere today, the most prevalent greenhouse gas, is the highest it has been for the past 650,000 years.

    And for 600,000 of those 650,000 years, there was a mile of ice over New York.

    And 121,000 years ago, temperatures were up to 3.0C warmer than today when CO2 was only 277 ppm or 30% lower than today. Accordingly to global warming theory, the CO2 level should have been at least double what it was to cause such a warm period.

  470. Posted Aug 1, 2008 at 9:22 AM | Permalink

    Re #451

    And 121,000 years ago, temperatures were up to 3.0C warmer than today when CO2 was only 277 ppm or 30% lower than today. Accordingly to global warming theory, the CO2 level should have been at least double what it was to cause such a warm period.

    Not true, Global Warming theory also takes account of orbital and solar changes, I would suggest that you look at the following for ~120kya:

    Milankovitch cycles

  471. DeWitt Payne
    Posted Aug 1, 2008 at 9:48 AM | Permalink

    cba,

    I have produced this graph that looks a lot like yours. The integrated power is 34.5 W/m2. The conditions used: 1976 standard atmosphere, 40 km looking up, CO2 38,400 ppmv, Strat. Ozone scale 10,000, everything else default. That’s a factor of 100 increase for CO2 and 10,000 for ozone.

    The TOA looking down and ground level looking up using these conditions are way off, so your calculations at low altitude seem to be better than those at high altitude. That would imply that the problem is still with your line width and height vs. pressure calculation. Your low pressure line strengths must be orders of magnitude too high. What do you calculate for the peak line strength for CO2 at 15 micrometers and ozone at 9.6 micrometers at zero pressure?

  472. Posted Aug 1, 2008 at 10:11 AM | Permalink

    #443

    Bit confused about this Russian report because I thought the time lag was well know,even New Scientist agrees it happens on their climate myths site but then uses semantics to explain it away.
    As far as I can see the co2 levels are reacting to the MWP temperatures not to todays.

    TonyB

  473. DeWitt Payne
    Posted Aug 1, 2008 at 10:14 AM | Permalink

    Yes engineers can calculate useful information in turbulent flow regimes. The McLaren and Ferrari Formula One teams wouldn’t be spending megabucks on computational fluid dynamics to improve the design of their racing cars if they couldn’t. The difference is that the F1 teams have wind tunnels where they can test and refine their mathematical models at full scale. Climate modelers don’t have this luxury.

  474. John Lang
    Posted Aug 1, 2008 at 10:46 AM | Permalink

    Not true, Global Warming theory also takes account of orbital and solar changes, I would suggest that you look at the following for ~120kya:
    Milankovitch cycles

    They say they do, but what “orbital forcing” is included in the models right now? How much has orbital forcing changed in the last 250 years? How much of the temperature change over the last 150 years is from orbital forcing? If it can cause ice ages and temperature changes of +/- 5.0C in just a few thousand years, there must be some small impact on periods of a few hundred years as well.

    They have no idea so they just ignore it.

    Solar forcing included in the models is essentially non-existent.

  475. Kenneth Fritsch
    Posted Aug 1, 2008 at 10:51 AM | Permalink

    The discussion of the topic of this thread makes me think about the motivations for climate science to follow the avenues of study that it does. I am nearly convinced that that motivation originates from the answers sought for climate policy questions and pushing mitigation and that explains climate science seemingly creeping away from what in other areas of science would be doing the science in its purer forms and letting the chips (any policy related questions) fall where they may.

    Why do we see climate modeling pushing into spatial resolutions and feedbacks where it admittedly has major limitations and when push comes to shove will back off to a global position that modelers feel more comfortable with? Why do we not see a more concerted effort to present the exposition that Steve M has called for here and the use of simple models to explain 2XCO2? Add to that the question of whether climate models are taking, or can take, into account some of the natural climate variations.

    I think in answering these questions one has to look at what the effects would be on policy and perhaps funding for climate science in light of that science putting all its efforts into determining/predicting a global mean temperature change and attaching some CI limits to it. I think such an effort would lead, regardless of the magnitude predicted, to a rather complacent reaction from John Q Public and his political representatives. It gets hot/cold in the winter/summer or vice versus so what is the big deal.

    Statements/predictions like droughts, flooding, hurricanes and the entire arsenal of extreme weather/climate events and tipping points are a much more effective array to motivate and interest the public. An averaged global temperature or climate models with a purported capability to predict it will not. An exposition with a simple model to explain 2XCO2 will only be capable of giving a mean global temperature change and again without features to interest and motivate the public. And of course any hints of significant natural variations in climate (beyond man’s ability to control) would seemingly give the public an opportunity to get off the hook.

  476. Mark T.
    Posted Aug 1, 2008 at 10:54 AM | Permalink

    If the basis for the assumption of a link between X and Y turns out to be false

    Such as… divergence? Ahem…

    Mark

  477. bender
    Posted Aug 1, 2008 at 11:04 AM | Permalink

    Statements/predictions like droughts, flooding, hurricanes and the entire arsenal of extreme weather/climate events and tipping points are a much more effective array to motivate and interest the public.

    People experience the natural world as event sequences. Whereas scientists study continuous processes. The urge to repackage science in an accessible form has to be curbed, lest a modest repackaging become a major distortion. Things like “critical thresholds” and “tipping points” that used to be useful scientific terms become meaningless rhetorical devices. All of a sudden, danger is everywhere.

  478. Mark T.
    Posted Aug 1, 2008 at 11:37 AM | Permalink

    Uh, there was supposed to be another blocked out quote in that one. Showed up nicely in the preview pane. Obviously, my comment starts with “Oof.” :)

    Mark

  479. tty
    Posted Aug 1, 2008 at 12:12 PM | Permalink

    Re 108

    Yes, computational fluid dynamics is a very active field. For aircraft the theory is rock solid (well, almost), but unfortunately not even the fastest computers in the world can handle the computations in a reasonable time for a complex shape. So, we have to simplify, and approximate and (bad word) parametricize. And so we must still test fly, and very carefully open up the envelope, and even so we get the occasional very nasty surprise. Now if we can’t even fully model the airflow around an aircraft how likely is it that we can model the airflow around the whole planet, for decades ahead, complete with interactions with ocean, precipitation and everything else?

  480. Posted Aug 1, 2008 at 12:27 PM | Permalink

    Steve;

    Are you aware of NASA-STD-7009 (July 2008)?
    (Models and Simulations Standards)

  481. Sam Urbinto
    Posted Aug 1, 2008 at 12:37 PM | Permalink

    bender: Next on “Survivor Paleo”, the red team throws a dissenter off the island!

    Henry: Yep. Take the second one too:

    “Carbon dioxide only makes up a small part of the atmosphere…”
    “…and so cannot be responsible for global warming.”

    Reverse it:

    “Carbon dioxide absorbs strongly in the infrared…”
    “…and so must be responsible for global warming.”

    In both cases, maybe it is, maybe it’s not. Then we can get more obvious with similar stuff:

    “Carbon dioxide levels have risen since the 1950s…”
    “…and so is responsible for the rise in crime since then.”

    “There are far fewer pirates around today than in the 1800s…”
    “…and so a lack of pirates causes global warming.”

    Just a bunch of types of logical fallacies together. Say, are you still wanting to know who the King of France is? People say it would be beneficial for you to know.

  482. jeez
    Posted Aug 1, 2008 at 1:19 PM | Permalink

    Da Debil made him do it.

  483. D. Patterson
    Posted Aug 1, 2008 at 1:23 PM | Permalink

    452 Phil. says:

    August 1st, 2008 at 9:22 am
    Re #451

    And 121,000 years ago, temperatures were up to 3.0C warmer than today when CO2 was only 277 ppm or 30% lower than today. Accordingly to global warming theory, the CO2 level should have been at least double what it was to cause such a warm period.

    Not true, Global Warming theory also takes account of orbital and solar changes, I would suggest that you look at the following for ~120kya:

    Milankovitch cycles

    Does not compute, Will Robinson! Are you saying temperature does not increase 1C for each doubling of CO2?

  484. LawsofNature
    Posted Aug 1, 2008 at 2:24 PM | Permalink

    RE:#43 (reposted after correction of LF . .)
    Dear Jordan, Steve at al.,
    I have seen these arguments about CO2 like they are presented there a couple of times and I still don’t seem to get it . . I hope this is the right place to ask some questions about it . . If not could you please direct me to one!?
    These are some “facts” as far as I understand it:
    – CO2-Molecule has a livetime of about 5 years in the atmosphere (estimated with C14 after the nuclear bomb tests in the 60ties), therefore for longer trends it is unclear where the trend is coming from. An average CO2 molekule which is in the atmosphere now was 5 years ago most likely in the sea.

    – CO2 concentration in the near surface sea water (NSSW) is very close to equilibrium with the atmospheric CO2-concentration (again there are only quite short time scales to reach eq. about 10 years AFAIK)

    – this should lead* (with a Revelle-factor of 10) to a roughly 3% higher CO2 concentration in the NSSW compared to the deep sea water (DSW) and therefore a change in the exchange rates between NSSW/DSW
    *for an only anthropogenic reason for the CO2 increase

    – CO2 from fossil sources has a depleted C12/C13 ratio which allows tracking it (but that doesn’t proof anything about being the reason for any CO2 increase)

    – sources and sinks of CO2 in the oceans are quite independent from each other in space and time (meaning that the transport processes of CO2 within the oceans are on long timesales and not very well measured)

    This seems to be some known facts about the CO2-concentrations. But how is it possible from there to surely exclude the possibility that the increase is non-manmade? Especially since the exchange rates between NSSW and DSW don’t seem to be shifted by 3% (3% less uptake AND 3% more dumping over the last 150 years would mean that all the anthropogenic CO2 is in the DSW by now or I am not doing my numbers correctly?)

    Thanks for any help with that puzzle and have a nice weekend,
    lon

  485. KevinUK
    Posted Aug 1, 2008 at 2:29 PM | Permalink

    #448 Hoi polloi

    Thanks for the link to the Guardian article.

    “The concentration of carbon dioxide (CO2) in the atmosphere today, the most prevalent greenhouse gas“.

    Oh dear yet another typical example of journalistic scientific ignorance. Andrew Simm clearly failed his chemistry GCSE as he obviously doesn’t know that the most prevalent greenhouse gas is hydrohydroxic acid in its gaseous form and not CO2. In its liquid form this most prevalent of the greenhouse covers approx. 70% of our planet (see here for more info) and is responsible for over 90% of the GHG effect.

    Regards

    KevinUK

  486. bender
    Posted Aug 1, 2008 at 2:45 PM | Permalink

    “hydrohydroxic acid”? Egad – sounds dangerous!!

  487. Pat Keating
    Posted Aug 1, 2008 at 2:49 PM | Permalink

    459 and 460
    Sounds almost as bad as the dihydrogen monoxide discovered by Penn and Teller.

  488. Sam Urbinto
    Posted Aug 1, 2008 at 2:52 PM | Permalink

    DHMO

  489. bender
    Posted Aug 1, 2008 at 2:53 PM | Permalink

    monoxides are planet Earth’s stealth killers, aren’t they?

  490. bender
    Posted Aug 1, 2008 at 3:09 PM | Permalink

    P.S. The audience in this case is unreliable. They feign skill, but impartial tests of reliability show them to be charlatans.

  491. KevinUK
    Posted Aug 1, 2008 at 3:12 PM | Permalink

    #461 and 462

    As this web site shows DHMO is clearly far more dangerous to mankind than CO2.

    Regards

    KevinUK

  492. bender
    Posted Aug 1, 2008 at 3:18 PM | Permalink

    #464 Will global warming result in increasingly toxic levels of DHMO?

  493. KevinUK
    Posted Aug 1, 2008 at 3:45 PM | Permalink

    bender

    Since according to that web site it is a major component of acid rain then I would say that with a 99% certainty you will be right that GW will result in increasingly toxic levels of DHMO. Here are some other reasons as to why this hazardous substance should be banned

    1. It is the major component of acid rain.
    2. It contributes to the “greenhouse effect”.
    3. It may cause severe burns.
    4. It contributes to the erosion of our natural landscape.
    5. It accelerates corrosion and rusting of many metals.
    6. It may cause electrical failures and decreased effectiveness of automobile brakes.
    7. It has been found in excised tumors of terminal cancer patients.

    Despite these dangers, dihydrogen monoxide continues to be is often used:

    1. As an industrial solvent and coolant.
    2. In nuclear power plants.
    3. In the production of styrofoam.
    4. As a fire retardant.
    5. In many forms of cruel animal research.
    6. In the distribution of pesticides. Even after washing, produce remains contaminated by this chemical.
    7. As an additive in certain “junk-foods” and other food products.

    Regards

    KevinUK

  494. cba
    Posted Aug 1, 2008 at 4:03 PM | Permalink

    DeWitt,

    There is a pi involved to get all angles in a hemisphere. There’s an inverse relation between wavelenth and frequency. I think it’s 10^7 divided by one or the other to convert between. I’m wiped out from a hard day and it’s not over yet and tomorrow is travel so i’ve got very little time here through most of the weekend. My calcs are done for 1cm^3 to get the unitless tau per unit length. I then multiply by the shell thicknesses (in cm) to get the total in the exponent. Using 100 (1m) my values are quite similar to your earlier one at least for co2. If you’re not using something like 10^7/nm to get /cm and 10^7/cm to get nm values. A /cm changes in # of nm width as the wavelength varies. That may be why the O3 is off by much more than co2 for the differences. While there are corrections for T & p, it’s all the same stuff in the software.

    TOA, BOA etc. are not tremendously far off. The bottom 15km OLR line is indestinguishable from the modtran.

  495. Kenneth Fritsch
    Posted Aug 1, 2008 at 5:15 PM | Permalink

    KevinUK @ #472:

    Any serious environmentalist would quickly put the lie to what you have presented in this post concerning the ill effects of HHO. We should (and need) not worry about those chemicals that occur naturally. In the case of HHO this becomes an issue of tracing that part of this material that occurs naturally and that that occurs by man’s hands, e.g. CH4 + 2O2 = CO2 + 2H2O.

    I know for a fact that when I fertilize my lawn with natural fertilizers that contain K and P and N, I am doing the right thing compared to using a fertilizer that is manmade and contains the same KPN compounds.

    I suppose next you will point to all those known naturally occurring carcinogens and attempt to compare them to those made by man. If man is affected by naturally occurring chemicals then it was meant to be. Think mosquitoes, malaria, DDT and breaking bird eggs. Kevin, in UK or anywhere in the world, please remember, a simple way of keeping straight what is good and what is bad: Natural is good and manmade is bad.

  496. bender
    Posted Aug 1, 2008 at 5:35 PM | Permalink

    #123

    modelers are not so much trying to model climate as they are trying to model an alarming impact from increased CO2 levels

    Speculation on motive or intent. Safer to assume they are trying to do exactly what they say they are trying to do. That they just aren’t achieving the robustness that they think they are.

    #124 But averaging does give you a more reliable answer if there are many semi-skilled individuals in the audience. That was Nathan Kurz’s point. [I'm always amazed how much better the audience is than me at pop culture questions.]

    Contrast “phone-a-friend” option, where you have a sample poll of n=1. Here your inference of the informant’s credibility hinges not on consistency of popular opinion and CLT, but on (1) your background insight about them, and (2) the way they answer the question. When they are “just winging it” you lose confidence. [At least I do.]

  497. bender
    Posted Aug 1, 2008 at 5:41 PM | Permalink

    P.S. At the risk of hyper-extending the “Millionaire” analogy, DrK is the friend that was phoned. And I liked his reply.

  498. jae
    Posted Aug 1, 2008 at 5:48 PM | Permalink

    NCPA is skeptical about GCM skill. However they fail to address the question of the source of the current warming trend. Being skeptical is fine, but a scientist is obliged to go further and speculate on mechanisms, to generate a working hypothesis. When it comes to decision time, precaution may dictate accepting the working hypothesis as tentatively correct. NCPA conveniently avoids the whole issue of (1) alternative explanations and (2) the wisdom of taking precaution against extreme risks

    By “current,” I suppose you mean over the last 70 years? And a scientist can also falsify a hypothesis, without generating another one, no?

  499. bender
    Posted Aug 1, 2008 at 6:45 PM | Permalink

    bender: NCPA conveniently avoids the whole issue of alternative explanations of the current warming trend.

    jae:

    By “current,” I suppose you mean over the last 70 years? And a scientist can also falsify a hypothesis, without generating another one, no?

    Has something been falsified without my knowledge? I rather doubt it.

    Please, jae, don’t do this to yourself. I know it’s only natural for you, but try to resist.

  500. Geoff Sherrington
    Posted Aug 1, 2008 at 7:38 PM | Permalink

    We have an 84-year old lady friend who has dinner with us about fortnightly. Very sharp mind. She brings a collection of clippings with her. This week (ref Melbourne’s Andrew Bolt editorial, and his seven climate graphs) there was a response from Professor Barry Brook, head, Adelaide University research institute for climate change, with this:

    “But the real issue actually boils down to this: climate scientists don’t use temperature charts to “prove” global warming. They use the scientific principles of physics, chemistry, geology, biology and so on, to form ideas and make predictions.”

    Next, there was a booklet “The Mini Rough Guide to Energy and our Planet” from http://www.roughguides.com with the Shell logo plastered over it. Shell Oil? Yep.

    Nuclear Power, p 39. “In one memorable episode of the Simpsons, Lisa and Bart go fishing near the Springfield nuclear plant and catch a three-eyed fish they name Blinky. Radioactive waste from the nuclear plant has created a whole new mutant sub-species. The episode reflects widespread anxieties about nuclear power….”

    (My bold).

    Mr Shell Oil, you are promoting activism by sponsoring a minority view an using incorrect science. You should wash your mouth out with a litre of car shampoo. What a disgusting sponsorship from what should be a responsible major.

    Someone might accuse you of poisoning science by using unclean industry funds to support AGW.

  501. D. Patterson
    Posted Aug 1, 2008 at 8:24 PM | Permalink

    Raven says:

    August 1st, 2008 at 5:31 pm
    #122 bender
    “Being skeptical is fine, but a scientist is obliged to go further and speculate on mechanisms, to generate a working hypothesis”

    The trouble is the most plausible alternative hypothesis at this time is that CO2 causes warming just not enough to justify the panic. The only way to test this alternative hypothesis is to wait another 10 years and see if the current flat temperature trend is the calm before the storm or evidence of low CO2 sensitivity.

    Is it plausible? Is it perhaps not “a plausible alternative hypothesis at this time is that CO2 causes warming,” until such time as someone would care to explain exactly how atmospheric CO2 decreased 1500ppm from 4500ppm to 3000ppm at the same time as the global temperature increased 10C? Why wait for a future event when you can already explain how decreasing CO2 caused the atmosphere to warm despite all other forces, yet the world temperature increased while the CO2 decreased by a third?

  502. Follow the Money
    Posted Aug 1, 2008 at 8:46 PM | Permalink

    #129

    I recently read a similar interesting contortion of the Scientific Method, expectedly in the climate science context. From the Hafemeister article printed in the APS newsletter along with Monckton’s article, here are the final sentences.

    Conclusion: Earth is getting warmer. Basic atmospheric models clearly predict that additional greenhouse gasses will raise the temperature of Earth. To argue otherwise, one must prove a physical mechanism that gives a reasonable alternative cause of warming. This has not been done. Sunspot and temperature correlations do not prove causality.

  503. bender
    Posted Aug 1, 2008 at 8:59 PM | Permalink

    #129-#130
    People seem compelled to choose sides. But if neither A nor B can explain the data owing to the uncertainty surrounding either proposition, one alternative that people keep forgetting is to remain undecided. Unask the question. Rephrase it. Let the scientists obtain more data.

    If you choose not to decide, you still have made a choice.

  504. bender
    Posted Aug 1, 2008 at 9:58 PM | Permalink

    #504 Zamboni killed the joke. But while you’re here, why not address the line above the one you chose to address? You talk like something’s been “falsified”. Did I miss a major breakthrough publication?

  505. jae
    Posted Aug 1, 2008 at 10:14 PM | Permalink

    Dear bender, 505:

    #504 Zamboni killed the joke. But while you’re here, why not address the line above the one you chose to address? You talk like something’s been “falsified”. Did I miss a major breakthrough publication?

    Well, where do I start. Douglass and Spencer have falsified the models, IMHO. And Lucia may have, also. The models do not comport with the actual temperatures measured in the troposphere, surface, SST, etc. And they have not predicted the 8-10-year “flat spot” in the warming. And looking beyond the models, there is absolutely nothing that suggests to me that there is a valid hypothesis. No empirical evidence, except a possibly spurious correlation between CO2 levels and temperature rise. No sensible physical explanation, despite Steve Mc’s constant plea for one. No demonstration that the modern warming is not just a natural cycle. It’s ho-hum science, and pure arm-waving. In short, the emperor has no clothes!

    Now, please tell me why you are straddling the fence so tightly.

    Steve: please stay away from attempts to argue complicated issues in a few sentences. This blog works better when people deal with specifics.

  506. bender
    Posted Aug 1, 2008 at 10:26 PM | Permalink

    Lucia may have, also

    What’s with the indecisive language? And it sounds like you may have missed my refutation of her “refutation”.

    please tell me why you are straddling the fence so tightly

    Certainly. There is too much uncertainty for me to decide one way or the other. I have no idea what the true uncertainty is around the CO2 sensitivity coefficient. Whereas you seem to think it’s almost certainly close to zero. Where you get that level of certainty is beyond me. Divine insight?

    Your other parrottings I will leave for the birds.

  507. bender
    Posted Aug 1, 2008 at 10:30 PM | Permalink

    please stay away from attempts to argue complicated issues in a few sentences

    jae will never learn to do this. He is only too willing to resort to religious assertions. Incantations on topics he does not understand. He doesn’t see any danger in it.

  508. bender
    Posted Aug 1, 2008 at 10:58 PM | Permalink

    I am sensing among CA commenters a growing confidence in the absence of a CO2 AGW effect. Willis, jae, Allan MacRae for example, all getting bolder in their assertions. If so, this makes no sense to me. There are little to no new data to go by. I’ve always been skeptical of GCM skill; Koutsoyiannis changes nothing for me. I’ve always said temperature could deviate up or down from the true GHG-caused trend due to internal variability that we do not understand; lucia’s take on UAH flatline does nothing for me. I’ve always maintained current temps and rate of warming are not unprecedented; Loehle resurrecting MWP does nothing for me. I’ve explained before the difficulty in inferring causality in a multivariate feedback system by looking at lag vs. lead elationships. Christy’s stuff on tropical troposphere trends is more than a year old, and absence of evidence is still not evidence of absence. We always knew cloud feedbacks & moist convection were the major uncertainty, and the IPCC consensus admits as much. None of this is new. None of it.

    jae asks why I sit on the fence. Ans: I haven’t any new information that could serve to update my priors. If you’re not sitting on the fence, maybe your priors are based on worthless religious baggage?

  509. TheDude
    Posted Aug 1, 2008 at 11:04 PM | Permalink

    I think it has more to do that skeptics are starting to smell blood. For me its less the developments and more the denials of freedom of information requests. That reaks of coverup and potential embarrassing information they are hiding fro

  510. Raven
    Posted Aug 1, 2008 at 11:52 PM | Permalink

    #508 bender says: ‘There are little to no new data to go by’

    There is a lot of new data which would be irrelevent individually but when taken together support the argument that the IPCC has overestimated CO2 sensitivity. I know that none of these examples are conclusive and the timeframes are short but they do give skeptics a reason to be optimistic:

    – a 7 years with a declining temperature trend during a period with no volcanos – a rare event in the historical record.
    – declining or no increase in OHC (i.e. no evidence of warming in the pipe)
    – tropospheric temperature trends that are much lower than predicted.
    – recovery of the arctic ice after the massive melt last year
    – a long solar cycle that has the potential to validate a strong sun-climate link.
    – credible analyses that demonstrate the problems with the models

    Personally, I am waiting for SC24 to ramp up. If the current cooling is nothing but an La Nina/solar min blip like in 1988-1989 then we should see the temperatures recover rapidly and exceed the levels of 2005. If that recovery is weak or does not happen at all then it will be clear that we have had a climate regime change similar to what happened in the 50s and a lot of people are going to have to rethink their assumptions about CO2.

    That said, realize this may all be wishful thinking but from what I can tell the current state of climate science is so muddled that I don’t think the alarmists have any business claiming the certainty they do and even if they are proven right I would attribute much of the success to luck rather than good science.

  511. Pat Frank
    Posted Aug 2, 2008 at 1:00 AM | Permalink

    #508 & 510 The unspoken point also at issue here is one of scientific integrity. If the climate sensitivity to CO2 is so clearly ambiguous, then for at least 20 years certain scientists have been engaged in a conscious and continuing betrayal of their professional integrity. They have claimed certainty and pronounced climate apocalypse where none can be scientifically supported.

    Likewise, in the SPMs the IPCC has been systematically and consciously misconstruing the state of certainty regarding an effect of human-generated CO2 on climate.

    Given the above ambiguity regarding the effect of CO2 on climate, one cannot conceive that the competent physicists associated with the IPCC position can not have known that climate models are incapable of supporting the publicly stated claims.

    The unspoken point, given the above climate ambiguity and which remains serious whether or not future advances in climate science show that human-produced CO2 could in fact cause dangerous warming, is that for the past 20 years scientists will have engaged in a systematic betrayal of science for political ends. They will have cannibalized public trust to nourish a public lie.

    If that turns out to be the case, the standing of science and scientists will be darkened for a generation. It would take something like South Africa’s post-Apartheid Truth Commission to restore credibility.

    Sincere motives notwithstanding it is the worst scandal, ever, in the history of science. Purity of motive is no exoneration. The Inquisition demonstrated that.

  512. Posted Aug 2, 2008 at 1:38 AM | Permalink

    You forgot to mention that not only does it cause global warming on earth, DHMO has also been detected in reservoirs and aquifers and has even been found on Mars (obviously Phoenix brought it).

  513. bender
    Posted Aug 2, 2008 at 1:54 AM | Permalink

    #510

    - declining or no increase in OHC (i.e. no evidence of warming in the pipe)
    – tropospheric temperature trends that are much lower than predicted.
    – recovery of the arctic ice after the massive melt last year
    – a long solar cycle that has the potential to validate a strong sun-climate link.

    Raven, what happened to your skepticism?

    – absence of evidence of warming in the pipe is not evidence of absence
    – absence of evidence of tropospheric warming could be caused by overriding effects; atmospheric circulation is poorly understood
    – one year positive anomaly is not a trend toward “recovery”
    – have you been reading the Svalgaard threads?

    Euphoric skeptics are prone to falling into the same trap as the alarmists: uncertainty denial. Our ignorance is deep. Certainty on either side is unwarranted. Balanced precaution is sensible.

  514. D. Patterson
    Posted Aug 2, 2008 at 2:28 AM | Permalink

    513 bender says:

    August 2nd, 2008 at 1:54 am
    #510

    – declining or no increase in OHC (i.e. no evidence of warming in the pipe)
    – tropospheric temperature trends that are much lower than predicted.
    – recovery of the arctic ice after the massive melt last year
    – a long solar cycle that has the potential to validate a strong sun-climate link.

    Raven, what happened to your skepticism?

    – absence of evidence of warming in the pipe is not evidence of absence
    – absence of evidence of tropospheric warming could be caused by overriding effects; atmospheric circulation is poorly understood
    – one year positive anomaly is not a trend toward “recovery”
    – have you been reading the Svalgaard threads?

    Euphoric skeptics are prone to falling into the same trap as the alarmists: uncertainty denial. Our ignorance is deep. Certainty on either side is unwarranted. Balanced precaution is sensible.

    True as far as it goes, but it still ignores a glaring and contradictory certainty. The planetary climate underwent major changes in global temperature without the predicted changes in atmospheric CO2 concentrations. Even the sign of the change, plus or minus, has typically been contrary to the prediction of CO2 influence as a GHG. How is this supposed to be scientifically possible? Where are the mathematics to explain such a contradiction?

  515. Raven
    Posted Aug 2, 2008 at 2:31 AM | Permalink

    #513 (bender)

    “Raven, what happened to your skepticism?”

    It is possible to be skeptical without being cynical.

    I made it clear that all of those things are inconclusive and prove nothing and only provide some reason to be optimistic for those of us would like see this entire CO2 thing go the way of the population bomb. That said, I am still realistic and realize that we may never really know how much influence CO2 has because the influence will likely be impossible to seperate from the chaotic internal gyrations of the climate.

    And yes, I do read the Svalgaard thread and realize that a sun-climate link is nothing but idle speculation given our current set of data but SC24 may give us new data that would tell us more (Leif has said as much). Mind you that new data may simply confirm that the sun has no measureable effect on climate but we won’t know till we get the data.

    The PDO/PDV is the other point where we have the opportunity to collect more data over the next few years. If it is in fact an oscillation then we should see evidence of the cold phase over the next 10 years. Of course, nothing will be proven no matter what happens but it will add to our knowledge.

    I am also looking forward to the data from the AQUA satellite which has been appearently been keep under wraps pending verification. Roy Spencer has hinted that the water vapour data contradicts the models and Anthony Watts has been told that the CO2 is not as well mixed as the models assume (what this means anyone’s guess).

    Bottom line: the only way to out of our pit of ignorance is new data and testing of hypotheses against this data.

  516. MarkR
    Posted Aug 2, 2008 at 2:33 AM | Permalink

    I just got my IP address blocked on RealClumsy and RabettRan. Seems like they are acting in concert to stifle debate. Anyone else?

    Steve: The exchange of IP addresses by climate scientists is something that I find disquieting. For example, the U of Arizona Tree Ring Lab blocked my IP address( now lifted), which had been previously blocked at MAnn’s University of Virginia website and Rutherford’s RWU website. IT sure says something about the mentality of these authors.

  517. D. Patterson
    Posted Aug 2, 2008 at 2:40 AM | Permalink

    Raven says:

    August 2nd, 2008 at 2:31 am
    [....]
    Bottom line: the only way to out of our pit of ignorance is new data and testing of hypotheses against this data.

    Only way? How is testing against forecasts of the future the only way “out of our pit of ignorance” when you have at least 600 million years of past experience being mostly ignored?

  518. Raven
    Posted Aug 2, 2008 at 2:52 AM | Permalink

    D. Patterson says:
    “Only way? How is testing against forecasts of the future the only way “out of our pit of ignorance” when you have at least 600 million years of past experience being mostly ignored”

    I think studies of the paleo data are useful for developing a hypotheses but are not conclusive evidence of anything since proxies tell us very little about what was going on at the time. If we want to know which hypotheses best represent reality we must use the hypotheses to make predictions of the future and then compare the actual outcomes to the predicted. The climate models represent one set of hypotheses which are supported by the modeller’s interpretation of the paleo data. They are not going to listen to any alternate interpretations of the paleo data unless new data collected in the future forces them to do so.

  519. Posted Aug 2, 2008 at 3:12 AM | Permalink

    Bender:

    - absence of evidence of warming in the pipe is not evidence of absence
    – absence of evidence of tropospheric warming could be caused by overriding effects; atmospheric circulation is poorly understood
    – one year positive anomaly is not a trend toward “recovery”
    – have you been reading the Svalgaard threads?

    In order:

    1. No, but that could be said about almost any irrational fear – try substituting “alien invasion” into the sentence above, for example.

    2. Atmospheric circulation is poorly understood. However when a hypothesis makes a claim about tropospheric warming that is not seen in the real atmosphere, its not the real atmosphere that’s got it wrong.

    3. I’ve no idea what that means unless somehow, somewhere, the earth’s climate has some sort of natural sweet spot to which it must return.

    4. I’ve not read the Svalgaard threads because they’re impossible to follow. I tried to get Steve to open a Svalgaard forum so that each piece of the argument could be individually analyzed but nooooo….

  520. D. Patterson
    Posted Aug 2, 2008 at 3:18 AM | Permalink

    518 Raven says:

    August 2nd, 2008 at 2:52 am
    D. Patterson says:
    “Only way? How is testing against forecasts of the future the only way “out of our pit of ignorance” when you have at least 600 million years of past experience being mostly ignored”

    I think studies of the paleo data are useful for developing a hypotheses but are not conclusive evidence of anything since proxies tell us very little about what was going on at the time. If we want to know which hypotheses best represent reality we must use the hypotheses to make predictions of the future and then compare the actual outcomes to the predicted. The climate models represent one set of hypotheses which are supported by the modeller’s interpretation of the paleo data. They are not going to listen to any alternate interpretations of the paleo data unless new data collected in the future forces them to do so.

    Who ever gave you the notion that paleo data is not conclusive evidence of anything, or paleo data tells “us very little about what was going on at the time”? That sounds very much like the old false argument where someone argues there are no facts only opinions. It is a conclusive fact that the Earth experienced atmospheric CO2 concentrations five to twenty times greater than present throughout all of the more than two-thirds of the past 600 million years and far longer into the past 4 billion years. The comparatively much more huge presence of CO2 in the Earth’s atmosphere is not a hypothesis which can be disregarded and trivialized as an unproven interpretation. The massive deposits of coal, oil, and natural gas in the Carboniferous are direct geochemical consequences of the massive decreases in CO2. The changes in CO2 left their geochemical fingerprints directly in the sratigraphy. They cannot be simply ignored and handwaved away.

  521. UK John
    Posted Aug 2, 2008 at 3:25 AM | Permalink

    Is Lord May the guy heading up the Royal Society, the same man that advised the UK Government as its scientific advisor that government, commerce and industry and our civilisation were threatened by the non existent Y2K millenium computer bug.

    Any machine code junkie knew that it was all over-hyped rubbish, but we still spent hundreds of Billions, some countries spent nothing and nothing went wrong, amazing!

    They actually believed that. The Royal Society then elected him as president.

    Is an expert and expert after he gets it so badly wrong.

    I don’t hold out much hope of common sense intervening, or Steve getting his data.

  522. MrPete
    Posted Aug 2, 2008 at 3:40 AM | Permalink

    UK John, same Lord May AFAIK. However, I’d be cautious about tarring him with Y2k. Parsing your expressed opinion a bit: yes, Y2K was over-hyped. However, it was most certainly not all rubbish. Some people needed to spend billions. Others did not.

  523. crosspatch
    Posted Aug 2, 2008 at 3:45 AM | Permalink

    Any guesses yet on the July satellite temperature data?

  524. Geoff Sherrington
    Posted Aug 2, 2008 at 3:48 AM | Permalink

    Now and then I rabbit on about using geostatistics in climate analysis. The following abstract from earlier this year is an example of why it interests me. The other interest arises because of the neat logical foundation of geostatistics. (Small parts of abstract are deleted by me for brevity …..).

    Tetsuya Shoji1, School of Frontier Sciences, The University of Tokyo, Kashiwa 277-8583, Japan

    Received: 23 April 2008 Accepted: 30 April 2008 Published online: 17 May 2008

    Abstract A series of rainfalls observed in central Japan from noon on the 13th to midnight on the 14th, August 1999 (36 h), has been analyzed by spatiotemporal variograms in order to reveal the continuity of rain precipitation in a 3-D space defined by geographic coordinates and time. All instances of zero precipitation are considered, but have been treated as four different cases:…… Hourly precipitation has a statistical distribution best approximated by a Weibull model, and somewhat less well by a normal distribution, in all four cases……. In contrast, temporally stacked rectangular variograms of hourly precipitation shows that the best continuity direction is W-E in all cases (the ranges in case A are 50 and 100 km along the N-S and W-E directions, respectively). A spatial variogram gives a spatial range independently of time, whereas a temporal variogram gives a temporal range. When geographic coordinates are normalized by the spatial range ….. and time is normalized by the temporal range ….. geographic coordinates and time can be treated as equivalent variables. Consequently, a spatiotemporal variogram can be calculated along a given direction in 3-D space using the normalized coordinates. ….. A rectangular variogram in the normalized space, in which the horizontal and vertical axes represent N-S direction and time, respectively, suggests that the series of heavy rainfalls examined here had a continuity pattern that was elongated from west to east (the range values are 20–30 km and 100 km along N-S and W-E, respectively), and that migrated from south to north with a speed of 30 km/h.

    I have added the bold because it supports (in this short-term example) my contentions that the connectivity between one weather site and another adopted for interpolation does not extend to the 1000 km used by some global interpolators.

    There is a wrong assumption in the 1000 km case. Annual or monthly time averages of variables like ground temperature cannot be used for interpolation over these distances because a part of the connectivity can be attributed to the averaging, rather than to the climate. I believe that this is noted in the abstract, indirectly.
    http://www.springerlink.com/index/k47v39l3t8838663.pdf

  525. Stan Palmer
    Posted Aug 2, 2008 at 6:04 AM | Permalink

    With regard to the Y2K issue, it was real and had significant effects. it was a significant factor in creating and popping the high tech bubble.

    Vendors did analyze their products and find significant issues with teh Y2K problem. Customers had a few choices. They could pay insignificant amounts to try to fix teh bug in their existing equipment or pay significant amounts to replace them. Since high tech equipment has a useful life of about 5 years. The first option would mean that a great deal of money would be spent on repairing equipment that would soon be replaced. The most economical choice was to replace teh equipment and essentially scrap the Y2K bug. This is what was done.

    This caused the market for new equipment to expand greatly in the years prior to 2000 and to shrink dramatically after. This was in part the cause of the bubble and the reason why it burst when it did.

  526. Craig Loehle
    Posted Aug 2, 2008 at 7:21 AM | Permalink

    An assertion is often made that those attempting a refutation of a hypothesis must provide and prove an alternative hypothesis or mechanism. This is not true either in theory or practice. If someone proposes that x happened because of state y and you demonstrate that state y did not or can not exist, you do not need to explain x. You have done something scientific. For example, a theory for human migrations to the Americas was an ice free corridor through Alaska and Canada into the Great Plains. Various studies have made this theory implausible (not impossible) but it is not incumbent on those addressing this question to prove how man did arrive here. Likewise, when people address poor predictive ability for models or bad station locations or cherry picking tree ring series, it is not incumbent on them to explain temperature trends. They (and CA) are performing one of the roles of a scientist—checking claims and results.

    Second, plausibility does not prove magnitude of effect. That is, CO2 etc are plausibly the cause of 20th Century warming, but this neither proves that they caused all of the warming nor that further increases will incinerate us. This is bad logic.

  527. MarkR
    Posted Aug 2, 2008 at 7:56 AM | Permalink

    #516 SteveM. I think in this case they are both running site meter and it’s causing a lot of Internet Explorer sites to crash. Just started happening I think.

    http://www.blogherald.com/2008/08/02/sitemeter-crashes-blogs/

    SiteMeter Crashes Blogs
    Filed as Features on August 2, 2008 5:52 am
    by Thord Daniel Hedengren

  528. Gerry Morrow
    Posted Aug 2, 2008 at 8:05 AM | Permalink

    Didn’t a certain I. Newton have a paper which claimed that white light was composed of multiple colours turned down by the Royal Society because the consensus was that “white” was the purest colour and could not be made up from other colours?

    It was a crime then and it is a crime now to shut down scientific argument and research.

  529. bender
    Posted Aug 2, 2008 at 8:15 AM | Permalink

    #519
    John A, these points pair up with Raven’s. So #3 is about ice “recovery”. I accept your protests as reasonable. A protest is not a refuation, however.

    #523

    An assertion is often made that those attempting a refutation of a hypothesis must provide and prove an alternative hypothesis or mechanism.

    Italics mine.

    This is true. In my case I did not make this assertion, however. I distinguish between the scientist’s responsibility to consider alternative hypotheses versus his right to pursue a single hypothesis. Although no one is obliged to vigorously pursue multiple alternatives, the science works most efficiently when the community at large keeps an open mind about the possibility of alternatives. Campy science, where opposing schools of thought dig their heels in against all reason, is inefficient science.

    So I agree that scientists are not obliged to provide and prove alternatives. But society appreciates it when they do so.

    Many scientists would agree with me that closed-mindedness in (or out of) science is suspicious.

  530. jae
    Posted Aug 2, 2008 at 8:33 AM | Permalink

    Can we all agree that there is far too little certainty to warrant invocation of the “precautionary principle,” and causing severe disrupitions of economic systems for “mitigation?”

  531. MrPete
    Posted Aug 2, 2008 at 9:17 AM | Permalink

    In particular, what does it mean to be “cautious” in our action, if part of the uncertainty is whether the actions being considered will have a beneficial impact? People are notoriously bad at predicting the impact of their actions.

    Seems to me that precautions taken should be known to help. If a forest fire is approaching your home, pouring cool liquid on the roof can help protect it… but perhaps not so much if the liquid is gasoline. ;)

  532. Raven
    Posted Aug 2, 2008 at 9:57 AM | Permalink

    D. Patterson says:
    “It is a conclusive fact that the Earth experienced atmospheric CO2 concentrations five to twenty times greater than present throughout all of the more than two-thirds of the past 600 million years and far longer into the past 4 billion years.”

    Climate modellers have convinced themselves that this does not contradict their models. They would only be able to do this if the data was pliable enough to allow them to add factors that explain away the inconsistencies – hence my comment that the paleo data is inconclusive.

  533. jimdk
    Posted Aug 2, 2008 at 10:25 AM | Permalink

    I agree with Bender but the authorities think otherwise

    Abstract: In this paper, we have used several basic atmospheric–physics models to show that additional carbon dioxide will warm the surface of Earth. We also show that observed solar variations cannot account for observed global temperature increase. Conclusion: Earth is getting warmer. Basic atmospheric models clearly predict that additional greenhouse gasses will raise the temperature of Earth. To argue otherwise, one must prove a physical mechanism that gives a reasonable alternative cause of warming. This has not been done. Sunspot and temperature correlations do not prove causality.
    David Hafemeister and Peter Schwartz
    Physics Department
    Cal Poly University, San Luis Obispo, CA 93405
    dhafemei@calpoly.edu

    http://www.aps.org/units/fps/newsletters/200807/hafemeister.cfm

  534. crosspatch
    Posted Aug 2, 2008 at 10:25 AM | Permalink

    #524

    It’s IE 7 that is crashing with sitemeter. Navigate to their site and if your browser throws an “Operation Aborted” error, then you are likely to have problems at any site that uses sitemeter until they get it fixed.

  535. bender
    Posted Aug 2, 2008 at 10:32 AM | Permalink

    bulldog in retreat:

    There are already plenty of periods in the paleo-climate record that unambiguously exceed global present day temperatures (the Pliocene, Eocence, PETM etc.), so one more is not an issue. – gavin

    But I thought temperatures were “unprecedented” in a millll-yun years?

  536. Kenneth Fritsch
    Posted Aug 2, 2008 at 10:48 AM | Permalink

    I think Bender here captures my skeptical view of all sides of the AGW issue, while Raven captures my optimism about the future of climate science and MrPete definitely captures my skepticism about mitigation with:

    In particular, what does it mean to be “cautious” in our action, if part of the uncertainty is whether the actions being considered will have a beneficial impact? People are notoriously bad at predicting the impact of their actions.

  537. Posted Aug 2, 2008 at 10:52 AM | Permalink

    Any guesses yet on the July satellite temperature data?

    I finally got a tomato today! Based on my garden in Illinois, I will bet they were cool. (Based on my former sister-in-law’s report from California, I’d vote hot.)

  538. D. Patterson
    Posted Aug 2, 2008 at 11:00 AM | Permalink

    528 Raven says:

    August 2nd, 2008 at 9:57 am
    D. Patterson says:
    “It is a conclusive fact that the Earth experienced atmospheric CO2 concentrations five to twenty times greater than present throughout all of the more than two-thirds of the past 600 million years and far longer into the past 4 billion years.”

    Climate modellers have convinced themselves that this does not contradict their models. They would only be able to do this if the data was pliable enough to allow them to add factors that explain away the inconsistencies – hence my comment that the paleo data is inconclusive.

    It’s not the data which is so pliable. It’s the climate scientists and others who refuse to face the reality of the evidence. It is more than strange to watch so many people beat to death ad infinitum some fictitious models which attempt to divine a fictitious future, all the while so carefully ignoring the realities of the past. The existence of the high CO2 concentrations are indicated and confirmed by at least two geochemical and one botanical method. The carbonate stratigraphy of the time periods in question simply could not have come into existence without the high atmospheric CO2 concentrations from which they were obtained and deposited into the stratigraphy. Again, handwaving by climate scientists and others is religion and not science. If you want to assert the fantastic claim that the existence of this high concentration of atmospheric CO2 does not contradict their models and is actually “pliable enoughto allow them to explain away the inconsistencies…the paleo data is inconclusive,” you need to present the scientific evidence in support of such an extraordinary claim. You can start by presenting evidence which refutes the works of Berner, Driese, Algeo, and others.

    Atmospheric CO2 decreased by a third, while atmospheric temperatures increased. How did that happen, if temperature must increase 1C for each doubling of the CO2 and must decrease 1C for each halving of CO2?

  539. Posted Aug 2, 2008 at 11:12 AM | Permalink

    519 (JohnA):

    I’ve not read the Svalgaard threads because they’re impossible to follow. I tried to get Steve to open a Svalgaard forum so that each piece of the argument could be individually analyzed but nooooo…

    John, part of that problem is that many people use the Svalgaard forum to peddle their own pet ideas or just to spout generalities [like "is it not possible that ..."] rather than addressing the specific issue at hand, which is that to claim that solar activity [in some form] is an important driver of climate, one must ascribe a high sensitivity to the climate system to the minute solar changes, and we do not know where that sensitivity comes from. I don’t know how to improve that situation, but y’all could make a serious contribution to the discussion by addressing the central problem with the same zeal as that which is applied to bashing each other…

  540. bender
    Posted Aug 2, 2008 at 11:15 AM | Permalink

    #535

    y’all could make a serious contribution to the discussion by addressing the central problem

    Could you be more specific?

  541. Posted Aug 2, 2008 at 11:34 AM | Permalink

    Re #534

    Atmospheric CO2 decreased by a third, while atmospheric temperatures increased. How did that happen, if temperature must increase 1C for each doubling of the CO2 and must decrease 1C for each halving of CO2?

    You’ve already been told why (#483), that you choose to ignore it doesn’t change the fact.

  542. Posted Aug 2, 2008 at 11:39 AM | Permalink

    536 (bender):

    Could you be more specific?

    In the very same post I stated what the problem was:
    the specific issue at hand, which is that to claim that solar activity [in some form] is an important driver of climate, one must ascribe a high sensitivity to the climate system to the minute solar changes, and we do not know where that sensitivity comes from.

    Steve Mc said in the introduction of Svalgaard #8:
    One topic that would interest me (and which I perceive Leif as hoping to get some discussion on) is the impact of new concepts of “small” changes in solar irradiance on “traditional” explanations of the association between the Maunder Minimum and the Little Ice Age.

  543. bender
    Posted Aug 2, 2008 at 11:49 AM | Permalink

    #538
    DrS, I know what the central issue is. What I meant to ask is: how do you want us “to make a serious contribution”?

  544. Posted Aug 2, 2008 at 11:56 AM | Permalink

    539 (bender):

    how do you want us “to make a serious contribution”

    Normally you do not need to be told what and how to comment on posts, and I was counting on the combined expertise of the very erudite readership to come up with something, rather than me telling them what to do and what to comment on and how to address issues. It is, of course, possible that no one [including you] has anything meaningful to say, in which case my quest for help from this corner in understanding this issue is somewhat in vain. :-(

  545. bender
    Posted Aug 2, 2008 at 12:18 PM | Permalink

    #540 My apologies. I thought you were suggesting that people should try to be more serious in their contributions. I thought you had a specific complaint that could easily be addressed. I am afraid there is no easy solution to the problem caused by irrational sun worship. A Svalgaard “unthreaded”, perhaps, where you deposit submissions you consider distracting “pet theories”?

  546. Dave Andrews
    Posted Aug 2, 2008 at 12:43 PM | Permalink

    I hope the Royal Society editor can stick to his/her guns but the auspices are not great if one considers a letter in today’s Guardian from 5 people who are going to break their bail conditions to attend a ‘Camp for Climate Action’ in protest against plans to build a new, less polluting coal fired power plant at Kingsnorth in Kent.

    Their letter specifically notes that this is an event “at which Royal Society professors mix with families”

  547. Posted Aug 2, 2008 at 1:11 PM | Permalink

    Raven, what happened to your skepticism?

    – absence of evidence of warming in the pipe is not evidence of absence
    – absence of evidence of tropospheric warming could be caused by overriding effects; atmospheric circulation is poorly understood
    – one year positive anomaly is not a trend toward “recovery”
    – have you been reading the Svalgaard threads?

    Regarding 1. Why not? If no attempt had been made to measure OHC then of course this would be true, but efforts have been made and so far there is evidence of an absence of pipeline heat from OHC and also this evidence is apparently fortified by recent reduction in sea levels.

    Is it not reasonalbe to take recent OHC measurements, sea level readings, troposphere temperature charts, the lack of warming in the SH, the poor performance of models, the many problems with surface temperature readings, the poor understanding of many aspects of climate, the histoic records which indicate nothing new and unprecdented is currently occuring, the recent strong cooling with out strong volcanic cause etc…and conclude that catastrophic AGW is unlikely?

  548. UK John
    Posted Aug 2, 2008 at 1:54 PM | Permalink

    #64 I think we’ve debated this before Mr Pete.

    Last time I asked for an example of a Y2K bug software change that you knew of, but memory is fading.

    I only know I was paid loads of money for checking software, and found absolutely nothing. Didn’t even see a bug that someone else found, did anyone!

    #65 You are wrong it was a complete myth, you were misled and still are, which piece of software did you alter? if you went to other countries other than UK or North America then they did very little, but nothing went wrong. I went to Italy to try and get them to do something, and they thought I was mad, in fact I was!

  549. Posted Aug 2, 2008 at 2:02 PM | Permalink

    Hers a link to Lord May’s thoughts on a variety of subjects.

    http://www.theyworkforyou.com/peer/lord_may_of_oxford

    The interview on the BBC with the protestors was depressing, as again AGW was accepted as completely factual and no one queried the figures. Someone spoke in favour of new power plants if appropriate technology can be applied. The idea of so many professors joining the camp is equally depressing. As usual people won’t wake up until the lights start to go out-that will be in about 5/10 years or so as our nuclear plants age. We have lots of coal-it makes sense to use it (as effectvely as possible) as otherwise we are at the mercy of foreign suppliers who don’t like us very much.

    TonyB

  550. RichardB
    Posted Aug 2, 2008 at 2:18 PM | Permalink

    I didn’t personally have any problems with y2k bugs, but did run some software that quit working Jan. 1, 1999. I did talk to one programmer who told me that he had to fix a lot of dating problems for y2k, i.e the programs wouldn’t accept any date after the last day of 1999. But nothing that would have caused great problems if it weren’t fixed.

  551. Pete
    Posted Aug 2, 2008 at 2:24 PM | Permalink

    UK John,

    It seems to me that, not finding “any” Y2K problems, or having to change code doesn’t mean there was not a need to investigate. I was on the sidelines of some investigations, and I recall that a major issue was that much software was not sufficiently documented so that folks could not even “skim” though the documentation and be able to assess.

    Even that may not have been sufficient, for safety critical applications, but I recall that the software folks I knew could not answer the impact question without a lot of work to review the programs. For safety critical applications you have to be certain.

    Perhaps you are referring to non critical applications and situations where software folks knew there was no impact, but milked the issue for all the budget they could get. As long as there are people and money there will always be tendency for some of that.

  552. Timo van Druten
    Posted Aug 2, 2008 at 2:33 PM | Permalink

    For those interested in possible abrupt Climate Change in Western Europe 12.700 years ago.

    “In an article in the scientific magazine “Nature – Geosciences”, the geoscientists Achim Brauer, Peter Dulski and Jörg Negendank, (emeritus Professor) from the German Research Centre for Geosciences (GFZ), Gerald Haug from the DFG-Leibniz Center for Surface Processes and Climate Studies at the University of Potsdam and the ETH in Zurich, and Daniel Sigman from the Princeton University prove, for the first time, an extremely fast climate change in Western Europe. This took place long before man-made changes in the atmosphere, and is causatively associated with a sudden change in the wind systems.

    The proof of an extreme cooling within a short number of years 12 700 years ago was attained in sediments of the volcanic lake “Meerfelder Maar” in the Eifel, Germany. The seasonally layered deposits allow to precisely determine the rate of climate change. With a novel combination of microscopic research studies and modern geochemical scanner procedures the scientists were able to successfully reconstruct the climatic conditions even for individual seasons. And so it was particularly the changes in the wind force and direction during the winter half-year, which caused the climate to topple over into a completely different mode within one year after a short instable phase of a few decades.
    Up to now one assumed that it was the attenuation of the Golf Stream alone that was responsible for the strong cooling in Western Europe. The examined lake deposits show however that the atmospheric circulation, probably in connection with the spreading of sea-ice, probably played a very important role. At the same time, these new results also show that the climate system is long not understood, and that especially the mechanisms of short-term change and the time of occurrence still hold many puzzles. Micro-layered lake deposits represent particularly suitable geological archives, with which scientists want to track down climate change.

    Etc.”

    Source:

    http://www.gfz-potsdam.de/portal/-?$part=GFZextern&locale=en

  553. D. Patterson
    Posted Aug 2, 2008 at 2:50 PM | Permalink

    537 Phil. says:

    August 2nd, 2008 at 11:34 am
    Re #534

    Atmospheric CO2 decreased by a third, while atmospheric temperatures increased. How did that happen, if temperature must increase 1C for each doubling of the CO2 and must decrease 1C for each halving of CO2?

    You’ve already been told why (#483), that you choose to ignore it doesn’t change the fact.

    Phil, thousands of “Milankovitch cycles” occurred during the tens of millions of years for each of the major long-term changes, plateaus, and valleys in planetary CO2 concentrations and temperature. Your use of the Milankovitch cycles, a relatively very short series of cycles, therefore amounts to nothing more than an invalid and quite absurd excuse for handwaving to avoid answering the question. Otherwise, the planet would be swinging between 10C extremes of temperature at not less than cycles of about 26,000 years, 100,000 years, or 400,000 years. In fact, there hasn’t been 22C extreme for tens of millions of years, and not another 12C minimum for many tens of millions of years before then. If you expect to be taken seriously, you are going to have to come up with a much less absurd excuse for handwaving than the Milankovitch cycles.

    The IPCC is predicting an imminent and catastrophic runaway climate because CO2 is a greenhouse gas more powerful than any other alternative forcing of the climate. Thousands of Milankovitch cycles did not keep the planetary CO2 and temperature from changing in opposite directions. The IPCC and its defenders need to supply the mathematics demonstrating how it is physically possible for CO2 and temperature to have made such glaringly different changes in the past, yet can only make a 1C per doubling of CO2 change in the present and future. Show us how and why it is not possible for the CO2 to increase to 4500ppm while the temperature drops 2C and another glaciation.

  554. Posted Aug 2, 2008 at 3:01 PM | Permalink

    Steve: I’ve done some experiments using maximum likelihood methods on the trend data and got wider confidence intervals than Lucia has with Cochrane-Orcutt; and it’s very sensitive to specification e.g. LTP.

    I agree the statistical model selected does affect the result of the test. I’m currently looking through ‘model data’ to see what statistical models are suggested by models.

    The difficulty with using longer amounts of historical data is that volcanos affect the variability, and are thought to increase it. The current periods is relatively unusual in having no recent stratospheric volcanic eruptions. The previous period with no eruptions has similarly low “weather noise”, but it’s also fairly short.

    Also, based on the previous short period, the variability of 8 years trends– absent volcanic eruptions– is possibly 20% larger than I’m getting, but really not much more. The variability Gavin says models suggest is larger than the variability in the whole thermometer record.

    Basically, the situation is: It’s entirely legitimate for people to not believe the Cochrane-Orcutt results, on the basis that weather noise may not be AR(1) and/or a few other reasons. However, it’s a concrete test. Given the results, I think those defending models should be suggestin reasons why we should have confidence in the predictive ability. The defenses advanced by modelers have been fairly weak.

  555. Posted Aug 2, 2008 at 3:04 PM | Permalink

    Bender– where is your refutation of what I did? I like to read them all so I can extend the stuff I do test!

  556. bender
    Posted Aug 2, 2008 at 3:34 PM | Permalink

    #555 I did not see Steve’s inline reply to #419, so thank you for highlighting it. This is what I was referring to. My hunch – which I outlined at your blog – is that your confidence intervals are far too narrow. I can not prove it; so you could say I was being coy with the term “refutation”. (Just seeing if you were paying attention.) But Steve’s work may go some way to proving my case. Certainly Koutsoyiannis analysis suggests there may be some internal dynamics going on that Gavin et al. like to sweep under the rug. (Ocean circulation?) These would correspondingly increase the width of your confidence intervals.

    You estimated the atmsopheric time constant as what – some 8 years, was it? Thus I expect no less than an AR(8) process. The problem is that the time-series for estimating such a high-order process are far, far too short. Still, that’s no excuse for postulating an AR(1).

  557. bender
    Posted Aug 2, 2008 at 3:43 PM | Permalink

    Whoops, 5 years, not 8: http://rankexploits.com/musings/2007/time-constant-for-climate-greater-than-schwartz-suggests/
    I can’t find the quote at the moment.

  558. Posted Aug 2, 2008 at 3:44 PM | Permalink

    Lucia wrote

    quote The current periods is relatively unusual in having no recent stratospheric volcanic eruptions. The previous period with no eruptions has similarly low “weather noise”, but it’s also fairly short. unquote

    I bet the volcano gods are getting a lot of prayers at the moment — a couple of big eruptions and the ‘cooling’ will have an obvious cause.

    JF

  559. bender
    Posted Aug 2, 2008 at 3:57 PM | Permalink

    I searched and can’t find it, lucia. But in searching … I wonder how an AR(5) fits to EchoG.

  560. maksimovich
    Posted Aug 2, 2008 at 4:00 PM | Permalink

    542(LS)

    In the very same post I stated what the problem was:
    the specific issue at hand, which is that to claim that solar activity [in some form] is an important driver of climate, one must ascribe a high sensitivity to the climate system to the minute solar changes, and we do not know where that sensitivity comes from.

    Steve Mc said in the introduction of Svalgaard #8:
    One topic that would interest me (and which I perceive Leif as hoping to get some discussion on) is the impact of new concepts of “small” changes in solar irradiance on “traditional” explanations of the association between the Maunder Minimum and the Little Ice Age.

    We also have longer time series problems(with the solar question) as Rial 2000 states

    1. The&100 kyr variations in insolation forcing due to eccentricity change are too small (less than 1%) to be the direct cause of the great ice ages (e.g. Imbrie et al., 1993; Berger and Loutre, 1991).
    2. The notable absence in the d18O data of significant spectral amplitude at 413 kyr in the last 1.2 million years (e.g. Imbrie et al., 1993; Mix et al., 1995; Clemens and Tiedemann, 1997), in spite of being the largest component of eccentricity forcing (Berger, 1978).
    3. The switch from predominant 41 kyr glaciation cycles to predominant 100 kyr glaciation cycles around 0.9 Ma BP (the mid-Pleistocene transition, or MPT) occurred without a corresponding change in orbital forcing (Pisias and Moore, 1981).
    4. The well-documented variation in glacial cycle duration from about 80}120 kyr (Raymo, 1997; Petit et al., 1999; Winograd et al., 1992) in the last&500 kyr does not correlate with insolation.
    5. The presence of spectral peaks at frequencies other than those in the insolation forcing (e.g., Ruddiman et al., 1989; Nobes et al., 1991; Bolton et al., 1995; Mix et al., 1995; Clemens and Tiedemann, 1997; vonDobeneck and Schmieder, 1998; Rial, 1999b) indicates nonlinear response of the climate system, but the responsible nonlinear mechanisms are yet to be found.

    Then the changes in insolation are not the primary driver.If not the climate sensitivity must be driven(coupled) by small changes in photochemistry as seen in small time scales such as the solar cycle,GCR(radiative blocking)and response to uv in the biogeochemical complex.

  561. bender
    Posted Aug 2, 2008 at 4:08 PM | Permalink

    #559
    I distinctly remember that this is the post that prompted me to comment:

    http://rankexploits.com/musings/2008/ipcc-projections-overpredict-recent-warming/

    but I can not find the comment. Oh well – you understand better than I do what the issues are. I doubt I’ve uncovered anything you haven’t already thought about.

  562. bender
    Posted Aug 2, 2008 at 4:23 PM | Permalink

    Though it is worth noting that the issue about how to statistically represent high-order internal climate variability is similar to Steve’s question about to you specify the error structure of a non-stationary bristlecone pine subject to episodic and inexplicable growth spurts that are sustained for unpredictable lengths of time. It’s not a single AR model. It’s several, pieced together. Assuming it is the result of a single high-order process will lead to a poor approximation. Too much meandering, not enough jumping.

  563. MrPete
    Posted Aug 2, 2008 at 7:45 PM | Permalink

    UK John #548, this will be my final comment on Y2k. There’s no need to hijack even “unthreaded” for this topic. If the data was still Googleable, I’d point you there, but it’s all water under the bridge to the world now. (I do have some material I could put back online, but fail to see the value other than chest-beating…)

    I did a major Y2k investigation in 1998, and never had to apologize for what I wrote.

    First, a caveat: I discovered as best I could what was really going to happen on 1/1/2000. The result: I cautioned everyone that would listen to ignore the hype, but also to take seriously the need to check on certain specific issues.

    Here are some specific examples of real Y2K issues. The reference links are probably useless unless you search archive.org.

    Medical System (www.baxter.com): of 339 products, 23 were involved in patient or medical reporting, and 12 of those (52%) had a Y2k problem. The other 317 products directly delivered medical services (blood therapy, etc). Of those 317, six (2%) were kidney dialysis machines that needed an upgrade or they would no longer automatically schedule dialysis.

    Phone System (www.executone.com/yr2000.htm): Call processing continued OK, but dates/days no longer displayed correctly, reports had wrong dates and sorted improperly (00 before 99). Day-of-week call routing broke (i.e. it would use weekend routing instead of weekday routing, thus breaking customer service lines, etc.)

    International calling (www.fcc.gov/year2000/faq.html): The ITU closely monitored international compliance and was prepared to cut off any nation whose phone system was not Y2k compliant. As I recall, everyone made it under the wire, but it was close.

    Credit cards (www.techweb.com/se/directlink.cgi?IWK19980112S0065): Major rework was required (remember the short expiration dates?), however all the work was completed in time, except that 0.1% of card terminals couldn’t handle Y2k dates and had to be replaced.

    PC BIOS (www.award.com/tech/y2k.htm): Any PC with “Award” BIOS dated 26 April 1994 to 31 May 1995 required an upgrade.

    PC Software (www.cnet.com/Content/Reports/Special/Y2k/chart.html): Quite a lot of software required updates. Some of the vendors included Microsoft, Intuit, Lotus, Borland, Claris, Corel, Dbxl. I still use an app (Ecco Pro) with a Y2k bug: repeating dates (e.g. birthdays, etc) do not automatically extend into the future; their automatic “final” date is 2000!

    I could go on, and get details on the embedded oil platform systems that had to be replaced (that was the headache of a friend of mine), etc etc etc. I trust this is sufficient to lay to rest any question whether Y2k was a “real” problem. It was.

    Again (I’ve commented on this before), there are many parallels to AGW that drew me to Steve M’s site in the first place. Some parallels include:
    * Hype, not reason, particularly among some who stand to gain
    * Spiral of fear (A expects 0, prepares for 1; B sees A, assumes A expects 1, so B prepares for 10; C sees that and prepares for 100, etc)
    * Those who decide too quickly that the answer is “known”, and particularly those who take early action on such decisions, cannot be convinced they are wrong. They are too invested.

    Enough from me on Y2k.

  564. Posted Aug 2, 2008 at 8:17 PM | Permalink

    Bender:

    A protest is not a refuation, however.

    A statement made without proof does not become a truth because there is no refutation. I think its time to embrace the “null hypothesis” in climate change.

  565. bender
    Posted Aug 2, 2008 at 8:46 PM | Permalink

    I think its time to embrace the “null hypothesis” in climate change.

    So you think it’s time to deny the facts. That’s fine. Many disagree.

    I think it’s time to *link* the facts in a coherent demonstration of what they imply. i.e. a GCM. The only thing wrong with the GCM approach that I can see is overfitting when you haven’t properly defined the noise structure. And there I will agree with you: the “null hypothesis” must not be underestimated – especially if it is a high-order non-stationary structure with alot more variability – something somewhat like EchoG. Lucia’s getting there rapidly.

    But what we *think* doesn’t matter if our job here is to audit the facts. So let’s just do that, shall we? Rather than advocate what to do about the facts.

  566. bender
    Posted Aug 2, 2008 at 8:50 PM | Permalink

    Go ahead and take the last word.

  567. DeWitt Payne
    Posted Aug 2, 2008 at 11:36 PM | Permalink

    cba,

    I’m confident of my units conversion because I can take the data from the table in MODTRAN and convert them to the same numbers as are plotted. And my plot using ridiculously high concentrations of ozone and CO2 not only has the same peak intensity as your plot, the shape of the spectral peaks is quite similar. That should bother you.

    You’re getting the same results in the lower 15 km because your line broadening calculation works there. Your calculation fails as pressure broadening decreases to zero. Somehow instead of maintaining the area under the line constant so the emission integrated over the whole line is constant at a given concentration, you are increasing the area as the line narrows so integrated emission increases, by orders of magnitude, as the line narrows. The null hypothesis is that MODTRAN is doing the calculation correctly. If you get emission intensity more than an order of magnitude higher than MODTRAN, you clearly aren’t doing it right. Saying your results are different because you use higher altitude resolution than MODTRAN is a red herring. MODTRAN uses low altitude resolution at high altitudes because you don’t have to. Absorption and emission don’t change much with altitude at high altitude.

  568. DeWitt Payne
    Posted Aug 2, 2008 at 11:39 PM | Permalink

    change next to last sentence to: MODTRAN doesn’t use high altitude resolution at high altitudes because you don’t have to.

  569. Posted Aug 3, 2008 at 6:38 AM | Permalink

    Bender–

    You estimated the atmsopheric time constant as what – some 8 years, was it? Thus I expect no less than an AR(8) process. The problem is that the time-series for estimating such a high-order process are far, far too short. Still, that’s no excuse for postulating an AR(1)

    I got about 12 years using the Schwartz model — which could be wrong. The Schwartz model assumes an AR(1) process based on some simplified physics. The assumption has nothing to do with inspecting properties of the data.

    If you want to see how EchoG fits an AR(N) process, I can send you data. I want to know how the peridos with no stratopsheric volcanic eruptions fits an AR(N) process. But, I don’t know how to get teh best fit AR process.

  570. Kenneth Fritsch
    Posted Aug 3, 2008 at 12:18 PM | Permalink

    Lucia, why do you not use annual temperature anomaly data in your analysis?

  571. Sam Urbinto
    Posted Aug 3, 2008 at 12:32 PM | Permalink

    It’s 12 C here in this place. Ten feet away, it’s 12.5 C and another ten feet away, it’s 13.6

    And in this place it’s 11.2

    Coincidence?

    I think not.

  572. Kenneth Fritsch
    Posted Aug 3, 2008 at 12:54 PM | Permalink

    From Leif Svalgaard at #544:

    It is, of course, possible that no one [including you] has anything meaningful to say, in which case my quest for help from this corner in understanding this issue is somewhat in vain.

    What I have found the most compelling aspect of CA has been the analysis of scientific papers and processes. I judge that a number of participants here are fully capable of rendering alternative conclusions from these papers and processes. However, this rendering does not approach the level of coming up with alternative theories on AGW and in fact most of the scientific papers analyzed here are not at that level either.

    I think “sun worshipping” is term we can reserve for those of us here that, while probably on the other side of the AGW issue, take too seriously the admonishment of the GHG advocates, who require an alternative theory before getting too introspective on the consensus one. For the purer-of-heart puzzle solvers, I think that the analyses of papers and processes on all sides of the AGW issue can be an end within itself.

    I think Steve M has to tune his blog finely to avoid some of the more excessive “sun worshipping” and yet allow some alternative interpretations of climate science papers and processes.

  573. Kenneth Fritsch
    Posted Aug 3, 2008 at 1:27 PM | Permalink

    MrPete @ #563:

    I consulted on a Y2K project after retirement and observed that:

    The legal department was very concerned about the company getting sued and required the documentation process to go to great lengths to avoid any legal repercussions, i.e. if something failed at least the company had made the effort to avoid it.

    The IT organization seemed convinced that a problem was going to occur regardless of the measures taken and some were expecting to be fired over the issue.

    Some of the department veterans were convinced that the whole process of documentation was not necessary and potential repercussions were being greatly overblown.

    The permanent employees appeared to have mixed feelings about the highly compensated consultants and their worth in the effort, but some let me known that they thought a number of consultants were grossly overpaid.

    After the fact, the process of documentation paid off in any number of ways, not with Y2K problems, but in more efficiently resolving day to day problems.

    I think part of these reactions resulted from a mild mass hysteria about Y2K that seemed to have a consensus view at the public and media levels. The documentation was a positive spin-off that I thought could have been obtained much less expensively by a more direct and ongoing effort to document software/hardware and processes. Some of the documentation had little value beyond its potential utility in a legal action.

  574. UK John
    Posted Aug 3, 2008 at 1:42 PM | Permalink

    #563 Mr Pete

    Agreed no more Y2K computer bug. It was over-hyped we all agree.

    However I wish I had found a genuine 2 digit date field software calculation problem, rather than just getting thoroughly bored. we actually had less problems at Y2K than on a clock change, or normal year end.

  575. jeez
    Posted Aug 3, 2008 at 1:53 PM | Permalink

    I was on the beach in Copacabana during the change. Given the hype on the BBC World at the time (Brazil was bright red on their global dangerspot maps) I fully expected the lights to go out at midnight. I had flashlights and water ready in my hotel room. Of course instead I had the best night of my life.

  576. BarryW
    Posted Aug 3, 2008 at 4:35 PM | Permalink

    Most of the Y2K problems were with old COBOL programs from early mainframe days as I understand it. Year 2000 dates were assumed to be 1900. So some poor smuck got a notice that his drivers license was 100 yrs out of date (pain for him but nobody died). Some guys were brought back from retirement because no one had touched the program in years and nobody knew where the bodies were buried so to speak. The real problem were the Al Gore types that used it for hyping books and such making it seem like the end of the world.

  577. trevor
    Posted Aug 4, 2008 at 1:54 AM | Permalink

    Over at RC, M Mann and Gavin differ on paleo-climate issues: Journalistic Whiplash, post 192:

    Joe Hunkins Says:
    1 août 2008 at 2:09 PM
    the models wouldn’t have to worry about that, or rather it’s already sort of baked in, just like El Nino events and the like. Thats a big oversimplification, but it’s the current state of the science in a nutshell, as I understand it.

    Whoa – too much simplification leads to obfuscation, which some think is happening with respect to the MWP.

    As Gavin noted above, Paleoclimate reconstructions, rather than models, are the best evidence to support AGW. Unfortunately some of these paleo reconstructions (esp. tree rings) are not as mathematically robust as one would like to see to definitively counter those who challenge them.

    Resolving the existence and/or significance of the MWP issues is important because

    1) If it was a clear global phenomenon it is another challenge to the alarmists and the pending doom hypothesis.

    2) If it was a clear global phenomenon it raises some potentially serious questions about the methodology of studies that don’t show MWP and the many generalizations about data that suggest MWP is not of interest or significant as a sign that natural CO2 variability is much greater than generally suggested.

    [Response: Neither of these conclusions follow. There are already plenty of periods in the paleo-climate record that unambiguously exceed global present day temperatures (the Pliocene, Eocence, PETM etc.), so one more is not an issue. The actual issue is whether it can be understood, but to do this you need to have unambiguous records of solar and volcanic forcing from which you could deduce the residual intrinsic variability. Given these records are very uncertain, combined with the uncertainty in the reconstructions, and the uncertainty in climate sensitivity, it is very unlikely that the medieval period is ever going to be a significant constraint on anything relevant. The paleo-climate that is important for overall climate sensitivity is the LGM, or maybe the Pliocene - much bigger signal-to-noise ratios in both cases. - gavin]

    [Response: Actually, you would find a fair number of climate scientists who don't necessarily agree w/ some of what Gavin has stated above. Indeed, Hegerl et al (2006) in Nature argues that you can indeed further constrain climate sensitivity based on precisely this information (well, using the past 7 centuries of paleoclimate reconstructions, anyway). That having been said, the idea of global mean warmth during the Medieval era that rivals current warmth is inconsistent with every paleoclimate reconstruction of the past decade published in the scientific literature. It is also inconsistent with every model simulation study that has been done using best estimate climate forcing. So you're really out on a limb. And perhaps even more to the point, an MWP as warm as today (while, again, inconsistent with all of the best available observational and modeling-based evidence) would most likely indicate a climate sensitivity that is much greater that nearly all available estimates, and would portend even greater future climate change in response to anthropogenic forcing. It would certainly not be a cause for comfort. -mike]

  578. Raven
    Posted Aug 4, 2008 at 2:26 AM | Permalink

    trevor says:

    And perhaps even more to the point, an MWP as warm as today (while, again, inconsistent with all of the best available observational and modeling-based evidence) would most likely indicate a climate sensitivity that is much greater that nearly all available estimates.

    This is an interesting admission because it indicates that the climate models have problems modeling any variation in climate unless CO2 is included as an amplifying factor . I also find it ironic that Mann assumes that the models must be correct and that the only way to reconcile the models with a warm MWP is to assume that the small increase in CO2 during the period caused the entire warming. More evidence of tunnel vision on the part of the modellers.

  579. PHE
    Posted Aug 4, 2008 at 5:40 AM | Permalink

    Here is an exciting new form of climate proxy: the weather records of ships dating from 1600s.

    Story in UK’s Daily Telegraph newspaper.

    http://www.telegraph.co.uk/news/2496902/Lord-Nelson-and-Captain-Cooks-shiplogs-question-climate-change-theories.html

  580. EW
    Posted Aug 4, 2008 at 6:20 AM | Permalink

    Interesting. I looked up the study author G. Wheeler. He’s apparently sitting over the logbooks for some time. His paper from 2005 is about the logbook “reproducibility”:

    Logbooks have survived in large numbers and contain notable quantities of climatological information. This paper examines the degree to which these data are reliable and consistently recorded. This is done by comparing the daily observations entered in the logbooks of vessels sailing in convoy, at which times the respective ships’ officers would independently estimate and record the prevailing wind force and wind direction. The results are described using a variety of descriptive summary statistics. In general, wind force records are highly correlated and wind direction differences are relatively small compared with the natural variability of this phenomenon. Wind directions were studied and found to have a bias towards 4-, 8- and 16-point compass readings at the expense of 32-point readings. Corrections were needed to convert the recorded directions, which were made by reference to magnetic north, to their true north equivalents.

  581. kim
    Posted Aug 4, 2008 at 6:53 AM | Permalink

    579 (PHE) Excellent link, there. Dennis Weaver has done beautiful work. I like the line about how it is wrong to connect CO2 to specific climate events.
    ===========================

  582. kim
    Posted Aug 4, 2008 at 6:55 AM | Permalink

    Good Heavens, it’s Dennis Wheeler. Oh well, I first wrote Earl Hepp.
    ==========================================

  583. JamesG
    Posted Aug 4, 2008 at 8:08 AM | Permalink

    “And perhaps even more to the point, an MWP as warm as today … would most likely indicate a climate sensitivity that is much greater that nearly all available estimates, and would portend even greater future climate change in response to anthropogenic forcing.”

    I’ve read that several times from these guys and I don’t understand how any scientist can be so illogical. Mann’s obviously circular argument depends on the initial assumption that CO2 sensitivity is high. Mann’s reconstruction was designed to match the Siple curve and so force the conclusion that CO2 controls the climate. If though the MWP was global then it’s clear that CO2 doesn’t control the climate at all, so the CO2 sensitivity must then be low. Mind you, given the state of a science which relies only on proxies and measurements that support the hypothesis and rejects those that don’t, it’s highly likely that the Siple curve is nonsense too (ref Beck) and that there should indeed be a one-to-one relationship but with CO2 as effect.

  584. Vincent Guerrini Jr
    Posted Aug 4, 2008 at 10:09 AM | Permalink

    http://www.timesonline.co.uk/tol/news/environment/article4449527.ece

    Did this appear in Mann’s Hockey stick

  585. theduke
    Posted Aug 4, 2008 at 11:04 AM | Permalink

    Interesting column by a respected historical writer:

    http://www.theaustralian.news.com.au/story/0,25197,24122117-7583,00.html

    in which he compares AGW theory to another “settled” scientific theory of the early 20th century.

  586. Kenneth Fritsch
    Posted Aug 4, 2008 at 11:18 AM | Permalink

    Raven @ #578:

    This is an interesting admission because it indicates that the climate models have problems modeling any variation in climate unless CO2 is included as an amplifying factor . I also find it ironic that Mann assumes that the models must be correct and that the only way to reconcile the models with a warm MWP is to assume that the small increase in CO2 during the period caused the entire warming. More evidence of tunnel vision on the part of the modellers.

    In my view of it, Mann’s remark makes sense vis a vis CO2 and GMT only by his “climate sensitivity” alluding to the combined effects of a natural event such as the MWP (where we have no indication whatever of a higher CO2 or other GHG levels in the atmosphere) and the effects of AGW (caused in his view from higher levels of GHGs). Lower GHG levels in a MedWP with temperatures at the level of the ModWP where GHGs are significantly higher could not be explained by a larger temperature sensitivity to, for example, a 2XCO2.

    I also get a kick out of Mann pointing to all the reconstructions that are mainly incestuous progeny of the original MBH as evidence that the MedWP was a local (and minimal) effect. Part of the original assertion was that maybe those reconstructions and climate models have a large uncertainty.

  587. Sam Urbinto
    Posted Aug 4, 2008 at 12:11 PM | Permalink

    The main problems with the year 2000 rollover, that needed to be investigated, was simple. Money. Liability. Safety. Etc. What happens to your life insurance policy, mutual funds, bank accounts and the like if you are no longer 30, you’re -70 ? What happens if the automatic train switching equipment goes haywire, trains crash, and you get sued out of business and/or people die ? Traffic lights? Air traffic control systems?

  588. Andre Tahon
    Posted Aug 4, 2008 at 12:22 PM | Permalink

    As a long-time reader of the various discussion threads on the site, I thought I’d draw your attention to the following recent article:

    Evidence for a solar signature in 20th-century temperature data from the USA and Europe

    which was published in the French publication Comptes Rendus Geoscience.

    To link to the abstract, copy the doi “10.1016/j.crte.2008.06.001″ (without quotation marks) into the text box on http://dx.doi.org

  589. Pat Keating
    Posted Aug 4, 2008 at 12:49 PM | Permalink

    587 Sam

    What happens if the automatic train switching equipment goes haywire, trains crash, and you get sued out of business and/or people die ? Traffic lights? Air traffic control systems?

    That’s where all the BS/hype was. It was conceivable that there might be a few issues with some financial programs, but the idea that real-time controller programs were vulnerable was total alarmist b*cr*

  590. Sam Urbinto
    Posted Aug 4, 2008 at 12:58 PM | Permalink

    589 Pat Keating says: “…That’s where all the BS/hype was….”

    Yeah, but still have to do the due dilligence. Even if the odds are nothing is going to happen. Good time to upgrade the computers! But sure, does your microwave care what year it is? I think it was mostly a money thing, but yes, way (99.9% ?) overblown. AGW reminds me of it. :)

  591. EW
    Posted Aug 4, 2008 at 12:58 PM | Permalink

    I downloaded the article. The solar signature isn’t statistically proved, just curve comparison, but the paper shows nicely, that various global regions behave quite differently. The authors used selected stations with a long history.

  592. UK John
    Posted Aug 4, 2008 at 3:04 PM | Permalink

    #590 Y2K +AGW

    The scientific method is to propose a hypothesis then try and demonstrate that the hypothesis is true.

    With Y2K, very early on, no bugs were found, but we still all continued, luckily there was an end, and nothing happened, nowhere, even with systems that were not checked. This was a UK/North America Myth.

    With AGW there is no end date.

  593. kim
    Posted Aug 4, 2008 at 9:23 PM | Permalink

    ias.ac.in/currsci/jun252006/1607.pdf

    I especially like reference #4.

    H/t David B. Benson.
    ==============

  594. kim
    Posted Aug 4, 2008 at 9:25 PM | Permalink

    http://www.ias.ac.in/currsci/jun252006/1607.pdf

    That should work, now.
    =============================

  595. Raven
    Posted Aug 4, 2008 at 9:40 PM | Permalink

    Kim says: “I especially like reference #4.”

    I don’t understand what your point is.

  596. DeWitt Payne
    Posted Aug 4, 2008 at 10:15 PM | Permalink

    kim,

    Comparing CO2 to arsenic, how droll. Yet another “We’re all going to die!” piece of tripe.

  597. MrPete
    Posted Aug 4, 2008 at 10:26 PM | Permalink

    ok, I’ll give one more y2k comment. Pat sez “the idea that real-time controller programs were vulnerable was total alarmist b*cr*”

    Note my links above. Two examples of real-time controller vulnerability were given. Phone system weekend/weekday functions, and kidney dialysis scheduling. The former an inconvenience, the latter potentially a matter of life and death if ignored. Controllers on oil platforms also were impacted, and many others.

    Yes, the issues were rare. Yes, the trouble was hyped. Yes, some people profited immensely and unethically. Yet… there were also real issues to be seriously addressed. [Some NON-issues: traffic lights, embedded timer chips. No issues in either one that I could find. Surprisingly, nobody had even asked the question of those who were responsible for timer chip design!]

  598. Posted Aug 4, 2008 at 11:20 PM | Permalink

    BBC Radio 4 had a programme about the inadequacies of the peer review process which might interest readers here. It was mainly about medical sciences but the issues there are identical to those in climatology. “Lack of statisticians among peer reviewers” was one issue cited.

    It’s available online here. I don’t know if this works outside the UK or not.

  599. DeWitt Payne
    Posted Aug 4, 2008 at 11:33 PM | Permalink

    Raven,

    I suspect it’s the name of the journal.

    Look at the graph of blood pH vs CO2 concentration. Blood is highly buffered. It seems more than a little suspect that the graph shows that pH would decrease faster than it would for CO2 in a solution with just enough bicarbonate to give a pH of 7.1 at current CO2 pressure. Blood pH has been measured for some time. Data that showed that pH had decreased over the last 30 years or so would have been a lot more convincing. The paper seemed mostly speculation with little or no hard data.

  600. D. Patterson
    Posted Aug 4, 2008 at 11:35 PM | Permalink

    593 kim says:
    August 4th, 2008 at 9:23 pm
    ias.ac.in/currsci/jun252006/1607.pdf

    Note:

    This work demonstrates that the level of carbon dioxide in the atmosphere at which humans can survive indefinitely, is much lower than expected. The estimated toxic level of carbon dioxide in the atmosphere under lifetime exposure is 426 ppm (Figure 1)4. At the present rate of increase of carbon dioxide in the atmosphere, the toxic limit will be attained in AD 2050 based on extrapolation of the measured results from Mauna Loa5. The effects of carbon dioxide are a reduction in the pH value of blood serum leading to acidosis4. The minimum effects of acidosis are restlessness and mild hypertension. As the degree of acidosis increases, somnolence and confusion follow. One of the effects of these changes is a reduced desire to indulge in physical activity [including but not limited to sex].

    Not to worry. Recent advances in gene splicing will permit humans to add the chlorophyll infused epidermal layer necessary to metabolize CO2 and give real meaning to the Green Party and a Greener economy and lifestyle.

  601. DeWitt Payne
    Posted Aug 4, 2008 at 11:57 PM | Permalink

    Apparently the authors of that paper didn’t even bother to google the subject. I found a reference to a NASA experiment on habituation to high CO2 levels in an article about the effects of varying CO2 levels on the sleep disturbances of submariners. Submariners are routinely exposed to levels of CO2 up to 20 times ambient (9 mm Hg compared to 0.4 mm Hg for 385 ppmv CO2) for weeks on end. The NASA volunteers were exposed to 10 and 20 times ambient for 24 days. While there was some transient acidosis, habituation in the NASA experiment occurred by about 7 days exposure.

  602. D. Patterson
    Posted Aug 5, 2008 at 1:18 AM | Permalink

    Low CO2 also produces a variety of medical problems.

  603. kim
    Posted Aug 5, 2008 at 7:25 AM | Permalink

    595 (Raven) Yes, DeW. P. has it about right in #599. For the groundbreaking physiology, necessitating a revision in the standard medical explanation of CO2 in blood, he references himself. Hair raises on the back of my neck thinking that either of these journals is peer-reviewed.
    =============================================================

  604. kim
    Posted Aug 5, 2008 at 7:32 AM | Permalink

    Better yet, Raven, he references himself in a journal that ‘takes a deliberately different approach to review’. The editor even has written down a justification for publishing untrue articles, that is, that they may stimulate debate.

    Fine, stimulate away; just don’t make my policy decisions depend upon your untrue, but stimulating, articles.
    ===============================================================

  605. cba
    Posted Aug 5, 2008 at 9:07 AM | Permalink

    DeWitt,

    I think the next constructive thing to try is to provide the calculations used for line width at the various altitudes along with the numbers for some simple line example. Also, due to the complexity and detail involved, I expect Steve M would prefer simply a link to where this is done.

    My schedule becomes more civilized again after the weekend but it gets even worse before so I doubt anything will happen before then.

    So far looking at line results in detail, I’ve not found a problem. By 50km pressure is around 1/1000 to 1/10000 of surface. HWHM line width declines and that pushes line peak values upward for contribution per molecule while the number of molecules/vol is dropping.

  606. DeWitt Payne
    Posted Aug 5, 2008 at 11:35 AM | Permalink

    cba,

    You seem to be saying that as line width approaches zero, emission becomes infinite, or at least very large. That’s wrong. Consider the logical limit of line width: the Dirac Delta Function. It is everywhere zero except at one point where its value is undefined. But it doesn’t have infinite area, Its area is a finite constant so emission would also be finite. The emission per molecule integrated over the whole line also has a constant and finite value. It does not increase rapidly as the line narrows. Looking at it another way, once the line is entirely inside your 1 nm resolution box, the shape of the line no longer matters and the emission at a given concentration for that box is constant and independent of pressure.

  607. DeWitt Payne
    Posted Aug 5, 2008 at 11:59 AM | Permalink

    cba,

    According to Petty, the half width for the CO2 15 micrometer band is constant above about 40 km with a value of approximately 3E07 Hz. The 4.3 micrometer band has a constant half width of about 1E08 Hz above 40 km.

  608. DeWitt Payne
    Posted Aug 5, 2008 at 12:10 PM | Permalink

    On further thought, the line strength must decrease as the line narrows. Consider absorption. A line with zero width would absorb zero energy no matter what the value of the peak cross section. By the same logic the emission flux would also be zero. It’s the peak absorption cross section that approaches a constant value as the line narrows, not the line strength.

  609. SidVisocus
    Posted Aug 5, 2008 at 12:28 PM | Permalink

    The most interesting thing to me in this story is that I didn’t see a temperature monitoring station nearby.

    I still think Anthony Watts should give the area a looksie.

  610. DeWitt Payne
    Posted Aug 5, 2008 at 2:28 PM | Permalink

    cba,

    Ignore my #606, it’s wrong. Let’s look at an example in terms of effective line width, where effective line width is the width of a rectangular line with the same area as the real line. Assume that the line is opaque at the peak over the path length and concentration of interest. Say we have a 1 micrometer resolution and the effective line width is 10 nm to ensure that wings extending outside the resolution range will not be an issue. That would mean that I/Io would be 0.99 over the path length. Now let’s reduce the actual line width by an order of magnitude. That should mean that I/Io would be 0.999. My guess is that your method would have I/Io for the narrow line still equal to 0.99, i.e. the effective width would remain the same. That doesn’t happen in the real world. The peak cross section remains more or less constant so the area under the line, the line strength, decreases as the line width decreases. This seems to me to be a likely explanation of your excess absorption and emission at high altitudes.

  611. jae
    Posted Aug 5, 2008 at 2:34 PM | Permalink

    For Mosh:

  612. Sam Urbinto
    Posted Aug 5, 2008 at 2:59 PM | Permalink

    So what’s the chances that Steve is going pay $50 CA to go to the Rockstar Energy Drink Mayhem Festival on the 8th at Downsview Park?
    :)

  613. cba
    Posted Aug 5, 2008 at 5:23 PM | Permalink

    DeWitt,

    The Lorentzian profile is used here and there is a peak value given that is divided by the hwhm (gamma) value squared. It is a per molecule intensity. THe line falls inintensity with the variance fromthe base frequency. The stuff is described in Hitran 1996 paper. J. Quant. Spectrosc. Radiat. ¹ransfer Vol. 60, No. 5, pp. 665Ð710, 1998.

    I’ll try to cut and paste a few comment & code lines from my program desribing the basics.

    gamma(T,p) =(Tref/T)^nair*( gair(T)*(p-ps) + gself*(ps)) // T in K, p in ATm, ps partial

    nair gair gself are line by line parameters from the database
    T is kelvins, p is ressure in atmospheres and ps is partial pressure of the isotope of the molecule
    Tref is usually 296K.

    Tintensity = intensity *(QTref/QT)*(exp(-c2*E00/T)/exp(-c2*E00/Tref))*((1-exp(-c2*nu/T))/(1-exp(-c2*nu/Tref)));

    T is this line intensity peak. intensity is the database value. QTref & QT are partition function values at particular values of T
    c2 is a constant = hc/k = 1.4388 cm K
    E00 is the bottom energy state from the line database as I recall
    nu is the line center frequency prior to any shift and it is in /cm not HZ.

    pigamma = invpi*gamma*Tintensity*tmpdensity;

    an intermediate value called pigamma which is 1/pi * gamma * Tintenisity * tmpdensity
    gamma is calculated above, invpi = 1/pi, Tintensity is calculated per molecule, tmpdensity is the # of molecules per cm^3

    The wavenumbers for the two edges of the 1nm wide bin containing the peak are calculated.

    peak height is given as variable tau = Tintensity/gamma^2
    for wide peaks simply an average intensity from both bin edges are done except for the central bin that has a weighted average for the peak to each bin edge. For narrow peaks, a more time consuming and accurate approach is done.

    Narrow peaks take the integral for the lorentz curve (area under the curve) from peak to each side of the central bin dividedby pi. Subsequent bins away from the central peak are the values for the area under the cuve from outer edge to inner edge (newval – oldval) for that bin. Again these are divided by pi as the intergral ultimately becomes from -pi/2 to + pi/2 times the peak, tau.

    tau (the original times 1/pi) times the integral function and getting the diff. between the near and far side of the bin gives us the power in the bin.

    That line bin data is summed in the spectral bin for that wavelength. Note the calculations are converted to nu for calculation and then converted back to wavelength.

    The result is a per unit length (cm) for the material located in cm. Multiplying by the relevent length (# cm in the particular shell) then provides a total dimensionless number so that exp(-value) provides the fraction of power transmitted through the shell and 1-exp(value) provides the fractional absorption in the shell. These are values for each nanometer of bandwidth in the spectrum.

  614. cba
    Posted Aug 5, 2008 at 5:35 PM | Permalink

    DeWitt,

    Sorry for an ommission,
    the intensity at another frequency is
    tautmplw = pigamma /(gamma2 + (temp)*(temp));
    where temp is the difference in freq. between the line center and some other point – such as at a wavelength bin edge. This also includes potential frequency shift of the line center. There is one of these for the upper line wing and one for the lower. Spreading out along the wing, the intensity is determined for the near side and far side and the intensities times the integrals are done to take the differences and determine the power.

  615. DeWitt Payne
    Posted Aug 5, 2008 at 6:30 PM | Permalink

    cba,

    What are the units of Tintensity? It appears to be an intensity multiplied by the fraction of molecules in the ground state. Why use wavelength instead of wavenumber and add the complication of changing units back and forth? Where do you use pigamma? As you wrote the equation, tau has minimal dependence on concentration. Did you mean Tintensity/pigamma^2? In that case tau is inversely proportional to concentration which would mean that transmission decreases with increasing concentration. I’ll look for the paper you referenced.

  616. cba
    Posted Aug 5, 2008 at 8:51 PM | Permalink

    DeWitt,

    I’m more at home with wavelength than with wavenumber. Probably the same goes for Kiehl and Trenberth as they tend to use it too.

    Tintensity is the same as intensity which is /cm /(cm^2 molecule) it’s the spectral line intensity.

    pigamma is just a variable that is combined from several others to cut dwn on multiplying out the eqn multiple times.

    If you find the reference, you’ll find it is the numerator of the dimensionless optical depth tau as shown in eqn A16 of the reference (which also has parts from A15 and A14).

    The denominator has frequency dependent information – which creates the magnitudes for off line center frequencies. The denominator consists of gamma^2 and the Temp^2 value where temp is the frequency offset from the line. The denominator changes for every new offset frequency as we go out the wings through each bin. pigamma is a constant for a line at an altitude and T. For the central peak, temp^2 becomes 0 and the tau becomes pigamma divided by gamma^2.

    This stuff comes directly from the 1996 hitran documentation paper Appendix.

  617. jae
    Posted Aug 5, 2008 at 9:35 PM | Permalink

    What ever happened to the ban on “pet theories” here?

  618. DeWitt Payne
    Posted Aug 5, 2008 at 10:53 PM | Permalink

    cba,

    I think this may be the final answer. The problem is that according to the equations in Petty, Doppler broadening is about equal to pressure broadening for CO2 at 15 micrometers at a pressure of about 10 mbar or between 30 and 35 km altitude. Above this altitude the line width becomes approximately constant because the Lorentz width is less than the Doppler width, which is not a function of pressure, only temperature. Ozone at 9.6 micrometers wavelength starts to become Doppler broadening limited at about 16 mbar or between 25 and 30 km. Not surprisingly, this seems to be the altitude where you start diverging rapidly from MODTRAN because your line width continues to decrease.

    The Doppler width constant (alpha sub D) = line center frequency (Hz) * sqrt(2*k*T/(mc2)) where k is the Boltzmann constant and T is Kelvin, m is the molecular mass and c is the speed of light. The line half width is alpha sub D * sqrt(ln2). That makes the halfwidth of CO2 at 15 micrometers approximately equal to 0.02 nm, if I did my sums correctly, not to mention that the wings fall off much more rapidly for a Doppler broadened line than for a Lorentz line.

  619. DeWitt Payne
    Posted Aug 5, 2008 at 10:57 PM | Permalink

    Line by line radiative transfer calculation using the HITRAN database is not a ‘pet’ theory. It is the gold standard.

  620. DeWitt Payne
    Posted Aug 6, 2008 at 12:09 AM | Permalink

    cba,

    Here’s a page with a couple of references to articles on how to calculate the Voigt line shape, which is what you’ll need to use in the region where Doppler and pressure broadening are about equal. It looks messy to say the least. Petty references Goody, R.M. and Y.L. Yung, Atmospheric Radiation: Theoretical Basis (2nd ed., paperback) Oxford University Press, New York, 544 pp., 1995 (ISBN 0-19-510291-6). Amazon has it new for a mere $165 US.

  621. DeWitt Payne
    Posted Aug 6, 2008 at 12:27 AM | Permalink

    cba,

    You could try this page as well. It has the code for calculating the Voigt function. You need a Postscript viewer like Ghostview to read the article.

  622. Hoi Polloi
    Posted Aug 6, 2008 at 6:49 AM | Permalink

    Below article by Ed Blick, Ph.D, has been presented to the APS. I doubt whether it will be accepted after the Monckton/Smith soap…

    http://icecap.us/images/uploads/Theunsoundsciencebehindglobalwarming.pdf


    Steve:
    this site is about examining mainstream articles and I don’t want this site to be used for promoting non-mainstream literature.

  623. Posted Aug 6, 2008 at 8:10 AM | Permalink

    Re #621
    I doubt whether it will be accepted because it’s a rambling load of nonsense!

  624. Boris
    Posted Aug 6, 2008 at 8:25 AM | Permalink

    621:

    I don’t think serious skeptics will support that paper.

  625. DeWitt Payne
    Posted Aug 6, 2008 at 8:34 AM | Permalink

    Re: #621,

    At the risk of being snipped: When I saw Beck mentioned favorably in the abstract, I stopped reading.

  626. Sam Urbinto
    Posted Aug 6, 2008 at 11:35 AM | Permalink

    While it’s certainly not mainstream (and could never be) I fail to see why it’s a “load of nonsense”.

    Certainly, this is wrong: “During the 20th century the Earth warmed ~0.7 C.” as it’s the anomaly trend rose ~0.7 C during between the late 19th century and the start of the 21st century.

    And certainly it’s not “Beck cataloged 90,000 chemical measurements of CO2 in the 1800s, some as high as 470 ppm (greater than the current Mauna Loa value of 385 ppm).” It’s CO2 in the air since 1812 summarized. http://www.biokurs.de/treibhaus/180CO2_supp.htm

    Yes, yes, attack the graph, bring up the names Meijer and Keeling versus Schulte, and toss in Mashey.

    All I know, is if they complain about something at Deltoid or Rabett Run, there’s probably some steak in it somewhere that needs to be doused in so much sauce you can’t see it any more.

  627. KevinUK
    Posted Aug 6, 2008 at 11:59 AM | Permalink

    #48

    “there are other lines of evidence and concern regarding CO2 besides proxies.”

    Steve please provide us with examples of the ‘other lines of evidence’ that you are refering to in you addition to Chris W’s post as I am intrigued to find out what other evidence exists that supports the claim (by Mann et al) that recent temperatures are ‘unprecented in the last xxxx years’. Please substitute whatever number you feel appropriate for xxxx. Other than the ‘chucky charts’ just exactly what evidence is there that shows a statistically significant (non-spurious) correlation between past temperature and CO2 concentration in the atmosphere (over the last xxxx years). If it can be shown whether through proxies or not or certainly via the volumes of documented historical evidence that the Medieval Warm Period occurred e.g. the farming of Greenland during the MWP etc then the claim that the recent warming period at the end of the 20th century was ‘unprecedented’ does not stack up.

    KevinUK

  628. Not sure
    Posted Aug 6, 2008 at 12:25 PM | Permalink

    The Stoat and his meat puppet stalk that article on Wikipedia assiduously. The Meat Puppet wanted to “…(restrict) the list to scientists in relevant diciplines(sic), or at least with a relevant publication in the physical sciences…” Let me translate that for you “restrict the list to Team members.”

    Check out also this lovely, unbiased, “encyclopedic” article. It’s history page is also a good read.

  629. Jason W. Solinsky
    Posted Aug 6, 2008 at 12:48 PM | Permalink

    #51, it would be well within appropriate Wikipedia guidelines (and indeed genuinely useful) to create a Climate Change Alarmism page as near mirror image of this page.

  630. Posted Aug 6, 2008 at 12:51 PM | Permalink

    Re #50

    #48

    “there are other lines of evidence and concern regarding CO2 besides proxies.”

    Steve please provide us with examples of the ‘other lines of evidence’ that you are refering to in you addition to Chris W’s post as I am intrigued to find out what other evidence exists that supports the claim (by Mann et al) that recent temperatures are ‘unprecented in the last xxxx years’.

    Can’t you read what Steve wrote, how is your question related to his statement?

  631. Mark T.
    Posted Aug 6, 2008 at 12:54 PM | Permalink

    Can’t you read what Steve wrote, how is your question related to his statement?

    It is clearly related: “and concern.” This last bit is exactly what KevinUK was referring to, i.e., that there should be concern about that offending stuff since IT is what is creating “unprecedented” temperatures (the ultimate “concern” as it were).

    Mark

  632. KevinUK
    Posted Aug 6, 2008 at 2:14 PM | Permalink

    #53 Phil,

    Normally I don’t feed trolls but in your case I’ll make an exception.

    In #48 CW posted “The message of this piece is very clear: when you replace bad science with good science then the Medieval Warm Period comes roaring back. This happens in countless studies, for example when a small selection of questionable tree ring proxies are replaced with a wide range of other proxies.”

    To which Steve added “I don’t argue that one squiggle is “right” and one squiggle is “wrong”. I’ve not argued that the proxies prove the existence of an MWP. I’m merely arguing that these studies do not prove what they claim. Also, as always, remember that there are other lines of evidence and concern regarding CO2 besides proxies.”

    So Steve is saying that he doesn’t claim that any given proxy is ‘right’ or ‘wrong’ and so therefore in his opinion cannot therefore serve as evidence for the existence of or claim that the MWP was warmer than the recent warming period. Now that comes from a man who has done more analysis of the contribution of each of the proxies to the overall millemial temperature reconstruction than any other man on the planet IMO. Steve has shown on countless occasions on this blog that the conclusions reached in the seminal temperature proxy reconstruction reports are not robust to substitution of the proxy data they have used by other equally valid available proxy data series. Michael Mann knew this which is why he tucked away evidence for lack of robustness in his ‘hockey stick’ in a folder called CENSORED on his FTP server.

    Steve has said that despite this lack of robustness in proxy temperature reconstructions “there are other lines of evidence and concern regarding CO2 besides proxies”. I have therefore legitimately IMO (and Mark T’s) now asked Steve if he could list these ‘other lines of evidence’.

    IMO is was the existence of the ‘flat shaft’ of the hockey stick i.e the removal of the previously widely acknowledged to exist MWP and LIA that aroused Steve and Ross’s suspicions when the IPCC TAR was published that ultimately led to the creation of this blog, hence why I have also mentioned the MWP in my post and its significance to the claim of ‘unprecedented in the last xxxx years’ (and hence therefore concerning) temperatures in the latter part of the 20th century.

    OK I’m done with explaining to the troll why I’ve asked Steve a legitimate question now.

    KevinUK


    Steve:
    The other arguments are, of course, based on the physics of the greenhouse effect and the impact of water vapor feedbacks. Please don’t ask me for a comprehensive exposition of these arguments – I’d like to see one. However, serious people take these arguments seriously and I give them enough credit that I don’t dismiss their concerns out of hand, not that I endorse arguments that I haven’t examined closely myself.

  633. DeWitt Payne
    Posted Aug 6, 2008 at 2:46 PM | Permalink

    cba,

    I have a copy of the paper. In paragraph A.2.4 is the following sentence:

    In the lower atmosphere, pressure broadening of spectral lines dominates and if a Lorentz profile is assumed…

    Emphasis added.

    I think this is one of those situations where the reader is expected to know that Doppler broadening of spectral lines dominates everywhere else. Eyeballing the graph in Petty, it looks like at 60 km the Lorentz line width is about 2 orders of magnitude smaller than the Doppler width for CO2 at the same altitude.

  634. Ian
    Posted Aug 6, 2008 at 3:13 PM | Permalink

    Steve @ 43 correcting my mis-characterisation of his inclination to trust tranzi’s with an agenda.

    I’ve wasted a lot of peoples time and distracted the discussion, Phil claims to have answered the CO2 question, which of course he hasn’t and has relied on the Team defence. I’m surprised I even bother to read his posts as he’s trying to club the baby ice as we speak. My only recourse is of course to visit the tip jar before all the starbucks close and Steve & MrPete are unable to update the poxy tree-rings.

  635. Pat Keating
    Posted Aug 6, 2008 at 3:31 PM | Permalink

    56 Steve

    The other arguments are, of course, based on the physics of the greenhouse effect and the impact of water vapor feedbacks.

    The first part of this sentence is OK (as long as we recognize that that physics only gives about 1.2C sensitivity), but the second half of the sentence is, I believe, invalid.
    Theory is inadequate to tell us anything about even the sign of water-vapor feedback. Empirically, one study in the polar regions suggests that it is positive, another study in the tropics suggests that it is negative. Assuming (for the argument) that these are correct, the negative-feedback result is more important since the IR radiative effects are stronger in the tropics.

  636. Sam Urbinto
    Posted Aug 6, 2008 at 4:00 PM | Permalink

    Pat; I think it’s clear water is a negative feedback, not really from its actions in the atmosphere as a greenhouse gas per se, but from its actions in the hydrosphere with phase changes and how water, ice, snow and clouds act (combined with sunlight and wind and the Earth’s tilt et al) to drive the weather/climate. It may not be empirical, but it certainly is observable.

    Otherwise, if not a negative feedback, how does the climate stay pretty much the same except for the cycles of ice ages? Now all that needs to be done is figure out what exactly triggers the glacials how.

    That’s perhaps a bit trickier.

  637. KevinUK
    Posted Aug 6, 2008 at 4:13 PM | Permalink

    #56 Steve

    In your reply to my question you have now posted

    “The other arguments are, of course, based on the physics of the greenhouse effect and the impact of water vapor feedbacks.”. By ‘other arguments … based on the physices…’ do you mean the computer models? If so then their predictions (or projections to use Kevin Trenberth parlance) can NEVER be considered as evidence. The GCMs are ‘tuned’ to produce greater than 1 deg. C for doubled CO2 predictions. They rely heavily on the assume that water vapour will (note my use of future tense) be a strong postive feedback when in fact this is far from certain and is in the final analysis only down to the opinions of two people namely Brian Soden and Isaac Held. Since there is no evidence within the paleoclimate record of any runaway warming as a result of postive water vapour feedback then the balance of the evidence is in favour of the opinion that water vapour is a net negative feedback when global temperature has risen in the geological past.

    KevinUK

    Steve: I don’t entirely agree with your characterization. Also I’m not arguing the point on behalf of Soden and Held, nor do I think that their arguments are above criticism. I’m simply observing that serious people hold the view, however irrational it may seem to you. And that there is a substantial literature supporting the ideas. Obviously, I don’t think that that is the same as proof, or else the HS would be validated as well. But I’m not in a position to form my own viewpoint without a great deal of analysis and there’s only so much that I can do. I’m not interested in debating snippets here and there. The questions are big ones and each point needs to be documented and analysed. This is one reason why I support the idea of a truly “independent” assessment by parties with no dog in the race – i.e. a large scale engineering quality analysis. So please don’t harangue me about this.

  638. Sam Urbinto
    Posted Aug 6, 2008 at 4:16 PM | Permalink

    One would think observational evidence trumps tweaked models.

  639. TerryB
    Posted Aug 6, 2008 at 4:36 PM | Permalink

    I think Roy Spencer’s paper last spring showed negative water feedback, in the tropics at least. (but I might be mistaken).

    Lubos Motl also supports (and shows how to get to) Pat Keating’s “1.2C” before feedbacks. He’s also suggested that if there were positive feedbacks, there’d be an unstable climate. Seems a plausible viewpoint to me, but I’m no scientist.

    I know Steve McIntyre often makes the point that there’s a lot of really clever folk who see CO2 as a real problem. That’s certainly true and they may well be right of course. But it’s great that there’s a lot of other clever folk like you lot that are pushing them hard for the evidence, despite many apparent obstacles, lack of co-operation, and censorship.

  640. Pat Keating
    Posted Aug 6, 2008 at 5:28 PM | Permalink

    59 Sam
    The AGW supporters just as confidently say the feedback is positive.

    My position is that we don’t know. The slight warming from the 1.2C sensitivity probably puts more water vapor into the atmosphere, but does the resulting increase in GHG effect outweigh the resulting increase in cloud formation?

    The work quoted by TerryB above suggests that the feedback is negative, but that needs to be confirmed.

  641. cba
    Posted Aug 6, 2008 at 6:16 PM | Permalink

    Actually, the IF should have had the emphasis added. It’s quite disappointing that their documentation fails to deal with the anything but the most limited of conditions in their explanations. This is supposed to be useful for not just Earth’s atmosphere but also for other situations as high in T as 3000K and significantly cooler as well. One might expect a lot of elbow grease might be needed to adapt hitran to Mars or a stellar atmosphere or perhaps a cold molecular cloud, but to find its basic documentation not even covering most of Earth’s lower atmosphere into the stratosphere seems a bit poor. It also suggests the learning cruve and work necessary to adapt it to different more exotic conditions could be substantially more than one might expect.

  642. D. Patterson
    Posted Aug 6, 2008 at 7:20 PM | Permalink

    61 Sam Urbinto says:
    August 6th, 2008 at 4:16 pm
    One would think observational evidence trumps tweaked models.

    More firsthand evidence that it does not do so in a political venue.

  643. Not sure
    Posted Aug 6, 2008 at 9:19 PM | Permalink

    #52

    IIRC a page on climate alarmism did exist, but was deleted. It’s hard to find deleted pages so I don’t recall the grounds, but I think it was determined to not be “encyclopedic”.

  644. jeez
    Posted Aug 6, 2008 at 9:52 PM | Permalink

    We are overdue for an old favorite.

  645. D. Patterson
    Posted Aug 6, 2008 at 10:26 PM | Permalink

    65 Not sure says:

    August 6th, 2008 at 9:19 pm
    #52

    IIRC a page on climate alarmism did exist, but was deleted. It’s hard to find deleted pages so I don’t recall the grounds, but I think it was determined to not be “encyclopedic”.

    The struggle to have such a page is ongoing. See:

    Talk:Climate change denial
    From Wikipedia, the free encyclopedia

    http://en.wikipedia.org/wiki/Talk:Climate_change_denial

    User talk:Jaimaster
    From Wikipedia, the free encyclopedia

    http://en.wikipedia.org/wiki/User_talk:Jaimaster

  646. Posted Aug 6, 2008 at 10:36 PM | Permalink

    http://www.ecostudies.org/press/Schlesinger_Science_13_June_2008.pdf

  647. DeWitt Payne
    Posted Aug 6, 2008 at 10:40 PM | Permalink

    cba,

    My guess would be that very few people that do this sort of thing write their own code from scratch. There a number of commercial and public codes out there like GENLN2. I’m betting that the error wouldn’t be all that large if you didn’t try to do the Voigt profile in the intermediate range but just switched from Lorentz to Doppler when the Lorentz width became smaller than the Doppler width.

  648. oakgeo
    Posted Aug 6, 2008 at 10:52 PM | Permalink

    Thanks to jeez!

    Love the graph. Now I finally understand everything I wanted to know about global warming – sorry, I mean climate change (wouldn’t want to lock anyone into a specific projection!) – but was afraid to ask. But was it rigorously peer reviewed?

    I’ve been a lurker here for a while and appreciate the insights from SM and the CA readership’s stable of minds. I’m a geologist and for years have rolled my eyes at the “sky is falling” rhetoric permeating the airwaves, but I can’t help but ask… does any of this good work have a chance of reaching the general population? And would most people appreciate its importance? Probably not, but then we can probably rely on apathy to hold the Algore Army at bay until reason prevails. Politicians, however…

    Anyway, sorry for the tangent. Thanks to Steve McIntyre for his diligence and stick-to-it attitude.

  649. Posted Aug 7, 2008 at 12:12 AM | Permalink

    Re #632
    Actually I thought you were the one trolling!

    “there are other lines of evidence and concern regarding CO2 besides proxies.”

    Steve please provide us with examples of the ‘other lines of evidence’ that you are refering to in you addition to Chris W’s post as I am intrigued to find out what other evidence exists that supports the claim (by Mann et al) that recent temperatures are ‘unprecented in the last xxxx years‘.

    So I’ll ask you again why the non sequitur?

  650. D. Patterson
    Posted Aug 7, 2008 at 12:29 AM | Permalink

    645 Leif Svalgaard says:
    August 6th, 2008 at 10:36 pm

    C. K. Keller, B. D. Wood. Possibility of chemical weathering before the advent of vascular land plants. Nature 364, 223 – 225 (15 July 1993); doi:10.1038/364223a0

    http://www.nature.com/nature/journal/v364/n6434/abs/364223a0.html

    THE TERRESTRIAL PLANT AND HERBIVORE ARMS RACE – A MAJOR CONTROL OF PHANEROZOIC ATMOSPHERIC CO2?
    OLSEN, Paul E., Lamont-Doherty Geological Observatory of Columbia University, Palisades, NY 10964.

    Much recent work points to chemical weathering of continental silicates as the principal control of atmospheric CO2. Presently, chemical weathering is mediated by plants. Vascular plants increase chemical weathering by drastically increasing acid leaching through respiration, decay, and microbial symbionts. Through the Phanerozoic the continuing evolution of terrestrial plant communities must have had a major effect on weathering rates. However, the efficacy of plant-induced-weathering is decreased by herbivory, which in turn decreases the invasion of soil by roots and leads to increased physical weathering. There is a dramatic correlation between the appearance of major plant trophic, reproductive, and defensive innovations, increased terrestrial organic carbon burial, and the onset of “ice house” conditions (presumably caused by decreased CO2). Conversely, there is also a correlation between the appearance of major herbivore inroads on terrestrial plants and global “hot house” conditions presumably resulting from higher CO2. I propose that the major “ice house”–“hot house” cycles of the Devonian-Quaternary were caused by the lag between plant innovations and complete compensation by herbivore-detritivore response. In this way, it seems possible that: 1) the Carboniferous coals are a consequence of limited herbivory and soil litter decomposition and the Permo-Carboniferous glaciations were caused by dramatically increased chemical weathering caused by the previous global spread of vascular plants; 2) the Mesozoic “hot house” was brought on by massive increases in megaherbivores and litter decomposers; and 3) Cenozoic cooling and Quaternary glaciations resulted from the spread of herbaceous angiosperms and most recently grasslands. Our own superherbivory, if continued for tens of millions of years, will bring us back to mid-Mesozoic “hot house” conditions, not by the burning of fossil fuels, but rather by a global increase in physical over chemical weathering.
    Geological Society of America, Abstracts with Programs, v. 25, no. 3, p 71

    http://www.ldeo.columbia.edu/~polsen/nbcp/gsa.nc.1993.abs.html

  651. cba
    Posted Aug 7, 2008 at 5:37 AM | Permalink

    DeWitt,

    I was just thinking a bit this morning. My resolution is 1 nm. While the narrow halfwidths (in this neighborhood) are treated as area under the curves, supposedly much wider ones are taken as averages at the bin edges (left right), starting at the middle of the line for the bin containing the line. Somehow, it’s making no sense atthe moment that there should be real problems with either, especially considering the wide lines are predominant lower down where the model seems to match well. Further up, I don’t see this causing much problem either because it’s area under the curve and the fact that the narrow line is converted to a bin average and is not resolved beyond 1nm anyway. The area under the curve should remain constant even though the peak height and shape may vary. Adjacent bins might receive slightly different amounts for particular lines but averaged over a million lines, even that should become negligable and finally integrated (summed) over the bandwidth of interest, I would expect such effects to be invisible. Certainly, if things are operating properly, there shouldn’t be orders of magnitude differences.

  652. Hoi Polloi
    Posted Aug 7, 2008 at 6:42 AM | Permalink

    “Climate change: Prepare for global temperature rise of 4C, warns top scientist”

    The UK should take active steps to prepare for dangerous climate change of perhaps 4C according to one of the government’s chief scientific advisers.

    In policy areas such as flood protection, agriculture and coastal erosion Professor Bob Watson said the country should plan for the effects of a 4C global average rise on pre-industrial levels. The EU is committed to limiting emissions globally so that temperatures do not rise more than 2C.

    “There is no doubt that we should aim to limit changes in the global mean surface temperature to 2C above pre-industrial,” Watson, the chief scientific adviser to the Department for the Environment, Food and Rural Affairs, told the Guardian. “But given this is an ambitious target, and we don’t know in detail how to limit greenhouse gas emissions to realise a 2 degree target, we should be prepared to adapt to 4C.”

    Globally, a 4C temperature rise would have a catastrophic impact.

    According to the government’s 2006 Stern review on the economics of climate change, between 7 million and 300 million more people would be affected by coastal flooding each year, there would be a 30-50% reduction in water availability in Southern Africa and the Mediterranean, agricultural yields would decline 15 to 35% in Africa and 20 to 50% of animal and plant species would face extinction.

    In the UK, the most significant impact would be rising sea levels and inland flooding. Climate modellers also predict there would be an increase in heavy rainfall events in winter and drier summers.

    http://www.guardian.co.uk/environment/2008/aug/06/climatechange.scienceofclimatechange

  653. Jaye Bass
    Posted Aug 7, 2008 at 11:01 AM | Permalink

    Is Gavin out of his mind? He just posted the following in a response to one of my posts:

    [Response: If independent groups with independent methodologies with models that range from the 1-layer energy balance, to line-by-line radiative transfer to full blown GCMs all agree on something, then yes, that counts as scientific replication. There is zero chance that this result is not a consequence of the physics being built into the models, therefore all the IV&V documents in the world aren't going to make the blindest bit of difference except for those people who want to delay recognition of the situation. - gavin]

    What am I missing? N models where validation doesn’t matter all agree so whatever they agree on must be real? Does anybody here believe this?

    Steve: I don’t think that radiation code is a particular issue. HOwever this is not an excuse for abysmal documentation of what people are doing. That should be done anyway. I also think that it’s important not to link policy issues to this sort of documentation. I categorically and frequent say that policy makers have to make decisions based on available information and if the only available information has crappy documentation then they have to proceed on that basis. But when they make the decision with crappy documentation, they should put the people in question on notice, tell them that they don’t like making decisions on poorly documented stuff and instruct them to remedy their crappy documentation as a priority. Gavin shouldn’t be permitted to use the “big picture” as an excuse for poor hygiene in his own operations.

  654. DeWitt Payne
    Posted Aug 7, 2008 at 11:15 AM | Permalink

    cba,

    The area under the curve should remain constant even though the peak height and shape may vary.

    Yes, it should. But your results do not appear to be consistent with a constant area. If the area under the line before multiplying by the concentration and path length were constant, you would get exponential decrease in emission with altitude rather than the large peak you see. The increase in emission caused by increased temperature in the stratosphere would be dominated by the lower concentration except for the strongest lines at the very center of the CO2 and ozone bands. And the ozone emission would be much less intense than the CO2 emission. There must be some flaw in the Lorentz line shape calculation at very small line widths. The evidence that the bug affects ozone more strongly than CO2 appears to support this conclusion. Ozone at 9.6 micrometers has a wider Doppler width than CO2 at 15 micrometers so the difference between Doppler and Lorentz width is larger for ozone at high altitude.

  655. DeWitt Payne
    Posted Aug 7, 2008 at 11:28 AM | Permalink

    cba,

    The other thing about a Doppler line shape is that the wings fall off much more rapidly leading to far less overlap. I don’t know if this is significant or not. Of course, you could work at picometer resolution (yeah, right) and not have to worry about binning.

  656. Jaye Bass
    Posted Aug 7, 2008 at 11:31 AM | Permalink

    IMO, its not just a documentation issue. Implicit in his comments was the assertion that if untested/unvalidated models all agree then they must be correct if they are developed independently and model, to varying degrees of fidelity, the same underlying physics. To me, this is a preposterous claim.

  657. jae
    Posted Aug 7, 2008 at 12:02 PM | Permalink

    jaye, 651:

    What am I missing? N models where validation doesn’t matter all agree so whatever they agree on must be real? Does anybody here believe this?

    FWIW, I don’t believe it.

  658. Mark T.
    Posted Aug 7, 2008 at 12:22 PM | Permalink

    Re #632
    Actually I thought you were the one trolling!

    You’re running in circles, Phil. “The concern” regarding CO2, IS unprecedented temperature rise. No trolling necessary. Steve states in a follow-up that he believes an exposition would indeed reveal what the physical basis of the concern is, yet here we are, still, waiting for such an exposition.

    If independent groups with independent methodologies with models that range from the 1-layer energy balance, to line-by-line radiative transfer to full blown GCMs all agree on something, then yes, that counts as scientific replication.

    What Gavin doesn’t realize, willingly or not, is that “scientific replication” means nothing when they all suffer the same failure. Yes, they are all replicating the same nonsense.

    Mark

    Steve:
    Except that the CO2 radiation code is very poor issue to worry about. Worry about clouds instead,

  659. cba
    Posted Aug 7, 2008 at 12:35 PM | Permalink

    DeWitt,

    What line width HWHM arena did you state was the point where doppler starts to predominate? I thought it was somewhat below 1nm. If that’s the case, we’re talking very small effects on average. I should have some time starting this weekend and plan on pulling the data for a co2 line at 3 altitudes including intermediate calculations and some relevant bins for it. I started to do this the other day only to run totally out of available time.

  660. DeWitt Payne
    Posted Aug 7, 2008 at 12:49 PM | Permalink

    cba,

    Area under the line is not the only controlling variable. Peak intensity is important too. It has to do with equivalent width. In the strong line limit where the peak of the line is saturated, absorption is proportional to the square root of the concentration. I haven’t tried to do the calculation yet, but I strongly suspect that considering the Lorentz line width goes down exponentially with altitude, the peak intensity must increase exponentially with altitude so the concentration dependence is always square root. Doppler width and intensity, OTOH, is independent of altitude so eventually the strongest line is no longer saturated and absorption becomes linear with concentration. In both cases the product of the peak intensity and the line width remains constant, I think, but the concentration dependence is very different. An exponentially decreasing concentration can then easily generate orders of magnitude difference in absorption for Lorentz vs. Doppler line shapes.

  661. DeWitt Payne
    Posted Aug 7, 2008 at 1:08 PM | Permalink

    cba,

    What line width HWHM arena did you state was the point where doppler starts to predominate?

    It’s way less than 1 nm, but see above. At 300 K, the Doppler HWHM for CO2 is 0.014 nm.

  662. DeWitt Payne
    Posted Aug 7, 2008 at 1:44 PM | Permalink

    RSS has posted their July, 2008 data. Global anomaly 0.147 up from 0.035 in June. The smoothed Arctic TLT anomaly continues to go down to 0.73 from 0.78 last month.

  663. BarryW
    Posted Aug 7, 2008 at 2:19 PM | Permalink

    Re 651

    Ballistic trajectory equations can be very accurate in their physics. However I can do a hundred models of a model rockets flight and tell you it will reach 500 ft, but if I fire it off in my living room all the model outputs will be crap, because they are missing a real world component (i.e., the ceiling). The modelers will say “Of course, that’s stupid!” What they’re missing is that if one (or more) of their assumptions are wrong, then the model doesn’t match the real world no matter how complex the equations and the accuracy of the physics. And if they’re all starting out with the same assumptions….

  664. DeWitt Payne
    Posted Aug 7, 2008 at 3:20 PM | Permalink

    cba,

    Doing a little more algebra: The Lorentz width, a, is directly proportional to the pressure, a=a’*p. The peak intensity S is inversely proportional the the Lorentz width,S=So/a. The mass path for a well mixed gas, u, is proportional to pressure, u=u’*p. So the dimensionless mass path U=(S*u)/(2*pi*a) converts to So*u’/(2*pi*a’^2*p) or U is inversely proportional to p only. For any moderately strong line U will always be much greater than 10 and the concentration dependence of the absorption (emission) will always be proportional to the square root of u. You can check this by calculating the dimensionless mass path for a strong ozone or CO2 line at different altitudes using a Lorentz line shape.

    With a Doppler line, neither S nor a are proportional to pressure so U is now directly proportional to p rather than inversely proportional. Should make a big difference.

  665. Mark T.
    Posted Aug 7, 2008 at 3:40 PM | Permalink

    Steve: Except that the CO2 radiation code is very poor issue to worry about. Worry about clouds instead,

    I agree, I just think that Gavin is making extreme assumptions regarding the concept of replication (“zero chance,” was that scientifically calculated?). Clouds is what I was getting at, btw, though in retrospect I think Gavin was speaking directly to the radiative transfer models (which I’m still not sold on), not GCMs as a whole.

    Mark

  666. Raven
    Posted Aug 7, 2008 at 3:40 PM | Permalink

    DeWitt Payne says:
    “Extreme AGW stands and falls on the GCM’s. Paleoclimate has not played a major role in validating or invalidating those. The paleoclimate proxies don’t have enough resolution, precision or geographic coverage for that.”

    I disagree. The paleoclimate is essential to alarmists arguments because the GCMs are junk if anyone can show that they don’t take into account a phenomena that have caused significant climate changes in the past. A warm MWP that cannot be explained by the GCMs would demonstrate that such phenomena exist and the modellers cannot credibly claim that they have correctly represented all things that can affect the climate.

  667. Steve McIntyre
    Posted Aug 7, 2008 at 4:11 PM | Permalink

    Extreme AGW stands and falls on the GCM’s.

    As I’ve said many times, I don’t entirely agree with that. Let’s picture a situation where we had no computers ergo no GCMs, but were increasing CO2 levels. Could there be a basis for honest concern about increased CO2 levels? Or would you have to say – we haven’t invented computers so we can’t think about the problem. I think that someone could express concern based on the physics. In a way, I’m suggesting that advocates forego their computers for a few minutes and try to write down the physics. The problem with the HS – as I’ve also said on many occasions – is that advocates have used this sort of thing as a crutch to avoid trying to explain the physics to the public. I think that they under-estimate people. But how often do we see advocates come here and simply fail to make any argument that rises above arm-waving?

  668. Jaye Bass
    Posted Aug 7, 2008 at 4:17 PM | Permalink

    I think Gavin was speaking directly to the radiative transfer models (which I’m still not sold on), not GCMs as a whole.

    The context of my post, to which Gavin responded, was reliance on agreement in N models in the abstract. I think the guy believes that if you have 10 models that agree then they must be correct if they are independent and are based on the same physics, incomplete or not.

    Steve: I guess we see the same sort of thinking with the 1000-year reconstructions. The Gavins of the world presume that these are “independent”, when they stand or fall collectively on one or two data sets that can easily be seen to be problematic.

  669. Jaye Bass
    Posted Aug 7, 2008 at 4:18 PM | Permalink

    RE: 661…exactly.

  670. Mark T.
    Posted Aug 7, 2008 at 4:24 PM | Permalink

    The context of my post, to which Gavin responded, was reliance on agreement in N models in the abstract. I think the guy believes that if you have 10 models that agree then they must be correct if they are independent and are based on the same physics, incomplete or not.

    So GCMs as a whole, not just the part that has a possible physical basis? Then yes, not just an extreme assumption, but a nonsense position. They all suffer from the same as I originally surmised.

    Mark

  671. Richard Patton
    Posted Aug 7, 2008 at 4:32 PM | Permalink

    For me, the hockey stick makes all the difference in the world. If I were convinced that this was true, then along with what looks like very solid evidence for +1.2C with 2X CO2 I would be mostly convinced of the strong (alarmist) AGW hypothesis.

    As soon as there is a reasonable probability for a long and warm MWP and a long and cold LIA with no clear explanation of how these might have occurred it casts significant doubt on the whole large climate sensitivity issue. In fact, for me, it even casts doubt on how solid the +1.2C with 2X cO2 is in the actual real world. This is because suddenly the earth’s climate looks a lot more like a very complex / chaotic open system with highly nuanced feedbacks that perhaps change based on the particular state of the current climate rather than a more linear / mechanical system (in its average behavior of course – obviously it IS a very complex open system in the sense of K.L. von Bertalanffy, “General System Theory”).

  672. DeWitt Payne
    Posted Aug 7, 2008 at 4:46 PM | Permalink

    Steve Mcintyre,

    Arrhenius did it a century ago with pen and paper.

    I’ve done a lot of work trying to understand the nuts and bolts of radiative transfer. I have little doubt that someone, not me, could come up with a very detailed but accessible explanation of exactly how that works for clear sky conditions with varying surface temperature and humidity. Clouds add a major complication because scattering is a lot messier than absorption, but it should still be possible. But that only gets you about 1 C for doubling CO2.

    Extreme AGW, which I will define as a climate sensitivity of greater than 3 C to a radiative forcing equivalent to doubling CO2 from the pre-industrial level, requires additional feedbacks. I remain unconvinced that you can adequately represent feedbacks with anything less than a full 3D coupled climate model. Of course, I also remain unconvinced that climate models are now, or possibly may ever be, adequate for the task. The problem with turbulent mixing caused by surface heating that occurs in the real world on scales much smaller than the model grid size is fixed with a classic kludge, (and it is kludge not kluge) for example.

    Clouds. Deep convection. The list goes on. But with anything less than a coupled 3D model you must always assume things that are not in evidence like constant relative humidity and also constant albedo, which seems on the surface to be contradictory, but what do I know? I’m reasonably certain that higher specific humidity increases forcing, but I’m not at all certain that higher temperature necessarily leads to higher humidity averaged over the planet.

    Of course, those same assumptions may well be buried in the parameterizations in the current models rather than following naturally from basic physics. Who knows? I don’t and I doubt the modelers do either in spite of their protests to the contrary.

  673. DeWitt Payne
    Posted Aug 7, 2008 at 4:52 PM | Permalink

    Mark T.,

    “Unprecedented” temperatures is also useless noise to me. One can just as easily argue that a large natural variation makes AGW more rather than less dangerous unless you can prove that temperature would be going down otherwise.

  674. Robert S
    Posted Aug 7, 2008 at 4:58 PM | Permalink

    RE #95, I think from a physics view point, computer simulations are of course vital anyway, because of the inherent complexity.

    There are a huge number of parameters, and the magnitude of effects are really quite small (in terms of delta kelvin).

    But yeah, therefore great care must be taken in making sure the inputted parameters and the assumptions and all the climatically important processes are accounted for accurately, and equate well with reality. And that would require a lot of work and impartiality.

  675. Raven
    Posted Aug 7, 2008 at 5:06 PM | Permalink

    Steve McIntyre says:
    “Let’s picture a situation where we had no computers ergo no GCMs, but were increasing CO2 levels. Could there be a basis for honest concern about increased CO2 levels?”

    I agree that anti-CO2 policies can be justified even without the GCMs. The trouble is the GCMs are being used to create a sense of urgency that will likely lead to some extremely bad policy decisions.

  676. Pat Frank
    Posted Aug 7, 2008 at 5:44 PM | Permalink

    #92 — Very nice summary, Ross. And this …

    While I understand Steve’s pro forma deference to accepting advice from the institutional consensus for the purpose of policy formation, at a certain point the institutions become discredited as regards their advisory function by their failure to maintain the procedures that were set for them, as well as failing on a prima facie basis to produce quality work.

    … captured my thoughts about the IPCC and its stalwarts exactly. One might add that by all evidence, they “[failed] on a prima facie basis to produce quality work” in a studied manner.

  677. Pat Keating
    Posted Aug 7, 2008 at 6:08 PM | Permalink

    92 Ross
    Thank you, that is very helpful to a physicist, not a statistician.

    99 DeWitt
    I agree with much of what you say (though I think it’s about 1.2C, but that’s close enough to your 1.0).

    I do disagree with you on the feedback issue, though. I think that it is likely that there is some increase in water-vapor mixing-ratio with temperature, but that the forcing would only be increased if the humidity is very low (poles or ice age), so that no extra clouds are formed. In warmer climes, I suspect negative feedback.

    There’s a lot more going on than radiation physics.

  678. Richard Patton
    Posted Aug 7, 2008 at 6:23 PM | Permalink

    DeWitt Payne

    =============
    Extreme AGW, which I will define as a climate sensitivity of greater than 3 C to a radiative forcing equivalent to doubling CO2 from the pre-industrial level, requires additional feedbacks. I remain unconvinced that you can adequately represent feedbacks with anything less than a full 3D coupled climate model.
    =============

    This depends on whether the aggregate or emergent behavior of the climate over time follows some regular laws / patterns.

    I think the only way we could possibly know this is from observation. We certainly cannot know this from first principles. I think most people would agree that you cannot explain biochemistry from sub-atomic physics. Each level of physical phenomena tends to have its own set of “laws” that have to be discovered from observation – not from detailing out how the “lower-level” of physical relationships lead to those laws. This is because most behavior at the higher level is emergent behavior – it comes about from the non-intuitive systemic interactions.

    Gavin seems convinced that we can know the general range of climate sensitivity based on paleo reconstructions of the climate. He seems to think that the climate shift around the LGM is an especially good case for this.

    To have any confidence that this is true would require first being able to confirm the following:

    Paleo reconstructions of climate:
    1. Are “accurate enough” – enough detail resolution compared to level of effect being explained.
    2. Are “comprehensive enough” – not missing any key variables.
    3. We can take them all and using known physical relationships come up with a set of “laws” that explain the evolution over time and not have any outliers that cannot be explained adequately.

    This is the only thing I can think of that would get at Steve’s issue of an exposition from physical principles related specifically to the sensitivity issue.

    Personally, I don’t think climate is like this. It looks to me much more like a high-order open system (more organic than mechanical). One that is quite good at taking perturbations and coming back to a semi-equilibrium state. Clearly the concern is that there seem to be a set of equilibrium states, some of which we would really like to avoid. A mile of ice on top of Minnesota is definitely one that comes to my mind.

    So, the key question is how much perturbation can the system take before going into some less-than-desired new equilibrium state? It seems to me that gaining solid knowledge of how far the current state has been perturbed is crucial information. If the hockey stick was absolutely true then we have more cause for concern than if there were large fluctations that did not cause us to jump to some other state.

    I agree that our best path forward is through full 3D coupled climate models. However, it seems pretty clear that we need to learn a whole lot more about the key systemic interactions (the things you mentioned) before they will tell us very much. I would guess that in another 20-40 years we may start getting there. Sooner if this process becomes less polarized and more like actual science.

  679. Mark T
    Posted Aug 7, 2008 at 7:28 PM | Permalink

    “Unprecedented” temperatures is also useless noise to me. One can just as easily argue that a large natural variation makes AGW more rather than less dangerous unless you can prove that temperature would be going down otherwise.

    That’s a strawman overall, however, since neither I nor Raven were making qualitative statements regarding whether or not anything is or isn’t unprecedented, just that the existence of the proxy studies is used to make assumptions about it. Without the proxy studies, it is one less piece of noise for alarmists to use.

    Quite frankly, I total agree that claims of precedence are useless noise. Making such claims reveal the claimants advocacy, not validity.

    Mark

  680. Mark T
    Posted Aug 7, 2008 at 7:35 PM | Permalink

    For me, the hockey stick makes all the difference in the world. If I were convinced that this was true, then along with what looks like very solid evidence for +1.2C with 2X CO2 I would be mostly convinced of the strong (alarmist) AGW hypothesis.

    That pretty much sums up what I think the general population sees the issue as, though the number that gets reported is actually much higher (often) depending upon which journalist or which shoddy report is used.

    As soon as there is a reasonable probability for a long and warm MWP and a long and cold LIA with no clear explanation of how these might have occurred it casts significant doubt on the whole large climate sensitivity issue. In fact, for me, it even casts doubt on how solid the +1.2C with 2X cO2 is in the actual real world. This is because suddenly the earth’s climate looks a lot more like a very complex / chaotic open system with highly nuanced feedbacks that perhaps change based on the particular state of the current climate rather than a more linear / mechanical system (in its average behavior of course – obviously it IS a very complex open system in the sense of K.L. von Bertalanffy, “General System Theory”).

    This, too, is what I think the population sees. The problem I see with this, which leads to my lack of concern towards precedence, is that nobody has ever explained to my satisfaction that doubling CO2 and even 2-3 degrees of a temperature rise is a net negative in terms of cost/benefit to the planet. The negatives are often arm-waving conjecture, and rarely do we hear anyone mention any of the positives, many of which can easily be proven without arm-waving.

    Mark

  681. DeWitt Payne
    Posted Aug 7, 2008 at 7:52 PM | Permalink

    Richard Patton,

    When I’m feeling really pessimistic about climate models I say things like this. However, I do think they’re a useful learning tool. IMO, however, it’s highly questionable to report the results of model runs or multi-model ensembles of runs based on extremely rough estimates of future emissions as robust estimates of future global climate. I know there are caveats about this in the guts of the various IPCC AR’s. But from the summary for policy makers to the press releases to the prophets of doom who shall remain nameless, the impression that these are firm predictions only gets stronger.

    Again IMO, the models haven’t demonstrated any more skill than linear extrapolation of current trends. Use of the models for estimating future regional climate effects is completely useless as they don’t even get current regional behavior correctly. See tropical atmospheric temperature profiles, e.g. Statements about the frequency of droughts, hurricanes and other extreme weather events being altered by AGW are no more than speculation at present.

  682. Gunnar
    Posted Aug 7, 2008 at 8:22 PM | Permalink

    >> has only minor significance in the AGW arena as a whole.

    Hogwash. The only case that has been made is:

    premise: Gosh, it’s hotter than ever
    premise: AGW is the only cause
    conclusion: let’s violate human rights

    Therefore, this revelation completely invalidates the above logic. Al Gore made his best case, and the hockey stick was fundamental to it.

    You folks may talk amongst yourselves about GCMs, but that has NEVER been part of the case that’s presented to the public. There’s a reason for that. It’s an extremely weak argument which doesn’t pass the laugh test.

    The elephant in the room is that no case has ever been made on scientific grounds for exactly how C02 can increase the worlds temperature, in violation of numerous scientific laws. If the phenomena were real, one should be able to perform an experiment that shows it. One should be able to explain why experiments that have been tried have failed to show any effect.

    >> In fact, for me, it even casts doubt on how solid the +1.2C with 2X cO2 is in the actual real world.

    This is flabberghasting. Arrhenius proposed this over 100 years ago. It was debunked shortly after proposal, and has NEVER been confirmed by any experimental evidence. And you are just starting to doubt it at this point?

    >> I agree that anti-CO2 policies can be justified even without the GCMs.

    Carbon is the foundation of life. C02 is food for plants. Every bit of C02 we emit helps the plant kingdom. C02 is no more a pollutant than oxygen or nitrogen. There is absolutely no rational basis for any anti-C02 policy.

  683. Richard Patton
    Posted Aug 7, 2008 at 8:44 PM | Permalink

    >> In fact, for me, it even casts doubt on how solid the +1.2C with 2X cO2 is in the actual real world.

    This is flabberghasting. Arrhenius proposed this over 100 years ago. It was debunked shortly after proposal, and has NEVER been confirmed by any experimental evidence. And you are just starting to doubt it at this point?

    The “greenhouse” effect makes sense to me and I get that CO2 absorbs radiation in certain bands that get saturated.
    Supposedly a doubling results in 3.7 w/m^2 which is supposed to translate into +0.8-1.2C – or something like that.
    What part of this are you saying there is little or no good evidence for?
    Are you saying that this may be accurate theoretically but there is no evidence for this operating this way in the actual atmosphere?
    Would love to whatever credible evidence you have along these lines.

  684. Steve McIntyre
    Posted Aug 7, 2008 at 10:01 PM | Permalink

    Please don’t try to debate the entire theory in 1-paragraph snippets. It’s a wsaste of time.

  685. Geoff
    Posted Aug 7, 2008 at 11:12 PM | Permalink

    Could be time to update Bender’s suggested poll on what’s causing the indicated temperature increases.

    My take:

    Solar – 40% (+/- 20)
    Mis-measurement and land use – 45% (+/- 20)
    CO2 – 15% (+/- 15)

    I’m not sure where to put the percetnage for “stuff happens”. I was amused to notice a recent article in Nature (co-authored by a TAR Co-ordinating Lead Author) which included the sentence “The Antarctic ice sheet is thought to be still responding to changes since the last glacial maximum”. Since the last glacial maximum (or Würm glaciation) was 16-20,000 years ago, I guess the cause of the changes referred to would not be SUVs or coal fired power plants.

  686. Allen
    Posted Aug 8, 2008 at 6:03 AM | Permalink

    20 Lief –

    It seems that most just takes solar forcing as a given, but there is no or little evidence for any significant solar forcing.

    Thanks for responding.

    I’m a newcomer, so my perceptions are “high level” as I do not know all the details. With that understanding of my significant limitations:

    I used FFT to show myself that the sunspot data and temperature data had similar frequency signatures. I did not do any statistical relevance calculation. Nonetheless, it did not seem this could be random chance.

    I made a “solar model” for Global Temperature. It was a simple heat balance for the earth — e.g. T^4, heat capacity, etc. The adjustable parameters were limited to those with physical basis (I have an extensive background in modeling physical systems, so I know some of the pitfalls of “Curve Fitting”). The model assumed that “somehow” (I don’t know the mechanisms) high sunspot activity was associated with heating and low activity with cooling. I used all the continuous sunspot numbers from the mid 1700s to now. I used HadCRUT3 global temperature. I let my custom coded “genetic algorithm” adjust the Earth heat capacity and other physically based constants to provide a best fit of HadCRUT3 to Sunspots.

    Using this very simple “solar forcing” model, the magnitude, amplitude, and frequency of major changes in HadCRUT3 data was fit quite closely both in the long term and shorter cycles (e.g. 11 year cycles) — but not in the very short term (individual years) because I had no factors in my simple model to do that. The long term rise in global temperature was fitted. The recent leveling of global temperature was reproduced. All in all, I could account for all but 0.2C of the temperature rise over the extent of the HadCRUT3 data with my physically based “curve fit” of sunspot data.

    I noted that where the global temperature did not follow my model fit, it was low. In those places, major volcanic eruptions had taken place. I read that eruptions can suppress temperature for a few years.

    I made other simple heat balance calculations that indicate a very long (over a century) potential lag between major changes in solar energy (or other unknown solar factors that may affect temperature) and global temperature (due to the heat capacity of the oceans).

    This was all “physics 101″. I am no expert. Still, it was easy to develop a possible correspondence between the magnitude of long and shorter term global temperature changes and sunspot activity. Moreover, our heating energy comes from the sun. So, I do not rule out solar activity as being responsible for much of the global temperature changes — even though more learned Scientists have not yet found a convincing relationship.

    I think maybe there is something we have yet to figure out.

  687. Gunnar
    Posted Aug 8, 2008 at 6:40 AM | Permalink

    >> This was all “physics 101″. I am no expert. Still, it was easy to develop a possible correspondence between the magnitude of long and shorter term global temperature changes and sunspot activity. Moreover, our heating energy comes from the sun.

    Allen, great post, well stated. AGW proponents insult our intelligence by claiming that physics 801 fundamentally contradicts physics 101.

  688. bender
    Posted Aug 8, 2008 at 6:58 AM | Permalink

    I remain unconvinced that you can adequately represent feedbacks with anything less than a full 3D coupled climate model.

    I happen to agree with DWP. However, our opinions are irrelevant. Rather than cite personal opinion, which is prone to bias, I think it would be preferable to point to documents already known to IPCC. That is why I cite Wunsch (on deep circulation), Christy (on negative cloud feedbacks), and Koutsoyiannis (on long-term persistence). Their opinions are infinitely more informed than mine or DWP’s.

  689. cba
    Posted Aug 8, 2008 at 7:25 AM | Permalink

    Well, we basically have 1 real data point = here and now (now being a climatological time frame). We have a calculatable point – a simplistic sphere with energy flow balance and can infer in a variety of ways that climate cannot be sensitive or unstable by comparing the two and perhaps creating points in between. We can know how much absorption and emissions occur in IR and something of how h2o vapor forms clouds as well as how reflective they tend to be. Whoops – we’re getting into serious complexity here and we haven’t even scratched the surface so to say.

    Perhaps analyzing the claims and consequences of those claims and then falsifying them might be plausible. If co2 doublings of similar effects to power balance cause 5 K rise per doubling – and there are 5 of those plus 6 more that are fairly close in power absorption – what happened to all the other contributors of warming? That’s really scientific rather than engineering.

    Another possibility is just to develop our own multi million line gcm code so it’s a contest of us versus them as to who has the right one (more accurate model).

    Clouds and water are the key element to the puzzle. It would seem though that it is both a principal input as well as a potential output of the analysis. Considering it isn’t known beyond 20 years and it has been known to vary by 10% over twenty years – it would seem that it is a tool to falisify any sensitivity for co2 that was done by those assuming it constant. That’s another falsification, not an engineering solution though.

    I think we can falsify the radical assumptions and claims and any physics based upon those assumptions now but I don’t think we can create a solution with what is known at present.

  690. Basil
    Posted Aug 8, 2008 at 7:49 AM | Permalink

    Well, it complicates matters to bring this up, but the magnitude of the difference between NXT and SXT temperature changes is not nearly as great when looking specifically at the continental US. UAH has a time series specifically for the continental USA (“USA48″). I did this last night, before the July data was available, so it is only through June:

    There’s a positive OLS trend through this, but it is not nearly as dramatic as the N-S difference plot Steve showed.

    Someone earlier attributed the NXT/SXT rate of change differential to Arctic warming. That explanation works only for trends since ~1992:

    Yesterday I posted some Hurst exponent calculations on latitudinal variations in temperature trends over at WUWT. I’ll update them here later today. There’s clearly greater warming taking place over portions of Northern extra tropic land surfaces, that is not taking place uniformly elsewhere. So why do they call it global warming?

  691. DeWitt Payne
    Posted Aug 8, 2008 at 10:47 AM | Permalink

    bender,

    Point taken. Cite more, opine less, pontificate not at all.

  692. Sam Urbinto
    Posted Aug 8, 2008 at 12:04 PM | Permalink

    Richard #683:

    Are you saying that this may be accurate theoretically but there is no evidence for this operating this way in the actual atmosphere?

    We know water vapor, carbon dioxide, methane, nitrous oxide and CFC absorb thermal radiation from the ground. If you remove carbon dioxide in Model E, 9% of the greenhouse effect goes away. If you accept the model as evidence, that would be about 3 C of the 33 C the GE is estimated to provide (if you accept the estimate). So in theory, doubling it would give another 3 C.

    There is no proof of this, but if you accept the anomaly trend reflects a rise in energy levels, the +.7 C is close to the difference between .000275 (2.1 C) of the atmosphere to .000385 (3 C) So it rather matches. All conjecture. It could very well be that other things caused a rise in energy levels, causing more outgassing from the oceans (the opposite relationship) or it could be my current opinion, that the anomaly is simply a by-product of measurement and processing or if not, well within a fluctuation range being less than +/- 1 C over the entire time period.

    I pretty much agree with Geoff though.

    Solar – 40% (+/- 20)
    Mis-measurement and land use – 45% (+/- 20)
    CO2 – 15% (+/- 15)

    Would that we could derive any of this scientifically. Oh, well.

    I also agree with Steve; I would base my policy decisions on the currently accepted institutional advice — after taking into account the political matters involved both from the advisors and the repercussions of any actions — or in other words, cost/benefit risk/reward types of factors. I would also do as Steve, demand more accountability.

    But then, I also said this:

    Since 1850 carbon dioxide levels have gone from about .000275 parts to .000385 parts of the atmosphere by volume, or a yawn inducing .000007 parts per decade, with no direct established causal connection between it and the equally yawn inducing proxy, the global mean temperature anomaly, supposedly showing a rise in the planetary energy budget, which shows a rise of about .05 per decade in the linear trend. This is of course against the significan backdrop of a 600% rise in the planet’s population, and the extensive urbanization and industrialization of the developed world and its deep impact upon the weather, and therefore over time, the climate.

  693. D. Patterson
    Posted Aug 8, 2008 at 9:15 PM | Permalink

    Raven says:
    August 7th, 2008 at 5:06 pm
    [....]
    I agree that anti-CO2 policies can be justified even without the GCMs.

    How can “anti-CO2 policies…be justified even without the GCMs,” by blind faith? The planet experienced an episode of ice age glaciation with 7000ppm of CO2, so how can you rationally assume a failure to increase CO2 emissions won’t cause the planet to plunge into another minor or major ice age glaciation? How are we supposed to know the biosphere is naturally adapting with the assistance of anthropogenic life to adapt the planetary environment to mitigate against a return to another ice age glaciation? How are we supposed to know that a doubling of CO2 won’t result in a major decrease in planetary temperature as it did before?

  694. cba
    Posted Aug 9, 2008 at 3:29 PM | Permalink

    DeWitt,

    I’ve been going through stuff much of the day here. I used what I think is a relatively obscure co2 line at the surface, 25km and 50km. It’s at 25624nm (+ a fraction). At the surface, it is quite broad and has only declined by a factor of a hundred or so by the limit of my wing calculations. At 25km and 50km it is totally down by the time I reach my wing limits (+/-200nm). Pressure and T vary according to the Std Atm. Intensity adjusts to T and the Q values.

    I also compard the value on the chart for the CO2 peak on modtran and adjusting for the fact that 22nm is a unit frequency width in /cm, I came out with fairly close to the same value on the peak for co2 as on the modtran scale. I believe that was looking up at 25km and using 330ppm for co2.

    Looking at incoming radiation, LW+SW, It appears that 3.3W /m^2 more reaches the surface between 330ppm and 768ppm.

  695. Raven
    Posted Aug 9, 2008 at 4:12 PM | Permalink

    D. Patterson says:

    How can “anti-CO2 policies…be justified even without the GCMs,” by blind faith?

    The precautionary principle provided the cost of such policies is manageable. The environment we live in today is a lot cleaner and healthier because regulations were imposed even though the science demonstrating harm was dubious at best. You can call it an insurance policy provided one never forgets that the cost of the insurance policy matters.

  696. DeWitt Payne
    Posted Aug 9, 2008 at 6:04 PM | Permalink

    cba,

    You need to look at a strong line, not a weak line. If it’s really weak, then narrowing may never make it strong enough to be saturated, i.e. zero transmittance at the peak. That’s what causes the square root of concentration dependence. It’s the strong lines for ozone at 9960 nm and CO2 at 15000 nm that are the problem. A line with a tau of 10 will exhibit an effective width significantly broader than the HWHM because transmittance can’t go less than zero so you get a flat bottom and an effectively broader line.

  697. Posted Aug 9, 2008 at 6:33 PM | Permalink

    692 (Sam):

    I pretty much agree with Geoff though.
    Solar – 40% (+/- 20)
    Mis-measurement and land use – 45% (+/- 20)
    CO2 – 15% (+/- 15)
    Would that we could derive any of this scientifically. Oh, well.

    How do you justify the 40% Solar? It seems to me that the ‘logic’ behind that is: 15% CO2, land-use/errors 45%, the rest must be solar. That is not good science.

  698. DeWitt Payne
    Posted Aug 9, 2008 at 7:44 PM | Permalink

    cba,

    In addition, the Doppler width is inversely proportional to wavelength so a long wavelength line won’t be Doppler limited until higher altitude, a pressure of 6 millibar for a 25 micrometer line or about 30 km, for example compared to 10 mbar for CO2 at 15 micrometers and 16 mbar for the ozone 9.6 micrometer band. Look at a line from the CO2 4.3 micrometer band if you really want to see something. Doppler and Lorentz width are equal at 35 mbar.

  699. Gerald Machnee
    Posted Aug 9, 2008 at 8:26 PM | Permalink

    RE #692 – **If you remove carbon dioxide in Model E, 9% of the greenhouse effect goes away. If you accept the model as evidence, that would be about 3 C of the 33 C the GE is estimated to provide (if you accept the estimate). So in theory, doubling it would give another 3 C.**
    Even that theory is not good. The effect of increasing CO2 is not linear, so doubling will not give a 3 C increase.

  700. D. Patterson
    Posted Aug 9, 2008 at 8:48 PM | Permalink

    Gerald Machnee says:

    August 9th, 2008 at 8:26 pm
    [....]
    Even that theory is not good. The effect of increasing CO2 is not linear, so doubling will not give a 3 C increase.

    The past global mean temperatures of about 22-23C coincident with a variety of CO2 concentrations ranging between 1400-7000ppm is also contrary to a 3C increase.

  701. cba
    Posted Aug 9, 2008 at 9:28 PM | Permalink

    DeWitt,

    Sorry but the line was 22624nm rather than the previous number.

    I’m not sure what you mean by 0 transmittance there. Such things have to be by pathlength. Below are some values for the 22624 nm wavelength and they’re given per cm pathlength. Strong lines have peaks around 10-2 to 10-3 which means that they have strong attenuation for paths of only a few cm or meters. Still, it’s not a matter of no transmittance for that length.

    bin lamda nm per cm intensity – Surface 25km 50km
    22325 22425 0.000000000000028690377055037617 0.0000000000000000020828890778410573 8.7272937671763359E-19

    22425 22525 0.00000000000011793587701430219 0.0000000000000000084460484443104813 0.0000000000000000035388919484445808

    22521 22621 0.000000000070138066217394139 0.000000000000012112979673428837 0.0000000000000050819203418217736
    22522 22622 0.000000000085624039151160519 0.000000000000033079813379831318 0.000000000000013913684789720429
    22523 22623 0.000000000098107045021951646 0.00000000000041846968539481353 0.00000000000019839838375860367

    22524 22624 0.00000000010105210352722771 0.0000000000022326488779854503 0.00000000004819065541333244

    22525 22625 0.000000000088329073553427536 0.000000000000052007693578247763 0.000000000000021932129492296719
    22526 22626 0.000000000074077856469322111 0.000000000000015574436096321128 0.0000000000000065367227455964535
    22527 22627 0.000000000059207780230269408 0.0000000000000074888512997169346 0.0000000000000031402963874812796
    22528 22628 0.000000000046195426777554825 0.0000000000000044015946594877435 0.000000000000001845108817951482
    22529 22629 0.000000000035768921469972805 0.0000000000000028970220180291071 0.0000000000000012142135123919417

    22625 22725 0.0000000000001136915104339505 8.00228493786007E-18 0.0000000000000000033529550082564734

    22723 22825 0.000000000000029506434614700906 0.0000000000000000020683212114476061 8.6662544770926371E-19

  702. Geoff Sherrington
    Posted Aug 9, 2008 at 9:29 PM | Permalink

    Temperature in Melbourne Australia today.

    This is a asocial comment, not a scientific one.

    As I write at noon , the max temp for the day, 10th Aug 2008, is 9 degrees C. Maybe it will reach ten. Since the 1860s, when records started at melbourne central, there are only 40 or so August days when the max temp was below 10 degrees C.

    The years when any August day was below 9 degrees are –

    (Year, Month, Day, Max temp deg C)
    1872 8 9 6.7
    1970 8 16 6.8
    1886 8 19 7.3
    1874 8 2 7.7
    1932 8 14 7.7
    1951 8 9 7.7
    1899 8 7 7.8
    1871 8 18 8.1
    1909 8 3 8.1
    1862 8 11 8.3
    1856 8 8 8.4
    1951 8 13 8.4
    1921 8 3 8.5
    1968 8 20 8.8
    1863 8 3 8.9
    1897 8 4 8.9

    On this day, we notice the omnipresent, unrelenting, radiative transfer heating beloved by the AGW folk.

  703. DeWitt Payne
    Posted Aug 9, 2008 at 9:53 PM | Permalink

    cba,

    Of course the transmission is never identically zero, but after 0.001, who cares. Plot the transmission of an individual line on a scale 0 to 1 for plus or minus 20 line widths for peak dimensionless mass paths ranging from 1 to 1000 if you still don’t see what I mean.

  704. Geoff
    Posted Aug 9, 2008 at 10:34 PM | Permalink

    697 Leif and 692 Sam,

    Sorry guys, was just having a bit of fun on a Friday afternoon. I used “solar” in just the way Leif says not to, as “a ‘dumping ground’ for what we don’t otherwise understand” (see).

    So by “solar” I meant TSI plus Milankovitch plus GCR plus spin orbit coupling plus alpha.

    I’ll try to use “natural” or “non-“A” in future (or skip Friday postings).

  705. DeWitt Payne
    Posted Aug 10, 2008 at 11:11 AM | Permalink

    cba,

    Probably a stupid question and no offense intended, but when you integrate a line that’s entirely inside a bin, are you integrating the line intensity or the absorption? In other words, do you multiply the intensity by the mass path to get optical depth tau as a function of wavelength or wavenumber and and then integrate 1-exp(-f(tau)) over the line or integrate over the line and then calculate absorption?

  706. cba
    Posted Aug 10, 2008 at 11:14 AM | Permalink

    DeWitt,

    I think I found some of the problem you’ve been having with my model. It’s pretty much the simplicity of the model and its courseness and probably not something to do with the line calculations. Each layer is assumed to be of uniform temperature and all emitted energy leaves the layer just like all absorbed energy is absorbed in that layer. It then depends upon the next layer to start absorbing the emissions. For example, the model emits 18W/m^2 at 40km over a 2.5km path. With a T of the next upward shell, that is 7 deg warmer for this case, only 5w/m^2 is transmitted through the next shell, the rest is attenuated out. The reason there is any getting through is due to a shorter optical path due to lower density.

    The recourse is to create some simple approximation mechanism to subtract out self reabsorption within a layer or break it into far finer detail or to ignore the problem totally. It’s not really that much of a problem with absorption as it is with emission, along with the T gradient. If there were no T gradient in a shell, then there would be no net absorption/emission factor within the shell.

  707. DeWitt Payne
    Posted Aug 10, 2008 at 11:53 AM | Permalink

    cba,

    That’s not it at all. I’m not the one with a problem. If you were doing the calculation correctly, there would be no significant re-absorption because the atmosphere would optically thin (tau less than 0.1) at all wavelengths at that altitude. The total emission would be on the order of 0.4 W/m2 from a 2.5 km thick layer centered at 40 km, not 18 W/m2.

    There are no saturated lines at that altitude and mass path. There are only a few lines where emission is even measurable. There is no absorption that amounts to anything. The fact that your program does not behave this way is overwhelming evidence that there is a fundamental flaw in your calculations. I think it has to do with line width, but there may be other contributing factors like how you calculate absorption/emission for lines that have width much less than your bin width.

  708. DeWitt Payne
    Posted Aug 10, 2008 at 6:31 PM | Permalink

    cba,

    I calculated the Doppler broadened absorption for the largest CO2 line at 667.65 cm-1. The peak absorption at the center of the line was 0.006 and the absorption integrated over the whole line was 0.1. The broad band absorption measured in a 1 nm bin would be even less than that because the line width is much smaller than the bin width. At an optical depth this small, absorption and emission are directly proportional to concentration. I’m beginning to think the Doppler vs. Lorentz thing is red herring and the true problem involves not doing all the calculations with the same units. For purposes of calculating line broadening, everything has to be in frequency. That way, the intensity at any point at any reasonable line width will always be much less than the line strength in the table, as it should be. A line width of 1 Hz, which would have the peak intensity equal to the table intensity, implies a line width in wavelength for the 15 micrometer CO2 line of 7.5 E-8 nanometers.

  709. DeWitt Payne
    Posted Aug 10, 2008 at 6:32 PM | Permalink

    The calculation above is done at 40 km with a path length of 2.5 km.

  710. DeWitt Payne
    Posted Aug 10, 2008 at 11:53 PM | Permalink

    cba,

    I calculated integrated flux density in W/m2 cm-1 for the most intense CO2 line at 667.65 cm-1 Then I summed the cross sections for all 19 intense CO2 lines in the 667 to 669 cm-1 range, divided by the cross section for the most intense line and multiplied the flux density for the 667.65 cm-1 line by that ratio. The result was 0.019 W/m2 cm-1. Then I went to MODTRAN and using 1976 atmosphere and 384 ppm CO2 calculated the intensity at 668cm-1 for 38.75 and 41.25 km looking up. Subtracting 41.25 from 38.75 and multiplying the difference by pi*10,000 to convert from cm-2 to m-2 and steradian-1 to total the result was 0.013. That’s good enough for me. Well, not quite, I think I may have overestimated the number density of CO2 at 40 km. Still checking.

    Are those numbers in your #701 absorption cross sections times molecules/cm3? I found only one CO2 line at 442 cm-1 or 22624nm. However, it has a zero pressure cross section of 3.7e-27. The number density of CO2 at sea level is about 1E14, I think. That means the intensity cannot ever be greater than 3.7 e-13/cm. Broadening will lower that a lot. Then there is the problem that the intensity only drops by a factor of 2 going from 25 to 50 km. That may be where the Doppler vs. Lorentz thing causes a problem. In the real world, the intensity would drop by more than an order of magnitude. I also don’t see how you get any intensity at all in anything but the 22529 to 22629 bin at 50 km. The line should be completely inside that bin at 50 km and probably at 25 km. This all assumes you are looking at just one line and not contributions from the entire HITRAN database for those wavelengths.

  711. bender
    Posted Aug 11, 2008 at 2:56 PM | Permalink

    OT prediciton:
    The same AGW alarmists who refused to attribute the late 1990s warming spike to “natural internal climate variability” will now invoke this very device to explain the 1998-2008 flatlining temperature trend. If I’m right, it goes into the alarmist double-standards database.

  712. Scott-in-WA
    Posted Aug 11, 2008 at 6:28 PM | Permalink

    Bender: OT predicton: The same AGW alarmists who refused to attribute the late 1990s warming spike to “natural internal climate variability” will now invoke this very device to explain the 1998-2008 flatlining temperature trend. If I’m right, it goes into the alarmist double-standards database.

    Bender, not to be overly persnickety, but why should any Professional AGW Alarmist feel any motivation to respond to this issue as long as the US Government continues to publish the following graphic — which indicates a continuing sharp upward trend through 2008 — as demonstrated fact:

    from http://www.climatesci.org/wp-content/uploads/fig15.jpg

  713. cba
    Posted Aug 11, 2008 at 10:39 PM | Permalink

    DeWitt,

    I’ve been finishing up a lot of stuff for the summer the last day or two but I’ve been going over the line width / height stuff here trying to determine where the problem may still be. As you can tell from the post time for this that I just got back here at a late hour, too late to go into detail on your calculations. I have a mod this should be more realistic for the small gamma (hwhm)stuff. I just finished the runs for co2 only and for all-co2 that i’m currently using. It probably works somewhat better but it’s not very clean. I don’t even have any good comparisons yet other than there’s not much changing going on high up. There may still be too much going on in emitting / absorbing. The original code wasn’t handling the center bin properly and probably still is not completely. Hopefully, I’ll be able to go through your recent posts in the morning.

  714. Geoff Sherrington
    Posted Aug 11, 2008 at 10:53 PM | Permalink

    Apologies again for coming from left field and interrupting the threads.

    Problems in Europe.

    First, an interesting summary of a book soon to appear, written by Prof David MacKay, Dept of Physics, Cambridge.

    http://www.inference.phy.cam.ac.uk/sustainable/book/tex/synopsis.pdf

    Second, an extract from a private email by Prof Dick Thoenes (retired) of Holland, whom I first had the pleasure to meet in the mid 1970s. Dick was one of the authors of the New York Heartland Institute paper of March 2008. He writes

    “I believe the situation in Europe is rather frightening and quite complex.There seems to be a huge and powerful coalition. All politicians of all parties in all countries agree that global warming is a serious problem that requires huge funds, for them to spend. Then there is a powerful group of scientists who proclaim the IPCC gospel, since their research funds depend completely on it. Our national weather bureau belongs to this group. There are also a huge commercial interests involved, especially the wind turbine industry. The media only cite the believers. There is no newspaper or TV station in Europe that I know of that openly contradicts them. Sceptics are ignored completely. They are considered village idiots, particularly in England and the Netherlands.

    The worst is that democracy appears to have been lost in the batlle. There is in fact no free press any more, and it is only Big Brother that you hear.

    To me it is unbelievable that this situation has prevailed in Europe for more than twenty years. Indeed, the politicians and the media ignore what happens elesewhere in the world.

    The small group of sceptics in this country that occasionaly publish their papers on websites and in obscure journals appeart to have no influence at all. I have more confidence in the American NIPCC and the Canadian ISPM.”

    Suggestions welcomed.

  715. Geoff Sherrington
    Posted Aug 12, 2008 at 4:54 AM | Permalink

    Another interruption annoyance, but these are FANTASTIC.

    rchfeed

    http://www.ted.com/index.php/talks/hans_rosling_reveals_new_insights_on_

    poverty.html

    The presenter shows that it is not a given that a credits system must be used to take from rich counties, to give to poor underdeveloped ones.

    Along the way, there are fantastic examples of data mining and presentation and a strong call for people like the UN to make data available.

    Especially recommended to all CA regulars. Entertaining, also.

  716. cba
    Posted Aug 12, 2008 at 7:27 AM | Permalink

    705 (DeWitt):

    Integration was done of the equation and then calculated for intensity of the line with the tau averaged for the bin. The initial database numbers are in attenuation per molecule in the path. The ideal gas law is used to determine the # of molecules in 1 cm^3 for the isotope fraction of the molecule fraction of the pressure average for the slice and base T of the slice. The result for my spectral bin is supposed to be the average value over the 1nm width bin for a tau that is per cm of path length. The surface cross section drops out as the # of molecules in the cross section is a function of the volume and for a fixed thickness is a furnction of cross sectional area. The bin result is done in custom programming. The bin array file is then loaded into excel where the path length is multiplied in and the exponent done so that transmission is exp(-tau*distance) and absorption is (1-exp(-tau*distance) ).

    710

    At 667.65 ( 14977.9 nm), my final co2 only bin number provides a peak of 0.00073. Looking at my 40km ‘looking up’ chart (includes all downward IR to 100km) the co2 peak around 14977 is showing 0.0109 W/m^2 per nm for the latest sw modification (ozone is showing .0059 at 10um).

    The 22624 line is about 4.8E-27 (Tref = 296) raw number but that varies with T. That line calculates out at 2.7E-27 at the surface, 7.0E-29 at 25km, and 1.3E-27 at 50km.

    The numbers in 701 are per cm thickness as they already have the # of molecules included. Note they are no longer the current values as of the latest change.

    Again, at the moment I have a kludge for dealing with the narrow peaks and I’ve got to go through the derivation again and verify what has been done to make sure there’s constants like 2/pi missing and that the implementation is correct and at the correct point for widths relative to bin width.

    Note, I’m not sure why you had to deal with converting cm^2 to m^2 and mult. by pi (unless you were doing vol. in m^3 for density.

  717. DeWitt Payne
    Posted Aug 12, 2008 at 9:16 AM | Permalink

    cba,

    I’m no longer confident that I did the calculations I reported above correctly. There’s something wrong in my integration to determine equivalent width. I did some experimenting on the spectralcalc.com site. You can use the line browser to find line intensities, but when you use the spectral calculator to calculate transmission, the intensity calculated from ln(peak transmission)/(mol.cm-3 * path length) is more than 2 orders of magnitude higher than the intensity from the line browser. The difference also varies with wavelength and is larger at longer wavelength. I reduce the pressure and path length for a strong line so the peak transmission is greater than 0.99. Then the calculated intensity is constant with small changes of pressure or path length (weak line limit).

    I still think you need to integrate absorption [(1-exp(-tau(nu)), tau(nu) = sigma(nu)*mass path), sigma(nu) = cross section as a function of frequency], not line intensity for lines entirely inside a bin. It won’t matter for a weak line, but I think it does for a strong line. My problem seems to be calculating sigma(nu) in the first place.

  718. bender
    Posted Aug 12, 2008 at 9:21 AM | Permalink

    #712 Scott-in-WA
    Are you disputing whether the data presented here are factual?
    Are you disputing the inference that one (black curve) is connected to the other (bars)?
    Are you disputing the suggestion that the past trend will continue into the future?

  719. Gunnar
    Posted Aug 12, 2008 at 9:26 AM | Permalink

    Bender, you saying so doesn’t make it so. GCMs are not an argument that passes the laugh test. The precautionary stuff is a political logic fallacy, not a scientific hypothesis.

  720. TerryBixler
    Posted Aug 12, 2008 at 9:28 AM | Permalink

    At this moment in time Boxer, Pelosi, Obama, McCain would all back the IPCC as the authoritative source of information with regards to global warming. The HS is alive and well. It will take much effort to turn the world to rational thinking on the Global Warming hysteria. This effort has started here but needs to expand to a wider audience.

  721. TerryBixler
    Posted Aug 12, 2008 at 9:53 AM | Permalink

    On close inspection the land based surface records are a tatters. Anthony Watts has highlighted the lack of effort in the siting of the US stations. We have no idea about the world wide siting of stations. The formulas used in filling in missing record entries have been shown by Steve to have potential statistical problems. The picture of CO2 is murky at best. Maybe with the updates of the satellite pictures of the gas concentrations in the atmosphere some useful thought can be given about how CO2 actually performs as a ‘GHG’.
    No wonder neat graphs can be presented without any rebuttal.


    Steve
    : I think that you’re over-reaching a LOT here.

  722. wmanny
    Posted Aug 12, 2008 at 10:01 AM | Permalink

    I think Scott is being ironic.

  723. bender
    Posted Aug 12, 2008 at 10:03 AM | Permalink

    On close inspection the land based surface records are a tatters.

    That’s a strong statement. Although there are many isolated problems I don’t think the consequence of all these has been synthesized. Put it in perspective. Suppose you have hundreds of errors. Sounds bad … until you realize there are milllions of records. Audit is good. But synthesis is better.

  724. bender
    Posted Aug 12, 2008 at 10:05 AM | Permalink

    I don’t.

  725. TerryBixler
    Posted Aug 12, 2008 at 10:17 AM | Permalink

    #28 approx 5% of the stations are well sited. Call it 10%. That is not good for instrumentation. No known audit of record keeping.

  726. Gunnar
    Posted Aug 12, 2008 at 10:19 AM | Permalink

    >> You kill the paleo HS and a guy like Scott-in-WA comes back with this. Now what?

    First of all, it’s the same logical argument (premise: it’s really hot conclusion, conclusion: let’s violate rights). Second of all, that graphic is no problem, since it’s false.

    >> GCMs are the tool used in AGW “fingerprint” detection and attribution. This is a fact.

    You just proved me correct. GCMs are only an analytical tool, not an argument, like you claimed. The hypothesis has to be stated and then tested against physical reality. So far, numerous AGW fingerprints have been claimed, and they have all been found absent.

    If someone has a hypothesis, let them state the detailed hypothesis and attempt to falsify it with experimental tests. Otherwise, you’re just playing mathemetical games. Why don’t you play SimCity instead?

    I can’t come out and say “I’ve written a computer program that predicts that the earth will break apart in the near future”, without a coherent physical hypothesis that can be checked against reality with empirical measurement. It doesn’t pass the laugh test.

  727. bender
    Posted Aug 12, 2008 at 10:34 AM | Permalink

    #725

    that graphic is no problem, since it’s false

    Please explain.

  728. Gunnar
    Posted Aug 12, 2008 at 10:56 AM | Permalink

    >> “that graphic is no problem, since it’s false” Please explain.

    Bender my friend, sorry, I thought it was really obvious that the graphic shows the temperature going up since 1998, when in fact, the temperature has been coming down since 1998.

    The second thing about the graphic that is laughable is the C02 hockey stick. An ice core proxy, created by people who don’t acknowledge that C02 is soluable, with data points every millenium or so, contradicted by the accurate C02 measurements, is grafted willy nilly onto a high resolution IR measurement data set, measured at one questionable location different from the ice cores.

    The third thing is that it implies that there is a correlation between C02 and global temperatures. Empirical measurement says otherwise.

  729. bender
    Posted Aug 12, 2008 at 11:23 AM | Permalink

    #727 You saying saying so doesn’t make it so. Show us the correct graph.

  730. Basil
    Posted Aug 12, 2008 at 11:25 AM | Permalink

    Gunnar,

    Go ahead.

    Basil

  731. Gunnar
    Posted Aug 12, 2008 at 11:28 AM | Permalink

    >> but the heat generated by UHI’s has to go somewhere.

    Because the heat associated with UHI is completely insignificant compared to the energy in the system. Your question reveals the core premise behind AGW and the explanation for why people think it’s plausible. The premise is the hubris that humans are so significant that they can change the earth. This premise and the lack of intuition about physics makes it plausible.

    The heating effect of cities is insignificant compared to the atmosphere. However, when one considers that the energy of the entire atmosphere is only 2% of the ocean and crust, it’s like thinking that crying is going raise sea levels.

    US human energy usage seems like a really big deal, until you consider that the solar power on the US is 10,000 times greater.

  732. John Lang
    Posted Aug 12, 2008 at 11:33 AM | Permalink

    Here is a better temperature chart

    Here is a better CO2 chart

    http://www.globalwarmingart.com/wiki/Image:Phanerozoic_Carbon_Dioxide_png

  733. Gunnar
    Posted Aug 12, 2008 at 11:33 AM | Permalink

    >> Show us the correct graph.

    You need me to give you a link to the satellite data? I’ve got 4 kids, I don’t need any more baby sitting time.

  734. bender
    Posted Aug 12, 2008 at 11:37 AM | Permalink

    #731
    Baby sitting fee is in the CA tip jar. Thanks.

  735. Gunnar
    Posted Aug 12, 2008 at 11:49 AM | Permalink

    #733, :)

  736. bender
    Posted Aug 12, 2008 at 12:06 PM | Permalink

    #32
    I asked Gunnar if he could do this and he told me he was too busy baby-sitting.

  737. Old Dad
    Posted Aug 12, 2008 at 12:10 PM | Permalink

    Of course, it’s naive to think that politics can be completely divorced from science, or anything else for that matter, but I think we’ve got a perfect object lesson here demonstrating what happens when politics co-opts science. The Team may have won a political victory by preserving Chucky for the IPCC’s latest, but think of all the time that was lost, and all the potentially good science that could have been developed with the same effort and resources.

  738. Pierre Gosselin
    Posted Aug 12, 2008 at 12:12 PM | Permalink

    The HS is a product of politics, and not science.
    Without it, the entire AGW hypothesis crumbles, as there is no other data to support it. The CO2/temp correlation turns into a myth.
    That’s why the polar bear has been rushed out as the latest icon.

    Steve: It is untrue that AGW “crumbles” without the stick – a point that I’ve argued elsewhere,

  739. Gunnar
    Posted Aug 12, 2008 at 12:22 PM | Permalink

    Pierre,

    Exactly, well said.

  740. Gunnar
    Posted Aug 12, 2008 at 12:25 PM | Permalink

    >> I asked Gunnar if he could do this and he told me he was too busy baby-sitting

    Bender, you’ve spent about two dozen comments saying that a dead HS is no big deal, since there are other arguments. I’ve completely countered that argument, and I’m still waiting for you to elucidate the alternate logical argument for AGW. Specifically, what is the unique AGW signature, ie empirical evidence of AGW heating, that could not be caused by any other means?

    (note: I have you reduced to asking me for links to satellite data, and retreating into places where this argument is off topic)

    This just in: a poll just released shows plummeting support for AGW.

  741. ccody1
    Posted Aug 12, 2008 at 12:41 PM | Permalink

    Where are we now? As a layman trying to get my hands around this issue, can someone give a summary of the state of GHG influence on climate? The global warming advocates have gotten past the hockey stick fiasco and are still driving 100 mph telling the world that the end is nigh. They believe they have science, hockey stick aside, on their side. So, which is it? I go round and round on this trying to educate myself and its a flippin maze.

  742. bender
    Posted Aug 12, 2008 at 12:43 PM | Permalink

    #735 more progaganda

  743. Gunnar
    Posted Aug 12, 2008 at 12:47 PM | Permalink

    >> But why has no one pointed out that no amount of good statistics can redeem the method?

    Good point Tom, but even more so: no amount of good statistics can redeem the lack of logical argument or scientific hypothesis. IOW, even if it’s hotter now than it ever has been, it doesn’t indicate that AGW is the cause.

  744. bender
    Posted Aug 12, 2008 at 12:55 PM | Permalink

    #35 is not “well said”. Very poorly said, in fact.
    #36 is propaganda.

    #39 Your observation underlines Steve M’s neutrality and his integrity re: the larger issue. His assessment is balanced. The HS is scientifically dead, but is on political life-support. Scientifically, we do not know if current temps are unprecedented in 1000+ years. As NAS ruled, current temps are likely unprecedented in 400 years. Which surprises few. Even if current temps are not unprecedented in 1000 years they could becomes so. Hence the over-riding importance of the physics embodied in the GCMs. The GCM fits to the surface temp record are not dead. The GCM projections are not dead. Statements to the contrary are pure propaganda.

  745. bender
    Posted Aug 12, 2008 at 12:57 PM | Permalink

    even if it’s hotter now than it ever has been, it doesn’t indicate that AGW is the cause

    Yes. Hence the over-riding importance of the physical models which are the basis of GHG attribution.

  746. george h.
    Posted Aug 12, 2008 at 12:58 PM | Permalink

    IMHO the hockey stick (unprecedented warming )is just one of a number of global warming tennants which crumbles under close examination. What else is on the list? The supposed fidelity of the surface record, the underlying climate sensitivity to 2xCO2, net positive feedbacks, GCM predictive validity, contrived measures of tropospheric warming, ocean heat content (“its in the pipeline”), aerosol masking; the list goes on. How the team and their friends in the media keep the myth alive is a mystery to me.

  747. DeWitt Payne
    Posted Aug 12, 2008 at 1:08 PM | Permalink

    bender,

    Back away from the cage please. Poking things that are reputed to live under bridges with sharp sticks only encourages them.

  748. Gunnar
    Posted Aug 12, 2008 at 1:21 PM | Permalink

    And the AGW proponents quickly retreat to name calling and non substantive contradiction to mask their growing unease.

    M: Oh look, this isn’t an argument.
    A: Yes it is.
    M: No it isn’t. It’s just contradiction.
    A: No it isn’t.
    M: It is!
    A: It is not.

  749. bender
    Posted Aug 12, 2008 at 1:52 PM | Permalink

    IMHO the hockey stick (unprecedented warming )is just one of a number of global warming tennants which crumbles under close examination.

    If you have proof of the alleged “crumbling under close examination” then this should not be a matter of opinion, but of fact. So what facts can you provide to support your confident assertion? Due diligence means providing, when asked, supporting facts for all your statements.

    Seems we may be having an outbreak of denialist propaganda today.

  750. bender
    Posted Aug 12, 2008 at 1:58 PM | Permalink

    Trolls make me laugh. Persistent propaganda artists make me uneasy.

  751. Mark T.
    Posted Aug 12, 2008 at 2:02 PM | Permalink

    I should point out that GCMs are not “data,” which is specifically what Pierre was referring to. I agree, however, that the actual argument rests on GCM outputs, for better or worse. The HS is the data tool that links the past with the present in the media along with phrases like “unprecedented.” Without the HS, convincing the public of any urgency is difficult irrespective of GCM validity.

    Mark

  752. Posted Aug 12, 2008 at 2:08 PM | Permalink

    Bender-

    OT prediciton:
    The same AGW alarmists who refused to attribute the late 1990s warming spike to “natural internal climate variability” will now invoke this very device to explain the 1998-2008 flatlining temperature trend. If I’m right, it goes into the alarmist double-standards database.

    You are supposed to make predictions before they happen. Gavin estimates the standard deviation of 8 year trends is 2.2 C/century, putting a 0 trend near the 1 sigma region. Even -2.2 C/cenntury over 8 years is supposedly consistent with IPCC projections of +2 C/century.

    The graph Gunnar says is wrong appears to be GISS LandOcean data. It’s annual averaged, so 2008 doesn’t appear. Using monthly data, depending on the start data picked for this century, GISS is the only agency with a slight positive trend or it’s the agency with the least negative trend.

    So, the graphs is correct for what it is: A plot of GISS Land/Ocean data.

  753. DeWitt Payne
    Posted Aug 12, 2008 at 2:26 PM | Permalink

    lucia,

    Gavin estimates the standard deviation of 8 year trends is 2.2 C/century, putting a 0 trend near the 1 sigma region. Even -2.2 C/cenntury over 8 years is supposedly consistent with IPCC projections of +2 C/century.

    Assuming that he didn’t just pull that number out from where the sun don’t shine, I’d like to know how many years of zero trend it does take until 2 sigma for 2 C/century is exceeded, 15, 20, 30, 50? Do confidence limits on the projected temperature increase with time? If there is LTP in the climate, does that mean that the temperature doesn’t have to ‘catch up’ with the prediction from an earlier year?

  754. Gunnar
    Posted Aug 12, 2008 at 2:43 PM | Permalink

    Hi Lucia, nice to talk to you again.

    >> the graphs is correct for what it is: A plot of GISS Land/Ocean data.

    ok, I agree that the graph may be a correct representation of a certain data set. I’m saying that it shows temperatures going up since 1998, when that is not what has happened, according to the most comprehensive source we have.

    Secondly, it shows that the 1930s were significantly cooler than everything after 1980. That is simply incorrect, and I think Steve M has shown that conclusively.

  755. Posted Aug 12, 2008 at 3:19 PM | Permalink

    Re #753

    ok, I agree that the graph may be a correct representation of a certain data set. I’m saying that it shows temperatures going up since 1998, when that is not what has happened, according to the most comprehensive source we have.

    Which is that ‘most comprehensive source’, does it include the polar regions?

    Secondly, it shows that the 1930s were significantly cooler than everything after 1980. That is simply incorrect, and I think Steve M has shown that conclusively.

    No it’s correct, the graph is of ‘global land/ocean’ surface temperature, for which the 30’s were not so hot, that 1934 and 1998 were equal (to within measurement uncertainty) is true for the US not the globe as shown by Hansen.

  756. bender
    Posted Aug 12, 2008 at 3:34 PM | Permalink

    #754 Pop goes the weasel’s denialist fantasy world.

  757. Mark T.
    Posted Aug 12, 2008 at 3:52 PM | Permalink

    Which is that ‘most comprehensive source’, does it include the polar regions?

    Satellites, if I know Gunnar well enough (his posts, at least). Yes, as far as I know, they cover the poles.

    Mark

  758. Sam Urbinto
    Posted Aug 12, 2008 at 4:08 PM | Permalink

    bender, I think you should re-read that again. I take it as Scott saying Pro AGW Alarmists won’t respond to this issue, they just fall back on that graph being a demonstrated fact of carbon dioxide induced human-caused global warming (when in fact all it proves is that there’s more carbon dioxide and the anomaly is where it is). Sounds rather like the ‘Why should I answer you, it would just be a waste of time’ defense. I think it’s sarcasm. As in:

    bender: “They will now use this to explain XYZ.”
    scott: “Why will they even bother discussing it at all, they have this to keep them warm at night.”

    My issues with the graph:

    1. It’s titled Global Temperature, when it is no such thing. It’s a chart of carbon dioxide levels and the global mean temperature anomaly over the last 130 years.
    2. The method of presentation is such that it’s meant (my opinion) and used as proof that carbon dioxide levels cause warming of the planet, and the time frame covers the time since the industrial era began (give or take). Draw your conclusions on what’s being implied.
    3. The graph is square to compress the timeline I’d guess. And fairly large for a graph. It looks weird.
    4. One side charts the anomaly on a +/- 1 scale compared to a +/- .000070 scale
    5. One side shows a -.62 to +.75 change; +1.37, the other a .000290 to .000380 change: +.000090
    6. There’s no proof one causes the other, much less which direction or to what percentage. It’s meaningless.

    It’s not, however, factually incorrect as to the numbers. But that doesn’t mean the way it was done was the correct way to make it, nor that the numbers are meaningful and accurate. Or have any correlation. It’s a tool.

  759. DeWitt Payne
    Posted Aug 12, 2008 at 4:12 PM | Permalink

    cba,

    After playing some more on Spectracalc, I’ve decided that I both don’t know enough and don’t have enough data to try to do even single line emission/absorption at this point. One problem is correcting for temperature. It looks like the sensitivities of the 442 cm-1 CO2 line and the 668 cm-1 lines move in different directions with changing temperature and the effect can be rather large. Going from 250 to 296 K, the sensitivity of the 668 cm-1 line decreased by 0.7 while the sensitivity of the 442 cm-1 line increased by a factor of 7. And I still have to figure out what I’m doing wrong when I’m integrating the absorption over the line.

  760. Posted Aug 12, 2008 at 5:01 PM | Permalink

    Re #756

    Which is that ‘most comprehensive source’, does it include the polar regions?

    Satellites, if I know Gunnar well enough (his posts, at least). Yes, as far as I know, they cover the poles.

    Mark

    If by satellites you mean MSU, they don’t.

  761. Posted Aug 12, 2008 at 5:07 PM | Permalink

    Re #757

    It’s not, however, factually incorrect as to the numbers. But that doesn’t mean the way it was done was the correct way to make it, nor that the numbers are meaningful and accurate. Or have any correlation. It’s a tool.

    I agree with Sam that it could be better presented, if I were drawing such a graph I’d plot Log(CO2) wrt the 20th century average.

  762. Smokey
    Posted Aug 12, 2008 at 5:10 PM | Permalink

    Hey, look. I got a graph, too:

  763. bender
    Posted Aug 12, 2008 at 5:12 PM | Permalink

    #757
    You are helping me make my argument (and in the process, dismantling the troll’s belief system).
    1. The data in that graph are solid.
    2. The link between the two series is through physical theory as encapsulated in the GCMs.
    3. The GCMs are the tool used for extrapolating from the past into the future.

    i.e. It’s not the instrumental data that prop up the GHG-AGW consensus, it’s the GCMs. Why anyone would want to argue this is beyond me.

  764. bender
    Posted Aug 12, 2008 at 5:14 PM | Permalink

    #761 How pleasant – another cherrypickin sharpshooter. Wants only to look at 1998 onward. Hey smokey, whatcha smokin?

  765. Smokey
    Posted Aug 12, 2008 at 5:27 PM | Permalink

    Oh, please, oh, please, don’t accuse me of cherrypickin’! So instead of ten years,let’s go back a few hundred million years. That oughta cover the climate between then & now, without any possible cherrypickin':

  766. Smokey
    Posted Aug 12, 2008 at 5:32 PM | Permalink

    Perspective:

  767. bender
    Posted Aug 12, 2008 at 5:44 PM | Permalink

    #764-5 Nope, no cherrypickin of source data there. What a perfectly balanced analysis. More. More.

  768. cba
    Posted Aug 12, 2008 at 5:45 PM | Permalink

    DeWitt,

    gamma = pow((Tref/T),nair)*(gair*(p-parpress) + gself*(parpress)); //gair(T)*(p-ps) + (Tref/T)^nair*gself*(ps)

    Here is the Hitran approach to gamma determination where nair is an exponent for each line and gair and gself are broadening factors from the database. p is atmospheric pressure in atms and parpress is the gas partial pressure.

    Tintensity = intensity *(QTref/QT)*(exp(-c2*E00/T)/exp(-c2*E00/Tref))*((1-exp(-c2*nu/T))/(1-exp(-c2*nu/Tref)));

    is the correction factor for line intensity. Qt is the partition function data from hitran, c2 is hc/k – a radiation constant = 1.4388 cm k and E00 is (as I recall) the lower energy state for this line transition. nu is frequency in /cm and T and Tref are the temperature and reference temperature in K.

    My integration occurs prior to any exponential activity and is based upon the integration of what is called the absorption coefficient which is the lorentz profile. This multplies by the supplied intensity function S which is per molecule. That forms the monochromatic absorption coefficient K which is then multiplied by the number of molecules /cm^2 (1 cm thickness) to give tau, the dimensionless optical depth which must be multiplied by the path length for total absorption effect in exp(-tau*pathl) which gives the fraction of power transmitted through the path.

    The integrated item from line center to a point on the wing becomes integral over the width is I atan(w/gamma) = Io/gamma * atan(w/gamma) where w is the offset frequency nu in /cm units. Going to discrete bin averages makes it messy, especially the central bin for smalll gamma because each wing has different nu values for each bin edge. Off the center bin isn’t as bad as the central bin. Difference between area under the curve at each side of a wing bin gives the area in that bin. Of course for my setup, each point must be converted from wavelength to frequency (freq in /cm = 1e+7 / wavelength in nm) as these are defined by freq. not wavelength.

    At the moment, I’ve not doublechecked the derivation for the integral or thought through more on the details for getting the average I over each bin. For the central bin peak, I multiply I (= Io/gamma) by gamma/binwidth so it becomes Io/binwidth when dealing with gamma smaller than binwidth/2. Larger gamma values are done by straightline averaging on each bin.

    I’ll try to look at spectracalc and do some more comparisons on the current code’s calculations. There are differences to be expected from references like modtran. That has made it somewhat difficult as I don’t know how much difference there really should be.

  769. Scott-in-WA
    Posted Aug 12, 2008 at 6:06 PM | Permalink

    wmanny: I think Scott is being ironic.

    Your thinking is correct. It was also my purpose to illustrate just how much relevant factual information on both sides of an important question can be pushed from the context of a debate — deliberately or otherwise — with an appropriate choice of a single very effective picture and just a very few words.

    ccody1: So, which is it? ….. I go round and round on this trying to educate myself and its a flippin maze.

    After almost two years of reading CA, RC, the Pielke blogs, government reports, Freeman Dyson’s musing’s (etc etc) I am now a confirmed lukewarmer — temperatures are gradually rising, but no one on either side of the AGW debate really has a set of adequately defensible arguments as to why it’s occuring.

    When the whole thing is said and done a hundred years from now, the true casualties of AGW will be the credibility of science and the reputations of those who call themselves “scientists.” But it will take a hundred years before the question is brought to any kind of firm public closure.

    Sam Urbinto: bender, I think you should re-read that again. I take it as Scott saying Pro AGW Alarmists won’t respond to this issue…..

    Sam, your interpretation of what I wrote is pretty much on target. In addition to the Hockey Stick, the genre of AGW graphic that I posted has been tossed at me from a fair number of people who call themselves “scientists” and who refuse to go any deeper into the details than saying simply, “It’s all been peer-reviewed, that’s all the credibility it needs to have as far as I’m concerned.”

  770. bender
    Posted Aug 12, 2008 at 6:08 PM | Permalink

    The question, smokey, is whether the ceiling on global temperatures imagined or implied by your #765 actually exists. You have your beliefs and that’s just wonderful. The rest of us are interested in the science behind the calculations.

    [Ever wonder why Pielke Sr doesn't make these kind of bone-headed, overly simplistic arguments? It's because his angle is the most skeptical tenable position that a solid understanding of the data will allow.]

  771. Mark T.
    Posted Aug 12, 2008 at 6:37 PM | Permalink

    If by satellites you mean MSU, they don’t.

    The MSU satellites are in polar orbits, Phil.

    Mark

  772. Smokey
    Posted Aug 12, 2008 at 6:54 PM | Permalink

    bender:

    You have your beliefs and that’s just wonderful. The rest of us are interested in the science behind the calculations.

    Hey, me too. So, can you show me the science behind the calculations that explains how CO2 causes global temp increases? Or, the science that explains why the planet isn’t doing as expected?

    I have an open mind; either explanation will suffice.

  773. Smokey
    Posted Aug 12, 2008 at 7:04 PM | Permalink

    Re #760: That chart was deliberately constructed to overlay CO2 on top of temp in such a way that it shows perfect correlation. Figures don’t lie, but liars figure. Had they used a CO2 chart starting at zero, the correlation would have been weak tea:

  774. bender
    Posted Aug 12, 2008 at 7:11 PM | Permalink

    blockquote>can you show me the science behind the calculations that explains how CO2 causes global temp increases?
    Can I show you the science behind a non-zero greenhouse effect? Yes. It’s called a GCM. Can I lay out the calculation of the magnitude of the effect? Of course not. Like you, I would love to see an “engineering quality” exposition of the matter. Its absence from IPCC ARs is disturbing.

    snip – The only reasonable position is that the burden of proof is shared by both sides. You show me your disproof.

  775. bender
    Posted Aug 12, 2008 at 7:16 PM | Permalink

    #772 Hey Smokey, a change of scale does not change the magnitude of a correlation. Go back to school.

    Steve M has (too) kindly asked that folks not post 1-paragraph AGW “refutations”. When you do that it makes you look stupid, and it makes CA look silly.

  776. John Lang
    Posted Aug 12, 2008 at 7:22 PM | Permalink

    I calculated the temperature increase per decade using the Hadley dataset which goes back to 1850 (approximately the time when CO2 began increasing, and the IPCC uses Hadley rather than GISS.)

    The increase varies over time depending on which period you start with.

    For comparison purposes, the climate models and global warming theory predict 0.2C increase per decade.

    Starting in 1850, temps increase by 0.04C per decade (less than one-quarter of the theory).

    The trend per decade gradually increases to a point where if you start measuring at 1940, temps increase by 0.08C per decade.

    This trend per decade continues increasing to 0.21C per decade if you start measuring in 1992.

    Afterward the trend per decade falls considerably at a rapid pace. If you start measuring in 1997, there is a 0.0C increase per decade (indicating global warming stopped after the 1997-98 El Nino.)

    If you start measuring the trend 5 years ago, the trend per decade is a scary -0.4C per decade.

    The average average over the entire period in only 0.07C per decade.

    (Keep in mind that Haldey and GISS have played around with the raw data so much over the period that one could not really say temps have increased much at all. The latest USHCNV2 adjustments to the raw data have added a total of +0.65C to the trend of +0.65C since 1900 in the US which says a lot to me.)

  777. bender
    Posted Aug 12, 2008 at 7:47 PM | Permalink

    #764 Those CO2 values are reconstructed estimates culled from a range of different studies. I am skeptical of the uncertainty levels reported there. AFAICT IPCC AR4 does not rely on Krauss (1999). Is this correct? Anyways, I would prefer to focus on the instrumental period, where uncertainties are much lower, as John Lang does in #775.

  778. DeWitt Payne
    Posted Aug 12, 2008 at 8:21 PM | Permalink

    cba,

    My integration occurs prior to any exponential activity and is based upon the integration of what is called the absorption coefficient which is the lorentz profile.

    And that’s one of your problems. You do know that the Lorentz profile is defined in such a way that its integral is identically one. The equivalent width, OTOH, which is used in all bandpass models, integrates the absorption over some frequency interval that includes most of the line, Integral (1-exp(-tau)), which is the width of a rectangular line with a transmission of zero. If you integrate the line function before calculating absorption, the entire bin can appear to be opaque even for weak lines. For example, a Lorentz line with a peak tau of 111 which when integrated first gives an absorption of .995 compared to 0.47 when the absorption is calculated first and then integrated.

    This multplies by the supplied intensity function S which is per molecule. That forms the monochromatic absorption coefficient K which is then multiplied by the number of molecules /cm^2 (1 cm thickness) to give tau, the dimensionless optical depth which must be multiplied by the path length for total absorption effect in exp(-tau*pathl) which gives the fraction of power transmitted through the path.

    This is correct, but your units are wrong. S has dimension cm2/molecule (which is why it is sometimes referred to as a cross section because it has units of area/molecule) which is then multiplied by the Lorentz function to give sigma(nu) which the HITRAN paper refers to as k for reasons not at all clear to me. u, the number density of molecules in the path, is the number of molecules/cm3 from the ideal gas law (could be a small problem with this at low altitude) times the path length in cm to give molecules/cm2 which when multiplied by k is indeed dimensionless as it must be. The problem I found was that the both the Lorentz and the Doppler function had dimension 1/Hz so that I couldn’t get anywhere near the right answer until I multiplied S*f(nu) by the line width in Hz. Or maybe it’s S in the tables that has dimension Hz because you have to divide it by the linewidth to get a dimensionless mass path.

    Of course for my setup, each point must be converted from wavelength to frequency (freq in /cm = 1e+7 / wavelength in nm) as these are defined by freq. not wavelength.

    I don’t feel your pain here at all. It is so much easier to convert units after you are all done that I think the effort you go through to avoid this is pointless.

  779. Scott-in-WA
    Posted Aug 12, 2008 at 8:42 PM | Permalink

    What caused all the “tempest in a teapot” excitement among bender and myself over on Unthreaded #36 was the following exchange, which I believe has direct relevancy here within this topic.

    Unthreaded #36, bender: OT predicton: The same AGW alarmists who refused to attribute the late 1990s warming spike to “natural internal climate variability” will now invoke this very device to explain the 1998-2008 flatlining temperature trend. If I’m right, it goes into the alarmist double-standards database.

    Scott-in-WA on Unthreaded #36: Bender, not to be overly persnickety, but why should any Professional AGW Alarmist feel any motivation to respond to this issue as long as the US Government continues to publish the following graphic — which indicates a continuing sharp upward trend through 2008 — as demonstrated fact:

    The impact of this graphic — along with some of my very carefully chosen words — has a mix of expected and unexpected effects upon bender. This is his immediate response:

    #712 Scott-in-WA

    Are you disputing whether the data presented here are factual?

    Are you disputing the inference that one (black curve) is connected to the other (bars)?

    Are you disputing the suggestion that the past trend will continue into the future?

    OK … What is this discussion all about, from my own personal perspective?

    First off, I’ll cross-post the following response about the NOAA graphic, which includes a short observation about it that I posted in Unthreaded #36:

    Unthreaded #36, wmanny: I think Scott is being ironic.

    Scott-in-WA: Your thinking is correct. It was also my purpose to illust