Hansen Then and Now

In the “good old days” (August 25, 2007: after they had corrected their Y2K error), I downloaded Hansen’s “combined” version (his dset=1).

Jerry Brennan observed today that Hansen appeared to have already “moved on”, noticing apparent changes in Detroit Lakes and a couple of other sites. Here is a comparison of the Detroit Lakes (combined) as downloaded today, compared to the version downloaded less than 3 weeks ago. As you see, Detroit Lakes became about 0.5 deg C colder in the first part of the 20th century, as compared to the data from a couple of weeks ago.

detroi10.gif

In a few more weeks, maybe Hansen will have 1998 – and perhaps even 2006 – on top again.

Despite these large changes, Hansen, as with the Y2K corrections, did not provide any notice to readers that major changes (not arising through ordinary operations) had been inserted in his records.

It looks like this is the reason for the conundrum observed in my last post . I never thought of checking to see if Hansen had altered early 20th century values for Detroit Lakes MN between August 25 and Sept 10. It’s hard to keep with NASA adjusters. As noted previously, no wonder Hansen can’t joust with jesters, when he’s so busy adjusting his adjustments.

Here is the same comparison for Boulder CO, another site mentioned by Jerry, showing major changes in Boulder temperature estimates for the 1980s!

detroi11.gif

As a result of revisions made within the last 2 weeks, NASA now believes that the temperature increase in Boulder since the 1980s is about 0.5 deg more than they believed only a couple of weeks ago. Boulder is the home of IPCC Working Group 1, the site of UCAR’s world headquarters, NCAR’s site and home to hundreds, if not thousands of climate scientists. You’d think that they’d have known the temperature in Boulder in the early 1980s to within 0.5 degree. I guess not.

82 Comments

  1. Gunnar
    Posted Sep 12, 2007 at 9:44 AM | Permalink

    I’m starting to get the idea that the surface record is controlled and manipulated by one person.

  2. Jon
    Posted Sep 12, 2007 at 9:52 AM | Permalink

    This puts his release of code in an interesting context. If he keeps fiddling with things, it isn’t as meaningful to have a code release.

    Its also rather concerning that he might be driven by a “fit the model” mentality. i.e., the risk that he keeps tweaking the calculations to yield a better fit to his conception of what the aggregate warming should be.

  3. jae
    Posted Sep 12, 2007 at 10:06 AM | Permalink

    3: Wow! Somebody’s flirting with some real bad publicity.

  4. crosspatch
    Posted Sep 12, 2007 at 10:07 AM | Permalink

    If he keeps fiddling with things, it isn’t as meaningful to have a code release

    I was thinking the same thing. If he distributes code and then changes it practically immediately after release AND silently replaces the data online, you can have no possible way to test the code that was provided (because you can’t compare any output from that code with data that is available online). So it seems like it is a great shell game. Use input data that nobody else seems to have access to that is processed through software that is different from that provided to generate output data that can never be created by the actual code that was released. So yeah, he has released code, but in the end it might accomplish nothing. He can simply state “that code is obsolete, it isn’t what we are using now, the problems you have found ‘don’t matter’.”

    It seems all so very frustrating and the sad part is that it doesn’t have to be that way *and* I can’t see what he gains from it in the long run.

  5. Anthony Watts
    Posted Sep 12, 2007 at 10:10 AM | Permalink

    Just for reference, I have the Boulder USHCN site online, and I just added it two days ago. Getting it was a challenge becuase the NIST/NOAA site is a secure government facility. I ahd to go through checkpoints, metal detector, have my vehicle inspected, and be photographed to gain entry.

    In the “ground zero of standards” for much of the world, I’m pleased they still use a CRS with max/min thermometers, but not so pleased that it is on the edge of a huge parking lot.

    See it here:

    http://gallery.surfacestations.org/main.php?g2_itemId=1699

  6. Andy
    Posted Sep 12, 2007 at 10:12 AM | Permalink

    Makes it even harder to believe claims that the complex science is “settled” when we can’t even settle on the simple bookkeeping.

  7. JerryB
    Posted Sep 12, 2007 at 10:36 AM | Permalink

    In a memo discussing release of the code, Hansen wrote:

    “Because the programs include a variety of languages and computer unique
    functions, Reto would have preferred to have a week or two to combine
    these into a simpler more transparent structure, but because of a recent
    flood of demands for the programs, they are being made available as is.
    People interested in science may want to wait a week or two for a
    simplified version.”

    So, one possibility that comes to mind is whether in the process of developing
    a “simplified version” some coding errors were made which produced the new, and
    different, results. Is that what happened? I have no way of knowing.

    Looking around, I noticed that at the page at which the code could be
    downloaded
    there is a comment that “It was last updated Sep. 10, 2007, to clarify
    the procedures of some steps”, but no mention of new programming.

    The slight change of format of the reports unmistakingly indicates that
    some program revisions have occurred, but does not indicate their scope.

    Also, we must keep in mind that someday, USHCN version 2 will be released,
    and that will surely change lots of numbers.

  8. SteveSadlov
    Posted Sep 12, 2007 at 10:41 AM | Permalink

    All political orientations and positions in AGW debates aside, if someone who worked for me did what Hansen has done, he or she’d be on the street within the hour.

  9. Posted Sep 12, 2007 at 10:42 AM | Permalink

    #4
    crosspatch writes:

    He can simply state “that code is obsolete, it isn’t what we are using now, the problems you have found ‘don’t matter’.”

    Well it wouldn’t be the first time that’s happened to Steve:

    “Mann stated that we had used the wrong data and somehow we failed to notice errors in the data. This was outrageous, as we had downloaded the data from his own FTP site from the location provided by his own colleague, Scott Rutherford; we had described countless errors in great detail and had re-collated over 300 series to avoid these problems. Now, according to Mann, we should have taken the data off a different address at his FTP site, but this new address had never been mentioned in any publication or even on his own Web site.”

  10. Craig Loehle
    Posted Sep 12, 2007 at 10:44 AM | Permalink

    When was the last time you met a scientist who took accounting? Explains a lot.

  11. Reed
    Posted Sep 12, 2007 at 10:59 AM | Permalink

    “He who controls the present, controls the past. He who controls the past, controls the future.” — George Orwell

  12. Mike B
    Posted Sep 12, 2007 at 11:05 AM | Permalink

    Re#8

    Imagine Hansen trying to get his “product” though even the most rudimentary supplier qualification process. He’d be laughed out of any Supplier Quality Manager’s office.

  13. aurbo
    Posted Sep 12, 2007 at 11:10 AM | Permalink

    Bafflegab is bafflegab no matter how you slice it. When is someone going to call a spade a spade?
    Is there any organization in the meterological community empowered to investigate malfeasance? Can you name any other enterprise with literally multi-billions of dollars dedicated to the outcome of their activity that has no formal auditing procedure in operation? Since the whole verification of AGW depends upon accurate surface temperature measurments, what agency is empowered to authenticate the data? Does anyone else detect a certain aroma arising from this whole data controversey?

  14. Frank H. Scammell
    Posted Sep 12, 2007 at 11:11 AM | Permalink

    Would it be possible (perhaps a Freedom of Information request) to get Hansen (GISS) to provide a set of data (both input and output) that is consistent with the code that has been provided ? If you have a semi-opaque black box, but you don’t know what is the goes-inta and have no idea of what the goes-outa is supposed to be, it is more than frustrating, it is useless (and at best deceptive). It is a shell game where they can change input, code, and output without notification, and then claim you all are just amateurs. Obviously, you don’t know what you are doing, and should just leave it to the “experts”. The “experts” have been more than forthcoming. They did provide the code, didn’t they ? Even though there may be perfectly valid reasons to make changes to the output, those changes should be documented, along with clear notification to users of the data that changes have been made, the reason and date of the changes, and an archived version of the data prior to the change.A large dose of transparency should be demanded. The public should be informed.

  15. DocMartyn
    Posted Sep 12, 2007 at 11:22 AM | Permalink

    Can some one explain to my why the error bars have such largely different step changes? It is mad to present data where different points have different amounts of information. It is quite clear that the data set in figure 1 is “fiddled”, how can this get past referees?

  16. george h.
    Posted Sep 12, 2007 at 11:30 AM | Permalink

    How about a phone call to Hansen’s boss from someone in the media like John Stossel. NASA can’t be happy with these shannigans.

  17. James Erlandson
    Posted Sep 12, 2007 at 11:31 AM | Permalink

    This is the climate equivalent of restated earnings — something no public company would (or could) do without a press release.

  18. bernie
    Posted Sep 12, 2007 at 11:39 AM | Permalink

    George #6
    Perhaps the first call should be to Senator Inhofe’s office? He is the Minority Head of the Environment COmmittee, though he may be viewed as partisan, he certainly will have some clout with NASA.

  19. Larry
    Posted Sep 12, 2007 at 11:48 AM | Permalink

    What exactly are we looking at here? I don’t see a “before” and an “after”.

  20. David
    Posted Sep 12, 2007 at 11:50 AM | Permalink

    #17: Even sillier. It is the climate equivalent of rewriting history even though the people who lived through that time period are still alive and can tell you what really happened.

  21. Andy
    Posted Sep 12, 2007 at 11:52 AM | Permalink

    I have put a file showing the differences between the original code release and the 9/10/07 update referred to in #7 in the downloads section at code.google.com/p/open-gistemp. It’s pretty ugly (output of the “diff -ur command on the entire source tree) but will give you an idea what changed.

  22. Boris
    Posted Sep 12, 2007 at 11:57 AM | Permalink

    You’d think that they’d have known the temperature in Boulder in the early 1980s to within 0.5 degree. I guess not.

    Why would you think that? Climate scientists are very, very stupid as everyone here knows. Geez.

  23. Dave B
    Posted Sep 12, 2007 at 11:59 AM | Permalink

    #11 Reed…another apt orwell quote:

    “The great enemy of clear language is insincerity. When there is a gap between one’s real and one’s declared aims, one turns as it were instinctively to long words and exhausted idioms, like a cuttlefish spurting out ink.” —George Orwell

    so i guess we should wait for the “simplified” version, just as we would if we were “interested in science”.

  24. Not sure
    Posted Sep 12, 2007 at 12:04 PM | Permalink

    Looks like they added some recent data to the Antarctic data files. A few changes to the shell scripts, mostly to compile FORTRAN things. They did fix a bug at the end of do_comb_step1.sh that I had worked around by renaming PYTHON_README.txt.

    Sad to see they didn’t change much in step 4, ’cause I still can’t get that to work.

  25. Reid
    Posted Sep 12, 2007 at 12:14 PM | Permalink

    Re #18, bernie says “Perhaps the first call should be to Senator Inhofe’s office?”

    I think the person to call is NASA Chief Michael Griffin.

    My instinct tells me Griffin would love to see Hansen in the hot seat and knocked down a couple of pegs. Griffin expressed skepticism at Hansen’s catastrophic predictions and was forced to retract his words of wisdom. If Hansen is shown to be incompetent or worse it would be sweet vindication (revenge) for Griffin.

  26. John A
    Posted Sep 12, 2007 at 12:26 PM | Permalink

    “All history was a palimpsest, scraped clean and reinscribed exactly as often as was necessary. In no case would it have been possible, once the deed was done, to prove that any falsification had taken place. The largest section of the Records Department, far larger than the one on which Winston worked, consisted simply of persons whose duty it was to track down and collect all copies of books, newspapers, and other documents which had been superseded and were due for destruction. A number of The Times which might, because of changes in political alignment, or mistaken prophecies uttered by Big Brother, have been rewritten a dozen times still stood on the files bearing its original date, and no other copy existed to contradict it. Books, also, were recalled and rewritten again and again, and were invariably reissued without any admission that any alteration had been made. Even the written instructions which Winston received, and which he invariably got rid of as soon as he had dealt with them, never stated or implied that an act of forgery was to be committed: always the reference was to slips, errors, misprints, or misquotations which it was necessary to put right in the interests of accuracy.”

    Sorry. Couldn’t resist.

  27. bernie
    Posted Sep 12, 2007 at 12:40 PM | Permalink

    That coupled with “some are more equal than others” – indicates just how prescient GO was with regards to the climate change debate!!

  28. Demesure
    Posted Sep 12, 2007 at 12:51 PM | Permalink

    #21 Andy,
    google code rocks !
    It shows Hansen has added a lot of temperatures for August. Why August ?

  29. Steve McIntyre
    Posted Sep 12, 2007 at 1:24 PM | Permalink

    I scraped a dset=1 version on Aug 25 and dset=0 on Sep6-7, saving the file at 2.44 pm Eastern Sep 7. The Sep 7 version of Detroit Lakes was the same as the Aug 25 version and different from the current version. So the changes have been introduced in the last couple of days.

  30. Fred
    Posted Sep 12, 2007 at 1:24 PM | Permalink

    If these were restated earnings, the FCC would be laying charges.

  31. SteveSadlov
    Posted Sep 12, 2007 at 1:36 PM | Permalink

    RE: #26 – “For the Love of Big Brother……”

    That, of course, also spurs, quite naturally, another great Sci Fi classic, from 1976. Kudos to a son of Toronto, who penned this one:

    “We’ve taken care of everything
    The words you hear the songs you sing
    The pictures that give pleasure to your eye
    It’s one for all and all for one
    We work together common sons
    Never need to wonder how or why

    “We are the Priests of the Temples of Syrinx
    Our great computers fill the hallowed halls
    We are the Priests of the Temples of Syrinx
    All the gifts of life are held within our walls

    “Look around this world we’ve made
    Equality our stock in trade
    Come and join the Brotherhood of Man
    Oh what a nice contented world
    Let the banners be unfurled
    Hold the red star proudly high in hand

    “We are the priests of the temples of syrinx
    Our great computers fill the hallowed halls.
    We are the priests of the temples of syrinx
    All the gifts of life are held within our walls.”

    Just think, the year 2112 is only 105 years away. Some small fraction of those born this year will still be alive then.

  32. Posted Sep 12, 2007 at 1:42 PM | Permalink

    re 28:

    It shows Hansen has added a lot of temperatures for August. Why August ?

    Because it’s September now, duh.

  33. MattN
    Posted Sep 12, 2007 at 2:02 PM | Permalink

    Wait, so he’s STILL changing data at sites?

    Do we need *another* US temperature history recalculation??

  34. Posted Sep 12, 2007 at 2:08 PM | Permalink

    This whole episode has descende dinto an utter farce. You couldn’t make it up even if you tried.

    After all this there surely can’t be any credibility left in the derived global temperature anomaly record?

    The MSM have surely got to catch onto this shortly (I hope).

    It’s now patently obvious that the latter 20th century warming trend is an artefact of these now constantly changing adjustments. They are clearly attempting to perpetuate the deception by reducing past temperatures so that the claimed recent trend is exaggerated.

    There can’t be any other explanation for this other than that they are doing this for politically motivated reasons to perpetuate their funding. No significant global wide warming trend (no Waldo except in North Tacoma)), means no more reason for UN involvement, means no more need for the IPCC and a sigificant reduction in the value of properties in the Boulder, Colorado area.

    KevinUK

  35. Jeff Wood
    Posted Sep 12, 2007 at 2:27 PM | Permalink

    My fellow countryman KevinUK has obviously formed a view, and he isn’t the only one here to do so.

    Is it too soon for you guys to put the present position in plain English for the benefit of us non-boffins? One always smelled a rat on account of the previous non-disclosure of code and data, but how many tails does the beastie have?

  36. Demesure
    Posted Sep 12, 2007 at 2:29 PM | Permalink

    #32 good point Hans. It seems strange that the code changes with changing data though but my understanding level of the diff format on code.google is near null. Maybe Andy could explain.

  37. Bob Koss
    Posted Sep 12, 2007 at 2:47 PM | Permalink

    I used the wayback machine to get an old copy of the GISS station_list file from 2005 and compared it to the current one. I found 207 changes to the station ID versions. 129 of the 2005 IDs have been replace by 78 different ones. Of the 129 IDs, 54 are no longer accessible at all. Most of those that still exist but aren’t listed seem to have been supplanted by different versions from the same station. I know at least some are evidently still used in making the combined file. A few stations have also been added.

    I suppose many of the changes in the 2007 file could be due to another version having a more complete record. But I am curious about what data might have been in the 54 versions that no longer exist. I doubt they were empty or they wouldn’t have been in the 2005 file at all.

  38. Hoi Polloi
    Posted Sep 12, 2007 at 3:28 PM | Permalink

    “Essentially, all models are wrong, but some are useful. ” (George E.P. Box)

  39. Andy
    Posted Sep 12, 2007 at 3:31 PM | Permalink

    Re #36, Not Sure’s assessment of the changes to the code in #24 matched my quick read of the differences. I don’t think the data difference Steve Mc has noted is related to differences between these 2 code releases. That being said, I’m going to start an “Adjuster” branch of the code in Google code so we can track any GISS changes up to and through the “simplified” version.

    The “Jester” branch is unfolding in the Hansen Code thread:-)

  40. KDT
    Posted Sep 12, 2007 at 3:40 PM | Permalink

    Wait, dset=0 has changed recently at Detroit Lakes? How is that possible? Isn’t that input data to the code we have? Raw temperature records? I’m confused.

  41. Armand MacMurray
    Posted Sep 12, 2007 at 4:22 PM | Permalink

    Re:#39 etc.
    Along with the August temp additions, the diff shows the http://www.antarctica.ac.uk/met/READER/surface/Grytviken.All.temperature.txt file had its July 2006 value changed from +1.6 to -1.7

  42. JerryB
    Posted Sep 12, 2007 at 4:40 PM | Permalink

    Re #40

    KDT,

    For USHCN stations GISS “raw” actually includes some USHCN adjustments, but
    not quite the same USHCN adjustments as in a regular USHCN file.

  43. bernie
    Posted Sep 12, 2007 at 4:45 PM | Permalink

    #34 & #35
    Not so fast. Hansen’s data may be all screwed up but that does not mean that some level of anthropogenic warming is not taking place. We really should let the data guide us – whether that is from the satellite record or rigorously screened land based records. The CO2 effect is plausible, its scope needs empirical verification.

  44. JerryB
    Posted Sep 12, 2007 at 5:07 PM | Permalink

    FWIW, I am leaning toward the guess that while implementing a “simplified
    version” at GISS, someone accidentally followed the directions and used a
    “normal” USHCN mean data file instead of the “vintage” USHCN special from 2000.

    This guess cannot explain how Steve got the results that he indicated in the
    crossword puzzle # 4 thread.

    Since it is time for me to feather the props for the evening, I will elaborate
    on my reasons for this guess on the morrow, but they have to do with both sets
    of Not Sure’s results, and how they compare with GISS results of different dates.
    My reasons also are influenced by years of observing data processing problems,
    and kinds of mistakes that led to them.

  45. JerryB
    Posted Sep 12, 2007 at 5:23 PM | Permalink

    In my previous comment, the “both sets of Not Sure’s results” refers to his
    STEP0 results, not to any other of his results.

    The last sentence of that comment was mainly to suggest to Steve why I lean
    toward the idea that what we have seen today at GISS is an unintentional result
    of a mistake.

  46. KDT
    Posted Sep 12, 2007 at 5:45 PM | Permalink

    #42 Thanks, that makes sense.

    But where do the numbers on this page come from? (download monthly data from there)

    http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425727530040&data_set=0&num_neighbors=1

    I expected them to match my v2.mean values and they don’t. I’m not sure what’s going on. Sorry if I’m just being dense here.

  47. JerryB
    Posted Sep 12, 2007 at 6:02 PM | Permalink

    KDT,

    Why those numbers don’t match yours has not yet been resolved. At this time I
    am perhaps half way to snoozeland, but perhaps what I post tomorrow may help to
    answer your question.

  48. bmcburney
    Posted Sep 12, 2007 at 7:10 PM | Permalink

    I don’t know why anyone is surprised by the new data.
    If the present fails to get warmer the past MUST
    become colder.

  49. qwer
    Posted Sep 12, 2007 at 7:43 PM | Permalink

    Something to consider is a nightly download-and-diff of the source code and data files. If a difference occurs, save-and-label the download and the diff for future inspection. When someone is manipulating the code/data (for good or ill reasons) it is handy to have an audit trail even if you don’t know why it’s being done.

  50. windansea
    Posted Sep 12, 2007 at 7:52 PM | Permalink

    I don’t know why anyone is surprised by the new data.
    If the present fails to get warmer the past MUST
    become colder.

    you win the thread

  51. Andy
    Posted Sep 12, 2007 at 8:01 PM | Permalink

    Re #49, this is exactly what I’m going to do on the source code.

  52. K
    Posted Sep 12, 2007 at 9:42 PM | Permalink

    Bernie at #43 is correct in pointing at data. And he is correct when he says Hansen may be right or partly right about AGW even if this audit should show he made many mistakes.

    If untouched data is available then no tricks in methods or programs will defeat independent analysis. Data first. That is why any adjustment which replaces original observations is so troubling.

    It is very hard to retroactively cleanse any deceptions from computer coding and still produce exactly what you have published long ago. That is almost certain to fail. But if data can be changed anything can be proved.

    The trump card in all this is time. Virtually nothing will be achieved in cutting CO2 over the next five years. By then new observations and work will reinforce the AGW view. Or not.

  53. tetris
    Posted Sep 12, 2007 at 10:04 PM | Permalink

    Re: 48 and 50
    At first glance elegant. That conclusion however, runs counter to various long term climate records which tell us that the earth has periodically been warmer [considerably so at times] than is the case today. Pls correct me if I have missed something subtle.

  54. Posted Sep 12, 2007 at 10:22 PM | Permalink

    I think Hansen is moving forward while this forum is stuck in the past.

    While our happy crew is auditing what Hanson did in the past, Hanson believes that the GCM is accurate, that the physics in the GCM is correct and sufficient, that correct, the regional predictions for the future AND the past are correct. It’s a short step then for Hansen to using the GCMs to to calculate the biases for past data and it’s a lot easier than empirically doing so for 1000’s of sites. So while this happy crew picks apart the past, Hansen doesn’t care anymore. May be one reason why he released the code, just to keep the jesters distracted.

    Essentially what Hansen is doing is calibrating historical regional data to match the GCM regional past predictions. Besides making bias calculations trivial for the past, and confusing and distracting the jesters, Hansen is only interested now in tracking future data (better and undisputable) against the GCM regional predictions. If the benchmarked-to-today future track of regional data doesn’t deviate significantly from the GCM regional predictions, then he’s won, the GCMs are right. If data does deviate, then that flags where tweaks need to be made to the GCM.

    Hansen has ceeded the argument about the past to the jesters, they can now go bicker among them selves and rattle their jinglebell hats.

    Hansen and the princes of GCM are have moved on.

  55. Posted Sep 12, 2007 at 10:24 PM | Permalink

    re 54

    gotta proofread better next time.

  56. K
    Posted Sep 12, 2007 at 11:18 PM | Permalink

    #54. I miss your point. Are we to look at what Hansen or others have not yet published or asserted?

    You are probably right about Hansen moving on. He should. He published what he published and said what he said and acted as he acted. About all an audit can do is review. That is why the jesters – his label – are working on the past. Audits work on what has been done.

    “Hansen is only interested now in tracking future data (better and undisputable) against the GCM regional predictions”

    Well and good! If the GCMs do a good job forward we all benefit. No one wants humanity bewildered by failure. I really don’t grasp that Hansen has somehow tricked or made fools of those now looking at what he so sullenly made public.

    Does Hansen win if GCMs are successful in the future? No, Hansen wins if the Earth keeps heating up and it is AGW as he has already said.
    If that does not happen then he won’t be praised – in say 2012 or 2020 – because constantly evolving models keep working better.

  57. Posted Sep 12, 2007 at 11:55 PM | Permalink

    re 54

    You got my point. Hansen is betting that the GCMs as they are today are correct, and that the earth will keep heating up. If the climate doesn’t follow the prediction, well Hansen has firm faith that future events will prove him right. It’s interesting example of hubris.

    By releasing the code and moving on, It is also possible that Hansen is conceding that the auditors are correct, the old data and old code aren’t good enough to measure global warming — so why try to defend them?

    In this thread there’s been a lot of effort put into figuring out what’s happened when there’s an unanounced change to a dataset. Was it just to stir the pot? Might the change have been for no other reason than to keep the auditors busy?

    Perhaps that just too devious.

  58. James Lane
    Posted Sep 13, 2007 at 1:16 AM | Permalink

    I think it’s pointless to ascribe motives to Hansen’s adustments. Nobody knows what Hansen is thinking, except himself and maybe some of his colleagues.

    What I find bewildering is that Hansen is constantly adjusting data from decades past, and he’s been at it for years. This first came up in the debate about the Swindle doco, when the director was blasted for using an “old” Hansen graph that showed a greater global temperature fall circa 1940 – 1970 than a more recent Hansen graph. Historical data don’t usually get up and walk around. Would this be acceptable in any other field than climate science?

    I’d love to hear an explanation for the Detroit Lakes record adustment in the top post, although I won’t hold my breath. You’d have to think some of the folk at GISS are dying of embarrasment, if they are not already dead after the “destruction of creation” memo.

  59. K
    Posted Sep 13, 2007 at 1:30 AM | Permalink

    #54. I agree with your first paragraph. The faith is there and apparently the arrogance. The facts will prove to be what they are. He may end up hailed as the Great Man of this century – half of a dynamic duo with Al Gore.

    I suspect any data (and code) changes in the last few days have honestly been to fix clerical error. i.e. Poor records, untidy procedures, and misplaced material. The risk to Hansen in actually changing things now just seems too high. I’ll try to clarify why.

    Suppose Hansen faked something long ago to support his premise. If auditing focuses on that data why would he now defend it? He could just say there must have been a lab error twenty years ago; someone probably inserted the wrong diskette. Try to prove otherwise.

    Now assume he never fiddled. He has been playing fair. Then why introduce tricks now? He would probably get caught and discredited.

    Either way, impeding or misleading now doesn’t look like a winning game for Hansen.

    I think your second paragraph is close. It matches my conclusions (always an agreeable outcome) from following this topic for about a year. The old records just aren’t good enough for the task. They could have been but that wasn’t done. The current work of Anthony Watts shows what a mess some stations are even today. The discussion of sea temperatures, cloth buckets, iron buckets, inlet water measures, etc. doesn’t produce a warm fuzzy feeling. The UHI effect introduces more uncertainty. Tree rings? Proxy this and proxy that?

    So, if the old data can’t do the job why audit? Simply because it would be good to know that the old data can’t do the job. Or that the old code was mistaken. It would help in setting policy.

  60. py
    Posted Sep 13, 2007 at 1:51 AM | Permalink

    #34. Whilst I agree there does seem to be an element of ‘reshuffling the deckchairs on the titanic’, at least Hansen et. al have provided access to unadjusted and adjusted data for some time now, and the release of code now makes the process of deconstruction easier. They should be applauded for that.

    No such statements can be made for data and methodology originating from Hadley/CRU. Maybe I haven’t searched long enough but I can find no GISS data release equivalent from these sites. I cannot find any station lists, or unadjusted/adjusted data. IMHO, maybe Hadley should use some of the £74 million investment the UK has just announced to fund a more open data access policy.

    They have nothing to hide right?

  61. Posted Sep 13, 2007 at 2:35 AM | Permalink

    #60 Has anyone actually done a FoI request for the Hadley data and code?


    Steve:
    Willlis ESchenbach has done an FOI for CRU data. They have refused to even identify the sites in question so far. There are a number of prior posts on this.

  62. Demesure
    Posted Sep 13, 2007 at 3:59 AM | Permalink

    #60 PY, you can search as long as you want, you’ll find no code or data of the CRU. Here is Jones et Al would reply to you in their standardized niet email “What I would do, in response to the comment, is to suggest that the skeptics derive their own gridded temperature data.”

  63. Jean S
    Posted Sep 13, 2007 at 4:15 AM | Permalink

    #60/#62: Actually, Jones’ vintage (1991) station data is available here:
    http://cdiac.ornl.gov/ftp/ndp020/
    I’ve been wondering why he released the data back then but is refusing to do it now.

  64. fFreddy
    Posted Sep 13, 2007 at 4:22 AM | Permalink

    He was still a scientist then …

  65. Posted Sep 13, 2007 at 5:35 AM | Permalink

    #62 But he’s not allowed to do that under the Freedom of Information Act AFAIK.

  66. bernie
    Posted Sep 13, 2007 at 6:09 AM | Permalink

    Experiments and data always trump theory — in the end. The GCM’s will always be tested against the temperature record. Sooner or later their predictions will have to be shown to be reasonably accurate. The big open questions underpinning the models are what Steve has repeatedly asked about: Essentially it is the sensitivity issue. At the moment though because we have data on CO2 and temperature data we will continue to look at the relationship. My guess is that globally extensive variables such as land use (per Pielke et al) will significantly improve the models.

    Moreover, Hansen cannot simply move on if he is shown to have manipulated the data. I believe he has left enough of a paper trail to be pulled up short, if the data shows that there has been a systematic effort to create more of a trend. To do this we need to reconstruct what he has done. As others ave pointed out, Hansen is not without bureaucratic and academic opponents.

  67. BarryW
    Posted Sep 13, 2007 at 7:20 AM | Permalink

    Re #66

    Thinking of Hansen as a scientist I agree, but as a politician he will just say: “Yes, mistakes were made but we fixed them, you’re dwelling in the past! The fate of the world is at stake!” If the present data is corrupted by microsite and UHI and his adjustments he can get the models to look like they are supported by that data with enough fiddling. He has already decided what the truth is so any data that doesnot conform must be wrong.

  68. JerryB
    Posted Sep 13, 2007 at 9:40 AM | Permalink

    Comparing Not Sure’s Sept 8th STEP0 output with some current GISS dset=0
    output, I see mostly exact matches, or differences of 0.1 C.

    Detroit Lakes GISS
    Detroit Lakes Not Sure
    Boulder GISS
    Boulder Not Sure
    Danevang GISS
    Danevang Not Sure

    Detroit Lakes, and Boulder match exactly, Danevang matches exactly except
    for the months of July and October, which differ by 0.1 C.

    Meanwhile …

    I feel a bit uneasy, as in presumptuous, about how I acquired the
    following information, so without any background, here it is:

    GISS is now (intentionally) using a “normal” USHCN mean data file instead
    of the vintage  USHCN special from 2000. They rely on the ‘M’ flag to
    skip FILNET adjustments for missing observations.

    Look for some statement of changes on their website within a couple of weeks.

    I did not get any information about what other changes might be in the works.

  69. bernie
    Posted Sep 13, 2007 at 11:18 AM | Permalink

    JerryB #68

    Can you or someone else clarify the implications?

  70. JerryB
    Posted Sep 13, 2007 at 11:53 AM | Permalink

    bernie,

    1. It resolves the question of why the numbers at GISS are different this wekk
    than last week for USHCN stations: a change of some input data, not of programs.
    2. It lets those working on getting the GISS programs running that in order
    to check results, they cannot use GISS output from last week, but must get new
    GISS output.
    3. It may (perhaps I should say will probably) lead to another rearrangement of
    relative rankings of warm USA years.
    4. It may lead to dropping the adjustments of the adjustments that GISS implemented
    early last month.

  71. Sam Urbinto
    Posted Sep 13, 2007 at 1:57 PM | Permalink

    I look at it this way. We don’t know what’s been done to the code or for how long since the decision/order/whatever was made. So all we can do is guess at that.

    Simple to do from published descriptions, ha!

  72. aurbo
    Posted Sep 13, 2007 at 7:50 PM | Permalink

    Re #72 and my earlier #13.

    Congratulations! Finally someone is willing to call a spade a spade.

    What Hansen has been doing is rather capriciously tinkering with the various data bases to create sufficient uncertainity so that no one can accurately contradict the GCMs. In short, there is so much claimable error in the current data bases that one cannot prove the GCMs are wrong. Neither can they prove that they are right, but no matter, that’s effectively kicking the can down the road.

  73. Willis Eschenbach
    Posted Sep 14, 2007 at 12:29 AM | Permalink

    Bishop Hill, I filed a FOI request for the Jones data a while back. They have refused to provide it, but offered a list of the stations used. I replied asking for the list about three weeks ago … still waiting …

    w.

  74. py
    Posted Sep 14, 2007 at 12:57 AM | Permalink

    #73, Willis,

    Did they give a reason as to why they wouldn’t release the data?

  75. fFreddy
    Posted Sep 14, 2007 at 1:43 AM | Permalink

    Steve M, would it be possible to have a thread to keep a record of Willis’ FoI with Jones ?

  76. Jeff
    Posted Sep 14, 2007 at 6:52 AM | Permalink

    I guess the mods didn’t like my use of the “f” word, no, the other “f” word.

  77. bmcburney
    Posted Sep 14, 2007 at 7:06 AM | Permalink

    Tetris #53,

    It was just a jest, not a joust. But can you rely on long term climate records when they change so rapidly?

  78. Chris D
    Posted Sep 14, 2007 at 9:20 AM | Permalink

    Instead of night time lights, perhaps we have another sort of “lights” involved here:

    http://en.wikipedia.org/wiki/Hawthorne_studies

  79. Ed I
    Posted Sep 14, 2007 at 11:12 AM | Permalink

    One of the reason Hansen was screaming he was being muzzled by this Administration as he ran around the country giving speech on AGW was to avoid being fired or demoted if he was caught defending the indefensible. Hansen understood that the Administration would fight such a political battle while fighting a real war. Yet this is not new for government scientists. I worked “cooperatively” on several projects with long time series data bases where a single federal agency individual made decisions of what data elements would be transfered to a new computer system and what elements discarded forever, no electronic or paper back-up. The original data collection and entry had taken thousand of man-days and million of dollars then half of it wiped out with the sweep of a pen and destroying the utility of the data.

    The AGW regulatory decisions are well underway at the state level in many states. These regulations will cost billions and will dramatically affect each of us. Yet when we should be able to depend on objective scientists who admit readily when they have discovered a mistake we are faced with technocrats that consider real political power from behind the curtain far more important to their own egos than the citizens of the country that has funded their research. Any scientist that cannot readily admit when they have made a mistake is not worth being called a scientist.

  80. Posted Sep 14, 2007 at 12:46 PM | Permalink

    #73 Willis

    Did they give a reason? Have you considered an appeal to the Information Commissioner?

  81. Michael Jankowski
    Posted Sep 14, 2007 at 12:48 PM | Permalink

    The irony of Hansen complaining about being “muzzled” when (1) he blasted the NASA head when he made his feelings on AGW public and helped get him to basically publicly apologize for speaking his views and saying it wouldn’t happen again [seems like muzzling to me] and (2) he’s explicitly stated he doesn’t want to appear in front of panels, gov’t agencies, etc, to talk about AGW unless they’re already on board with him – in other words, muzzling himself.

  82. SteveSadlov
    Posted Sep 14, 2007 at 2:20 PM | Permalink

    RE: #79 – Our regs here in California have resulted in Wonderbread shutting down and relocating. That is only just the beginning.

One Trackback

  1. By Y2K Re-Visited « Climate Audit on Nov 11, 2010 at 10:54 AM

    […] few days in the station history for Detroit Lakes MN which we’d been using as a test case. I posted up the following illustration of the changes with the commentary shown in the caption: Figure 1. […]