The NAS Panel on Data Availability

One of the by-products of data stonewalling by Team climate scientists was the appointment of a NAS panel entitled “Ensuring the Utility and Integrity of Research Data in a Digital Age”, noted up a couple of years ago at CA here. According to its webpage, the panel’s last hearing was in late 2007.

The webpage also states:

This project is sponsored by The National Academies.
The approximate start date for the project is 01/02/2007.
A report will be issued at the end of the project in approximately 12 months.

The study will not address privacy issues and other issues related to human subjects.
Update 2-29-08: The project duration has been extended. The report is expected to be issued by July 1st, 2008
As of 7-1-08, the project duration has been further extended and the report will be issued in fall 2008.
Update 1-9-09: The project duration has been extended. The report is expected to be issued by April 1, 2009.
Update 4-10-09: The project duration has been extended. The report is expected to be issued by June 1, 2009.

Anyone care to venture an over-under on whether they will issue another update announcing a postponement of the report. Maybe they could join forces with the PR Challenge.


52 Comments

  1. EJ
    Posted Apr 20, 2009 at 5:16 PM | Permalink | Reply

    Seems like no one is willing to take this project on. The report would, I would guess, have to summarize in detail previous obsufucations in data and methods.

    The subject is obviously a trivial matter to the NAS. No hurry, no worries.

    Another extention for sure.

  2. Arn Riewe
    Posted Apr 20, 2009 at 5:24 PM | Permalink | Reply

    I had a boss once. He was not the best boss I’ve had, but he did have the greatest sayings. One of my favorites was “Delay is the subtlest form of denial”.

  3. Michael Jankowski
    Posted Apr 20, 2009 at 5:28 PM | Permalink | Reply

    It looks like they need a panel to oversee that panel.

  4. jeez
    Posted Apr 20, 2009 at 6:04 PM | Permalink | Reply

    Ok, I’m normally not a spelling Nazi, but “obsufucations” is funny.

    • Geoff Sherrington
      Posted Apr 21, 2009 at 6:32 AM | Permalink | Reply

      Re: jeez (#4),

      Might mean “observation suffocation”. A goodie elsewhere last night was “manufactor”, like some kind of climate scientist statistical method.

  5. Terry
    Posted Apr 20, 2009 at 6:09 PM | Permalink | Reply

    Is anyone going to be held accountable for a now 18 month delay and 4x “this has been further extended” track record on this? At my employer, multiple heads would have rolled long ago for such poor planning, execution, and repeated delays.

  6. Soronel Haetir
    Posted Apr 20, 2009 at 6:25 PM | Permalink | Reply

    At least the extension periods appear to be shrinking. Maybe the report will ship before Duke Nukem Forever.

    • Mark T
      Posted Apr 20, 2009 at 7:51 PM | Permalink | Reply

      Re: Soronel Haetir (#6),

      At least the extension periods appear to be shrinking. Maybe the report will ship before Duke Nukem Forever.

      In statistical treatments, this is akin to a floor to ceiling confidence interval.

      Mark

      • David Jay
        Posted Apr 20, 2009 at 8:26 PM | Permalink | Reply

        Re: Mark T (#12),

        I like the quip: “DC to Light” – THAT is serious bandwidth.

        • Mark T
          Posted Apr 20, 2009 at 10:57 PM | Permalink

          Re: David Jay (#15), Eh, what’s a few terahertz among friends? ;)

          Mark

  7. Posted Apr 20, 2009 at 6:33 PM | Permalink | Reply

    I wonder if the extensions come with extended funding?

  8. Pat Frank
    Posted Apr 20, 2009 at 6:48 PM | Permalink | Reply

    I missed the 2007 post. Reading through that, one is struck by the casual attitude displayed by the NAS about data availability.

    So, with that in mind, I’ll take the extreme over. The NAS will never provide a full and enforceable guideline. The delay will be indefinitely prolonged.
    .
    I liked the point you made in the 2007 post, Steve, that publicity-seeking climate scientists sought, and were well prepared for, favorable attention and the complimentary softball interviews by reporters. They were not prepared for critical demands for due diligence and requests for open data. They seemed instead, then and now, unprepared to rise to the most basic question of science: “How do you know?” It’s as though they didn’t anticipate having to demonstrate their claims, and since then have actively resented the calls to do so.

  9. BarryW
    Posted Apr 20, 2009 at 6:58 PM | Permalink | Reply

    Sometimes people do things based on motives they aren’t even consciously aware of. One of these may be a desire to wait until legislation and rules are instituted that they are afraid might be compromised if the data was actually analyzed by someone who could see that the legislation and rules were based on conclusions that were not actually supported by the data. Since they are convinced of their dogma (or afraid of harming sacred cows of others who may be funding them) they delay so that others, such as our host, might not be able to “muddy the waters” with critical analysis. This is a common bureaucratic practice. Don’t deny, just delay.

    • Kenneth Fritsch
      Posted Apr 20, 2009 at 7:33 PM | Permalink | Reply

      Re: BarryW (#10),

      That revelation deserves a big laugh and acknowledgment that the mission was never taken seriously – no rationalizations please. They must be taking us for idiots and maybe we are.

  10. Peter Pond
    Posted Apr 20, 2009 at 7:59 PM | Permalink | Reply

    Sometimes it is as difficult to distinguish between incompetence or indolence or deliberate deception, as it is to distinguish the key factors and their relative influences in climate data.

  11. Gary
    Posted Apr 20, 2009 at 8:00 PM | Permalink | Reply

    They’ll issue another extension. It’s more important to keep records on these sorts of things than to actually do the work.

    Pity this isn’t brought to the floor at the next general meeting of the NAS or that some enterprising journalist doesn’t request an interview of the leadership and publish the stonewalling response.

  12. AnonyMoose
    Posted Apr 20, 2009 at 8:52 PM | Permalink | Reply

    Should we ask them for the data which they’re trying to process? :-)

  13. kuhnkat
    Posted Apr 20, 2009 at 10:07 PM | Permalink | Reply

    “Maybe they could join forces with the PR Challenge.”

    Sounds like they already did!!

  14. AnonyMoose
    Posted Apr 20, 2009 at 10:16 PM | Permalink | Reply

    Too bad the NAS didn’t ask committee member Lane to complete his work before he could get his medal.
    “Neal F. Lane to Receive Public Welfare Medal, Academy’s most prestigious award”

  15. John F. Hultquist
    Posted Apr 20, 2009 at 11:42 PM | Permalink | Reply

    I think much of the academic inspired publication, and the training that leads to it, historically has not expected that anyone would want to directly replicate a specific study. Academics get promotions, salary increases and new funding because they are advancing their field. Showing that data collection or analysis is incorrect has little personal reward– so why do that? Many exceptions to this notion, particularly in physics, exist. Because many have not expected the critical review (as found here on CA) they are not prepared for it in either a practical manner (archiving data) nor personal psychological sense, as in “How dare anyone question what I do.” Deans, colleagues, and friends must also decide how to react. Lots more could be said. Anyway, I’m not arguing that this is the proper way for university researchers to conduct their work – only that many have.

    • tetris
      Posted Apr 21, 2009 at 9:12 AM | Permalink | Reply

      Re: John F. Hultquist (#20),

      In e.g. chemistry, physics, the bio sciences or nanotechnology, a paper submitted without a properly documented “methods and materials” section as a rule is rejected out of hand, because it makes it impossible for the reader to verify the conclusions and how they where derived.

      Having spent a good number of years developing practical applications from cutting edge science in bio and nanotechnology and raising not inconsequential sums of venture capital money for the different companies, I know from experience that the ability for the investor to do proper due diligence and verify the science – in one case having the original scientific work replicated – was the determining factor.

      As has been amply demonstrated here at CA, this basic element of the scientific method is simply absent from “climate science”.

  16. Mark T
    Posted Apr 21, 2009 at 12:12 AM | Permalink | Reply

    True. Most of what gets published never really gets implemented in any practical way, so whether or not it can be replicated is largely immaterial in the long run. The biggies (relativity) get tested forever. That’s probably why we engineers have such a different understanding of “peer review.” It has to work at some point, else your company won’t spend any money on it for you to design it in the first place.

    Mark

  17. Clark
    Posted Apr 21, 2009 at 9:36 AM | Permalink | Reply

    Biological sciences have moved very rapidly to collate data AND experimental resources (genetic lines, DNA, etc). Indeed, if you want a grant from the NIH, you have to explicitly state how you are going to share any data and resources you gather. No sharing, no money.

    Plus, many scientific organizations have put together publicly accessible databases for distributing the large datasets that are now being generated. Anyone who tried to make claims on datasets they did not release would be summarily ignored. Heck, even companies like Monsanto and Syngenta have released huge, proprietary data and resource collections.

  18. James Smyth
    Posted Apr 21, 2009 at 11:17 AM | Permalink | Reply

    You know, w/ the day I’ve been having, who’d a thunk that this CA post would provide the humorous break I needed. Three cheers for the five-minute surf break. Smoke ‘em if you got ‘em

  19. Bob McDonald
    Posted Apr 21, 2009 at 11:21 AM | Permalink | Reply

    The comment period has been extended. My comment is expected to be available May 15, 2009.

  20. AnonyMoose
    Posted Apr 21, 2009 at 12:01 PM | Permalink | Reply

    Maybe there should be widgets on the side of the site which count down/up the time from the PR Challenge and NAS PDA dates? Maybe both “original” and “latest” target dates.

  21. Mike B
    Posted Apr 21, 2009 at 12:46 PM | Permalink | Reply

    A quick glance at the membership of this committee doesn’t fill me with much hope that they will ever accomplish anything.

  22. Mitchel44
    Posted Apr 21, 2009 at 3:01 PM | Permalink | Reply

    Don’t worry, the times has the answer, it’s “the poor communications skills of good scientists

    http://www.nytimes.com/2009/04/19/magazine/19Science-t.html?_r=2&ref=magazine&pagewanted=all

    That’s what is keeping all the data from being released.

  23. Mark W
    Posted Apr 21, 2009 at 3:04 PM | Permalink | Reply

    As someone who has spent a great deal of time converting paper records to electronic, I’m willing to give the NAS panel a little leeway here. While at first blush it may appear that 12 months (at first) and now 24 months is more than enough time to come up with a policy covering the format, deposit, storage, and the ability to retrieve electronic records, my experience tells me otherwise.

    A simple project that I ran ended up taking 34 months, and I only had to deal with one department in one company. And I knew, essentially, what I wanted to do. Here we’re talking about establishing a policy that people from many places, collecting data in many different ways, need to meet in order for it to be robust.

    While we tend to belittle the “intellectual property rights” that researchers claim to have in the data, the truth is that for commercial enterprises such rights to and ownership of data is real. Of course, most of the researchers in the field covered by CA are funded by one or more government entities, and the government typically claims at least an unrestricted license to such data if not outright ownership as part of its funding agreements. In any event, I would think that the panel needs to deal with both perspectives in a fair and equitable way if the policy statement is to have any enforceable teeth. As the policy, I presume, would extend to many other fields such as pharmaceutical research, energy research, etc., the panel would be remiss if it didn’t at least attempt to address these issues.

    It would be easy, and a mistake, to rush out a hard rule that all electronic data needs to be archived and available to all comers. Such a rule could and would easily be ignored, and even if followed, could be done in such a way that it does not clearly identify the phenomenon it purports to measure, leaving us no better off than we are now. If you want it done right, it’s going to take some time.

    • MJ
      Posted Apr 21, 2009 at 5:47 PM | Permalink | Reply

      Re: Mark W (#33),

      Personal computers have been around for almost 30 years. It’s hard to find a place where most laboratories, research facilities, and/or academia facilities do not have or rely on computers to catalog and process vasts amounts of research data. Most computers nowadays have some kind of readily accessible software that an ordinary user can get.

      Most software applications can export data files into a text files. There are also plenty of common formats that anyone can download appropriate converters and viewers (excel, word, etc.) for free.

      Most programming language environments store software code in text format before compiling. While its true that developers literate in a programming language may be the only ones who can interpret the code, that should not prevent researchers from posting the source code in a text format, regardless of the language syntax. Researchers are not immune from programming errors of logic and it wouldn’t hurt to have more than one person review their work.

      Even if you use a proprietary application, then at the very least individuals should say “I used application XYZ, selected these functions, and used these data files.” If an application has customizable processing scripts, more than likely, they are exportable as text, and can be included in the upload package. No one is asking that they have to provide the entire software installation disk, product keys, etc., just be detailed enough to explain the steps performed and if necessary, their rationale for that step, and any files used in the processing.

      It would seem to me that such a process would require nothing more than directory location and some generous space to upload the files. Certainly, there are issues of security and accessibility for uploading, however, to imply that such actions are unattainable because it requires a great deal of thought is not accurate.

      While it may be true that converting past research into the appropriate format for review may be difficult, there is no excuse for those doing research using computers today to not have their data and processing available in electronic form immediately. It seems to me the excuses I have read about not providing data and code for use by other researchers (unless you are a team member) is silly.

    • Henry A
      Posted Apr 22, 2009 at 2:39 PM | Permalink | Reply

      Re: Mark W (#33),

      Mark who do you work for, the Post Office?

  24. AnonyMoose
    Posted Apr 21, 2009 at 3:53 PM | Permalink | Reply

    Mitchel: Releasing data should not require many communication skills, and certainly fewer skills than creating the research paper.

    Mark W: At least the NAS data panel has the simpler task of dealing with general policy, as opposed to the PR Challenge which wants to clean up data. But the NAS data panel should have at least published a preliminary report which described what they’re doing or a proposed solution, so they’d have feedback by now. And the PR Challenge should publish the raw data which they are given, so it is available even if it requires more effort than their nicely organized final collection; and the raw data should be part of the PR Challenge’s public collection anyway.

  25. Fred
    Posted Apr 21, 2009 at 4:28 PM | Permalink | Reply

    NAS is a typical government organization . . .

    Question ? How many people work at ?

    Answer: ’bout half

  26. Reid
    Posted Apr 22, 2009 at 3:36 AM | Permalink | Reply

    The panel is obviously very underfunded and understaffed. They can’t even afford to hold a meeting in the Bahamas let alone Bali.

  27. henry
    Posted Apr 22, 2009 at 6:13 AM | Permalink | Reply

    I think we’re seeing this all wrong.

    Look again at the notice:

    This project is sponsored by The National Academies.
    The approximate start date for the project is 01/02/2007.
    A report will be issued at the end of the project in approximately 12 months.

    Approximate START date is…

    We’re assuming that the process has started and is being delayed. What proof do we have that the project has even started?

  28. Mark W
    Posted Apr 22, 2009 at 8:58 AM | Permalink | Reply

    re: AnonyMoose

    I agree wholeheartedly that transparency and preliminary reports in advance of a final report are good things. Perhaps is the lack of these things in this process that makes everyone (myself included, I admit) suspect some sort of uneven and underhanded approach to this issue.

    re: MJ

    Believe it or not, I think we’re not too far apart. We are, however, talking about two separate things. Is it possible for, today, every researcher to post information electronically and make it available? Of course. Do I think that some climate researchers make up excuses why they can’t post such data. Sure.

    My point is that, for an independet third party to get access to this data, there needs to be some sort of mechanism to force such behaviour. One obvious way, as Steve McIntyre has used with limited success, has been through FOIA requests. One other way would be through some sort of request to journals that publish papers that draw conclusions from such data. Right now there is no way to get a journal to enforce its own, apparantly discretionary, rules in regard to data archiving. If, however, the NAS came out unequivocally with a standard, it would be much more difficult to ignore, and we’d all be better off. If, however, the rule glossed over the interests of certain, large segments of the research community, such as, e.g., a pharmaceutical company, they would simply be ignored. That would only embolden the researchers in the climate field to ignore them as well.

    Do I think that these problems can be addressed? Yes. Do I think that current researchers in this field can, today, immediately post their research data? Yes. They are not mutually exclusive positions. Do I think it’s possilble for the NAS to say, today, what an appropriate policy should be. No.

  29. Craig Loehle
    Posted Apr 22, 2009 at 9:52 AM | Permalink | Reply

    1) This is not about converting paper records to electronic, nor does the NAS actually have to handle any data. It is about policy.
    2) NAS gets $ for these studies. To get paid they must sometime deliver something.
    3) Funders of NAS are often happy with generalities and “winging it” apparently. NAS working groups (or whatever they call them) are usually outside people brought in on minimal $ to discuss, not NAS employees.
    4) NAS reports are not binding on anyone.
    5) I doubt they will have any computer experts on the panel, as CA has in abundance.

    • Scott Lurndal
      Posted Apr 23, 2009 at 11:08 AM | Permalink | Reply

      Re: Craig Loehle (#41),

      While they may not have any computer experts on the panel itself (which seems to have an average age approaching 70), at one of the meetings Rob Pike (a distinguished computer scientist, formerly with bell labs, now with google) presented.

  30. Andrew Parker
    Posted Apr 22, 2009 at 3:19 PM | Permalink | Reply

    Having made use of a digitized library and watched them dutifully scanning page after page of archives, I can appreciate that converting paper to byte can be time consuming, the work of decades, but, I don’t think that is the major issue here. Most of the data and documentation produced in the past 20 years likely already exists in digital form. I know that the US Dept. of Defense was requiring digital submissions of reports 20 years ago. Certainly anything pertaining to modeling is digitized.

  31. Joseph
    Posted Apr 23, 2009 at 5:05 PM | Permalink | Reply

    This is a little OT, but posters here seem to have strong opinions about integrity and transparency in science, so I am going to post this here. This is a chance to make your views known, and maybe make a difference.

    The Office of Science and Technology Policy has published a notice requesting public comment to be used to aid the drafting of recommendations for Presidential action to ensure scientific integrity in the executive branch as per Presidential Memorandum. It was published in the Federal Register yesterday.

    http://edocket.access.gpo.gov/2009/pdf/E9-9307.pdf

    The comment period is now open and will be short, only 21 days. The comment period closes at 5 p.m. EDT May 13, 2009.

    Comments can be made here:
    http://blog.ostp.gov/2009/04/22/presidential-memo-on-scientific-integrity-request-for-comment/

    An interesting aspect of this process is that the public can vote upon (or even flag) the comments of others. I dunno how wise that is (I am thinking about the 2007 Weblog awards), but it IS interesting.

    How would YOU design a process that works?

    • John Baltutis
      Posted Apr 23, 2009 at 5:43 PM | Permalink | Reply

      Re: Joseph (#45),

      Safari 6530.3

      Design the process and withhold at least 50% of expected payments until accomplished. Kind of like any contractual system. They’re called deliverables, penalties, and rewards. Then, use prior performance as a measure for future awards. BTW, most of that’s been in place for years, but totally ignored.

  32. Joseph
    Posted Apr 23, 2009 at 6:49 PM | Permalink | Reply

    Okay, I’ll bite. What the heck is Safari 6530.3? A Net search came up blank.

    • John Baltutis
      Posted Apr 23, 2009 at 7:30 PM | Permalink | Reply

      Re: Joseph (#47),

      Dumb, inadvertent, subsequent paste, when editing and with no way to correct after submitting, since I diidn’t check it after posting. Ignore it. It was supposed to be

      How would YOU design a process that works?

      • Geoff Sherrington
        Posted Apr 24, 2009 at 5:49 AM | Permalink | Reply

        Re: John Baltutis (#48),

        How would YOU design a process that works?

        Easy. By rewarding success and penalising failure.

        The world is becoming inhabited by people who are paid to tell other people what to do. Forget about them. We need productivity. We should reward productivity. There used to be a theory of market driven demand for real goods …..

  33. Clark
    Posted Apr 24, 2009 at 9:34 AM | Permalink | Reply

    It’s not that hard to design a process to ensure data releases. It’s done in the biological sciences all the time. Grant proposals to the major agencies must have a detailed plan on distributing data AND materials. Anyone failing to do so would never get renewed funding.

    As for the suggestion (#46 above) of withholding some portion of funding until completion, that would not work as the grant money is used to primarily pay the salaries of the people doing the work. Unless you expect graduate students and research associates to go without pay for the last two years of a proposal period, it’s not feasible. The mechanism already in place that works is that funding for a NEW period is based in large part of the success (including the release of data and materials) of the previous funding period.

    If you want leverage over researchers to get them to act like actual scientists, shaming and public humiliation aren’t nearly so strong as money and publications. Work on the journals and granting agencies (which SM has clearly done for a number of journals). If you can convince them to put in place (and enforce) policies, you will get immediate, nearly full compliance.

  34. Hu McCulloch
    Posted Apr 26, 2009 at 10:27 AM | Permalink | Reply

    James Smyth, #28, writes,

    actually the Wikipedia entry for Eddington claims that:

    The myth that Eddington’s results were fraudulent is a modern invention.

    Wikipedia often is very useful, but always keep in mind that anyone can edit it. Thus, I could change Eddington’s entry to read that he is universally recognized to have been on the Grassy Knoll, and that is what it would say until someone bothered to change it.

  35. Scotch Tape Smell
    Posted May 2, 2009 at 5:43 PM | Permalink | Reply

    This may be off the topic, but I wondered if you are aware that Steven Chu uses the Mann Hockey Stick graph (or something that appears to be the Hockey Stick) in his presentation “The Energy Problem”? Is this MBH98 on page 7 of the pdf?

    So what I’m wondering is Steven Chu aware of what the NAS has said about the Hockey Stick graph?
    Secondly, I’m wondering if he would make the code that was used to produce that graph in his presentation available to the public? (That question would be somewhat on topic, wouldn’t it? I fear the snippers ;) )

  36. James Smyth
    Posted Apr 21, 2009 at 11:24 AM | Permalink | Reply

    Re: stan (#23), actually the Wikipedia entry for Eddingtion claims that:

    It has been claimed that Eddington’s observations were of poor quality and he had unjustly discounted simultaneous observations at Sobral, Brazil which appeared closer to the Newtonian model[3]. The quality of the 1919 results was indeed poor compared to later observations, but was sufficient to persuade contemporary astronomers. The rejection of the results from the Brazil expedition was due to a defect in the telescopes used which, again, was completely accepted and well-understood by contemporary astronomers.[4]. The myth that Eddington’s results were fraudulent is a modern invention.

    Which, of course, isn’t to say that we shouldn’t all be good little skeptics…

  37. stan
    Posted Apr 21, 2009 at 12:08 PM | Permalink | Reply

    Re: James Smyth (#28),
    I’ll let “Physicist” speak for himself. I have no idea. I do know that Wikipedia isn’t a very good place for anything about science for which there are divergent points of view.

  38. Geoff Sherrington
    Posted Apr 25, 2009 at 10:43 PM | Permalink | Reply

    Re: A story illustrating one of the most important aspects of the climate science debate « Fabius Maximus (#50),

    We had an Australian Prime Minister before 1975 who liked to be called Fabius Maximus. I wondered if this theme could be expanded to include Hillarious Clintonis.

3 Trackbacks

  1. [...] see this today in climate science.  (This post gives a slightly expanded version of material from this article at Climate [...]

  2. [...] more: The NAS Panel on Data Availability [...]

  3. [...] The NAS Panel on Data Availability « Climate Audit – Since they are convinced of their dogma (or afraid of harming sacred cows of others who may be funding them) they delay so that others, such as our host, might not be able to “muddy the waters” with critical analysis. This is a common bureaucratic practice. … Deans, colleagues, and friends must also decide how to react. Lots more could be said. Anyway, I’m not arguing that this is the proper way for university researchers to conduct their work – only that many have. … [...]

Post a Comment

Required fields are marked *

*
*

Follow

Get every new post delivered to your Inbox.

Join 2,875 other followers

%d bloggers like this: