Dunde: Will the Real Slim Shady Please Stand Up?

One of my objectives in looking at both the Dulan tree ring data and Chinese station data is to take a fresh look at the Dunde ice core information, which is near Dulan and Delingha. twq says that he’s been analyzing low-frequency information from Dunde.

A caveat for twq: the Dunde archiving situation is a fiasco. There are multiple inconsistent versions of Dunde (and other Thompson ice cores such as Guliya). Last year we discussed three inconsistent Guliya versions used in peer-reviewed 2006 articles – Dunde is just as bad.

I just noticed that Yao et al 2006 introduces yet another inconsistent Thompson version. I’ve complained to Science, NSF, NAS and gotten nowhere. So twq, before you start using Dunde data, you should ponder which Dunde you’re using.

The figure below shows 5 inconsistent Thompson versions, two from Thompson articles in 2003 and 2006 and the others from grey versions archived or obtained from multiproxy authors. The most detailed information comes from the MBH98 archive (annual); the Thompson 2006 PNAS article (red) archived 5-year averages; the Thompson 2003 Clim Chg article (blue) archived 10 year averages (only after I caused a data policy to be implemented at Climatic Change and complained about Thompson); the low frequency curves are from Yang (email) and Crowley (email). I’ve done a re-scaling estimate for the Crowley version, as he had lost his original data and only had a rescaled and smoothed version.

IPCC SPM told us that “additional data in multiple indicators” is showing “coherent behavior”. We don’t even see “coherent behavior” in different Dunde versions. (Guliya is just as bad.)

Dunde Versions. Black – MBH98 (annual); red – PNAS 2006 (5-year averages); blue – Clim Chg 2003 (10-year averages); purple – Yang et al 2002 (values in 50 -year intervals); green – Crowley and Lowery 2000 (original in standardized format, re-fitted here for display by regression fit to MBH98).

Will the Real Slim Shady Please Stand Up?
To this unseemly brew, Yao et al Ann Glac 2006 (including Thompson) has added yet another inconsistent Dunde version (thus far not available digitally). Dunde (their Figure 4 top panel) has a peak in the 1930s which is not present in the earlier versions.

“Fig. 4. The 3 year running averaged d18O records for the four ice cores from the TP presented in the paper: (a) Dunde, (b) Guliya, (c) Puruogangri and (d) Dasuopu. The trend lines of Guliya, Puruogangri and Dasuopu show obvious warming since 1900, but the Dunde warming is less obvious because the record ends in 1985.”

Update: Here’s the updated Dunde version incorporating the digitization of Yao et al 2006 (thx, Willis):
Caption: Yao et al 2006 in heavy black (3 year rolling average); others as above. End Update:

Yao et al also have a detailed figure of annual data from 1967 on, shown below. For comparison, on the right, the annual data from the grey MBH98 version is also shown The 1987 values in the MBH98 version end at very low values, while the Yao et al 2006 version ends on high values.

dunde3.gif dunde8.gif

Left – Version in Yao et al 2006 Figure 6c: thin annual; solid – 3 year rolling average ; right – corresponding data from MBH98 version in black; Yao et al digitization in red

Every Dunde sample datum should be archived immediately. Thompson should be asked to reconcile the various versions. In business, if a company presented rolling inconsistent versions of their financial statements, you know what the outcome would be.
The acquiescence of Science, NSF and Ralph Cicerone of NAS in these shenanigans is a disgrace.

PS: For those of you uninfluenced by teenagers, the title of the post is from Eminem.

Tandong YAO, Zexia LI, Lonnie G. THOMPSON, Ellen MOSLEY-THOMPSON, Youqing WANG, Lide TIAN, Ninglian WANG, Keqin DUAN, 2006. d18O records from Tibetan ice cores reveal differences in climatic changes, Annals of Glaciology 43 2006 url


  1. tc
    Posted Apr 12, 2007 at 10:08 PM | Permalink

    Steve – You are doing mammouth work! There is no reason that you alone should have to bear so many of these burdens, such as, for example, requesting Science, NSF, and NAS to address this serious issue. It is sad but often true that it is easy to ignore one person. It is harder to ignore a request from several people.

    Suggestion 1 – Dedroclimatologists
    Because this particular issue invloves tree rings, I suggest that the you invite the dendroclimatologists to join with you in a letter to one or more of these organization requesting that they address the issue. I realize that you have made a general request that the dendroclimatologists address these kinds of issues themselves.
    I just wonder if you were to suggest to them this specific opportunity to clear up this specific issue might they be willing to help get at least one of those orgaizations to address this issue.

    Option 2 – Other scientists
    You could announce and display on your website a draft letter to Science or NSF or NAS, and invite any scientists who agree with the letter to join with you in signing the letter.

    The idea here is not getting high numbers of co-signers. It is simply to show that there is more than one person requesting that the organization address the issue. It could be only a few co=signers, or even just two or three co-signers. Just by showing a few different affiliations, the letter can have more impact that from one person.

    It just amazes me that you are able to do so much analyses and work. Could there be a few select cases where you might find support from other scientists helpful?

  2. Steve McIntyre
    Posted Apr 12, 2007 at 10:20 PM | Permalink

    Paul Dennis of the University of East Anglia (the home of Phil Jones and Keith Briffa) was upset by similar nonsense with the Guliya core and wrote to Thompson and Science. He said that he’d contact me if he got anywhere so I presume that he didn’t.

    I definitely believe in writing letters. I think that more letters help. With Jones et al 1990, we had three people writing the University of East Anglia – Willis, myself, Doug. Our letters were independent though, but Willis’ experience gave a template for the inquiry. You don’t necessarily get anywhere with these institutions the first go, but sometimes you get somewhere eventually. At a certain point, the institution has to wonder what the hell is going on.

    The U.K. FOI gave a bit of leverage at the U of East Anglia so the university was legally obliged to respond. I sent a similar FOI to NOAA and never got an acknowledgement and a Materials Complaint to Nature and never got an acknowledgement either.

  3. Willis Eschenbach
    Posted Apr 13, 2007 at 2:17 AM | Permalink

    Here’s the digitized data from the Dunde graph, to add to your collection of Dundelunacies …

    Year, ‘ˆ‚O18
    1900, -11.35
    1901, -11.23
    1902, -11.04
    1903, -10.87
    1904, -10.87
    1905, -10.49
    1906, -10.36
    1907, -10.15
    1908, -9.82
    1909, -9.50
    1910, -9.50
    1911, -9.30
    1912, -9.39
    1913, -9.46
    1914, -9.66
    1915, -9.75
    1916, -9.94
    1917, -10.12
    1918, -10.21
    1919, -10.28
    1920, -10.24
    1921, -10.05
    1922, -9.74
    1923, -9.57
    1924, -9.49
    1925, -9.49
    1926, -9.81
    1927, -10.01
    1928, -10.29
    1929, -10.29
    1930, -10.14
    1931, -9.58
    1932, -9.31
    1933, -8.81
    1934, -8.19
    1935, -8.19
    1936, -8.44
    1937, -8.44
    1938, -9.07
    1939, -9.60
    1940, -9.54
    1941, -9.26
    1942, -9.51
    1943, -9.45
    1944, -9.57
    1945, -9.74
    1946, -9.93
    1947, -9.70
    1948, -9.42
    1949, -9.14
    1950, -9.30
    1951, -9.40
    1952, -9.85
    1953, -10.38
    1954, -10.81
    1955, -10.68
    1956, -11.05
    1957, -11.17
    1958, -11.03
    1959, -10.65
    1960, -10.65
    1961, -10.21
    1962, -10.07
    1963, -9.89
    1964, -10.51
    1965, -10.81
    1966, -10.81
    1967, -10.44
    1968, -10.27
    1969, -10.05
    1970, -10.01
    1971, -10.05
    1972, -10.20
    1973, -10.09
    1974, -10.23
    1975, -10.51
    1976, -10.31
    1977, -10.34
    1978, -10.24
    1979, -9.86
    1980, -9.49
    1981, -9.63
    1982, -10.20
    1983, -9.64
    1984, -9.70
    1985, -9.83

    Thanks for all the good work, Steve … and everyone else.


  4. Steve McIntyre
    Posted Apr 13, 2007 at 9:41 AM | Permalink

    I’ve updated my collation graphic to include the digitization of the Yao data (thx Willis). It is one of the most disgusting graphics in climate science. Needless to say, Al Gore relies heavily on Lonnie Thompson.

    Thompson has had dictatorial power over this data for over 20 years. Yeah, yeah, Castro’s been in power longer. But the Thompson data has been in prison far too long. Let the Thompson data out of prison.

  5. Steve McIntyre
    Posted Apr 13, 2007 at 9:50 AM | Permalink

    Dendroclimatologists (memo to twq) should be lining up to demand that Thompson let the Dunde data out of prison. Think of all the work and effort and ingenuity being applied to understand the complicated integration of temperature and precipitation in the Dulan junipers. Well, at nearby Dunde, there is another integral of temperature and precipitation in the Dunde ice core – BUT it’s a different integral. Dendroclimatologists should WANT to see a different integral and see if they can use it to winkle out some precious information.

    It’s no good using Dunde low-frequency information. Get the real data before it’s lost or destroyed.

  6. Dave Dardinger
    Posted Apr 13, 2007 at 10:27 AM | Permalink


    I have to admit I’d never heard of “Slim Shady” before, having given up on pop music a couple of decades ago (if you call rap “music” in the first place). But while I was googling I was surprised that upon searching this site there’s no hit on “Crocodile Dunde”. It would actually be a good chapter title for a bio of you: The story of how a Canadian ore seeker came to the capital of “High Science” and knocked it on its ear.

  7. Roger Dueck
    Posted Apr 13, 2007 at 12:42 PM | Permalink

    Steve #4
    As a professional geologist I am amazed at how possesive the climatic/dendro crowd is with “their” data. In my industry the regulating authorities of the provinces long ago established the fact that data IS the property of the people. Privately funded or public, the data acquired through the drilling of oil and gas wells is BY LAW directed to the provincial authority and after a specified time, (three months to a year, in most cases, seldom up to a max of five years) access is made available to the public. Anyone with an interest can and do access this data for their own academic or economic interest. The Alberta Energy and Utilities Board has set a world-wide standard for the practice of accumulating data and samples and providing this access. It gauls me to see Peter Brown refer to “his data” and assert that he is publicly funded for his expertise in interpretation, not for the data. How arrogant, to assume no other scientist or lay-person would have the right to examine what he has disgarded as worthless! Science would be in the dark ages if that type of attitude was common. Public funding should be conditional on all data being archived and made easily available.

  8. Steve McIntyre
    Posted Apr 13, 2007 at 1:43 PM | Permalink

    #8. Be calm, folks. Rather than thinking about ill-considered new laws, I’d be content if NSF simply administered the policies that are presently in effect. NSF has the authority and even obligation to require fundees to archive data, but doesn’t. That’s the first point of criticism.

  9. Roger Dueck
    Posted Apr 13, 2007 at 2:09 PM | Permalink

    Steve, just as in an industrial audit, policies are usually sufficient but legal recourse ensures compliance. The oil industry has accepted and lives within the bounds of the data-sharing mandate with the possible legal recourse of loss of operating status(can’t do anything, can’t pump wells). Most of the US does not have the data sharing regulation and data records (especially older archives)in many states are in shambles, if they exist at all.

  10. twq
    Posted Apr 13, 2007 at 2:40 PM | Permalink

    RE: #8,

    Anyone in the society does agree with it?

  11. David Ashley
    Posted Apr 13, 2007 at 3:44 PM | Permalink

    Re #7-10. What exactly does NSF require regarding disclosure? I’m shocked to keep seeing this issue of data/code hoarding popping up. In my recently completed master’s thesis, I specifically mentioned contacting me for the code to a model that could not be included because of the sheer size (few hundred text pages of code). I can’t imagine not releasing such things to anyone who asked. Is this really the common practice in this field? I’m surprised the departments or institutions of these individuals don’t threaten them with dismissal on the ethical grounds alone regardless of whether there’s any law or regulation. Where has scientific ethics gone?

  12. Willis Eschenbach
    Posted Apr 13, 2007 at 5:08 PM | Permalink

    [Treasure of the Sierra Madre] Ethics? ETHICS? We doan’ gotta show you no steenkin’ ethics … [/Treasure of the Sierra Madre]


  13. Robert Wood
    Posted Apr 13, 2007 at 6:27 PM | Permalink

    I have never understood why climatologists do yearly, 5-yearly or any-yearly averages.

    I am an electronics engineer and one is always looking for short-cuts when performing signal processing in hardware.

    One of the first things one learns is that an averaging circuit is cheaper, and consumes less power, than a filter circuit.

    The next thing one learns is that the two functions are not the same.

    Averaging brings in all sorts of problems. The only correct way to seek a signal is with filtering.

    Why do climatologists not know this?

  14. tom s
    Posted Apr 13, 2007 at 6:28 PM | Permalink


  15. Pat Frank
    Posted Apr 13, 2007 at 6:32 PM | Permalink

    #14 — but filtering puts truncation ripples into your data. We generally average our data scans as the best way to reduce white noise without any processing artifacts. When adequate or better S/N is achieved, we can do things like Fourier filter the data to isolate various parts of the signal for more specific analysis.

  16. Robert Wood
    Posted Apr 13, 2007 at 6:32 PM | Permalink

    And no, before anyone chimes in, a rolling average is NOT a filter.

  17. Rod Montgomery
    Posted Apr 13, 2007 at 8:50 PM | Permalink

    #12: From page
    download the current NSF “Grant General Conditions, March 15, 2006”.

    Earlier versions are available from page

    On pages 27-28 of the current version, find

    [begin quote]
    38. Sharing of Findings, Data, and Other Research Products

    a. NSF expects significant findings from research and education activities it supports to be promptly submitted for publication, with authorship that accurately reflects the contributions of those involved. It expects investigators to share with other researchers, at no more than incremental cost and within a reasonable time, the data, samples, physical collections and other supporting materials created or gathered in the course of the work. It also encourages grantees to share software and inventions or otherwise act to make the innovations they embody widely useful and usable.

    b. Adjustments and, where essential, exceptions may be allowed to safeguard the rights of individuals and subjects, the validity of results, or the integrity of collections or to accommodate legitimate interests of investigators.
    [end quote]

  18. Rod Montgomery
    Posted Apr 13, 2007 at 9:26 PM | Permalink

    #18: But there is also, on pages 13-14 of the current version,

    [begin quote]
    18. Copyrightable Material

    a. Definition. Subject writing means any material that:

    1. is or may be copyrightable under Title 17 of the U.S.C.; and

    2. is produced by the grantee or its employees in the performance of work under this award.

    Subject writings include such items as reports, books, journal articles, software, databases, sound recordings, videotapes, and videodiscs.

    b. Copyright Ownership, Government License. Except as otherwise specified in the award or by this paragraph, the grantee may own or permit others to own copyright in all subject writings. The grantee agrees that if it or anyone else does own copyright in a subject writing, the Federal government will have a nonexclusive, nontransferable, irrevocable, royalty-free license to exercise or have exercised for or on behalf of the U.S. throughout the world all the exclusive rights provided by copyright. Such license, however, will not include the right to sell copies or phonorecords of the copyrighted works to the public.
    [end quote]

  19. tc
    Posted Apr 13, 2007 at 9:58 PM | Permalink

    Time for harball. Don’t take no for an answer from the NSF or any other federal agency. You have administrative rights and legal rights. You have to know your rights to get these agencies to act responsibly.

    If a federal agency turns down your request that the agency abide by its own rules, you have the right to appeal the decison. Appeal the NSF decison. If appeal is unsucessful, sue the agency.

    In regard to U.S. government agencies, like NSF, you have administrative rights and then legal rights.

    Step 1 Registered mail/return receipt request:
    First, one requests the agency to enforce their rules, regulations, or requirements, etc. in a specific case.
    Cost: first class stamp + registered mail cost/return receipt:

  20. tc
    Posted Apr 13, 2007 at 10:32 PM | Permalink

    #20 continued.
    Step 2 Appeal the denial decision: If a bureaucrat denies your request, then appeal the decision to a higher authority in the agnecy (or follow whatever is the agency appeal procedure). Exhaust your administrative (appeal) remedies with the agency.
    Cost: first class stamp+registered mail/return receipt

  21. tc
    Posted Apr 13, 2007 at 10:44 PM | Permalink

    #20 & 21 continued. Seem to be technical problems with the CA posting system. The last part of my post gets cut off. I’ll try here to finish my post.

    Step 3 Sue the agency: If the agency’s final decision denies your appeal, sue the agency in court to enforce the agency’s compliance with its own rule. However, under federal rules, in ome types of cases, successful planitiffs are automatically awarded payment of their legal costs. Cost: lawyers $$$$.
    Tip 1: Step 1 can often succeed if you show in your letter that you know. and intend to exercise, your administrative (appeal) rights and legal rights. Bureaucrats hate dealing with appeals of their decisions. After all, an appeal puts the bureaucrat in the spotlight. Bureaucrats do not like having their decisions reviewed by their boss or higher authority in the agency. So, your fist letter can produce results if you know how to reason with the bureaucrat.
    Tip 2: Similar to Tip 1, except the Step 1 first letter is signed by a lawyer on the lawyer office stationary. Cost: a few hundred $.

  22. Al
    Posted Apr 13, 2007 at 10:52 PM | Permalink

    Steve, you seem to run into ‘analog/printed only’ data a fair amount.

    I used to use a Mac program of some sort to auto-digitize other people’s data when they weren’t timely in responding. I can’t recall the name. But Google turned up something that _looks_ similar, and might be useful for you: plotdigitizer

  23. fFreddy
    Posted Apr 14, 2007 at 12:50 AM | Permalink

    tc, I think you are getting caught by the dreaded ‘less than’ sign.
    The software interprets this as thestart of an HTML tag; if it cannot interpret the tag, it throws away all subsequent text.
    Solutions: either say ‘less than’, or use the four characters & l t ; (without any spaces between).

    John A/Steve: this seems to be happening rather frequently – it might be worth adding a warning to the “Due to caching …” text below the preview area.

  24. John Baltutis
    Posted Apr 14, 2007 at 1:13 AM | Permalink

    According to Wikipedia:

    The National Science Foundation (NSF) is an independent United States government agency

    NSF’s director, its deputy director, and the 24 members of the National Science Board (NSB) are appointed by the President of the United States, and confirmed by the United States Senate. The director and deputy director are responsible for administration, planning, budgeting and day-to-day operations of the foundation, while the NSB meets six times a year to establish its overall policies.

    The current NSF director is Dr. Arden L. Bement, Jr. and the current deputy director is Dr. Kathie L. Olsen.

    If I, a U.S. citizen, filed the request for them to follow their own rules and provide specific data and coding, I’d send the request directly to the two directors. If it met with a denial, then I’d file the appropriate appeal, with copies to their boss, POTUS, my two senators, and my congressman.

  25. Al
    Posted Apr 14, 2007 at 10:14 AM | Permalink

    I made an earlier post, it must have been eaten.

    Regardless, I was just making the observation that there are free programs that take a scan of a plot and digitize it. (Or help digitize it). I think it was the link that got the last message flagged. I just googled ‘plot digitize free software’ or so and came up with something that looked reasonable.

  26. Steve McIntyre
    Posted Apr 15, 2007 at 9:57 AM | Permalink

    Notes on a similar fisaco at Guliya are at http://www.climateaudit.org/?p=958 http://www.climateaudit.org/?p=959 http://www.climateaudit.org/?p=947

  27. Posted Apr 16, 2007 at 12:48 AM | Permalink

    14, 17

    C’mon, averaging filter is a filter 😉 Not very impressive filter, specially if you have equiripple or something like that in your design specs. I think it is OK to use N-year moving average as a simple way for visualization, but those averaged results shouldn’t be used in data analysis. Correlation coefficients after N-year MA filtering are quite useless, see example here.


    This kind of problems can be avoided by using filtering theory ( example ) , but in that case signal and noise has to be modeled somehow (before looking at the data). That seems to be very difficult in climatology. But the alternative, using data variance adjustment and ad hoc smoothers is not very good way either.

  28. Mark T.
    Posted Apr 16, 2007 at 9:38 AM | Permalink

    I agree with UC. A “rolling average” is also known as a moving average filter. It has a low-pass response, though very wide, and all the taps are equal.


  29. MarkW
    Posted Apr 16, 2007 at 10:51 AM | Permalink

    From an electronics standpoint, a filter shunts the energy of a signal to ground, and hence, it’s gone. When you average a signal, the energy/information still exists, it’s just spread out over a longer time period. For periodic signals, this spreading allows negative portions to cancel out positive portions. If a period signal is an even multiple of the time over which you are doing your averaging, it’s possible to cancel out all of the signal. (Assuming the signal is symetric about the mean.) If the signal is not an even multiple, then most of the signal will cancel out, and the remaining, partial cycle will have it’s energy/information spread out over the averaging period.

    Back to my original point. In the electronics world, there is a difference between filtering and averaging. Filtering eliminates data, averaging spreads out data.

  30. Posted Apr 16, 2007 at 1:49 PM | Permalink

    NSF grant conditions are specified here:

    There are different grant conditions depending on the year in which the award was granted. In fact, though, all grants made since 1995 (possibly earlier—conditions prior to 1995 are not listed) include a condition entitled “Sharing of Findings, Data, and Other Research Products”. My interpretation of the condition is that researchers must share their data.

    So, I telephoned the NSF and spoke with someone in the Policy Office about this. (The person was Beth Strausser.) I was informed that my interpretation was correct, although there could be some dispute about how raw the shared data had to be. Apparently, partially-processed data might suffice; it seemed unclear as to how that would be judged. I then asked what should be done if a researcher refused to share data. Strausser said that in principle the program manager might get involved, but that in reality the sharing policy was “self policed” by scientists in the field. She briefly elaborated: the NSF relied on other scientists in the field to put (social) pressure on a researcher who refused to share data.

    I can imagine rare cases where that approach might not be fully effective. Perhaps the FOIAct would be of use in such cases? I telephoned the Justice Dept. and spoke with an attorney in the Office of Information and Privacy (which deals with FOIA matters). (The attorney was Anne Work.) I was informed that because the NSF does not have the data, there is very little chance that a request made under the FOIA would be achieve anything.

  31. DaleC
    Posted Apr 16, 2007 at 5:52 PM | Permalink

    re #14, 16, 17, 28, 29, 30 on moving average smoothing/filtering

    Most of my work in time series is with longitudinal social studies such as voting intentions leading up to a major election. The data comes from surveys, and thus any one time series is connected to any other because a case is not just a single value, but a set of interrelated values which together describe a respondent. In this context the term ‘filter’ is never used as a synonym for smoothing. To filter a series means to remove cases on the basis of some Boolean or relational criteria, such as ‘Intend to vote Conservative filtered to Males with income over 100k’. The time series for this would comprise only those cases where Gender=Male is true, and, within that set, where Income greater than 100k is also true. The filter is in this context is some such expression as ‘Gender=Male AND Income.gt.100000’.

    In my casual investigations of climate data I often use this sort of approach – for example, TMAX where TMAX is greater than 30C as a way of determining if heat waves are more or less prominent, or (TMAX+TMIN)/2 filtered to days where PRECIP is greater than 3mm, and so on. This is an excellent way to try to ferret out underlying patterns which would otherwise be totally obscured by a simple aggregate or annual average.

    The standard smoothing algorithm for survey data is a moving average, and it seems to me to be a very useful way to get a good overview of underlying low frequency movements. I take UC’s point in #28 that a moving average on random data can be jiggered to produce just about anything you like (and I have witnessed this in practice many times) but if there is really a trend (and for climate that is nearly always true) then with clean and consistent data a long moving average should show it. Analytically, the usual practice is to use a linear regression on unsmoothed data to confirm the direction of the smoothed series, and then adjust the width of the moving average window according to the level of detail required.

    However, temperature data sets are too often far from clean and consistent. One interesting application of the moving average algorithm is to diagnose chronologically displaced temperature data which has a seasonal signal. For example, if a block of winter temperatures are dated as being in a different season, then a moving average creates a visually obvious oscillation. Oscillation can also be caused by missing data, with the amplitude on rolled daily data being an indicator for how much an annual or monthly average will be distorted. See the top series in this chart between 1990 and 1996 for an example of oscillation induced by the patches of missing data from 1980 onwards.

  32. Steve McIntyre
    Posted Apr 16, 2007 at 6:02 PM | Permalink

    Dale, this looks interesting, but it would help if you spelled this out a little more. I can’t figure out the chart without more info.

  33. DaleC
    Posted Apr 16, 2007 at 7:16 PM | Permalink

    Re #33,

    Apologies for being unduly terse.

    The series under discussion is Indio Fire Station. Here is an annotated version of just that series. The large gap from 1982 to 1986 and the few smaller ones up to 1990 conspire to cause a seasonal displacement which takes 10 years to wash out.

    The point of the original chart in #32 is to get a broad overview. In the case of California there are many regional climates, so the overall impression is a bit messy. Utah here is a bit more consistent.

  34. tc
    Posted Apr 17, 2007 at 12:41 AM | Permalink

    Thanks Rod #18 and Doug #31 for the NSF information. Doug, thanks for showing how the NSF responded to your request.

    Here’s one way to beat this NSF diversion and shirking of responsibility. Regardless of any hopes that NSF has for outside forces to self-police, the NSF has its own responsibilities to enforce compliance of NSF grant conditions for sharing data.

    1. If someone wants data from a particular NSF-funded study, first find the name of the NSF project manager for the study.

    2. Then, by certified registered mail/return receipt requested, send a request for data to the NSF-funded researcher with carbon copy to the NSF project manager.

    3. If the NSF-funded researcher refuses to share the data, then, by certified registered mail/return receipt requested, send a letter to the NSF project manager providing documentation of the researcher’s refusal to share data and requesting that the project manager enforce compliance with NSF Grant General Condition on Sharing of Findings, Data, and Other Research Products.

    4. If the NSF project manager does not enforce compliance, then ask the NSF project manager for the name of the NSF official that you can appeal this decision not to enforce compliance.

    5. If notice that you intend to appeal does not cause the NSF project manager to reconsider the decision, then by certified registered mail/return receipt requested, send an letter to the NSF official appealing the decision of the project manager.

    By the way, a similar case of shirking responsibility (passing the buck) is described by Willis at:

    Climate Dynamics Passes the Buck

    See especially Willis #4 response.

  35. Willis Eschenbach
    Posted Apr 17, 2007 at 1:02 AM | Permalink

    tc, sounds good … but how do you find out the name of the NSF project manager for the study?


  36. DaleC
    Posted Apr 17, 2007 at 6:46 AM | Permalink

    re #33,

    Here is a much better example of oscillation induced by seasonal displacement. The chart shows Sydney Australia daily TMAX for the 20 years to 2000, with a moving average of 365 days in black, which looks completely reasonable because all the seasons line up, and then at a moving average of 365+91=456, which oscillates like crazy. This is deliberately induced by using a roll which is one year plus one season. My point about diagnostics is that if such an oscillation is apparent when using a roll which is bounded by an annual unit (52 if by week, 365 if by day for annual, or 520 if by week, 3652 if by day for decades, etc) then that is a sure sign that the data is screwed up somewhere either by gaps, improper interpolation, or block displacement, or whatever.

    This leads me to the opinion that working with daily data on annually bounded rolls is a lot safer than working with annual or monthly averages. It also has the side benefit of making it very easy to spot aberrant outliers by simply turning the roll off.

  37. tc
    Posted Apr 17, 2007 at 11:42 PM | Permalink

    Willis #36 – Here is one way to find the NSF Program Manager for a NSF study.

    STEP 1 – Go to NSF Award Search Introduction webpage where 4 types of search are described. Two types of searches of immediate interest are: 1) the Program Information search (Figure 3) that includes Program Officer, and 2) the Awardee information search where you can query by Principal Investigator and limit search to Active Awards Only, Expired Awards Only, or Historical Awards.


    STEP 2 – Click on “Search for Awards” near top of webpage, and go to NSF webpage that lists the steps to Search for Awards.

    STEP 3 – Click on “URL” under “1.”, and go to Award Search webpage.

    STEP 4 – To test the search function Award search webpage, under the default Awardee Information tab, under Principal Investigator I entered Michael Mann. I accepted the default “Active Awards Only” checked box at bottom of page. Then I clicked “Search”.

    STEP 5 – The search results are then displayed at the bottom of the screen. For Michael Mann there are two awards listed, one for $100,000 and another for $299,761. The larger award is for “Analysis and Testing of Proxy-Based Climate Reconstructions” Award Number 0542356.

    STEP 6 – In the left column, click on the Award Number 0542356. Viola! The Award Abstract page appears with the name of the Program Manager.

    STEP 7 – Go to NSF Staff Directory search page. Type in name of Program Manager to get contact information.

  38. Willis Eschenbach
    Posted Apr 18, 2007 at 1:54 AM | Permalink

    Thanks, tc, I appreciate it.


  39. tc
    Posted Apr 18, 2007 at 5:11 AM | Permalink

    Willis #39 – Regarding #38, I need to add, as the last line of Step 1:

    “In the left side table of contents, click on “Award Search and Proposal Deadline”.

One Trackback

  1. […] ice core data, see also “Juckes, Yang, Thompson and PNAS: Guliya” (CA 12/3/06), “Dunde: Will the real Slim Shady please stand up?” (CA 4/12/07), “More Evasion by Thompson”. Possibly related posts: (automatically […]

%d bloggers like this: