“Full, True and Plain Disclosure” and Falsification

"Full, true and plain disclosure" is a fundamental obligation in the offering of public securities. As someone with experience in this field, I’ve been reflecting for some time about the following questions:

Is there a duty of “full, true and plain disclosure” or its equivalent in science? If so, how is it expressed in journal policies and science codes of conduct? If not, should there be such a duty?

In our EE article earlier this year, we touched on this question, but we did not connect the question to the language of institutional codes of conduct. I do so here.

In offering securities to the public, stock promoters have an obligation of “full, true and plain disclosure” — with each word being important. For public corporation, this is an ongoing obligation: a corporation has an obligation to disclose material adverse results, when they occur, not just when they are doing a securities offering. The existence of a law doesn’t mean that people don’t break it, but at least there’s a standard to which people can be held accountable.

The hardest part is “full” disclosure. This means prompt disclosure of all relevant adverse information. This is different than “don’t ask, don’t tell” discloure. It’s not enough that everything in a prospectus is “true”; if something important is left out, the obligation of full, true and plain disclosure has not been met.

Not many corporate executives will participate in concocting overtly false information, but there are obviously many temptations even for essentially honest people not to disclose bad news or to cut corners. The usual rationalization is materiality: faced with bad results, there’s a real temptation to try to convince yourself that the adverse results didn’t “matter”. Or alternatively to delay in the hopes that you get some good results. But it’s a good idea to chin up and take your medicine. I’m very unsympathetic to the behavior of UCAR with respect to the Wahl and Ammann articles. They made a press release announcing the submission of two Wahl and Ammann articles (on the same day that Ross and I were making a presentation in Washington). GRL has rejected the Wahl and Ammann submission but UCAR has not announced this and has left the submission on the UCAR website without any comment. Academics seem unfazed by this, but, as someone with experience with speculative mining stocks, I am dumbfounded by it.

When you do a prospectus, you have to swear an affidavit that the prospectus contains full, true and plain disclosure. Your lawyer pauses over each word. For corporate officers and directors, the duty is very serious and exposes them to personal liability.

Academic Journals

One of the first things that I noticed when we submitted an article to Nature last year was that you had to sign an affidavit disclosing conflicts of interest, but you did not have to sign an affidavit warranting full, true and plain disclosure or its equivalent. (This was an anthropological observation about a different culture if you will.)

I’ve recently examined the publication policies of Nature or Science (which are typical of science journals) and been unable to locate any explicit policy of full, true and plain disclosure or its equivalent. As far as I can tell, neither do the instructions to reviewers for science journals state that reviewers have any obligations to ensure full disclosure.

This is not to say that the journals encourage the opposite — merely that the issue is not front and center. Further, if scientists have a duty of full disclosure, it does not arise under policies of science journals for authors, but must arise elsewhere. There’s an interesting discussion of the role of journals in science misconduct here.

Institutional Policies

If journals do not have explicit policies of full, true and plain disclosure, such a duty, if it exists, must arise elsewhere. The U.S. federal government has a research code of conduct, as do most universities and professional societies. I’ve examined quite a few of these codes and most of them are surprisingly similar. I’ll look first at the reseach code of conduct of the U.S. federal government.

It defines three classes of scientific misconduct: fabrication, falsification and plagiarism, and states that misconduct does not include honest error or honest differences in interpretations or judgments of data. The definition of “falsification” is broader than one might think and covers omissions as well as overt invention of data (which is covered under “fabrication”). Here’s the U.S. federal definition:

Falsification is manipulating research materials, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record [my bold]

It’s not hard to interpret this definition of “falsification”, which includes material omission, as placing a full disclosure obligation on scientists fairly comparable to the “full, true and plain disclosure” obligation on stock promoters. I find the term “falsification” to be pretty unhelpful relative to the term "full disclosure" in securities offerings. It seems to me that a scientist could easily interpret a prohibition against falsification as being limited to the prohibition of positive acts (which is really dealt with under “fabrication”) and that he might not realize that selective omissions were a form of falsification. The term “full true and plain disclosure” seems less likely to be misconstrued.

A definition of misconduct including the term “falsification” is very widespread among codes of conduct for other institutions and universities. Usually the above or a very similar definition of falsification is included (but not always.) For example, neither Stanford University nor the National Science Foundation nor a joint statement by the Director General of the Research Councils and the Chief Executives of the UK Research Councils define falsification, while the University of Massachusetts and most universities do. The University of Massachusetts also has a clause prohibiting “misrepresentations in publication”, which appears to be broader than most university codes of conduct.

Case Studies

There is a considerable amount of U.S. case law involving the term “falsification”, much of it relating to application forms. Omission of material information in application forms is usually held to be “falsification”.

It’s hard to find much discussion of case studies involving omission of data by scientists, as opposed to cases involving plagiarism or making up data. This, in itself, is interestingly different from business situations, where omission of information is usually the issue and deserves an explanation. To someone with a business background, the apparent preoccupation of academics with plagiarism, relative to full disclosure, appears rather precious. To some extent, the battles are about vanity and personal “property” interests, rather than about protection of the public.

Here is one of the few cases that I could locate, considered under a somewhat different code of conduct:

Engineer A is performing graduate research at a major university. As part of the requirement for Engineer A to complete his graduate research and obtain his advanced degree, Engineer A is required to develop a research report. In line with developing the report, Engineer A compiles a vast amount of data pertaining to the subject of his report. The vast majority of the data strongly supports Engineer A’s conclusion as well as prior conclusions developed by others. However, a few aspects of the data are at variance and not fully consistent with the conclusions contained in Engineer A’s report. Convinced of the soundness of his report and concerned that inclusion of the ambiguous data will detract from and distort the essential thrust of the report, Engineer A decides to omit references to the ambiguous data in the report. Question: Was it unethical for Engineer A to fail to include reference to the unsubstantiative data in his report?

This was considered under a code of ethics that was a little more defined than most university codes of conduct, but I don’t think that the differences are material to the conclusion, including the following terms:

Section II.3.a.:"Engineers shall be objective and truthful in professional reports, statements, or testimony. They shall include all relevant and pertinent information in such reports, statements, or testimony."
Section III.3.a.:"Engineers shall avoid the use of statements containing a material misrepresentation of fact or omitting a material fact necessary to keep statements from being misleading; statements intended or likely to create an unjustified expectation.
Section III.11.:"Engineers shall cooperate in extending the effectiveness of the profession by interchanging information and experience with other engineers and students, and will endeavor to provide opportunity for the professional development and advancement of engineers under their supervision."

The adjudicators concluded under section II.a.3 that:

The engineer must be objective and truthful in his professional reports and must include all relevant and pertinent information in such reports. In the instant case, that would suggest that Engineer A had an ethical duty to include the unsubstantiative data in his report because such data were relevant and pertinent to the subject of his report. His failure to include them indicates that Engineer A may have exercised subjective judgment in order to reinforce the thrust of his report.

Here I think that the emphasis on “subjective judgement” is interesting. Under section III.3.a, they concluded:

In a sense, Engineer A’s failure to include the unsubstantiative data in his report caused his report to be somewhat misleading. An individual performing research at some future date, who relies upon the contents of Engineer A’s report, may assume that his results are unqualified, uncontradicted, and fully supportable. That may cause such future research to be equally tainted and may cause future researchers to reach erroneous conclusions.

Under section III.11, they concluded:

We do not see how Engineer A could be acting consistently with that provision by failing to include the unsubstantiative data in his report. By misrepresenting his findings, Engineer A distorts a field of knowledge upon which others are bound to rely and also undermines the exercise of engineering research. Although Engineer A may have been convinced of the soundness of his report based upon his overall finding and concerned that inclusion of the data would detract from the thrust of his report, such was not enough of a justification to omit reference to the unsubstantiative data. The challenge of academic research is not to develop accurate, consistent, or precise findings which one can identify and categorize neatly, nor is it to identify results that are in accord with one’s basic premise. The real challenge of such research is to wrestle head-on with the difficult and sometimes irresolvable issues that surface, and try to gain some understanding of why they are at variance with other results.

The Board concluded that it was unethical to fail to include the unsubstantiative data.

The Office of Research Integrity has several cases.

Some Questions to Readers

In our EE article, we drew attention to some areas in MBH98 where the disclosure practices seemed highly questionable, particularly the following:

1) Withholding the R2 and other verification statistics for the early steps. It seems to me that omission of the 15th century R2 (~0.0) was selective.
2) Withholding the results without the bristlecones (the CENSORED file).
3) Related to but distinct from 2, the claim that the MBH98 reconstruction was “robust” to the presence/absence of all dendroclimatic indicators, when the author knew that it was not robust even to the presence/absence of the bristlecones.
4) Withholding the information that the Gaspé series had been edited and that the editing of this series was material to early 15th century result. Separate but related is the misrepresentation of the start date of this series.
5) The misrepresentation of the principal components method. In our EE article, we allowed that it was possible that this was simply a computer programming error, but realclimate argued in early January that it was not a computer programming error, but a misrepresentation.

I’ve tried to look at the issue of “full true and plain disclosure” as it applies to scientists from first principles, because I’ve been puzzled by the anthropological differences in culture. As someone used to a "full true and plain disclosure", I find the apparent withholding of adverse results by MBH98 to be very disquieting, but this issue has occasioned almost no interest in academic communities (who are mostly interested in principal components methodologies as far as I can judge from comments submitted to GRL). On the other hand, disclosure issues are instantly understandable to civilians. I think that academic practices on disclosure need upgrading, especially when it comes to scientific prospectuses such as IPCC assessment reports. Needless to say, archiving source code and data as used is a helpful audit trail in any event and easy to implement at a journal level.

Despite the lack of traction with academics, these disclosure issues have obviously attracted the interest of the House Energy and Commerce Committee, who are very familiar with disclosure and due diligence issues (after all, they did Enron hearings.) It will be interesting to see what happens. In the mean time, I would be interested in feedback on the more general issues pertaining to codes of conduct and journal practices.

Notes: see http://www.onlineethics.org/cms/7730.aspx


  1. Louis Hissink
    Posted Jun 28, 2005 at 5:38 PM | Permalink


    I wonder whether this whole issue is actually a case of pathological science, as defined by Langmuir some years back, where data are close to the limits of detection (here we have thermal changes in the order of decimals of a degree Celsius).

    “Langmuir described typical cases as involving such things as barely detectable causal agents observed near the threshold of sensation which are nevertheless asserted to have been detected with great accuracy. The supporters offer fantastic theories that are contrary to experience and meet criticisms with ad hoc excuses. And, most telling, only supporters can reproduce the results. Critics can’t duplicate the experiments.”


  2. Michael Jankowski
    Posted Jun 28, 2005 at 6:33 PM | Permalink

    As I’ve said before, it amazes me that the statistical validity of Mann’s reconstructions have been questioned far less than that of the relatively inconsequential graduate school MS research of myself and my peers. Had I presented my results, however preliminary, as part of an informal lunchtime seminar series, the first questions I would’ve been asked would’ve had to do with verification statistics.

  3. Posted Jun 28, 2005 at 6:46 PM | Permalink

    Steve M.,

    Here is an interesting link,


    Near the bottom one of the posters has a list of all the code files at Mann’s FTP site. The root directory is


    and the following subdirectoris have all the code,


    Just curious if this is all the source code or not?

  4. Steve McIntyre
    Posted Jun 28, 2005 at 7:13 PM | Permalink

    Steve V, I’ve been through all of these programs a long time ago. The TREE/ITRDB and TREE/STAHLE programs are just principal component calculations for the different networks. None of these calculations pertain to any of the matters in question. It’s pretty weird that a different Fortran program is shown for each nework. You’d think that this stuff would be parameterized. I can do all these calculations in about 8 lines of R code.

    The TREE/COMPARE code pertains to MBH99 and, in particular, to the coercion of the NOAMER PC1 to the Jacoby treeline series. This is his supposed adjstment for CO2 fertilization in MBH99 but it is a nonsense adjustment and isn’t applied to MBH98 anyway. I have no idea why it’s in the MBH98 archive.

    He didn’t use the GISP series in either study so this doesn’t affect anything. I haven’t checked this program to see what it does, but I will do so sometime. The INSTR/PREC is another PC program, but he doesn’t use precipitation instrumental PCs, I’ll check this for standardization methods though.

    None of this is the requisitioned source code. When Mann said that he was not going to be “intimidated” into disclosing his source code – it was because he hadn’t disclosed his source code.

    Cheers, Steve Mc.

  5. Posted Jun 28, 2005 at 10:44 PM | Permalink

    Steve M.,

    First, let me say I wasn’t calling into doubt your claims, but just verifying that all those programs were something you had seen before.

    I also agree that it is weird to have programs for all the different series. It is either sloppy code writing or it is an attempt to backfill or something. Since the same procedure is to be applied to each series writing a single bit of code that goes out and grabs different data sets each time you run it makes the most sense.

  6. Paul Gosling
    Posted Jun 29, 2005 at 4:31 AM | Permalink


    1) It is quite reasonable to exclude anomalous data if you are reasonably confident of the cause of the anomaly.

    2) Even if you are not, if an anomalous piece of data is likely to result in the failure of a piece of work to get published because journals don’t generally publish negative results, then people will leave it out. No papers, no funding, no job.

    3) I am puzzled by your constant reference to business as an example of full and fair disclosure to which science should aspire. The news is constantly full of revelations that this or that business has falsified data to make their situation look better than it is. As its your field I am sure you are aware of the recent downgrading of petroleum reserves by Shell, twice, in a year.

  7. Steve McIntyre
    Posted Jun 29, 2005 at 5:50 AM | Permalink

    Re #6 –
    Paul, you’re mis-construing my point. I’m pointing out that there are many temptations not to give full, true and plain disclosure and society has long ago adopted a policy that if you are a public corporation, this is a legal obligation. My point is that this provides a standard. Take the Shell downgrading (about which I don’t know anything) – it would be entirely appropriate for a securities commission to investigate this downgrading and determine whether there were breaches of full, true and plain disclosure obligations. If there were, the chips should fall where they may. If you didn’t have a full, true and plain disclosure obligation, you would have no recourse. I made a similar point in my February National Post op-ed – see http://www.climateaudit.org/index.php?p=66.

    Apply your second point – "no papers, no funding, no job" to raising money for speculative mineral exploration and pose the following question to a mining promoter: "if an anomalous piece of data is likely to result in the failure of a prospectus to get cleared because securities commissions don’t generally clear offerings with problems, then people will leave it out. No prospectus, no funding, no job". What’s the difference? Well, when the stock offering goes south and people find out that you’ve left out relevant information, you can get in big trouble. You put the adverse information in. In fact, securities commissions understand that there are risks in speculative exploration and do not require that there be no risk – they require that the risks be disclosed. If journals have created an environment where scientists are failing to provide full disclosure, then maybe that’s the problem. What you’ve just said sure says that there is a problem. In passing, it’s struck me that a reviewer’s first obligation should be to ensure that there is full, true and plain disclosure, but that reflects my background. Cheers, Steve

  8. Steve McIntyre
    Posted Jun 29, 2005 at 6:11 AM | Permalink

    Steve V, I didn’t mean to seem chippy. It’s a fair enough question about these programs, but in our articles we have actually cited specific url’s for the code for the North American tree ring principal components calculation (we’ve tried to be quite exact in these references). So the internet chatter which ignores these prior references and pretends that the requested programs have always been in plain sight is a little irritating. I appreciate that you’ve clarified this point at debunkers, but it’s rampant elsewhere. BTW contra your debunkers post, I had been through all the principal components and COMPARE programs a long time ago. The only one that I haven’t been through is the GISP program.

    I entirely agree with the code criticism. In our EE article, I showed R-code to do the Mann PC calculations; it took about 4 lines. Also as we point out in our GRL article, when you write Fortran code, it’s easy to make mistakes. The short-segment standardization turns up as code in a do-loop deep in the program – it could easily have been an error.

    I’m old enough that I took a course in Fortran programming at university (in 1967) and did some summer jobs involving Fortran programming. When I started back doing this research in 2003 (not having done any computer programming for over 30 years), I stumbled across R as a language, which was a vey lucky turn of events. It is a remarkable language. I marvel at how much you can do. But when I confronted Mann’s Fortran code, because I’d learned Fortran when I was young, I could work through it pretty easily. One of the few advantages of being of a certain vintage.
    Cheers, Steve Mc

  9. Alex Avery
    Posted Jun 29, 2005 at 8:25 AM | Permalink

    Steve, I couldn’t agree more, having been involved in trying to rectify a couple of cases of critical “omissions” in environmental science (agriculture) in both Science and Nature. Albert Einstein has a perfect quote on this that I’ve posted on the wall above my desk — I first saw it this spring while entering the National Academy of Sciences new building in Washington DC to give a symposium lecture on the fallacies and misrepresentations of organic farming and food:
    “The right to search for truth implies also a duty; one must not conceal any part of what one has recognized to be true.”
    He was an incredibly wise man, on top of being ridiculously intelligent.
    Alex Avery
    Hudson Institute

  10. Posted Jun 29, 2005 at 11:58 AM | Permalink

    2) Even if you are not, if an anomalous piece of data is likely to result in the failure of a piece of work to get published because journals don’t generally publish negative results, then people will leave it out. No papers, no funding, no job.

    3) I am puzzled by your constant reference to business as an example of full and fair disclosure to which science should aspire. The news is constantly full of revelations that this or that business has falsified data to make their situation look better than it is. As its your field I am sure you are aware of the recent downgrading of petroleum reserves by Shell, twice, in a year.

    Funny, 2 & 3 look inconsistent to me. In 3 we have businesses falsifying data as bad, hence the requirements of disclosure, but in 2 this is too stringent for science. Which is it…science should be held to high standards of disclosing all data or not and scientists could, at least theoretically, get away with all kinds of fraud.

    Further, considering that this science in question could impact daily lives of billions of people worldwide isn’t it at least reasonable to see their work and not just their results? How much will global warming policies cost? Hundreds of billions? Maybe even trillions? The U.S. economy alone has $10 to $11 trillion GDP. Even an impact as little as 0.1% for 50 years half a trillion dollars (not including economic growth). So what is actually more curious Paul is your willingness to take this stuff on little more than faith.

  11. John Hekman
    Posted Jun 29, 2005 at 12:18 PM | Permalink

    Steve McIntyre:
    The other thing that you have made quite clear, but which is always misrepresented by your attackers, is that you are not proposing an alternative PC model of temperature. PC is a terrible tool for doing what MBH did. PC has been used, in my experience, to sort out the contribution of a number of disparate variables. MBH use it to pick from a lot of data series, all measuring the same possible proxy for temperature, to find one that produces some correlation. This is flat wrong. In statistics, they say that a 95% confidence interval means that there is a 5% chance that the rejection of the null hypothesis is wrong. MBH hunt through more than 20 data series for the one that can produce significance. They find the anomaly!! (As you have shown so thorougly.) It seems to me that what the dendrochronologists should do is to construct a model that can explain tree ring size with variables such as temperature, rainfall, sunspot cyle, regional dummies, altitude dummies and species dummies. The parameters of this model could be used (if significant) to back-cast temperature. Has this been done?

  12. Dave Dardinger
    Posted Jun 29, 2005 at 12:47 PM | Permalink

    Poor PC is getting way overloaded; Personal Computer, Political Correctness, Principal Component and probably more. We need to convert them all to TLAs to ease the mental burden.

  13. Posted Jun 29, 2005 at 3:41 PM | Permalink

    In statistics, they say that a 95% confidence interval means that there is a 5% chance that the rejection of the null hypothesis is wrong. MBH hunt through more than 20 data series for the one that can produce significance. They find the anomaly!!

    Uhhhmmmm, no this is a common misconception. Confidence intervals are a Frequentist concept, and in Frequentist statistics when you calculate a confidence interval (CI) there is no more randomness left. The data is not random. The parameter is not random. The critical values are not random. Hence, there is nothing in a CI that is random. Hence probabilistic statements about a CI are incoherent from a Frequentist perspective. The 95% confidence means just that, you are confident at the 95% level. That is, given a large number of such intervals with different data 95% of such intervals will contain the parameter of interest. Given any single CI the parameter of interest is either in the interval or not.

    What I think you are referring to is the concept of a P-value. A P-value of 5% means that you’d expect to get the observed result randomly 5% of the time. Thus if you had 101 series and designated one the dependent variable and then ran 100 simple regressions on that dependent variable with the remaining 100 you’d expect to get about 5 regression results with statistically significant results.

  14. DrKewp
    Posted Jun 29, 2005 at 10:50 PM | Permalink

    I’m the poster whom published the list of source code over at the JREF. I posted the list of code as well as its location as a response to a claim that Mann had not been forthcoming with providing his methodology, code or data. I demonstrated that this was not the case, as far as I was aware. The claim now is that some portion of the code is being witheld, but I had been unable to get a straight answer as what it is exactly. I’ve since found some more details of the specifics of the challenge on this site.

    Steve M., I think (hope actually) that much of your complaints stem from a basic ignorance of how modern scientific research operates. Mann, etal., are under absolutely no obligation to provide their source code, notes and other ancillary data to you or anyone else regardless of their funding sources. They are obligated, however, to release their data and methodology so that others may reproduce their work. Which they have clearly done as the Wahl-Ammann reconstruction via original code has proven. This would simply not be possible if MHB were witholding elements of their methodology or experimental data. That others, including yourself, have been unable to reproduce MHB98’s results is more indicative of a general failure to adhere to the stated procedure.

    Now if you have evidence that MHB and Wahl-Ammann have committed fraud, either via deliberate falsification of data and/or collusion please present it. Otherwise I would suggest a re-evaluation of your current methodology.

    Thank you for dropping by. I would urge you to read our 2005 GRL and EE papers (see links in right frame). In them, you will see a detailed discussion of the code at Mann’s FTP site, including specific references to the pca-noamer.f program. In fact, it was only by parsing through this program that we were able to see why Mann’s PC results were so out-of-line with our attempts to replicate his PC calculations: there was what appears to be a subscripting error, certainly not beyond the realms of possibility in something programmed in as ugly a fashion as these calculations. Mann has since tried to argue that the subscripting was intentional, even if the calculation is not then “standard”. This of course flies in face of the representation in MBH98 that “conventional” PC methods were used. So in this area, Mann is on the horns of a dilemma: either he made an error or a misrepresentation.

    It has consistently been my position, that, whatever the fine points of academic protocol on disclosing source code, once Mann’s reconstruction is applied in large-scale social policy, it should be subject to engineering-level scrutiny. Mann himself was an IPCC author for the very section that highlighted his journal work. As far as I’m concerned, once he instigated/agreed to/acquiesced in the highlighting of his work by IPCC either at a chapter level or summary level, he waived whatever privilege he might have had for his code not to be examined.

    Further, important discrepancies between Mann’s prior accounts of methodology and his actual methodology have been proven, including both the discrepancies acknowledged in the Corrigendum, the unacknowledged discrepancies which caused Nature to require the issuance of completely new SI (I am unaware of prior precedents for this) and still further in the discrepancies observed in our GRL and EE articles. In business, people would long ago have lost patience with him and forced him to produce every line of code. Once you see one problem, you have to see if there are others. I am dumbfounded at the passive acquiescence of academics and academic journals in his continuing refusal to produce the balance of code so that it can be examined as well.

    It’s important to understand that most of the code for MBH98 is still missing.

    I examined the code in the MBH98/COMPARE directory at the FTP site a long time ago and determined that it does not pertain to MBH98. These programs pertain to a supposed “adjustment” to the NOAMER PC1 for CO2 fertilization that was done in MBH99 for the PC1 used in the 1000-1399 step. They do not pertain to MBH98 (there are problems with these calculations, but that’s another story).

    The programs at the FTP site do not contain code for any other steps of MBH98 e.g. the calculation of reconstructed temperature PC series from proxies, the calculation of the NH temperature index from reconstructed temperature PCs, the calculation of the number of tree ring PCs to retain under Preisendorfer’s Rule N (now said to have been used in MBH98), the selection of tree ring site chronologies according to the criteria said to have been usd, etc. etc. as listed here, here and here.

    As to Wahl and Ammann, I know exactly what they’ve done and not done, as I have replicated in detail exactly what they’ve done — see here. Their submission to GRL has been rejected. If they were subject to standards applicable mining promoters, having issued a press release that they had submitted articles to GRL and CC, they would have to announce that GRL had rejected their submission. At this point, their failure to announce the GRL rejection reflects very poorly on their ethical standards.

    Secondly, Wahl and Ammann’s replication did something very strange. Rather than using Mann’s temperature PCs (about which there was no active controversy — although von Storch et al 2004 pointed out a gross error in their failing to take a square root of cosine latitude prior to PC calculations), they used temperature PCs calculated using annual data. I re-did calculations applying their algorithm to the original Mann temperature PCs and the reconstructed RPCs in the 15th century were identical to 10 decimal places, as I’ve reported. There was a very slight difference in re-scaling RPCs. Here Wahl and Ammann modified their algorithm in April 2005. I’ve made a slight modification to my algorithm and can completely replicate WA results (but not MBH results).

    Further, the available record has changed substantially during this long drawn-out procedure. Wahl and Ammann use an extensive amount of information from the Corrigendum Supplementary Information, which did not exist until July 2004 and which was only created in response to our work. The MBH98 directory now publicly accessible at Mann’s FTP site was almost certainly not publicly indexed or available prior to our work. There were no links or public references to this directory, although Mann linked to many other directories at his webpage; both Mann and Rutherford were unaware of this directory in April 2003 and again in September 2003; Mann directed us to a different directory on his FTP site (now deleted). My belief is that this directory was only made publicly indexed in late Oct/early Nov 2003.

    You argue that Mann et al are only obliged to provide sufficient information about their methodology to permit replication, and that the replication by Wahl and Ammann is proof. Wahl and Ammann make no attempt to replicate many steps in MBH98. I’ve listed a number of examples where application of the reported methodology does not yield the reported results e,g, selection of gridcells, selection of tree ring chronologies, selection of tree ring PCs, reconstructed RPCs etc.. etc. Because these matters cannot be replicated using the verbal descriptions, I have sought to see the source code that supposedly generated these results.

    I had long ago emulated MBH results to the degree that WA have. I am intrigued by the remaining differences. Why do they exist? I’m a big believer in examining details as sometimes they direct you to more substantial issues. There is ample precedent for this in scientific history. No self-respecting scientist can look at two squiggly lines and argue that the details “do not matter”. There are a host of unresolved details — what accounts for these differences? Who knows — but there’s no reason not to find out. I think that we’ve already identified the biggest defects of MBH98, but the reluctance of Mann et al to disclose the bulk of their code makes me wonder whether there’s something still in the cupboard.
    WA also do not address the issues that we identified as among the most important: the catastrophic failure by the 15th century MBH98 reconstruction of the R2 and other verification statistics; its non-robustness to the presence/absence of bristlecone pines.

    As I mentioned above, one should be able to do engineering-level replication of MBH given its application in social policy. This is the reason for requiring close examination of all details — not just because of concerns about fraud. There are plenty of other types of defects aside from fraud — e.g. mistakes, unintentional misdescriptions etc. etc.

    I’ve posted extensively on these matters and you can see comments about Wahl and Ammann in the May archives. I hope that this is helpful to you and to your chatline. Steve Mc.

  15. Paul Gosling
    Posted Jun 30, 2005 at 5:02 AM | Permalink

    Steve V Re 10

    I was merely pointing out the way science publishing and funding works. In most cases it does not matter as the falsified or poor work will be forgotten once proved wrong. Climate science may be different becasue of what lies at stake, and I have suggested elsewhere on this blog that maybe it should undergo a differnet review process. This would of course cost a lot more.

    As for costs, what are our priorities. The US government has spent over $200 billion in Iraq and looks like being there for another 10 years.

  16. Stephan Harrison
    Posted Jun 30, 2005 at 8:17 AM | Permalink

    Am I right in thinking that it doesn’t matter one bit if MBH98 is flawed, since it’s clear that AGW is happening, and similar results have been found using a number of proxy indicators?

  17. Posted Jun 30, 2005 at 11:49 AM | Permalink

    I was merely pointing out the way science publishing and funding works. In most cases it does not matter as the falsified or poor work will be forgotten once proved wrong. Climate science may be different becasue of what lies at stake, and I have suggested elsewhere on this blog that maybe it should undergo a differnet review process. This would of course cost a lot more.

    I think the current review process is probably fine, but also should include the release of all coed, data, etc. I agree that for most scientific research this is probably overkill since much of the research will have little or no meaningful impact. GW/CC research on the other hand could conceivably come with a very high price tag, is often funded with taxpayers dollars, and probably should face a higher level of scrutiny.

    Clearly based on McIntyre and McKitrick’s research there are some issue with this research and releasing the relevant information (if it still exists) would go along way towards resolving the issue.

    As for costs, what are our priorities. The US government has spent over $200 billion in Iraq and looks like being there for another 10 years.

    Quite true, but even if the Iraq policy is completely foolish and unsupportable, does that mean we should run full steam ahead into other completely foolish and unsupportable policies? This sounds like 3rd grade play ground logic. “But he did it fiiiiiiirst!”

    Am I right in thinking that it doesn’t matter one bit if MBH98 is flawed, since it’s clear that AGW is happening, and similar results have been found using a number of proxy indicators?

    Granted there is alot that goes into the IPCC reports, but the Hockey Stick is one of the most important pieces of evidence used in presentations on GW/CC. Further, it was a main component of the TAR (the IPCC’s Third Assesment Report). Mann himself was one of the lead authors on the TAR. So, if the Hockey stick turns out to be unsupported then it does have bad implications for at least part of the TAR. It also means an important piece of evidence just fell apart. It doesn’t mean there is no GW/CC, but the evidence in favor of that hypothesis would be weakend. This latter conclusion is undeniable. So while is might not “prove there is no GW/CC” it sure would change the nature of the debate, IMO.


    Did you finally read MBH98 yet? Did you find the section that mentions Preisendorfer?

  18. Michael Jankowski
    Posted Jun 30, 2005 at 11:53 AM | Permalink


    “Am I right in thinking that it doesn’t matter one bit if MBH98 is flawed…”

    That’s a cry of retreat being heard more-and-more often, as the flaws in MBH98 become clearer-and-clearer.

    The hockey stick is often used to show that the 20th century is the warmest of the past 1000 yrs, with the 1990s likely the warmest decade. Because of those conclusions (and the otherwise flat shape of the hockey stick), MBH98 is often used to justify the claim that a significant portion of 20th century warming was anthropogenic.

    “…since it’s clear that AGW is happening…”

    If it’s that clear without MBH98, why is the hockey stick so widely used to promote the idea AGW is happening? Can you tell us how much of the 20th century warming was due to AGW?

    “…and similar results have been found using a number of proxy indicators?”

    Many of the reconstructions coming to similar conclusions to MBH were far from independent, with either Mann, Bradley, or Hughes as contributing authors. And at least one of the “proxy indicators” itself raises issues. For example, the authors of MBH98 did a reconstruction which excluded the bristlecone pine proxies. Without using the bristlecone pine data, MBH98 didn’t get the hockey stick (a finding MBH98 apparently suppressed). You can find these discussed in detail elsehwere on this site.

  19. Peter Hartley
    Posted Jun 30, 2005 at 1:25 PM | Permalink

    “Am I right in thinking that it doesn’t matter one bit if MBH98 is flawed…” If you look at Figure 12 in the Veizer Geoscience Canada, March 2005 paper (mentioned on this site in another thread and available here ) you can see why the MBH view of climate history in the past 1000 years is so important to the IPCC thesis that CO2 is the principal driver of average global temperature. The path of CO2 change over that period surely follows a “hockey stick” shape. If temperature does not, the IPCC thesis is in trouble. Hence, even if the mechanics in MBH are wrong and that makes the paper invalid the IPCC needs the conclusions MBH reached to be right.

  20. Jo Calder
    Posted Jun 30, 2005 at 3:29 PM | Permalink

    On “Institutional Policies” — it seems that codes of conduct for academics are often the purview of their professional bodies. I guess a career as a mathematician (AMS code of conduct) or whatever is typically longer than employment at a particular institution.

  21. Greg F
    Posted Jun 30, 2005 at 3:30 PM | Permalink


    Am I right in thinking that it doesn’t matter one bit if MBH98 is flawed, since it’s clear that AGW is happening, and similar results have been found using a number of proxy indicators?

    I don’t think so.

    Seeing the Wood from the Trees
    Keith R. Briffa and Timothy J. Osborn

    An uninformed reader would be forgiven for interpreting the similarity between the 1000-year temperature curve of Mann et al. and a variety of others also representing either temperature change over the NH as a whole or a large part of it (see the figure) as strong corroboration of their general validity, and, to some extent, this may well be so. Unfortunately, very few of the series are truly independent: There is a degree of common input to virtually every one, because there are still only a small number of long, well-dated, high-resolution proxy records.

  22. stephan harrison
    Posted Jul 1, 2005 at 2:37 AM | Permalink

    Thanks for your comments re my post number 16. #21 says that “very few of the series are truly independent. There is a degree of common input to virtually every one”. However, this doesn’t account for Oerleman’s curve derived from the retreat of mountain glaciers worldwide. At a much smaller scale, we (or rather one of the research group) have produced work from the Tien Shan mountains in Kazakhstan using tree rings which shows anomalous recent warming, and this is reflected in the behaviour of Tien Shan mountain glaciers. This work has yet to be peer-reviewed. I agree that the latter do not constitute long, well-dated high-resolution proxy records, but would argue that the weight of evidence points very strongly towards anomalous recent warmth in the context of the late Holocene and it seems like a remarkable coincidence that this is associated with historically high levels of anthropogenic C02 emissions.

  23. Ed Snack
    Posted Jul 1, 2005 at 5:04 AM | Permalink

    Stephan, thanks for the information, I am sure most posters would look forward to reviewing your group’s data when it is available. I would point out tha glaciers retreat (and advance) at least as much on precipitation changes as temperature changes. Large scale precipitation variations are known to have occurred during the holocene.

    I also think that what is being disputed on this site, with reference to MBH98 and associated reconstructions, that your phrase ” the weight of evidence points very strongly towards anomalous recent warmth in the context of the late Holocene and it seems like a remarkable coincidence that this is associated with historically high levels of anthropogenic C02 emissions” is as well supported as you seem to believe. It is not clear that there have not been periods of significantly greater warmth than today in the reasonably recent past (say the last 3-5,000 years). MBH98 (and extended in MBH99) attempts to show that 20 century warming is unprecedented over the last 1,000 years, but M&M’s work in particular throws that reconstruction into doubt.

    I cannot speak for all posters, but I suggest that many do not dispute that there has been some warming through the 20th century. In fact most climate “skeptics” truly believe that climate varies on all time scales, which is exactly why many find the sudden attribution of all of the 20 century warming to CO2 to be a problem. Looking at the temperature record (which may or may not be entirely accurate, but accept it for now), there is significant warming from the late 19th century until the late 1930’s, without much increase in CO2. Then there is a period of significant cooling through until the late 1970’s, when CO2 was increasing markedly, then a renewed warming trend from the late 70’s on to apparently peak around 1998 with little change (on a short time) since. It may be a coincidence, it may not, but models of CO2 warming are not evidence. The warming up to the 1930’s is generally accepted as being solar driven (IPCC for example accepts this as the likely explanation), not because of any specific evidence, but because it is plausible. This could also be true for 1970 onward, there is some evidence, not enough but some. Quite likely some of the recdent warming is solar, but how much is unclear.

  24. fFreddy
    Posted Jul 1, 2005 at 6:04 AM | Permalink

    On the subject of glacial retreat, have you seen “The Green Alps theory” –

    Short version – as the alpine glaciers retreat, they are finding old trees which are carbon dated to show that, about 10 times since the last ice age, the glaciers were smaller than today. In particular, about 7,000 years ago, the Alps were practically glacier-free.

    Any thoughts ?

  25. fFreddy
    Posted Jul 1, 2005 at 6:23 AM | Permalink

    Stephan, on the subject of glacial retreat, are you familiar with the “Green Alps” theory – see :


    Short version : as the glaciers retreat, the University of Innsbruck is finding that there used to be trees there. Carbon dating indicates that the trees date to 10 separate occasions since the last ice age. In particular, there was a time 7000 years ago when there were practically no glaciers at all.
    Any thoughts ?

  26. stephan harrison
    Posted Jul 1, 2005 at 8:37 AM | Permalink

    Thanks for your comments Ed. I’ll limit my comments mainly to the glacier record (since this is my research area). It seems to me unlikely that the worldwide recession of mountain glaciers can be attributed to precipitation changes since this would also have to be worldwide, and, as far as I am aware, there is no evidence for this. It seems clear to me that the glaciers are responding to something, and their behaviour would appear to be in line with the majority view of GW. Glacier behaviour also argues against the idea of urban heat islands having contaminated the surface records….in Patagonia and Kazakhstan where I work the glaciers are nowhere near urban areas and the same goes for most other mountain glaciers I would guess.

    As far as I understand it, the temperature changes throughout the 20th century can be explained by ‘natural’ factors; it’s only the temperature rise of the last 15 or 20 years or so which appears to be anomalous and most likely related to increases in CO2 emissions. I go along with Occam’s razor on this one!

  27. stephan harrison
    Posted Jul 1, 2005 at 9:56 AM | Permalink

    fFreddy, I’m afraid the link didn’t work. However, I think I’ve read some of the material. As far as I can tell, the fact that glaciers were smaller in the Alps 7000 years than they are now doesn’t really argue against AGW theory since it appears the forcings were different and present warming may ‘only’ be anomalous for the last 1 or 2 thousand years. This view would collapse of course if MBH were to reconstruct climate as far back as that and argue that the present was anomalous!

  28. fFreddy
    Posted Jul 1, 2005 at 10:52 AM | Permalink

    Whoops, my apologies for the double post; moderators, please feel free to zap #24 and this.

  29. Michael Jankowski
    Posted Jul 1, 2005 at 11:16 AM | Permalink

    “…Glacier behaviour also argues against the idea of urban heat islands having contaminated the surface records…”

    I don’t think anyone claims 100% of the recorded 20th century warming was due to urban heat island contamination. But is there significant contamination? It’s quite possible. Glacier retreat doesn’t dimiss this idea, it simply shows it’s not the only explanation for the recorded warming.

    “..it’s only the temperature rise of the last 15 or 20 years or so which appears to be anomalous and most likely related to increases in CO2 emissions…”

    This raises an interesting question. The IPCC suggested in 2001 that glacial retreat trails warming on the scale of decades:
    “…Glaciers are generally not in equilibrium with the prevailing climatic conditions and a more refined analysis should deal with the different response times of glaciers which involves modelling (Oerlemans et al., 1998). It will take some time before a large number of glaciers are modelled. Nevertheless, work done so far indicates that the response times of glacier lengths shown in Figure 2.18 are in the 10 to 70 year range…”

    So if you believe the 2001 IPCC conclusion and also believe only the warming of the last 15-20 yrs is tied to anthropogenic warming, then the glacial retreat you have observed should have only been abnormal for the last 5-10 years at the most.

    If you go back 15-20 yrs, here’s a borehole study listed as “Tienshan, Kazakhstan,” nsidc.org/data/docs/fgdc/ggd499_boreholes_kazakh/ which doesn’t really show any warming from 1986-1992. It’s tough to draw conclusions from only two boreholes, of course!

    I also found this abstract which seems to suggest Kazahkstan had suffered a severe 3-decade drought through 1989 with no observed temperature trend, although 1989 is borderline in your 15-to-20 yr windown http://www.geosc.psu.edu/~lmontand/Documents/Summary_Aral.pdf

    Lastly, I came across this article concerning glaicers in “northern Tien Shan, Kazakhstan” boris.qub.ac.uk/ggg/papers/full/2005/rp012005/rp01.pdf . If you look at Fig 4b, the air temps don’t seem to suggest any warming of significance over more than the past half century. The 10-yr running mean ends right where it started. Granted, it’s generally warmer now than in the 1950s-70s, but no warmer than pre-1940, when anthropogenic GHG emissions were relatively minimal, and relatively stable (based in the running mean) since the late 1970s.

  30. Posted Jul 1, 2005 at 1:25 PM | Permalink

    I would like to point out that four independent research groups, using different methods, have found that more solar radiation reaches Earth’s surface and less is reflected back into space now than in the end of the 1980’s. Three of them published papers recently in the same issue of Science. This could explain most of the recent warming and melting glaciers. No AGW is needed. In fact, the increase in incoming solar radiation is so large that the warming should have been much larger than observed, even if a small climate sensitivity is used.

  31. fFreddy
    Posted Jul 1, 2005 at 6:12 PM | Permalink

    Bother, sorry, let’s try again :
    Green Alps link here.
    The evidence seems to show that there have been significant warmings – to higher temperatures than today – every thousand years or so. The event 7000 years back was the most extreme case, but not the only one.

    The climate alarmists argument at its simplest is :
    1 – We observe record high CO2 levels at present
    2 – The hockey stick “shows” record high temperatures at present
    3 – “Therefore” the CO2 is causing the temperature.

    M&M’s excellent work knocks out point 2, and hence returns 3) to its status of distinctly unproven.

    I am a trifle suspicious of any argument based on “forcings”. (This is a polite phrasing, in deference to the tone of this site.) The climate modellers are still feeling their way towards an underlying theory of what makes climate work; they are nowhere near a solid theoretical basis. To dismiss hard evidence based on such fluff requires a degree of faith that I, for one, do not possess.

    If the Innsbruck work is confirmed, and replicated at other glacier sites world-wide, it will provide very solid, easily comprehensible evidence that current warming is nothing terribly unusual – i.e., it will demonstrate that point 2) above is not just unproven, but distinctly false.

    Ref the Aral Sea, the reason this got drained is because Kruschev had this dotty idea about making Central Asia into a giant cotton field, so he diverted all the rivers that drained into the Aral. He didn’t get much cotton, and the Aral shrank to something like 20% of what it used to be. Supposed to be the only man-made ecological disaster that is visible from space, or some such.
    The Aral Sea is about a time-zone away from Almaty and the TienShan. I don’t know if that is too far to affect precipitation patterns.

  32. Posted Jul 1, 2005 at 6:13 PM | Permalink

    Re #22

    The story in Der Spiegel works here

    Moreover, most of the high latitude glaciers (Alaska, Greenland, Norway) had their largest retreat in the 1930-1945 period, when temperatures were as high (or even higher) than today at the places of interest, see e.g. the temperature record of all stations around Greenland.

    Even the graph of the glacier lengths since 1500 by Oerlemans (on RealClimate upper right of the page) shows that the retreating trends for most of the glaciers were larger in the first halve than in the second halve of the previous century. Near halve of them have a slower retreat now (see the Alpine glaciers) or even an increase (Norway/Sweden/New Zealand)… Thus the relation with AGW is far from perfect…

  33. Roger Bell
    Posted Jul 1, 2005 at 8:42 PM | Permalink

    Firstly, I’ve been one of the authors on some papers published in Nature and, unlike Steve, I’ve never been asked to sign an affidavit. This must have been introduced fairly recently and I wonder why.
    Secondly, does everyone realise that there is a cost for publishing in many American scientific journals – something of the order of $100 per page. This can lead to people being included as authors of papers simply because they have the funds to pay the charges while the true authors don’t.. I mention this because some journals, quite reasonably, want the author list restricted to those who did the science.

  34. Michael Mayson
    Posted Jul 4, 2005 at 12:10 AM | Permalink

    Slightly off topic:
    July 8, 2005 –
    Joint CGD-ISSE & CIRES Seminar – Panel Discussion “Hockeysticks, the tragedy of the commons and sustainability of climate science.”
    Panelists: Warren Washington, Caspar Amman and Doug Nychka, NCAR, and Roger Pielke Jr.

    (see http://sciencepolicy.colorado.edu/prometheus/archives/climate_change/000477upcoming_talk_and_pa.html)
    It would be great if someone frequenting this blog could attend and report!

  35. Murray Duffin
    Posted Jul 4, 2005 at 3:12 PM | Permalink

    Just a few personal observations re glaciers.
    The alpine glacier surface will soften and melt under bright sunshine, even when the surface air temperature is some degrees below the freezing point. Conversely, the surface will remain crisp under heavy overcast, even when the surface air temperature only a few inches above the ice is modestly above the melting point. An increase in sunshine due to reduction in cloud cover is more likely to cause glacier retreat than a temperature increase of <1 degree C. An increase in sunshine due to reduced cloud cover is also likely to be coincident with a reduction in precipitation. Thus solar activity is more likely to be the cause of glacier retreat than is AGW. Swiss alpine glaciers have been in retreat since ca 1850, and in terms of length, most of the retreat took place well before 1970. However part of the reason for a slowing of the retreat is that the face keeps moving to higher altitudes, where the temperature change (decrease) is quite significant. Alaskan glaciers had there maximum advance about 1700, with Captain Cooke having noted that the ice extended to the sea, ie no open fjords. Most of these retreats were at sea level, and in terms of volume (if memory serves) >80% of the retreat took place before 1950. Of course, again the retreat would be expected to slow when the glacier face finally got above sea level. As has been noted above the UHI effect is taken by most skeptics to have exaggerated the reported average temperature increase, not to have accounted for it. There is no obvious cause and effect link between AGW and glacier retreat and UHI has nothing to do with the argument. Murray Duffin

  36. Ed Snack
    Posted Jul 5, 2005 at 2:26 AM | Permalink

    Re #34, Michael, page not found ? Something happened or a problem with the url ?

  37. J. Sperry
    Posted Jul 5, 2005 at 2:16 PM | Permalink

    Re #36, it’s the pesky last character, Ed. Delete the end parethesis in Michael’s link and it works OK.

  38. Ed Snack
    Posted Jul 5, 2005 at 6:54 PM | Permalink

    Re #37, thanks, that is the trick, should have realised by looking that a “)” doesn’t normally belong after “.html”

  39. Michael Mayson
    Posted Jul 5, 2005 at 7:45 PM | Permalink

    Re #36 $ #37 Sorry about the link – are you going to go Ed?

  40. Ed Snack
    Posted Jul 6, 2005 at 3:34 PM | Permalink

    Micahel, it’s a bit fcar away for me, though I would certainly like to go, and if no one else did, ask Amman about his verification statistics and the other questions posed by Steve in the other thread. I would also like to ask if anyone there thought the Bristlecone Pines were responding to temperature increases, and if not, why were they included in the first place, with time for some semi-snide comments on the Gaspe Cedars, Polar Urals, and Tasmanian Huon’s while about it.

  41. TCO
    Posted Sep 20, 2005 at 12:41 AM | Permalink

    1, 2, and 3 are all shameful. 4 and 5 might be inadvertant.

One Trackback

  1. […] about the failure of academics take withholding adverse results as seriously as they should in a 2005 CA post (long before the present controversy) as follows: It’s hard to find much discussion of case […]

%d bloggers like this: