"Full, true and plain disclosure" is a fundamental obligation in the offering of public securities. As someone with experience in this field, I’ve been reflecting for some time about the following questions:
Is there a duty of “full, true and plain disclosure” or its equivalent in science? If so, how is it expressed in journal policies and science codes of conduct? If not, should there be such a duty?
In our EE article earlier this year, we touched on this question, but we did not connect the question to the language of institutional codes of conduct. I do so here.
In offering securities to the public, stock promoters have an obligation of “full, true and plain disclosure” — with each word being important. For public corporation, this is an ongoing obligation: a corporation has an obligation to disclose material adverse results, when they occur, not just when they are doing a securities offering. The existence of a law doesn’t mean that people don’t break it, but at least there’s a standard to which people can be held accountable.
The hardest part is “full” disclosure. This means prompt disclosure of all relevant adverse information. This is different than “don’t ask, don’t tell” discloure. It’s not enough that everything in a prospectus is “true”; if something important is left out, the obligation of full, true and plain disclosure has not been met.
Not many corporate executives will participate in concocting overtly false information, but there are obviously many temptations even for essentially honest people not to disclose bad news or to cut corners. The usual rationalization is materiality: faced with bad results, there’s a real temptation to try to convince yourself that the adverse results didn’t “matter”. Or alternatively to delay in the hopes that you get some good results. But it’s a good idea to chin up and take your medicine. I’m very unsympathetic to the behavior of UCAR with respect to the Wahl and Ammann articles. They made a press release announcing the submission of two Wahl and Ammann articles (on the same day that Ross and I were making a presentation in Washington). GRL has rejected the Wahl and Ammann submission but UCAR has not announced this and has left the submission on the UCAR website without any comment. Academics seem unfazed by this, but, as someone with experience with speculative mining stocks, I am dumbfounded by it.
When you do a prospectus, you have to swear an affidavit that the prospectus contains full, true and plain disclosure. Your lawyer pauses over each word. For corporate officers and directors, the duty is very serious and exposes them to personal liability.
One of the first things that I noticed when we submitted an article to Nature last year was that you had to sign an affidavit disclosing conflicts of interest, but you did not have to sign an affidavit warranting full, true and plain disclosure or its equivalent. (This was an anthropological observation about a different culture if you will.)
I’ve recently examined the publication policies of Nature or Science (which are typical of science journals) and been unable to locate any explicit policy of full, true and plain disclosure or its equivalent. As far as I can tell, neither do the instructions to reviewers for science journals state that reviewers have any obligations to ensure full disclosure.
This is not to say that the journals encourage the opposite — merely that the issue is not front and center. Further, if scientists have a duty of full disclosure, it does not arise under policies of science journals for authors, but must arise elsewhere. There’s an interesting discussion of the role of journals in science misconduct here.
If journals do not have explicit policies of full, true and plain disclosure, such a duty, if it exists, must arise elsewhere. The U.S. federal government has a research code of conduct, as do most universities and professional societies. I’ve examined quite a few of these codes and most of them are surprisingly similar. I’ll look first at the reseach code of conduct of the U.S. federal government.
It defines three classes of scientific misconduct: fabrication, falsification and plagiarism, and states that misconduct does not include honest error or honest differences in interpretations or judgments of data. The definition of “falsification” is broader than one might think and covers omissions as well as overt invention of data (which is covered under “fabrication”). Here’s the U.S. federal definition:
Falsification is manipulating research materials, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record [my bold]
It’s not hard to interpret this definition of “falsification”, which includes material omission, as placing a full disclosure obligation on scientists fairly comparable to the “full, true and plain disclosure” obligation on stock promoters. I find the term “falsification” to be pretty unhelpful relative to the term "full disclosure" in securities offerings. It seems to me that a scientist could easily interpret a prohibition against falsification as being limited to the prohibition of positive acts (which is really dealt with under “fabrication”) and that he might not realize that selective omissions were a form of falsification. The term “full true and plain disclosure” seems less likely to be misconstrued.
A definition of misconduct including the term “falsification” is very widespread among codes of conduct for other institutions and universities. Usually the above or a very similar definition of falsification is included (but not always.) For example, neither Stanford University nor the National Science Foundation nor a joint statement by the Director General of the Research Councils and the Chief Executives of the UK Research Councils define falsification, while the University of Massachusetts and most universities do. The University of Massachusetts also has a clause prohibiting “misrepresentations in publication”, which appears to be broader than most university codes of conduct.
There is a considerable amount of U.S. case law involving the term “falsification”, much of it relating to application forms. Omission of material information in application forms is usually held to be “falsification”.
It’s hard to find much discussion of case studies involving omission of data by scientists, as opposed to cases involving plagiarism or making up data. This, in itself, is interestingly different from business situations, where omission of information is usually the issue and deserves an explanation. To someone with a business background, the apparent preoccupation of academics with plagiarism, relative to full disclosure, appears rather precious. To some extent, the battles are about vanity and personal “property” interests, rather than about protection of the public.
Here is one of the few cases that I could locate, considered under a somewhat different code of conduct:
Engineer A is performing graduate research at a major university. As part of the requirement for Engineer A to complete his graduate research and obtain his advanced degree, Engineer A is required to develop a research report. In line with developing the report, Engineer A compiles a vast amount of data pertaining to the subject of his report. The vast majority of the data strongly supports Engineer A’s conclusion as well as prior conclusions developed by others. However, a few aspects of the data are at variance and not fully consistent with the conclusions contained in Engineer A’s report. Convinced of the soundness of his report and concerned that inclusion of the ambiguous data will detract from and distort the essential thrust of the report, Engineer A decides to omit references to the ambiguous data in the report. Question: Was it unethical for Engineer A to fail to include reference to the unsubstantiative data in his report?
This was considered under a code of ethics that was a little more defined than most university codes of conduct, but I don’t think that the differences are material to the conclusion, including the following terms:
Section II.3.a.:"Engineers shall be objective and truthful in professional reports, statements, or testimony. They shall include all relevant and pertinent information in such reports, statements, or testimony."
Section III.3.a.:"Engineers shall avoid the use of statements containing a material misrepresentation of fact or omitting a material fact necessary to keep statements from being misleading; statements intended or likely to create an unjustified expectation.
Section III.11.:"Engineers shall cooperate in extending the effectiveness of the profession by interchanging information and experience with other engineers and students, and will endeavor to provide opportunity for the professional development and advancement of engineers under their supervision."
The adjudicators concluded under section II.a.3 that:
The engineer must be objective and truthful in his professional reports and must include all relevant and pertinent information in such reports. In the instant case, that would suggest that Engineer A had an ethical duty to include the unsubstantiative data in his report because such data were relevant and pertinent to the subject of his report. His failure to include them indicates that Engineer A may have exercised subjective judgment in order to reinforce the thrust of his report.
Here I think that the emphasis on “subjective judgement” is interesting. Under section III.3.a, they concluded:
In a sense, Engineer A’s failure to include the unsubstantiative data in his report caused his report to be somewhat misleading. An individual performing research at some future date, who relies upon the contents of Engineer A’s report, may assume that his results are unqualified, uncontradicted, and fully supportable. That may cause such future research to be equally tainted and may cause future researchers to reach erroneous conclusions.
Under section III.11, they concluded:
We do not see how Engineer A could be acting consistently with that provision by failing to include the unsubstantiative data in his report. By misrepresenting his findings, Engineer A distorts a field of knowledge upon which others are bound to rely and also undermines the exercise of engineering research. Although Engineer A may have been convinced of the soundness of his report based upon his overall finding and concerned that inclusion of the data would detract from the thrust of his report, such was not enough of a justification to omit reference to the unsubstantiative data. The challenge of academic research is not to develop accurate, consistent, or precise findings which one can identify and categorize neatly, nor is it to identify results that are in accord with one’s basic premise. The real challenge of such research is to wrestle head-on with the difficult and sometimes irresolvable issues that surface, and try to gain some understanding of why they are at variance with other results.
The Board concluded that it was unethical to fail to include the unsubstantiative data.
The Office of Research Integrity has several cases.
Some Questions to Readers
In our EE article, we drew attention to some areas in MBH98 where the disclosure practices seemed highly questionable, particularly the following:
1) Withholding the R2 and other verification statistics for the early steps. It seems to me that omission of the 15th century R2 (~0.0) was selective.
2) Withholding the results without the bristlecones (the CENSORED file).
3) Related to but distinct from 2, the claim that the MBH98 reconstruction was “robust” to the presence/absence of all dendroclimatic indicators, when the author knew that it was not robust even to the presence/absence of the bristlecones.
4) Withholding the information that the Gaspé series had been edited and that the editing of this series was material to early 15th century result. Separate but related is the misrepresentation of the start date of this series.
5) The misrepresentation of the principal components method. In our EE article, we allowed that it was possible that this was simply a computer programming error, but realclimate argued in early January that it was not a computer programming error, but a misrepresentation.
I’ve tried to look at the issue of “full true and plain disclosure” as it applies to scientists from first principles, because I’ve been puzzled by the anthropological differences in culture. As someone used to a "full true and plain disclosure", I find the apparent withholding of adverse results by MBH98 to be very disquieting, but this issue has occasioned almost no interest in academic communities (who are mostly interested in principal components methodologies as far as I can judge from comments submitted to GRL). On the other hand, disclosure issues are instantly understandable to civilians. I think that academic practices on disclosure need upgrading, especially when it comes to scientific prospectuses such as IPCC assessment reports. Needless to say, archiving source code and data as used is a helpful audit trail in any event and easy to implement at a journal level.
Despite the lack of traction with academics, these disclosure issues have obviously attracted the interest of the House Energy and Commerce Committee, who are very familiar with disclosure and due diligence issues (after all, they did Enron hearings.) It will be interesting to see what happens. In the mean time, I would be interested in feedback on the more general issues pertaining to codes of conduct and journal practices.
Notes: see http://www.onlineethics.org/cms/7730.aspx