There’s a short article “Model verification and documentaiton are needed” in Eos, June 20, 2006, by a geologist, I. Sasowsky, :calling for reviewers to ensure that computer methods are properly documented and archived as part of the review process. Sasowsky notes that prior studies have documented frequent “surprises” and “fundemental errors” in numerical modeling studies, even citing Naomi Oreskes (Oreskes and Belitz 2001) on this:
Trust, but verify”¢’¬?this is what editors ask for, and what readers expect, from reviewers of technical articles. As a reviewer, I am growing concerned with the level of trust requested by authors of submitted manuscripts, and the frequent lack of verifiable data and methods. Negative reports in the press [e.g., New York Times, 2005] attest to the worst-case outcomes of such shortcomings ….
Where scientific findings are based on computational analyses, documentation of computer model methods and analyses ought to be a required element of publication. The trust of the public in scientists and our methods depends upon this.
Ian Karucunas, the secretary for the NAS panel, kindly sent me the following reference to a June 20, 2006 announcement by the Research Councils U.K. (said to be equivalent to NSF):
The Research Councils UK Executive Group, the grouping of the eight chief executives of the UK Research Councils, has today published its updated position statement on access to research outputs (http://www.rcuk.ac.uk/access/). …
The paper reaffirms the Research Councils’ commitment to the guiding principles that publicly funded research must be made available and accessible for public examination as rapidly as practical …. and outputs must be preserved and remain accessible for future generations.
I guess no one sent the memo to Jones and Briffa.