Some interesting comments in today’s Washington Post. Thanks to Roger Pielke for the reference. Roger pointed to the following:
Rather, we need to recognize just how arduous and painstaking good science usually is and remind ourselves that data do not become dogma when published, but only when independently validated.
Quite so. The article also pointed to a "market situation" in current stem cell research, leading to rapid and uncritical promotion of "hot" studies:
As the demand for results far outstripped the ability of researchers to supply them, a seller’s market emerged in which goods were overvalued and even low-quality merchandise was snatched up by eager buyers. This is the context in which Hwang’s studies appeared.
While most in the field of stem cell research were shocked by the reports of fraud, the shock was only one of degree; it is common knowledge that the bar for publication in this field often has appeared remarkably low, with even well-respected research journals seeming to fall over one another for the privilege of publishing the next hot paper. The result of this frenzy has been an entire body of literature that is viewed with extreme skepticism by most serious stem cell investigators.
The idea of buyer’s and seller’s markets is obviously familiar to me from stock market experience. I’ve often used stock market analogies to try to explain fads. I would extend the analogy to Hockey Team multiproxy studies. The author of the article says (and I hope that this applies to millennial climate studies as well):
More likely this controversy — and the ensuing scrutiny and self-reflection — will provide exactly what our discipline needs most: the opportunity to modulate the extravagant expectations for this research while we reaffirm our underlying commitment to it…


5 Comments
Unfortunately the
boiler room promotersjournal editors appear determined not to learn any lessons.See this article by the BBC for case of “head-in-the-sand” disease.
and
By the way, the article mentions an “Office of Research Integrity” setup in 1992 to investigate cases of scientific fraud. Has anyone heard of this agency?
John A,
Given the date 1992, it would seem to be the ORI within the US “Departmentof Health and Human Services”
See: http://ori.dhhs.gov/
There are many other groups of the same name in many universities.
It amazes me that some journalists go to the person who is principally responsible for the debacle to find out what went wrong. Worse yet, they even ask that person what the plans are for fixing the problem.
If Donald Kennedy had known what had gone wrong and how to fix it, would he not have done it and avoided causing himself and his magazine such embarrassment?
In the aftermath, even if he were introspective enough to really determine the answers to those two questions, he would have had to admit that it was his own miss-management which led to the problems. It is highly unlikely that he would be so honest as to say that.
Asking Kennedy what went wrong at Science is similar to asking Ken Lay what went wrong at Enron.
See John P. A. Ioannidis: Why Most Published Research Findings Are False
Here’s a good article from the New York Times on the same subject. It talks about the concept of “open-source review,” which I really like.
http://tinyurl.com/9lz5m