Washington Post on Hwang

Some interesting comments in today’s Washington Post. Thanks to Roger Pielke for the reference. Roger pointed to the following:

Rather, we need to recognize just how arduous and painstaking good science usually is and remind ourselves that data do not become dogma when published, but only when independently validated.

Quite so. The article also pointed to a "market situation" in current stem cell research, leading to rapid and uncritical promotion of "hot" studies:

As the demand for results far outstripped the ability of researchers to supply them, a seller’s market emerged in which goods were overvalued and even low-quality merchandise was snatched up by eager buyers. This is the context in which Hwang’s studies appeared.

While most in the field of stem cell research were shocked by the reports of fraud, the shock was only one of degree; it is common knowledge that the bar for publication in this field often has appeared remarkably low, with even well-respected research journals seeming to fall over one another for the privilege of publishing the next hot paper. The result of this frenzy has been an entire body of literature that is viewed with extreme skepticism by most serious stem cell investigators.

The idea of buyer’s and seller’s markets is obviously familiar to me from stock market experience. I’ve often used stock market analogies to try to explain fads. I would extend the analogy to Hockey Team multiproxy studies. The author of the article says (and I hope that this applies to millennial climate studies as well):

More likely this controversy — and the ensuing scrutiny and self-reflection — will provide exactly what our discipline needs most: the opportunity to modulate the extravagant expectations for this research while we reaffirm our underlying commitment to it…


  1. John A
    Posted Jan 15, 2006 at 5:16 PM | Permalink

    Unfortunately the boiler room promoters journal editors appear determined not to learn any lessons.

    See this article by the BBC for case of “head-in-the-sand” disease.

    We all have so many questions. Why did he {Hwang] do it? How did he expect to get away with it? And, was there anything that could have, or should have, been done to pick up the great con much earlier.

    The last question, quite naturally, is being directed at Science magazine, which published the 2004 and 2005 manuscripts; and at the process of “peer review” which it, and other leading journals, use to check papers before they publish them.

    That process is supposed to ensure that any study’s methodology is sound and that interpretation of data does not go beyond what can be reasonably justified.

    Science magazine is continuing its own internal review but its Editor in Chief, Dr Donald Kennedy, is doubtful there are any systematic flaws in the peer review process that made the Korean fraud any easier.

    “We’ve had a couple of papers in Science in the last four or five years that plainly involved scientific misconduct, ultimately discovered on investigation and publicised,” he told reporters last month.


    But, as Donald Kennedy suggests, there appear to be few options for fundamental changes to the peer review process that would make it harder for fraudulent papers to enter the scientific literature.

    “Thousands of papers are reviewed every week, and peer review works usually,” says Ms Wager.

    “There aren’t any alternative models to peer review. It’s a bit like democracy: it’s a lousy system but it’s the best one we have.

    “There are always cases that seem to get through, especially in areas where everyone wants the results to be true.”

    By the way, the article mentions an “Office of Research Integrity” setup in 1992 to investigate cases of scientific fraud. Has anyone heard of this agency?

  2. JerryB
    Posted Jan 15, 2006 at 5:31 PM | Permalink

    John A,

    Given the date 1992, it would seem to be the ORI within the US “Departmentof Health and Human Services”

    See: http://ori.dhhs.gov/

    There are many other groups of the same name in many universities.

  3. Brooks Hurd
    Posted Jan 16, 2006 at 8:33 PM | Permalink

    It amazes me that some journalists go to the person who is principally responsible for the debacle to find out what went wrong. Worse yet, they even ask that person what the plans are for fixing the problem.

    If Donald Kennedy had known what had gone wrong and how to fix it, would he not have done it and avoided causing himself and his magazine such embarrassment?

    In the aftermath, even if he were introspective enough to really determine the answers to those two questions, he would have had to admit that it was his own miss-management which led to the problems. It is highly unlikely that he would be so honest as to say that.

    Asking Kennedy what went wrong at Science is similar to asking Ken Lay what went wrong at Enron.

  4. Posted Jan 17, 2006 at 4:01 PM | Permalink

    See John P. A. Ioannidis: Why Most Published Research Findings Are False

  5. pj
    Posted Jan 19, 2006 at 1:00 PM | Permalink

    Here’s a good article from the New York Times on the same subject. It talks about the concept of “open-source review,” which I really like.


%d bloggers like this: