In a series of studies designed to assess two anti-tissue-rejection drugs, former University of Alabama–Birmingham surgeons Judith Thomas and Juan Contreras carefully detailed experiments in which they replaced one kidney in rhesus monkeys with a foreign one and, a month later, removed the remaining native kidney. The new organs took, they reported. The drugs worked.
But according to a July report from the federal Office of Research Integrity, that second kidney was never removed from at least 32 of the 70 animals. The scientists denied intentional wrongdoing, but the promising drug has been deemed bogus. The experiments also cost taxpayers: The National Institutes of Health poured $23 million into the work over eight years.
Such misconduct, whether it involves overlooking inconvenient data or outright lying, distorts the scientific record, props up weak theories and ineffective drugs, and wastes funding. Unfortunately, no patterns emerged from Fanelli's work that would indicate the exact cause. Many scientists suspect that the publish-or-perish mentality of academia, and the scientific establishment's tendency to reward productivity over quality, is to blame, in part. Universities and institutions, says Brian Martinson, a misconduct expert at HealthPartners Research Foundation in Minnesota, are increasingly evaluating scientists for promotion or tenure by asking, "How many grant dollars has this person brought in?"
Since the report, the scientific community has begun to consider solutions. Nicholas Steneck, director of the Research Ethics and Integrity Program at the University of Michigan, points out that researchers could better police their own fields. That could be as simple as studying suspect images. In 2005, Korean stem-cell researcher Hwang Woo Suk was caught fabricating data in part because of a single photo of a cell line that was reproduced in two separate journals and labeled as two distinct cell lines. The Office of Research Integrity offers software that helps scientists analyze images in their contemporaries' papers for Photoshop-style doctoring. Turning scientists into fraud detectives isn't such an odd idea, Steneck says, because research is supposed to be self-regulating. "In a fair number of the more serious cases," he notes, "if the colleagues working with those researchers had simply paid more attention, they would have been caught earlier."
Contreras and Thomas resigned from their posts, but less nefarious infractions often go unpunished. Publications sometimes uncover misconduct through the peer-review process, and although the editors reject the article, they do not report the potential fraud, most likely out of fear of getting sued. "That's a violation of self-regulation," Steneck says. "We need to take action if we see something wrong."
Better training in ethics and proper research conduct could certainly help. "I honestly believe that a lot of people cross this boundary without really knowing it," says University of Chicago computational biologist Andrey Rzhetsky. This notion is supported by Martinson's 2007 survey of 3,000 researchers that found that young scientists learn from their mentors how to cut corners to advance professionally.
The next step may be clarifying the nature of the problem and determining if it's even worse than Fanelli suspects. Science has a deservedly favored position in society, but to retain that standing, its practitioners need to weed out the bad behavior. If misconduct is as rampant as these studies imply, scientists must start reporting that behavior regularly, not merely when asked to fill out anonymous questionnaires.
This is really disheartening, and not only on the ethical level within the scientific research community, though of course that's a huge concern. Imagine being an utterly honest scientist but having some of your research questioned because outsiders have at least a faint distrust.
Related to that distrust, I fear, is something of potentially enormous magnitude that could cut right across disciplines and issues. For instance, I can easily imagine GW-climate-change naysayers seizing upon this: "See? See? Those LYING scientists!" Another example is that of those who fear GM crops will turn us all into zombies, Frankensteins, whatever. Yet another is from the field of energy sources: "Drill, baby, drill! Those scientists who claim burning fossil fuels is bad for our health are all LIARS!"
Of course, they will be utterly ignoring the flipside of this research: most scientists are honest.
Within the area of a young researcher under pressure to publish so as to gain tenure or later under similar pressure for a promotion, I think it's not the purview of a tenure/promotion committee to even know, let alone consider, how many research money the candidate has attracted. Leave that to the institution's finance people; perhaps they could form a "finance point of view tenure/promotion committee" and make a recommendation -- but only AFTER the academic committee had cast -- and recorded -- a tentative vote. Even that's far from perfect, and in any case might not be reasonably feasible.
Improve peer review? I don't know how, though maybe increasing the number of required reviewers to sign off on the research might help some. Even that has problems, as professors retire and aren't replaced.
How about both civil and criminal penalties? Well, if someone involved in an area of research with direct implications for human well-being fudges the data and as a result people are hurt/made ill or even die because of the faulty research, maybe, but on my part, a very reluctant maybe. But where does that leave, say, the art historian who manipulates information about an artist? That isn't going to hurt, let alone kill, anyone, but intellectually it's every bit as dishonest. Are we really looking to lock those kinds of folks, too?
I should say that while I've taught in universities, I am not a researcher, nor even a professor, as a holder of an MA. But I'm a lifelong, avid fan of the sciences, engineering, etc., especially cutting-edge research and development.
The line seems pretty darned clear to me. If the numbers don't add up to that answer, then they don't add up. Period. End of story. If *maybe* they add up -- fine: replicate the work. As for other research suggsting something different, whether a little or entirely, if that research can reasonably be expected to be known to the researchers, yes, they should acknowledge it -- but if either they weren't (reasonably) or if the competing research post-dated their own, then, no. It's not their ethical responsibility to blather "So-and-so has contradicted my research of a decade ago."
As for deliberate fudgers -- stop it.
Sorry -- meant to write "how many research GRANTS," not "how many research MONEY." Jeez, what a dumb mistake! What I get for not proofreading, huh???
Would it be surprising to find, that many of these behaviors begin as a result of behavior observed/enacted during their graduate study years? How many students have been denied procedural due process due to a professor's personal conflict with a student's written/spoken opinion(s). Or, about the student that always agrees with the professor, gets A's on papers which data tests were never completed Regardless of knowledge of any University System's rules stating a graduate school is to follow their student handbook explicitly, not to interpret it on an individual basis; the graduate schools do it regardless, with blind eyes all around not willing to uphold their own regulations (let's call the kettle black). How often has a graduate school been guilty of misconduct, but never been held accountable? Yes, we know data results can be skewed, but the behavior begins before educations are ever completed, thanks to our 'highly skilled, highly educated', educators.