Feature

By all appearances, Harvard psychologist Karen Ruggiero, PhD, had been on the path to greatness in the late 1990s. Today, she’s only remembered as the psychologist behind one of the highest-profile instances of scientific fraud.

In 2001, Ruggiero wrote to the Personality and Social Psychology Bulletin, copying Harvard and the federal Office of Research Integrity, to retract a 1998 article about status and discrimination “because serious questions exist concerning the validity of the data which relate solely to my own work and which do not implicate my co-author in any way.”

In all, she admitted to fabricating five experiments published in two articles and to doctoring research that appeared in a third.

The temptation to make up or alter scientific data is not new. Even Gregor Mendel probably fudged some of his data to make the connections between inherited traits in his pea plants appear more solid. In 1936, an English statistician named Sir Arnold Ronald Fisher discovered that Mendel’s reported second-generation crossbreeds came out implausibly close to the 3-to-1 ratio Mendel’s theories predicted. Fisher concluded that Mendel likely fell prey to a kind of “confirmation bias” — he knew what the answer was “supposed” to be and selectively ignored contradictory data.

It’s just as easy for graduate students and young scientists to make the same mistakes today, say researchers who study scientific fraud and misconduct. According to bioethicist Raymond De Vries, PhD, of the University of Michigan in Ann Arbor, falling into shady science happens more often than you might think. De Vries reports in a 2006 paper for the Journal of Empirical Research on Human Research Ethics (Vol. 1, No. 1) that there’s a good deal of “normal misbehavior” in scientific research, and cases like Ruggiero’s are only the extreme end of that misbehavior.

Common scientific misdeeds include removing data points that contradict your hypothesis. Not disclosing personal or professional ties to commercial interests. Having improper personal relationships with graduate students and research subjects. Using other people’s ideas without giving them credit. In fact, says De Vries, these and other pernicious misbehaviors are far more damaging to the scientific endeavor than the much rarer “big three” — fabrication, falsification and plagiarism — which constitute fraud.

“Just like in any profession, some researchers engage in serious misconduct, but it’s a minority,” says Melissa S. Anderson, PhD, a professor of higher education at the University of Minnesota in Minneapolis. “Most falls under the category of ‘everyday misconduct.’”

Pressure to publish, high levels of competition among colleagues and the difficulty of getting tenured all can tempt researchers into these lesser sins of science.

Why falsify?

Just how widespread is scientific fraud and misconduct? A meta-analysis by University of Edinburgh in Scotland behavioral ecologist Daniele Fanelli, PhD, suggests it’s commonplace. Her study, published in May 2009 in the Public Library of Science ONE (Vol. 4, No. 5), finds about 2 percent of scientists admit to either having fabricated or falsified data at least once, and 14 percent say they’ve witnessed colleagues do the same.

Chances are, the prevalence is even higher than people are willing to admit in a survey, Fanelli says. The percentage goes up to 33 percent for more nuanced types of misconduct De Vries mentioned, according to a 2005 article in Nature (Vol. 435, No. 9).

Why do scientists behave badly? In part because of increased federal regulations, grant stipulations and institutional review board oversight, says De Vries. To protect animal and human subjects, and prevent the misuse of grant money, scientists may have to navigate a labyrinth of rules that can sometimes seem arbitrary, even silly.

As a result, scientists sometimes lie about their methods to get around bureaucratic barriers. For example, one anonymous researcher De Vries quotes in his study explains that NIH forbids the use of a chemical paid for by a different grant, even if the researcher happens to have enough chemical left over from a previous study. If the researcher followed the rules, he or she would have to buy a redundant bottle of the chemical.

The researcher said, “[a]nd of course, you have to sign that, ‘Yes, this came from the funds used for this project.’ But of course I use it for something else.”

And once you’ve done it before and gotten away with it, there’s little to stop you from doing it again, Anderson says. Shortcuts like that may even lead to routinely cutting corners, essentially creating an institutionalized pattern of misbehavior.

Bureaucracy may engender white lies, but the intense pressure many scientists feel to produce new research can lead to more serious misbehavior, Anderson says. Scientists are well aware that their publication record determines whether they get promoted, tenured and receive grants.

Avoiding misconduct

As a protection against scientific dishonestly, federally funded universities are required to adopt a “responsible conduct of research” curriculum for both undergraduate and graduate students. Universities can offer it as individual courses, integrate it into a broader curriculum or do both. These classes teach students about how to properly handle concerns over authorship, avoid bias in their research and recognize plagiarism, among other skills related to the business and ethics of science.

A key factor in whether those concepts translate into better behavior in the lab is the quality of mentorship students receive, according to a 2007 paper published in Academic Medicine (Vol. 82, No. 9). In the study, Anderson and her co-authors connected self-reports of scientific misconduct with information about the scientists’ “responsible conduct of research” courses and mentorship. They found that scientists who’d had the courses, in addition to having principles of responsible research integrated throughout their programs, were the least likely to conduct shady science. Anderson and colleagues also found that researchers whose mentors highlighted ethics and research techniques were less likely to engage in misconduct .

Which researchers were more likely to behave unethically? Those who’d had mentors who had emphasized academic survival, including doing whatever it takes to make it, Anderson says. She adds that the Office of Research Integrity has ramped up its focus on “responsible conduct of research” courses since her paper came out and she hopes that some organizations have improved their curricula. She and her colleagues are currently collecting data to see if it’s working.

If students don’t feel they are receiving enough guidance on research ethics, they need to learn it for themselves, says Thomas Babor, PhD, a psychologist with the University of Connecticut School of Medicine who’s co-authored a book chapter on scientific misconduct. If your mentor isn’t teaching you these things, go find someone who will, he advises. Also, you can approach your department and ask about the protocol for appealing grievances over authorship or charges of plagiarism. Students should also read up on scientific ethics to acquaint themselves with the issues (some recommendations are listed below).

And once you’ve learned the rules, it’s important to hold yourself and those around you to high standards, even if they’re your superiors. Karen Ruggiero’s data fabrication was discovered when a graduate student asked to see her notes for a related project they were working on and she refused. Even if outright fraud isn’t occurring, Anderson says labs should foster an environment of “collective openness” to protect everyone from committing scientific sins.

If you see a colleague fabricate data or otherwise fudge research protocol, don’t be afraid to report it to your institution’s research ethics office, says Sangeeta Panicker, PhD, APA’s director of research ethics. “There’s a strong commitment to whistleblower protection,” she says. The federal Office of Research Integrity has broad authority to bring lawsuits against institutions that retaliate against whistleblowers.

A strong commitment to ethics isn’t just about avoiding damage to your career, Anderson says, it’s about upholding the integrity of scientific enterprise. When those rules get broken — whether accidentally or intentionally — it undercuts the foundation of what makes science a powerful tool for investigating the world around us.

“Young scholars should be empowered to raise questions,” Anderson says, “so that things can get caught, be they mistakes or misconduct.”


For more articles on student issues, go to gradPSYCH magazine. Students can also join the conversation through gradPSYCH’s Facebook page.

Further reading

  • Babor, T.F., et. al. (2008). Publishing Addiction Science: A guide for the perplexed. Essex: Multi-Science.

  • Macrina, F.L. (2005). Scientific Integrity: Text and Cases in Responsible Conduct of Research. Washington, DC: ASM Press.

  • Goodstein, David. (2010). On Fact and Fraud: Cautionary Tales from the Front Lines of Science. New Jersey: Princeton University Press.