Fraud threatens the integrity of social psychology

Michael Cook
4 May 2013
Reproduced with Permission
BioEdge

Even in scientific laboratories Georg Wilhelm Richmann is not a household name. But he ought to be. Richmann was an 18th century Russian scientist who died trying to repeat Benjamin Franklin's famous experiment of attracting lightning to a kite. A ball of lightning travelled down the cord and struck him dead. The first martyr for the cause of science died trying to replicate another scientist's results.

The ability to reproduce the results of an experiment is a key step in the rapid progress that science has made since then. As any undergraduate in science knows, a true scientist observes, hypothesizes, predicts, and tests to reach a conclusion. We can be sure that his conclusion is unbiased and certain, because anyone can replicate it to see if it contains errors.

But what if no one bothers to replicate it? Is it really science?

This is the question which is shaking the whole field of social psychology after a Dutch professor admitted that most of his stellar career - with papers in the world's best journals - was a gigantic con job.

Diederik Stapel, the dean of the School of Social and Behavioral Sciences at Tilburg University, in the Netherlands, published dozens of articles based on fraudulent data over 15 years at three universities. He even forged data for the post-graduate students he was supervising, tainting their degrees.

Shortly before the secret of his stellar career was unmasked, he wrote three of the media-friendly studies which had made his work so well-known. In the leading journal Science he claimed that in a rubbish-strewn environment, people are more likely to be racists. In Psychological Science, he claimed that positions of power in work environments increase rates of infidelity among both men and women. And in another study he claimed that vegetarians are happier and more social than meat-eaters.

But in 2011 whistleblowers alerted authorities at Tilburg University about irregularities in his published papers. His reputation unravelled quickly. Stapel has admitted that he had fiddled his data and fabricated research results and has returned his PhD.

Now remorseful, Stapel opened up to the New York Times recently. "People think of scientists as monks in a monastery looking out for the truth," he said. "People have lost faith in the church, but they haven't lost faith in science. My behavior shows that science is not holy."

Amazingly, Stapel taught research ethics at Tilburg University. When he asked students to review their work, he found grievous errors: "They got back with terrible lapses," he told the NYT. "No informed consent, no debriefing of subjects, then of course in data analysis, looking only at some data and not all the data." Stapel, of course, had made up his data, so none of these issues affected him.

Is Diederik Stapel just one rotten apple?

No one is suggesting that all social psychologists are frauds, but there does seem to be something amiss; Stapel is not the only scientist whose experiments could not be replicated. Last year Dirk Smeesters, of Erasmus University in the Netherlands, had to resign after he admitted that he had "massaged" data in his studies of consumer psychology. A psychologist at the University of Michigan, Lawrence Sannna, resigned last year after questions were raised about his work.

In fact, the situation in social psychology is so bad that Dutch academics who reviewed the Stapel case were horrified. In a sombre assessment, three panels chaired by Willem Levelt, a former president of the Royal Netherlands Academy of Arts and Sciences, found fundamental flaws in the scientific process both in the Netherlands and internationally.

"Virtually nothing of all the impossibilities, peculiarities and sloppiness mentioned in this report was observed by all these local, national and international members of the field, and no suspicion of fraud whatsoever arose... from the bottom to the top there was a general neglect of fundamental scientific standards and methodological requirements."

They also criticised the editors and reviewers of leading international journals.

"Not infrequently reviews were strongly in favour of telling an interesting, elegant, concise and compelling story, possibly at the expense of the necessary scientific diligence."

For social psychologists, the conclusion of the report is damning, almost apocalyptic:

"A 'byproduct' of the Committees' inquiries is the conclusion that, far more than was originally assumed, there are certain aspects of the discipline itself that should be deemed undesirable or even incorrect from the perspective of academic standards and scientific integrity."

(Predictably, the Executive Committee of the European Association of Social Psychology attacked the report's conclusions as "slanderous".)

This was not just a local disaster, but one which has rippled internationally. "I see a train wreck looming," wrote Nobel laureate Daniel Kahneman, a psychologist, in an open email to colleagues who work in social priming, one of Stapel's areas: "your field is now the poster child for doubts about the integrity of psychological research".

Last year the journal Perspectives on Psychological Science had a special issue on the field's crisis of confidence. Its focus was the key issue of replicability. John P.A. Ioannidis, of Stanford University, points out that the authority of science depends upon its ability to self-correct errors.

But as the Levelt report revealed, reproducing the results of other researchers is uncommon. Researchers are far more interested in startling new results because these will attract more funding. It is the curse of "neophilia". "The self-correcting paradigm… seems to be very uncommon," Ioannidis writes.

Nor is the peer review process a guarantee against the publication of fraudulent results. Professor Colin Macleod, a leading psychologist at the University of Western Australia, told MercatorNet in an email:

"this peer review process is not designed to police against data fraud. Rather, peer reviewers begin with the assumption that the reported study was conducted, and their responsibility is to appraise whether its design is appropriate to answer the question addressed, whether the data reported as resulting from this study is analysed in an appropriate manner, and whether the outcomes permit the conclusions drawn. If the study was not actually conducted, (or if the reported data are not actually those obtained from the study), then the peer review process would not be expected to routinely pick this up."

Does this weakness in the scientific method matter? In medical research, it clearly does. Fraudulent studies can lead to ineffective or harmful treatments. British surgeon Andrew Wakefield published research in The Lancet in 1998 which "demonstrated" that there was a link between the measles-mumps-rubella vaccine and autism. This terrified parents and vaccination rates fell significantly. But he had faked his results. Many more children in Britain have contracted a potentially fatal disease as a result.

But social psychology? Studies of whether people who like barbecued tofu are nicer than people who like barbecued steak? Do they matter?

Possibly. The hottest topic in public policy at the moment is same-sex marriage. The case for legalisation is supported by scores of small social psychology studies on the impact of stigma, the strength of same-sex relationships, the effect of same-sex parenting, the effects of marriage on well-being and so on.

These have had an enormous influence upon the courts. For instance, one of the reasons why US District Court Judge Vaughn Walker struck down Proposition 8, California's referendum banning same-sex marriage was that he believed that: "The research supporting this conclusion is accepted beyond serious debate in the field of developmental psychology."

In his mind there was absolutely no appreciation of the uncertainty and questions hanging over psychology. But replication is clearly a problem in the hotly-disputed field of same-sex parenting.

Stapel's disgrace has nothing whatsoever to do with same-sex marriage. But the systemic failures which allowed him to publish fraudulent results for years should make us all a bit more sceptical about whether claims by social psychologists are ever "beyond serious debate". And this is the case whether they involve the influence of rubbish on racism or the success of same-sex parenting.

As Chris Chambers, a psychologist at Cardiff University, told The Guardian: "In many respects, psychology is at a crossroads - the decisions we take now will determine whether or not it remains a serious, credible, scientific discipline along with the harder sciences."

Top