Misconduct is part of human nature, laments Tim Birkhead. Even scientists are at it
It's not been a good week: a lot of exam marking, a wet field course and three separate instances of scientific misconduct brought to my attention. The first involved an undergraduate who had written his own reference. Luckily, the perfidy was betrayed by poor grammar. The second involved the chair of a medical study committee who on finding that members had a lot of great ideas decided to capitalise on the opportunity and transform the study into a PhD for himself. The third involved final-year students at another university who admitted that fudging project data was widespread.
Part of me despairs, but part of me acknowledges that misconduct is human nature. The part of me that despairs does so because scientists, almost more than anyone else, ought to be immune to corruption.
The other part of me recognises that no one is. Fabrication, falsification and plagiarism all constitute scientific misconduct, but not all cases are clear-cut. Of those above, the first is not scientific misconduct in the strict sense; it is simply dishonest. The second is less obvious but, in my mind, still dishonest: you simply do not glean knowledge given in good faith and later use it for self-promotion. The third case is interesting because, on closer questioning, the undergraduates did not consider fudging data to be misconduct because their project work was only for their degrees - it wasn't going to be published. Apparently, they felt that if they were doing real scientific research, they would be honest. Really? A slippery slope, I fear.
To discover what researchers consider to be scientific misconduct, a colleague - Bob Montgomerie of Queen's University in Canada - and I recently devised an anonymous questionnaire. You can find it at http:///biology.queensu.ca/montgome/sm/ .
Is scientific misconduct on the increase? Competition for resources and the cynicism it engenders certainly are, so misconduct might be. Horace Judson in The Great Betrayal (2004) states that organisations such as large companies, the Catholic Church and universities are especially vulnerable because they are allowed to police themselves. One solution is to increase regulation, but more bureaucracy isn't the answer. I cannot help feeling that some cynicism expressed by the undergraduates I talked to reflects distrust of a system that generates ever-increasing scores for A levels and undergraduate degrees, and fails to dismiss researchers for misconduct ("US war on plagiarism takes first scalp", The Times Higher , April 15). Indeed, the deterrents are few: as one postdoc put it, the maximum penalty for getting caught committing fraud is no worse than the routine penalty for not publishing (enough or in the right places), so there's little disincentive ( Nature 384: 6, 1996).
What do we do about it? Clearer disincentives might work, as would more instruction. How many undergraduate courses discuss the role of ethics and integrity in research? Perhaps all undergraduates should get a course on "doing science" - how to, how not to and the implications of scientific dishonesty. At the very least, discussion of the issue might raise awareness, rather than pretending that science is self-policing.
Tragically, the undergraduates I spoke to about this all felt that the notion of fudging data in scientific experiments began at school, so if we are going to use training to blow the whistle on scientific misconduct, we may have to start young.
Tim Birkhead is professor of behavioural ecology at Sheffield University.