Pursue the truth, don't massage it

June 29, 2007

A system that puts more emphasis on outcome than on process can put pressure on science students to falsify results, says Tim Birkhead

How do we know the difference between right and wrong? Is it something we inherit, as some evolutionary psychologists think, or does it have to be taught? And what about making progress, getting on, beating the next person? That's a trait that seems much more likely to be inherited.

I recently ran a workshop for final-year undergraduates at another university on "doing science". During my free-ranging discussions with the students, I discovered something that both surprised and horrified me. They admitted that fudging data for their final-year projects was widespread. As far as I could tell, among this particular group of undergraduates, about 80 per cent probably massaged or made up their data in some way.

They saw the look of horror on my face and, very sweetly, they tried to reassure me, telling me not to worry, they wouldn't do it if this were "real" science. However, I didn't feel reassured.

On probing a little deeper, several issues were revealed. First, fudging data was, they felt, often essential to obtain a good result and good mark - a point reinforced by our assessment-ridden education system and culture.

This is probably true: our assessment methods place far too much emphasis on the end result rather than on the process - presumably because with large numbers of students this is the easiest form of assessment.

Second, seeking the root of fudging, we traced the problem back to their earliest school experiences when they were first introduced to the idea of a "scientific experiment". Several students commented on how often such experiments had "failed" and how teachers had then either fudged the data in front of them or told them what they should have obtained, but rarely if ever went over why the experiment had failed. My own experiences were similar - so not much has changed in 50 years.

At first sight, fudging a result because a primary-school experiment didn't produce the expected one might seem trivial, but repeated exposure to this kind of behaviour can have an insidious effect, breeding a culture of cynicism and a flagrant disregard for the truth.

My knee-jerk response to this was to blame the teachers - many of whom probably have little training in the most basic philosophy of science - and with little idea that by fudging their results, however innocently, they are condoning a culture of dishonesty.

But my surprisingly open (I won't say honest) undergraduates were quick to point out another factor: time. With an overly bureaucratic curriculum they were all too aware that there simply is not enough time to conduct a postmortem of a failed experiment, or even to repeat it. Our box-ticking culture that poses as education does little to encourage teachers or pupils to reflect, to stop and ask why something went wrong.

The experiments themselves do not matter, but what does matter is that many of those entering a degree in science do not know the difference between right and wrong, or if they do, they don't care: getting on is all that matters.

Tim Birkhead is professor of behavioural ecology at Sheffield University.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored