Discovering how much students knew about the world helped Roger Ottewill learn new ways to teach politics
Lecturers frequently express surprise at how little students know about the world about them. But are 18 to 24-year-olds the "dumb generation" that one national newspaper labelled them? A group of us at Sheffield Hallam University Business School decided to investigate. For the purpose, we designed a multiple-choice quiz.
Students coming to study business, public policy and accountancy in autumn 2001 were bemused to be asked during induction 25 questions on politics, economics, finance, society and demography. The average score was under 50 per cent. But, with no data with which to compare the results, we did not know if this was good or bad.
We decided to repeat and extend the exercise the following year using a mini-grant from C-Sap, the subject centre for sociology, anthropology and politics. As I was moving to Southampton University, we were in a position to compare two groups of students. Adopting the acronym Plato (politics learning and teaching online), we set about developing an e-questionnaire.
As before, this included 25 multiple-choice questions. Most were about national and international politics. Students found some questions easier than others - for example, the minimum voting age and the meaning of the term "pressure group". They were less sure about the Third Way, Doha Round and the number of female MPs. To encourage and intrigue, we included five questions on sport, Big Brother and popular culture.
We wanted to capture students' views on issues such as racism, the inclusion of "citizenship" in the national curriculum, tuition fees and public spending on the elderly. We also wanted to know about students' backgrounds and lifestyles to see if these related to their level of political literacy. We asked what subjects they had studied at school or college; how often they went clubbing; their main source of news; and how much interest they and their parents took in politics. Because of differences in the induction arrangements at each university, we asked Southampton students to complete the questions before they arrived. At Sheffield Hallam completion was, as before, part of induction. Students were asked not to look up any of the information or ask someone for the answers. We had no way of measuring their honesty but the results have a ring of truth about them. The Southampton students, who had opted to study politics, unsurprisingly scored higher than their Sheffield Hallam counterparts. But regardless of university, students who performed best were those who had studied social sciences or regularly read a broadsheet newspaper to keep up to date.
These results were of less consequence in retrospect than what we learnt from the experience and how this could help teachers of politics and similar subjects. Student focus groups at both universities contributed a considerable amount to our appreciation of how teaching was perceived by those on the receiving end.
One lesson was that the period between students acquiring a place at university and commencing their studies provides a valuable window of opportunity for exercises of this kind. Survey fatigue and cynicism have not had time to set in and most students are anxious to please.
The political agenda of students is not necessarily the same as that of lecturers, so it is sensible to involve students in the design of the questionnaire. When we asked students, they suggested including questions on more controversial topics of direct concern to them, such as drinking, drugs, personal safety and the cost of being a student.
It is important to get the results processed and fed back to students while the exercise is still fresh in their mind. If used wisely, the findings are a valuable teaching resource. They can help lecturers to establish a starting point for working with the grain of their students rather than against it.
In teaching introductory politics, a "one size fits all" approach is inappropriate. What this exercise shows is that full account should be taken of student expectations and entry-level knowledge. On this occasion, we did not make as much use of the data generated or interest stimulated as we might have done. But we alerted C-Sap to the potential of our approach and hopefully it will inspire others to try something similar. The instrument and basic results are available at www.politics.soton.ac.uk
Roger Ottewill is a research assistant at the Centre for Learning and Teaching, University of Southampton. He was formerly a business lecturer at Sheffield Hallam University.