Do students achieve significant “learning gain” during their time at university? And can this “distance travelled” be measured using standardised tests?
These are questions that are being asked around the world, not least in the UK, where standardised testing is to be piloted by the Higher Education Funding Council for England and could eventually form part of the teaching excellence framework.
Some answers may come from Brazil, one of the few countries that have required students to take standardised exams.
The National Student Performance Exam (Enade) has been administered to students in their final year since 2004, and between 2004 to 2010 it was taken by first-year students, too.
It has a general part, which tests knowledge of cultural and social aspects of contemporary society, and a subject part, which covers broad discipline areas, consisting of multiple-choice and essay questions.
Between 2004 and 2009, the Enade was taken by a representative sample of students from all disciplines every year. Since 2010, however, it has been administered to students in different subject areas on a rotating basis.
An analysis of the results of 484,410 students who took the exam between 2008 and 2010, published in Higher Education, reports that students in most disciplines recorded an appreciable increase in general knowledge and even greater improvements in subject knowledge.
There were significant variations by discipline, with students in most biological science fields performing best, and, in general, students at private universities scored better than their counterparts at public institutions.
Tatiana Melguizo, associate professor of education at the University of Southern California, and Jacques Wainer, associate professor in computer science at the State University of Campinas, say that their findings contrast with the conclusions of the influential US study Academically Adrift: Limited Learning on College Campuses.
In this 2011 book, sociologists Richard Arum and Josipa Roksa, of New York University and the University of Virginia, respectively, analysed the results of Collegiate Learning Assessment tests to conclude that 45 per cent of students did not demonstrate any significant improvement in their learning in the first two years of university, and that 36 per cent showed no major gains after four years.
Dr Melguizo told Times Higher Education that Academically Adrift measured only general skills, whereas students in the main went to university to develop subject area knowledge. She also said that Academically Adrift did not properly control for prior academic preparation.
Standardised testing, she said, could be a very useful “formative indicator” that could help institutions to benchmark the progress of students and departments.
Hamish Coates, professor of higher education at the University of Melbourne, said that the Brazilian experience was a “sign of the shape of things to come” and argued that the results could form part of a measure of institutional performance.
But Dr Melguizo argued that using the data for institutional rankings would be “very dangerous” because exam results could change significantly from year to year, which could also increase insecurity for academics.
The Brazil study also highlights some of the difficulties with standardised testing because, although it is ostensibly compulsory, it is not a prerequisite for graduation and is not used by employers.
Of 587,000 students who were required to sit the Enade in 2012, only 469,000 actually did. And, taking the economics paper of that year as an example, one in 10 students did not answer any multiple-choice questions, and nearly one in three did not attempt the written questions.