Online exams: is technology or authentic assessment the answer?
With the coronavirus pandemic still raging, distance learning looks to remain a key part of universities’ activities for some time. And given the success they have had with experiments in online tuition, many institutions may well choose to continue it as an option into the long term.
But teaching at a distance means assessment at a distance, and finding the fairest and most secure way to do this has not been straightforward. Nevertheless, the surge in cheating cases reported by universities since the switch to online examinations makes the need for a solution urgent.
Online “closed-book” exams have proved comparatively easy to cheat on, while open-book exams with a time limit do not fix the problem because contract cheating websites, which write student essays or answer questions for a fee, now operate around the clock.
Sarah Eaton, an associate professor of education at the University of Calgary and a programme organiser with the International Center for Academic Integrity, said the sector’s response to the problem could be simplified into two options: the use of online proctoring technology or “authentic assessment”.
Authentic assessment asks students to apply what they have learned in different circumstances, rather than relying on what knowledge they have memorised, and is therefore considered difficult to cheat on. Online proctoring uses technology to act as the “invigilator” on a student’s computer, checking that students remain in the room or tracking their eye movements, analysing their keystrokes or blocking them from accessing other websites.
“We’re starting to see a divide in the academy between those who favour authentic assessment and new ways of doing things, and those in positivist or STEM disciplines where testing has been the norm and continues to be important,” Dr Eaton said.
Technology seems like an obvious answer, particularly for those courses accredited by professional, statutory and regulatory bodies, which require students to meet certain professional standards and call for closed-book exams.
Many institutions are taking the technological route. Scott McFarland, chief executive of US-based ProctorU, which offers remote oversight by invigilators, said he has had to double his staff of about 500 human proctors since the lockdowns began. Another US company, Proctorio, said it has experienced 900 per cent year-on-year growth in proctored exams completed on the platform.
In the UK, uptake has been more cautious. The Quality Assurance Agency (QAA) conducted a small survey of members about their use of proctoring services since the pandemic, and Simon Bullock, the organisation’s quality and standards specialist, said that of the 50 who responded, about half had used some form of online proctoring. This ranged from simple identity checking or invigilators monitoring students via Zoom to the kind of software that tracked eye movements or keystrokes, but “very few used the most advanced software”, he said.
This was likely because of the high cost for the more advanced software, and the resources needed for Zoom invigilation, for example, he explained.
Mike Draper, professor of legal studies at Swansea University and a member of the QAA’s academic integrity advisory group, explained that additional resources were necessary to use the most advanced software. Before students sit an assessment, they have to do a “pre-assessment” to check that the software works, and there have been instances where the technology failed and students have had to resit the whole exam.
The issue of credibility and the need for some students preparing for certain professions to be able to demonstrate that they have the right competencies were important, Professor Draper said. However, online proctoring was not “entirely foolproof”, he continued. “You still find that there are ways to cheat it.”
Aside from resources, online proctoring throws up ethical issues, Professor Draper said. Over the summer, a number of legal news outlets reported that students taking the Bar Standards Board’s exam online resorted to relieving themselves in bottles because they feared that the online proctoring system would mark them down for leaving to the room to use the toilet.
Privacy issues are a particular concern, with many students baulking at the idea of having someone take over their computer or gaining access to their camera. “Finding a quiet and private space to take an exam at home means using their bedroom for most students. That can be very intrusive, particularly for young women,” said Cath Ellis, associate dean of education at UNSW Sydney.
Last summer, students at the Australian National University launched a petition against the use of proctoring software Proctorio at the university, describing it as a “gross invasion of our privacy”. The issue was compounded in August 2020, when the universities of Sydney and Melbourne confirmed that they were investigating an alleged a data breach at ProctorU, the online examination tool they were using.
Some companies say they have recognised the issue. Keith Straughan, chief executive of Axiologs and one of the developers of UK-based online testing platform Tenjin, said they took these “concerns very seriously indeed”. Tenjin does not “take control” of a candidate’s computer or use continuous video monitoring; instead it takes frequent photos, biometrically matched against identity documents, and supplements this with voice recognition, he said.
This also lowers the bandwidth needed, making the platform easy to use internationally, even in technically challenging geographies, which is particularly useful for English-language assessment, Professor Straughan added.
Some universities have opted against using any kind of proctoring software. Dr Eaton said Calgary had commissioned a group, of which she was a member, to look into whether the university should invest in such technology. It concluded that there was “not enough research done by independent and objective third parties to support its use”, she said.
“It didn’t justify the cost, not at a time when we’re cutting jobs,” she explained.
This was echoed by Phillip Dawson, associate director of the Centre for Research in Assessment and Digital Learning at Deakin University. “Proctoring companies claim that their tools can reduce cheating, but I’m yet to see any robust, independent peer-reviewed evidence in support of that,” he said. “I’ve approached several remote proctoring companies over the past couple of years to arrange a study where I try out a range of cheating approaches in their exams, but none have said ‘yes’ so far.
“To me, the question is: how much does it reduce cheating, and is that reduction worth the trade-offs that are made in terms of privacy?” he said.
Jesse Stommel, senior lecturer in digital studies at the University of Mary Washington, warned that some companies “are taking advantage of the crisis”.
The use of online proctoring software “creates a culture of anxiety and stress”, he said. Rather than focusing on tales of an increase in cheating, which, Dr Stommel said, was negligible, institutions should be making decisions about what was best for their students. “Students don’t need cameras in their homes, watching their every move. Students need support,” he said.
Irene Glendinning, academic manager for student experience at Coventry University, advised that “if you are going to [use online proctoring], you need to have the consent of the students and ensure that they understand what it will do and why it is needed. Overall, authentic assessment is a better way of assessing: you are asking students to demonstrate their understanding, rather than regurgitate facts, and learn more in the process.”
Dr Ellis agreed. “People need to look at what is the purpose of that assessment. What are we asking students to demonstrate? If we go back, a lot of it is about checking that students can do something themselves without outside assistance. So we need to find a better way of doing that,” she said.
Dr Ellis suggested that even subjects where high-stakes exams seem necessary, such as languages or medicine, could move towards more authentic assessment through mini-vivas, in which students are questioned about their body of work or asked to demonstrate their overall understanding.
Neil Morris, dean of digital education at the University of Leeds, explained that addressing assessment takes time. “We need to remember that everyone is still in emergency mode. To do this properly, you start with curriculum design and learning outcomes…people are doing their very best, but there hasn’t been the time to go back and rethink their curriculum from the very beginning,” he said.
Professor Morris believes that what is required is a move towards assessing students’ ability to solve problems and to apply knowledge to contextual situations through authentic assessment. Creating such a portfolio attesting to a student’s learning and demonstrating their abilities would be more valuable to future employers, he said.
“Looking ahead, online proctoring would likely be used only where it’s absolutely necessary for professional exams, but at a minimum level,” Professor Morris said.
The problem is that assessment often is designed with efficiency in mind rather than educational effectiveness, argued Janice Orrell, emeritus professor of higher education and assessment at Flinders University. Institutions were failing to provide sufficient professional education or supportive mentoring for staff to take on this high-stakes aspect of university education, she said.
This was echoed by Dr Stommel, who said institutions should be investing in their teachers and their development. “Every dollar that is spent towards remote proctoring, towards new learning management systems, should go towards faculty development and student support,” he said. This would allow teachers to develop assessment methods that help learning. “It’s the most pedagogically sound thing to do,” he said.
For Professor Dawson, proctored exams have made it “easy to keep doing roughly the same exams we’ve always done. I worry this might get in the way of reimagining what assessment should be for the future.” They must not be “an artificial life-support system for a dying type of assessment: the invigilated exam,” he said.
However, Dr Glendinning was hopeful that the sector was starting to change. While the pandemic has posed many challenges, it has “had a really positive impact on people’s understanding of what makes good assessment, what works well and what not to do. It has shone a light on the problems with the old way of doing things, ways that have been done for centuries but not changed,” she said. “The genie is out of the bottle.”