The UK is taking a major leap forward in its use of data in higher education as the country gets the world’s first national learning analytics service.
Learning analytics, which uses data on student performance and activities to track engagement and identify undergraduates who are at risk of dropping out, is being used by growing numbers of universities around the world; but so far the UK has been regarded as lagging behind institutions in the US, as well as Australia and New Zealand.
However, on 1 August Jisc, the UK sector’s main technology body, will launch a national learning analytics service, with 30 institutions – 24 universities and six colleges – already signed up and a further 30 expressing an interest in following suit.
The programme will collate student data such as attendance and grades to create records of their learning, and will then produce dashboards that will enable staff to view visualisations of the information.
The service will also include the Study Goal app, which allows students to record their activities, set targets and compare their progress with friends, in a similar way to a fitness tracker app. The service can be used to register attendance using a code enabled by the tutor during a teaching session and by location tracking.
Phil Richards, Jisc’s chief innovation officer, said that when a pilot was launched in 2016, universities saw the main benefit as identifying students who were disengaging from learning and improving retention rates.
But, as time has gone on, “more universities have recognised the wider benefits of analytics and signed up”, Dr Richards said.
“There is evidence that it is particularly beneficial for widening participation cohorts and helps with student success and well-being,” he said. “The University of East Anglia, for example, doesn’t have a retention problem but signed up because it is interested in the employability angle.”
A critical issue for learning analytics is the collection of data on individuals and how this links to issues of ethics and consent.
Dr Richards said that Jisc had spent a long time developing a code of service to ensure that ethics were always front and centre. However, he added that a lot of the data was already collected by universities and the service was simply collating it in one place.
James Moir, senior lecturer in sociology at Abertay University, who has analysed a pilot of the Study Goal app at his institution, said it was important that universities were clear about the use of the collected data.
“If it is being used for more than just helping students organise and compare their study, such as sending them emails to say ‘you have not engaged with this module, did you know you run the risk of failing’, that’s not sinister, but it is another purpose for the data,” he said. “They have to be told about the different uses, and they have to be able to use it on a take-or-leave basis.”
Dr Moir said that his study had found that students were largely happy for their data to be collected, reflecting not only the attitudes of students today but also their recognition that the tool could help them.
However, Dr Moir added that another problem was that the information provided was very functional. “Measuring how many hours you spent reading or times you logged in to the VLE [virtual learning environment] does not take into account that learning is transformative and involves different ways of thinking, debating and analysing,” he said.
“There are also questions around using it to flag ‘at risk’ students. They might feel they are being unfairly labelled,” continued Dr Moir, who presented his research earlier this month at the Quality Assurance Agency’s enhancement conference.
Several UK institutions, including the Open University and Nottingham Trent University, already have their own well-established learning analytics programmes.
Eunice Simmons, Nottingham Trent’s deputy vice-chancellor, said that her university had deliberately avoided focusing the service on identifying “at risk” students.
“We wanted to share the narrative and data with students, rather than say ‘you are at risk of failure’,” she said. “We’re not doing a stereotyping exercise.”
Professor Simmons said that, although her university had benefited from creating its own analytics service, the Jisc service would shorten the journey for those who haven’t started developing their own.
Michael Hughes, education, research and enterprise services manager at City, University of London, which took part in the pilot and is continuing with the service, said that access to the data infrastructure and the dashboard was incredibly important.
“We’re still evaluating but we’re also hoping that a side-effect could be helping course leaders to see the effectiveness of their course design,” Mr Hughes said. “The main benefit will be that it assists personal tutors to have proper discussions with their students about why they aren’t performing as well as they should be.”