Teaching intelligence: how to use learning analytics

Your university’s new learning analytics service has flagged that a student is disengaging – what now?

November 15, 2018
Astronaut
Source: Getty
Adrift: a change in attendance patterns could indicate a student is in difficulty

Universities globally are increasingly turning to learning analytics, the use of data on student performance and activities, to track engagement and identify undergraduates who are at risk of dropping out.

The topic is much discussed in the sector, with its use touted as a way for universities to provide better, more tailored support for students’ learning and to improve retention rates. But critics warn that the widespread collection of personal data has the potential for misuse.

Either way, learning analytics are here to stay for the immediate future at least: earlier this year Jisc, the UK sector’s main technology body, launched a national service, and a number of universities already have their own services in place.

So how exactly does a learning analytics system work – specifically, what happens when a student is flagged up as “at risk”? How do universities and teachers use the analytics to ensure that students get the best from their course, stay on track and do not drop out?

Learning analytics services usually have a “dashboard” interface that holds data about individual students, measuring their engagement with the course. Typically, key gauges of engagement include a student’s attendance, use of library resources, marks for assessments and use of the institution’s virtual learning environment; and tutors are shown a red, amber or green indicator for each student.

According to James Hodgkin, associate director and university librarian at the University of Gloucestershire, attendance is the key indicator of engagement. At the start of a seminar or lecture, students at Gloucestershire use Jisc’s Study Goal app on their phones to input a four-digit pin code that is linked to a time stamp and geolocation.

“Early indication of a change in attendance patterns could be an extremely valuable sign of something more serious developing,” he said.

Mr Hodgkin added that tutors are asked to act if the data show that a student is disengaging. They might write a letter requesting a meeting or informal chat with the student, aimed at encouraging a change in behaviour, he said.

The university has already had feedback from tutors who found that it could be a real eye-opener to show students their data. “Seeing how they are out of step with the rest of class can be powerful,” Mr Hodgkin said.

The data will not always be a surprise, but they can be a great way to illustrate the situation, he said. Explaining in very practical ways how students can re-engage can be supported by the data, which are “evidence-based and specific”. The information enables the tutor to home in on the issue – and also to offer praise in areas that are going well, he added.

Ed Foster, the student engagement manager at Nottingham Trent University, which has created its own learning analytics service, agreed that showing students their own dashboard was useful. It is a “powerful tool” to help undergraduates understand because sometimes students are less engaged simply because they have not understood what is expected of them, he said.

NTU also “strongly recommends” that staff use the dashboard more frequently around high-pressure times for students and intervene more quickly, because if students are disengaging right before deadlines, that situation “is far more serious”, he said.

Mr Foster added that it was also important that tutors understand what the tool is showing them – and what it is not. To make learning analytics useful, the system must compress each complex individual student into relatively few points of data. Although this makes the figures easy to understand, it does “require users [tutors] to think about the data presented”, he said.

“It’s important to know that it cannot tell you why a student is disengaging; this has to be done through personal interaction,” Mr Foster said. He added that it was vital that if a tutor sees a student with very low engagement, that insight must serve as a prompt for supportive discussion, not as a reason to dismiss them as a lost cause.

If a tutor discovers that an individual has a serious problem, Mr Hodgkin said, they may need to refer the student to other support within the university. Mr Foster agreed: “Learning analytics work best as part of your institution’s support system.”

According to Bart Rienties, professor of learning analytics at the Open University, tutors armed with insights from learning analytics must be very tactful in dealing with students. He advises tutors to keep things fairly general, rather than pointing the finger at the student.

“What people always underestimate [about learning analytics] is the people aspect; no matter how good the data system…the biggest hurdle is to train tutors to make sense of data and have conversations that are actually supportive for the students rather than just ‘hey, the computer says you are at risk, go to the nearest library to solve your problem’,” Professor Rienties said.

anna.mckie@timeshighereducation.com

POSTSCRIPT:

Print headline: What to do when the data dashboard says ‘danger’

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored