More than 500 universities from around the world have submitted data to our new Times Higher Education University Impact Rankings, a league table based on the United Nations’ Sustainable Development Goals (SDGs).
But this ranking not only provides a new mechanism for assessing university performance that is vastly different from the methodologies behind research-focused global league tables; it also, and just as importantly, creates a new way of verifying data.
We have expanded the nature of the data we are looking for, primarily through a new type of metric where we not only ask universities whether they do something but also ask them to provide evidence to back it up.
For example, when we consider SDG 8, “Decent Work and Economic Growth”, one of the questions we ask is: “Does your university as a body have a policy on pay scale equity including a commitment to measurement and elimination of gender pay gaps?” If a university replies “yes”, we ask it for evidence.
In this case, we seek evidence that there is a policy – not just an ambition or a hope – and that it refers specifically to the issue of a gender pay gap.
Policies are important even when the law prescribes certain actions. Because laws are rarely specific about how they should be interpreted in every situation, a policy can help to define how a university will apply a law in its own specific circumstances. It may indicate the practical steps that people can take to raise their concerns and ensure that their rights are upheld. It also acts as a powerful indicator that the university leadership treats the issue seriously, and can (in some circumstances) go further than the law.
We have also decided to award extra credit where the evidence is public, which usually means that it is on a publicly accessible website. This encourages openness, and it also makes it possible for a university to be held to scrutiny on its policy. If an institution says that it has a policy on pay scale equity but is not acting on it, then faculty, staff or the wider public are able to take it to task.
We are not in a position to explore whether the policy is the best possible one, but we hope to be able to assess the evidence that has been submitted consistently and fairly. Our team are working to an agreed framework and are recording their decisions.
There are a few different rules, depending on the nature of the question we are asking about:
- Policies – we ask for the dates the policy was created and was or will be reviewed. Policies that have been recently created or reviewed are more likely to be acted on, and it is best practice to review policies regularly to verify that they are still fit for purpose.
- Costs – for some services, for example sexual health services, we will give credit if they are free.
- Links to the SDGs – ideally, we want to see explicit links to the goals and their targets, rather than general links.
- Open data – we ask whether data are published in an open format, which means in a way that they are open to interpretation, analysis and reuse while minimising data errors. This does not include data that are in a table within a PDF, but data that are available in a downloadable file or via an application programming interface (API). (For more information, visit the Open Data Institute).
If this sounds a little prescriptive, it certainly is not meant to be. Personally, I have found the responses to be very exciting. And the most impressive part about the universities that have submitted data is the quality of the evidence they are providing. This is a real credit to the fantastic work that universities are doing, often without recognition.
Duncan Ross is chief data officer at Times Higher Education.