The Times Higher Education University Impact Rankings are the only global performance tables that assess universities against the United Nations’ Sustainable Development Goals (SDGs). We use carefully calibrated indicators to provide comprehensive and balanced comparisons across three broad areas: research, outreach, and stewardship.
Which SDGs are included?
There are 17 UN SDGs and we are evaluating university performance on 11 of them in our first edition of the ranking (click on a category below to view its specific methodology):
- SDG 3 – Good health and well-being
- SDG 4 – Quality education
- SDG 5 – Gender equality
- SDG 8 – Decent work and economic growth
- SDG 9 – Industry, innovation, and infrastructure
- SDG 10 – Reduced inequalities
- SDG 11 – Sustainable cities and communities
- SDG 12 – Responsible consumption and production
- SDG 13 – Climate action
- SDG 16 – Peace, justice and strong institutions
- SDG 17 – Partnerships for the goals
Universities can submit data on as many of these SDGs as they are able. Each SDG has a series of metrics that are used to evaluate the performance of the university in that SDG.
Any university that provides data on SDG 17 and at least three other SDGs is included in the overall ranking.
As well as the overall ranking, we also publish the results of each individual SDG in 11 separate tables. This enables us to reward any university that has participated with a ranking position, even if they are not eligible to be in the overall table.
How is the ranking created?
A university’s final score in the overall table is calculated by combining its score in SDG 17 with its top three scores out of the remaining 10 SDGs. SDG 17 accounts for 22 per cent of the overall score, while the other SDGs each carry a weighting of 26 per cent. This means that different universities are scored based on a different set of SDGs, depending on their focus.
The score from each SDG is scaled so that the highest score in each SDG in the overall calculation is 100. This is to adjust for minor differences in the scoring range in each SDG and to ensure that universities are treated equitably whichever SDGs they have provided data for.
The metrics for the 11 SDGs are included on their individual methodology pages.
Scoring within an SDG
There are three categories of metrics within each SDG:
Research metrics are derived from data supplied by Elsevier. For each SDG, a specific query has been created that narrows the scope of the metric to papers relevant to that SDG. As with the World University Rankings, we are using a five-year window between 2013 and 2017. The only exception is the metric on patents that cite research under SDG 9, which relates to the timeframe in which the patents were published rather than the timeframe of the research itself. The metrics chosen for the bibliometrics differ by SDG and there are always at least two bibliometric measures used.
Continuous metrics measure contributions to impact that vary continually across a range – for example, the number of graduates with a health-related degree. These are usually normalised to the size of the institution.
When we ask about policies and initiatives – for example, the existence of mentoring programmes – our metrics require universities to provide the evidence to support their claims. In these cases we give credit for the evidence, and for the evidence being public. These metrics are not usually size normalised.
Evidence is evaluated against a set of criteria and decisions are cross validated where there is uncertainty. Evidence is not required to be exhaustive – we are looking for examples that demonstrate best practice at the institutions concerned.
Unless otherwise stated, the data used refer to the closest academic year to January to December 2017.
Universities must teach undergraduates and be validated by a recognised accreditation body to be included in the ranking.
Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular data point is not provided, we enter a value of zero.
The methodology was developed in conjunction with our partners Vertigo Ventures and Elsevier, and after consultation and input from individual universities, academics, and sector groups.