University Impact Rankings: FAQs

We answer your questions about the Times Higher Education University Impact Rankings

九月 17, 2019
Newton's cradle
Source: iStock

Times Higher Education’s University Impact Rankings capture universities’ impact on society based on institutions’ success in delivering the United Nations’ Sustainable Development Goals.

A set of performance metrics was developed last year and published as a league table at the THE Innovation and Impact Summit held at the Korea Advanced Institute of Science and Technology (KAIST) in South Korea in April 2019.

We have now confirmed the metrics for the THE University Impact Rankings 2020, which will be based on all 17 SDGs. A document detailing the new metrics is attached at the bottom of this article.

If you would like to submit data, ask questions or offer feedback, please contact us at

You can discover more about how we are evaluating impact in this rankings blog.

Watch our webinar on the THE University Impact Rankings 2019

Note: The slides are attached at the bottom of the article



What does this rankings do that other rankings do not do?

The Times Higher Education World University Rankings are designed for research-intensive global universities and are dominated by indicators of research excellence.

THE’s data team has also successfully pioneered new teaching-led rankings, focusing on teaching excellence and student success, in Japan, in the United States (in partnership with The Wall Street Journal) and in Europe.

This rankings, however, explores the impact that a university can make, specifically by looking at themes of sustainability.

What is the THE University Impact Rankings about?

The THE University Impact Rankings shows how the global higher education sector is working towards the United Nations’ Sustainable Development Goals (SDGs).

Why is the THE University Impact Rankings important?

It provides a showcase for the work being delivered by universities in our communities, and it is an opportunity to shine a light on institutional activities and efforts not covered in other rankings. It will allow us to demonstrate the differences a university is making to the world we live in.

Can all institutions participate in this ranking?

This ranking is open to any higher education institution in the world. We want this ranking to be as inclusive as possible.

Data collection is open to any university that teaches at an undergraduate level and is validated by a recognised accreditation body. There are no bibliometric-related eligibility criteria.

We will also accept data from others outside this group for wider analysis and editorial purposes.

This is different from the THE World University Rankings, which includes a minimum publication volume as part of the eligibility criteria.

If an institution does not provide any data, it will not be ranked.

If you would like to take part in the rankings please email There is no participation fee. 

For more information on data collection please take a look at our guide and the list of questions you will be required to answer.

What is the time frame for this ranking?

The first version of the University Impact Rankings was launched in April 2019. Data collection for 2020 via our data collection portal will open in October 2019 and close on 16 January 2020. 

What are the UN Sustainable Development Goals?

There are 17 SDGs, which were adopted by the UN in 2016 to provide a framework for developing the world in a sustainable way.

These include ending poverty and hunger; promoting good health and well-being and quality education; achieving gender equality and economic growth; providing access to clean water and sanitation and affordable and clean energy; fostering innovation; reducing inequalities; building sustainable cities and communities and achieving responsible consumption and production; tackling climate change; managing sustainably life below water and life on land; promoting peaceful societies; and revitalising global partnerships.

How will the ranking work?

The ranking is based on the 17 SDGs. Not every target in the SDGs relates directly to universities, but we believe that the higher education sector has a significant role to play in helping nations to deliver on the SDGs agenda. For each SDG, we have identified a limited set of metrics that can give an insight into progress.

In the first year, we collected data on 11 of the 17 goals from participating universities. For 2020, we are expanding this to all 17 SDGs.

Universities may provide data on one or more of the SDGs.

We produce an overall ranking of universities based on institutions’ data on SDG 17 (the only mandatory goal) plus their top three scores on the remaining SDGs. This will allow universities to demonstrate their excellence in the areas that are most relevant to them, their community and their country.

Rankings of the universities that are best achieving the individual SDGs will also be published.

My university is not active (or does not record data) across all SDGs – is it worth participating?

Not all universities will be able to report on all the metrics covered in the rankings. To be included in the overall ranking, we ask that you submit data on SDG 17, which is mandatory, and at least three other SDGs of your choice.

A university that submits data in fewer than three other SDGs cannot be included in the overall ranking. However, it can still be ranked in the tables on individual goals. For example, if you have done great work on climate action, submitting in that category alone would enable you to be ranked for it.

The ranking will reflect local activities as well as international activities.

What happens if we submit data for more than four SDG areas?

We will evaluate your performance in all areas and then choose the three goals in which you excel; these will count towards the overall university score.

How many rankings will THE produce?

THE will use provided data to produce:

  • An overall ranking of universities based on the top three SDGs for each individual university, plus SDG 17 (revitalising global partnerships)
  • Individual rankings of universities based on performance in each SDG

Will you rank all universities?

We will rank all universities that submitted data and meet our eligibility criteria. 

Are other stakeholders involved in this ranking?

We are collaborating with experts in the field, including Vertigo Ventures, an organisation that helps leading research institutions globally to identify, capture and report the impact of their work. Our bibliometric supplier is Elsevier. We are also consulting with higher education institutions and groups to help us better define our metrics.





What is the rankings methodology?

The THE University Impact Rankings is created using the UN Sustainable Development Goals as reference.

Each SDG has a small number of metrics associated with it.

Data will come from a variety of sources, including:

  • Direct submissions from institutions
  • Bibliometric datasets from Elsevier

The overall score will be calculated by counting SDG 17 (revitalising global partnerships) as a mandatory data field and combining this with data on the best three SDGs per university.

The metric weightings are in the file attached at the bottom of this page. Alternatively, please email for a copy of the document. 

Won’t this just favour big, established universities?

We have tried to define the metrics in a way that allows all universities to participate – this has included focusing on definitions that rely on less complex calculations than in an ideal world. We have also tried to ensure that the choice of metrics is not overly biased towards wealth.

As with the World University Rankings, we will normalise for university size where appropriate, and use other measures to ensure equity between different countries and universities.

We do not expect universities in different parts of the world to have the same areas of focus. By combining the SDGs and allowing flexibility, we open up the project to universities that have very different missions and ensure that institutions in Western Europe and North America do not have an unfair advantage.

How did you come up with methodology?

THE has been discussing aspects of university impact for several years. This has included a lengthy consultation with interested parties, culminating in an open session at the THE Young Universities Summit in Florida in June 2018.

Other crucial aspects informing our decision were feasibility and access to data.

How do you come up with the final scores?

Numerical answers are calculated and then z-scored.

Evidence-based answers are scored as described in the above webinar, with credit given for the answer, for openness, for recency, and (depending on the question) for factors such as cost.

Continuous data is evaluated and then z-scored to give a score that can be compared to other universities. This is then weighted to the correct percentage for that metric.

How should a university interpret the results of the overall ranking if different universities are supplying data on different areas?

The overall ranking provides a sense of which universities are making positive steps towards the SDGs most strongly. 

Universities can be compared more easily in the individual SDG tables. 



Data collection and validation

Data collection 


What are your data sources?

We invite universities to submit data in a subset of SDGs. For each SDG, there will be some data that are collected from universities as well as bibliometric data provided by Elsevier.

How will you validate the data?

Universities will be asked to provide evidence or documentation to support their submissions. Clarifications and/or URLs to sites with evidence are requested under the caveat section and/or the data submission section in the portal.

Information will be cross-checked against external sources at our discretion, and we reserve the right to investigate institutions where we believe inappropriate data collection or submission has taken place.

We encourage universities to publish their evidence, and in many cases we expect the evidence to be sourced from existing public sources, for example, annual reports. Public documents do not have password protections or time limits. 

Our team of analysts will compare evidence that is provided to the definitions, and it will be marked accordingly.

What types of evidence do you accept?

We accept links to documents or websites and publicly available timetables, brochures, magazines and articles.

If provided documents are confidential, universities must explicitly indicate this in the caveats.

If evidence is included in a broader document then we expect the institution to point out the specific part of the document where evidence to the question can be found.

We are not looking for a large volume of evidence; rather, we ask institutions to submit the best pieces of evidence. We allow up to three evidence items to be uploaded for each question, where relevant. 

We do not expect universities to submit all the evidence in English. 

If one piece of evidence is applicable for more than one question you can re-submit the same piece of evidence. 

More credit will be given to publicly available evidence but we do not rate different forms of evidence differently. For example, we do not consider public websites more or less important than a public PDF document or brochure.

You cannot upload videos as evidence but you can provide a URL that includes a video on the page.

We look for evidence from the "university as a whole" rather than, for example, a course in a single department.

What is your process for assessing the quality of qualitative evidence?

Where we are looking for evidence of action – for example the existence of mentoring programmes – our metrics require universities to provide evidence to support their claims. In these cases we will give credit for the evidence, and for the evidence being public.

We are not in a position to explore whether a policy is the best possible one, but we do aim to assess the evidence submitted consistently and fairly.

Evidence is evaluated against a set of criteria and decisions are cross-validated where there is uncertainty. Evidence is not required to be exhaustive – we are looking for examples that demonstrate best practice at the institutions concerned.

Which year should data be based on for the 2020 ranking?

When quoting statistics (continuous data) we ask that the data be specifically from the calendar year 2018, the academic year 2017-18 or the financial year that ended in 2018. However, note that these are only examples. You may use the most appropriate annual cycle that best fits your data, but ends in 2018.

When answering the picklist yes/no questions, the evidence provided does not have to be from 2018. We ask that you provide evidence that best answers the question at hand and that is relatively recent (no more than two years' old).

For policies, we ask an institution to submit the date the policy was created and the date it was last reviewed. At least one of these dates must be submitted to establish whether a policy is an active policy. We expect policies to be regularly reviewed, meaning it should have been created or reviewed in the last five years.

If we provide evidence this year that we already provided last year, will we still receive credit for that evidence?

Where evidence from last year is still valid you can reuse it. We don’t necessarily expect policies to have changed.

How do we deal with measures that are already regulated by state or federal law?

Laws specify minimum standards and tell institutions what they cannot do. Policies should explain how particular laws are reflected in practice in the university. So, in most cases, we would expect a policy alongside the law. Please provide a URL to the relevant law on the government website. 

However, there are exceptions. For example, in Spain academic freedom is a constitutional requirement and therefore we will accept as meaning that institutions in the country have a policy on supporting academic freedom. 

If you think there are other exceptions please contact us at

Must universities submit data for all SDGs in order to participate?

Only SDG 17 (global partnerships) is a mandatory data field.

Otherwise, universities may submit data on as many SDGs as they would like or are able to.

We do not have all the data needed for a specific SDG – what will happen?

If certain data points within a SDG can’t be answered because data is not available, the institution will receive a score of zero for that specific data point. The institution can still be ranked in that SDG but will score at a lower level than institutions that are able to provide all data. We would encourage you to provide data wherever you can, and to look to record data for future years, too.

This year, the data collection portal offers a 'not applicable' option. So, if an institution does not offer any programmes in a particular area, then 'not applicable' should be checked. Not applicable corresponds to a 'no'.

Do you have a detailed description of the data fields?

We are providing a data collection user guide that explains key aspects of the process, including data field definitions.

If you have any queries, please send your questions to:

How do you assess research publications for each SDG?

Together with Elsevier we have identified a series of metrics and devised a query of keywords for each SDG.

The research component for each SDG is made up of two or three metrics. These can include: proportion of a university’s output that is viewed or downloaded, proportion of a university’s output that is cited in SDG specific guidance, number of publications, proportion of papers in the top 10% of journals as defined by Citescore, proportion of a university’s output that is authored by women, number of patents, Field Weighted Citation Index of papers produced by the university, or proportion of academic publications that are co-authored with someone from a university not in the home country.

How do you define 'number of students'?

Number of students means number of full time equivalent students in all years and all programmes that lead to a degree, certificate, institutional credit or other qualification. We are looking for undergraduate and postgraduate students who are studying for higher education programmes such as bachelor’s, master’s, doctoral or other equivalent degrees or components of those programmes. Please do not include postdoctoral students. We use the International Standard Classification of Education (ISCED) as a guiding framework.

Some of the data fields must be answered with a yes/no response. How are these scored?

If you answer 'no' to indicate that your institution does not have a particular policy or does not do a particular activity then you will receive a score of zero in that area.

If you answer 'yes', we will request evidence from you. We then evaluate the evidence provided based on how specific it is in answering the question and whether it is public or not.

Some of the data fields must be answered with a response of available, unavailable or withheld. Why?

'Available' means you have the data and you therefore must enter a value. If you do not have the data you can click 'unavailable' and provide us with the reason. 'Withheld' means you have the data but you are not willing to share it with us.

This approach means that we know whether the data provider has skipped a question or left it blank for a reason.

Will participating institutions be able to benchmark their data against peers?

Yes. There will be opportunity for benchmarking, but this process is yet to be determined.


Reader's comments (1)

This is a great initiative to address the societal responsibilities of universities. The "third mission", namely, innovation and impact directly benefiting society and sustainable development is hard to measure and often overlooked in universities' strategy. The new ranking will bring about more balance into the missions of universities. Well done to THE and look forward to seeing this new ranking launched next April. Max Lu -President and Vice-Chancellor, University of Surrey


Log in or register to post comments


玛丽·彼尔德(Mary Beard)最近承认自己是每周工作100小时“狂人”的推文引发热议。但学者们如此拼命工作合理吗?谁有权决定?这些学术狂人应当对此保持沉默吗?让我们听听这些学者怎么说

2月 20日