Impact Rankings: FAQs

We answer your questions about the Times Higher Education Impact Rankings

April 17, 2020
Newton's cradle
Source: iStock

Times Higher Education’s Impact Rankings capture universities’ impact on society based on institutions’ success in delivering the United Nations’ Sustainable Development Goals.

A set of performance metrics was developed last year and published as a league table at the THE Innovation and Impact Summit held at the Korea Advanced Institute of Science and Technology (KAIST) in South Korea in April 2019.

We have now confirmed the metrics for the THE Impact Rankings 2020, which will be based on all 17 SDGs. A document detailing the new metrics is attached at the bottom of this article.

If you would like to submit data, ask questions or offer feedback, please contact us at impact@timeshighereducation.com

You can discover more about how we are evaluating impact in this rankings blog.


Watch our webinar on the THE Impact Rankings 2019

Note: The slides are attached at the bottom of the article


General

 

What does this rankings do that other rankings do not do?

The Times Higher Education World University Rankings are designed for research-intensive global universities and are dominated by indicators of research excellence.

THE’s data team has also successfully pioneered new teaching-led rankings, focusing on teaching excellence and student success, in Japan and in the United States (in partnership with The Wall Street Journal).

The Impact Rankings, however, explore the impact that a university can make, specifically by looking at themes of sustainability.

What is the THE Impact Rankings about?

The THE University Impact Rankings shows how the global higher education sector is working towards the United Nations’ Sustainable Development Goals (SDGs).

Why is the THE Impact Rankings important?

It provides a showcase for the work being delivered by universities in our communities, and it is an opportunity to shine a light on institutional activities and efforts not covered in other rankings. It will allow us to demonstrate the differences a university is making to the world we live in.

Can all institutions participate in this ranking?

This ranking is open to any higher education institution in the world. We want this ranking to be as inclusive as possible.

Data collection is open to any university that teaches at an undergraduate level and is validated by a recognised accreditation body. There are no bibliometric-related eligibility criteria.

We will also accept data from others outside this group for wider analysis and editorial purposes.

This is different from the THE World University Rankings, which includes a minimum publication volume as part of the eligibility criteria.

If an institution does not provide any data, it will not be ranked.

If you would like to take part in the rankings please email impact@timeshighereducation.com. There is no participation fee. 

What is the time frame for this ranking?

The first version of the Impact Rankings was launched in April 2019. Data collection for 2020 via our data collection portal ran from October 2019 to January 2020. 

Our third data collection cycle is likely to commence in October 2020. The confirmed date will be announced soon.

What are the UN Sustainable Development Goals?

There are 17 SDGs, which were adopted by the UN in 2015 to provide a framework for developing the world in a sustainable way.

These include ending poverty and hunger; promoting good health and well-being and quality education; achieving gender equality and economic growth; providing access to clean water and sanitation and affordable and clean energy; fostering innovation; reducing inequalities; building sustainable cities and communities and achieving responsible consumption and production; tackling climate change; managing sustainably life below water and life on land; promoting peaceful societies; and revitalising global partnerships.

How will the ranking work?

The ranking is based on the 17 SDGs. Not every target in the SDGs relates directly to universities, but we believe that the higher education sector has a significant role to play in helping nations to deliver on the SDGs agenda. For each SDG, we have identified a limited set of metrics that can give an insight into progress.

In the first year, we collected data on 11 of the 17 goals from participating universities. For 2020, we are expanding this to all 17 SDGs.

Universities may provide data on one or more of the SDGs.

We produce an overall ranking of universities based on institutions’ data for SDG 17 (the only mandatory goal) plus their best three results on the remaining SDGs. This will allow universities to demonstrate their excellence in the areas that are most relevant to them, their community and their country.

Rankings of the universities that are best achieving the individual SDGs will also be published.

My university is not active (or does not record data) across all SDGs – is it worth participating?

Not all universities will be able to report on all the metrics covered in the rankings. To be included in the overall ranking, we ask that you submit data on SDG 17, which is mandatory, and at least three other SDGs of your choice.

A university that submits data in fewer than three other SDGs cannot be included in the overall ranking. However, it can still be ranked in the tables on individual goals. For example, if you have done great work on climate action, submitting in that category alone would enable you to be ranked for it.

The ranking will reflect local activities as well as international activities.

   
   

What happens if we submit data for more than four SDG areas?

We will evaluate your performance in all areas and then choose the three goals in which you excel; these will count towards the overall university score.

How many rankings will THE produce?

THE will use provided data to produce:

  • An overall ranking of universities based on the top three SDGs for each individual university, plus SDG 17 (revitalising global partnerships)
  • Individual rankings of universities based on performance in each SDG

Will you rank all universities?

We will rank all universities that submitted data and meet our eligibility criteria. 

Are other stakeholders involved in this ranking?

We are collaborating with experts in the field, including Vertigo Ventures, an organisation that helps leading research institutions globally to identify, capture and report the impact of their work. Our bibliometric supplier is Elsevier. We are also consulting with higher education institutions and groups to help us better define our metrics.

What is the first step for an institution looking to participate in the THE Impact Rankings?

The first step to participate in our rankings is to create a profile for your institution if it does not already exist. There are no fees or costs to participate.

Participation (and evaluation) will depend on the provision of necessary data and the fulfilment of eligibility entry criteria. Data collection is open to any university that teaches at an undergraduate level and is validated by a recognised accreditation body. We also accept data from institutions outside this group for wider analysis and editorial purposes.

The institution needs to nominate a data provider and approver (head of institution). For both the following details are required:

  • First name
  • Last name
  • Institutional email
  • Job title
  • Department
  • Telephone number

Once a data provider contact has been provided, they will receive an email with instructions on how to submit relevant information.

For further questions contact impact@timeshighereducation.com

 

 

Methodology

 

What is the rankings methodology?

The THE Impact Rankings is created using the UN Sustainable Development Goals as reference.

Each SDG has a small number of metrics associated with it.

Data will come from a variety of sources, including:

  • Direct submissions from institutions
  • Bibliometric datasets from Elsevier

The overall score will be calculated by counting SDG 17 (revitalising global partnerships) as a mandatory data field and combining this with data on the best three SDGs per university.

The metric weightings are in the file attached at the bottom of this page. Alternatively, please email impact@timeshighereducation.com for a copy of the document. 

Won’t this just favour big, established universities?

We have tried to define the metrics in a way that allows all universities to participate – this has included focusing on definitions that rely on less complex calculations than in an ideal world. We have also tried to ensure that the choice of metrics is not overly biased towards wealth.

As with the World University Rankings, we will normalise for university size where appropriate, and use other measures to ensure equity between different countries and universities.

We do not expect universities in different parts of the world to have the same areas of focus. By combining the SDGs and allowing flexibility, we open up the project to universities that have very different missions and ensure that institutions in Western Europe and North America do not have an unfair advantage.

How did you come up with methodology?

THE has been discussing aspects of university impact for several years. This has included a lengthy consultation with interested parties, culminating in an open session at the THE Young Universities Summit in Florida in June 2018.

Other crucial aspects informing our decision were feasibility and access to data.

How do you come up with the final scores?

Numerical answers are calculated and then z-scored.

Evidence-based answers are scored as described in the above webinar, with credit given for the answer, for openness, for recency, and (depending on the question) for factors such as cost.

Continuous data is evaluated and then z-scored to give a score that can be compared to other universities. This is then weighted to the correct percentage for that metric.

How should a university interpret the results of the overall ranking if different universities are supplying data on different areas?

The overall ranking provides a sense of which universities are making positive steps towards the SDGs most strongly. 

Universities can be compared more easily in the individual SDG tables. 

Why did we select SDG 17 as mandatory?

SDG 17 can be considered as a meta-SDG. Working together through partnerships and collaborations in order to achieve the 2030 agenda highlights the cooperation and publishing aspect of the goals. However, as we have selected this as the mandatory SDG, we have decreased its score in the overall value. While every other SDG is valued at 26 per cent, SDG 17 only makes up 22 per cent of the overall score.

Can we participate in SDG 5 (gender equality) if we are a women's only institution?

Yes. The substance of this SDG is about addressing women's representation and access to higher education. So, if you are a women's institution with no enrolled male students this will not negatively effect your score for this SDG.

 

 

Data collection and validation

Data collection 

 

What are your data sources?

We invite universities to submit data in a subset of SDGs. For each SDG, there will be some data that are collected from universities as well as bibliometric data provided by Elsevier.

How will you validate the data?

Universities will be asked to provide evidence or documentation to support their submissions. 

Information will be cross-checked against external sources at our discretion, and we reserve the right to investigate institutions where we believe inappropriate data collection or submission has taken place.

We encourage universities to publish their evidence, and in many cases we expect the evidence to be sourced from existing public sources, for example, annual reports. Public documents do not have password protections or time limits. 

Our team of analysts will compare evidence that is provided to the definitions, and it will be marked accordingly.

What types of evidence do you accept?

We accept links to documents or websites and publicly available timetables, brochures, magazines and articles.

If provided documents are confidential, universities must explicitly indicate this in the caveats.

We are not looking for a large volume of evidence; rather, we ask institutions to submit the best pieces of evidence. We allow up to three evidence items to be uploaded for each question, where relevant. 

We do not expect universities to submit all the evidence in English. 

If one piece of evidence is applicable for more than one question you can re-submit the same piece of evidence. 

More credit will be given to publicly available evidence but we do not rate different forms of evidence differently. For example, we do not consider public websites more or less important than a public PDF document or brochure.

You cannot upload videos as evidence but you can provide a URL that includes a video on the page.

We look for evidence from the "university as a whole" rather than, for example, a course in a single department.

What is your process for assessing the quality of qualitative evidence?

Where we are looking for evidence of action – for example the existence of mentoring programmes – our metrics require universities to provide evidence to support their claims. In these cases we will give credit for the evidence, and for the evidence being public.

We are not in a position to explore whether a policy is the best possible one, but we do aim to assess the evidence submitted consistently and fairly.

Evidence is evaluated against a set of criteria and decisions are cross-validated where there is uncertainty. Evidence is not required to be exhaustive – we are looking for examples that demonstrate best practice at the institutions concerned.

Which year should data be based on for the 2020 ranking?

When quoting statistics (continuous data) we ask that the data be specifically from the calendar year 2018, the academic year 2017-18 or the financial year that ended in 2018. However, note that these are only examples. You may use the most appropriate annual cycle that best fits your data, but ends in 2018.

When answering questions requiring evidence, the evidence provided does not have to be from 2018. We ask that you provide evidence that best answers the question at hand and that is relatively recent (no more than two years' old).

For policies, we ask an institution to submit the date the policy was created and the date it was last reviewed. At least one of these dates must be submitted to establish whether a policy is an active policy. We expect policies to be regularly reviewed, meaning it should have been created or reviewed in the last five years.

If we provide evidence this year that we already provided last year, will we still receive credit for that evidence?

Where evidence from last year is still valid you can reuse it. We don’t necessarily expect policies to have changed.

How do we deal with measures that are already regulated by state or federal law?

Laws specify minimum standards and tell institutions what they cannot do. Policies should explain how particular laws are reflected in practice in the university. So, in most cases, we would expect a policy alongside the law. Please provide a URL to the relevant law on the government website. 

However, there are exceptions. For example, in Spain academic freedom is a constitutional requirement and therefore we will accept as meaning that institutions in the country have a policy on supporting academic freedom. 

If you think there are other exceptions please contact us at impact@timeshighereducation.com.

Must universities submit data for all SDGs in order to participate?

Only SDG 17 (global partnerships) is a mandatory data field.

Otherwise, universities may submit data on as many SDGs as they would like or are able to.

We do not have all the data needed for a specific SDG – what will happen?

If certain data points within a SDG can’t be answered because data is not available, the institution will receive a score of zero for that specific data point. The institution can still be ranked in that SDG but will score at a lower level than institutions that are able to provide all data. We would encourage you to provide data wherever you can, and to look to record data for future years, too.

This year, the data collection portal offers a 'not applicable' option. So, if an institution does not offer any programmes in a particular area, then 'not applicable' should be checked. Not applicable corresponds to a 'no'.

Do you have a detailed description of the data fields?

We are providing a data collection user guide that explains key aspects of the process, including data field definitions.

If you have any queries, please send your questions to: impact@timeshighereducation.com

What do you mean by 'university as a body'?

When we refer to 'university as a body' we mean you should provide evidence whenever your institution, rather than individuals or faculties, work towards the metric.

The work done by individuals, for instance, a lecturer or researcher working for the university, can be accepted as evidence if their work is associated with an institutional action. For example, a local or national programme of environmental education, which is performed by the researcher, but thoroughly supported or carried out by the university.

How do you assess research publications for each SDG?

Together with Elsevier we have identified a series of metrics and devised a query of keywords for each SDG.

The research component for each SDG is made up of two or three metrics. These can include: proportion of a university’s output that is viewed or downloaded, proportion of a university’s output that is cited in SDG specific guidance, number of publications, proportion of papers in the top 10% of journals as defined by Citescore, proportion of a university’s output that is authored by women, number of patents, Field Weighted Citation Index of papers produced by the university, or proportion of academic publications that are co-authored with someone from a university not in the home country.

Can the keyword search terms be accessed?

All research metrics are measured against a keyword search of the Scopus dataset. The search terms are available here: https://data.mendeley.com/datasets/87txkw7khs/1

How do you define 'number of students'?

Number of students means number of full time equivalent students in all years and all programmes that lead to a degree, certificate, institutional credit or other qualification. We are looking for undergraduate and postgraduate students who are studying for higher education programmes such as bachelor’s, master’s, doctoral or other equivalent degrees or components of those programmes. Please do not include postdoctoral students. We use the International Standard Classification of Education (ISCED) as a guiding framework.

What is 'open data'?

Open data means that the data can be easily read and used by others – ideally under an open licence. Technically this can mean many things, but usually documents and images wouldn’t be counted; spreadsheets, csv, and API access would. Open data does not mean the data are available as a table within a pdf.

Will participating institutions be able to benchmark their data against peers?

Yes. There will be opportunity for benchmarking, using the SDG Impact Dashboard. It will provide you with a detailed, but easy to understand analysis of performance in the THE Impact Rankings. Strategic planners particularly will benefit from the user-friendly benchmarking and competitor analysis tools that can be customised for both global and domestic regions.

Contact data@timeshighereducation.com if you would like to learn more about SDG Impact Dashboard.

 


Please login or register to read this article.

Register to continue

Get a month's unlimited access to THE content online. Just register and complete your career summary.

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Reader's comments (2)

This is a great initiative to address the societal responsibilities of universities. The "third mission", namely, innovation and impact directly benefiting society and sustainable development is hard to measure and often overlooked in universities' strategy. The new ranking will bring about more balance into the missions of universities. Well done to THE and look forward to seeing this new ranking launched next April. Max Lu -President and Vice-Chancellor, University of Surrey
Good article If you look for help with your essays I'm here https://www.fiverr.com/hammadshaat/research-is-perfectly-done-and-time-is-saved

Have your say

Log in or register to post comments

Most commented

Sponsored

Featured jobs