THE University Impact Rankings 2019 by SDG: industry, innovation and infrastructure methodology

April 2, 2019

This ranking focuses on universities’ role of fostering innovation and serving the needs of industry. It explores institutions’ research on industry and innovation, their number of patents and spin-off companies and their research income from industry.

View the methodology for the University Impact Rankings 2019 to find out how these data are used in the overall ranking.


Research on industry, innovation and infrastructure (11.6%)

This focuses on research that is relevant to industry, innovation and infrastructure, measuring the volume of research produced.

The data are provided by Elsevier’s Scopus dataset, based on a query of keywords associated with SDG 9 (industry, innovation and infrastructure). It includes all indexed publications between 2013 and 2017. The data are normalised across its range using z-scoring.

Patents (15.4%)

This is defined as the number of patents that cite research conducted by the university.

The data are provided by Elsevier and relate to patents published between 2013 and 2017 (not research published between these dates). Patents are sourced from the World Intellectual Property Organisation, the European Patent Office, and the patent offices of the US, UK, and Japan. The data are normalised across the range using z-scoring.

University spin-offs (34.6%)

University spin-offs are defined as registered companies set up to exploit intellectual property that has originated from within the institution. They must have been established at least three years ago and still be active.

The data were provided directly by universities and normalised across the range using z-scoring. 

Research income from industry (38.4%)

This metric reflects the ability of the university to generate new research and is also used in the Times Higher Education World University Rankings. It measures the amount of research income an institution earns from industry (adjusted for PPP), scaled against the number of academic staff it employs.

The data are subject-weighted against three broad areas: STEM; medicine; and arts, humanities and social sciences. This is normalised by the number of full-time equivalent staff in each area.

The data were provided directly by universities and normalised across the range using z-scoring.


When we ask about policies and initiatives, our metrics require universities to provide the evidence to support their claims. Evidence is evaluated against a set of criteria and decisions are cross validated where there is uncertainty. Evidence is not required to be exhaustive – we are looking for examples that demonstrate best practice at the institutions concerned.


Unless otherwise stated, the data used refer to the closest academic year to January to December 2017.


Universities must teach undergraduates and be validated by a recognised accreditation body to be included in the ranking.

Data collection

Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular data point is not provided, we enter a value of zero.

The methodology was developed in conjunction with our partners Vertigo Ventures and Elsevier, and after consultation and input from individual universities, academics, and sector groups.

Have your say

Log in or register to post comments

Most commented

Mary Beard’s recent admission that she is a ‘mug’ who works 100 hours a week caused a Twitter storm. But how hard is it reasonable for academics to work? Who should decide? And should the mugs be obliged to keep quiet? Seven academics have their say

20 February


Featured jobs

Academic Investigations Officer

Royal Holloway, University Of London

Subject Library Assistant

Leeds Arts University

Teacher, Education

Zayed University

Associate Professor in Railway Engineering

Norwegian University Of Science & Technology -ntnu