The Wall Street Journal/Times Higher Education College Ranking is a pioneering ranking of US colleges and universities that puts student success and learning – based on 200,000 current student voices – at its heart.
The ranking includes clear performance indicators designed to answer the questions that matter most to students and their families when making one of the most important decisions of their lives – who to trust with their education. These questions include: does the college have sufficient resources to teach me properly? Will I be engaged and challenged by my teacher and classmates? Does the college have a good academic reputation? What type of campus community is there? How likely am I to graduate, pay off my loans and get a good job?
The ranking includes the results of the THE US Student Survey, which examines a range of key issues including students’ engagement with their studies, their interaction with their teachers and their satisfaction with their experience.
The ranking adopts a balanced scorecard approach, with 15 individual performance indicators combining to create an overall score that reflects the broad strength of the institution.
For all enquiries and questions about this ranking, please email: email@example.com
Data comes from a variety of sources: the US government (Integrated Postsecondary Education Data System or IPEDS), the US Department of Education’s Federal Student Aid (FSA) centre, the College Scorecard and the Bureau of Economic Analysis (BEA), as well as the THE US Student Survey, the THE Academic Reputation Survey, and Elsevier’s bibliometric dataset.
Our data is, in most cases, normalised so that the value we assign in each metric can be sensibly compared with other metrics.
The overall methodology explores four key areas:
Does the college have the capacity to effectively deliver teaching? The Resources area represents 30 per cent of the overall ranking. Within this we look at:
- Finance per student (11%)
- Faculty per student (11%)
- Research papers per faculty (8%)
Does the college effectively engage with its students? Most of the data in this area is gathered through the THE US Student Survey. The Engagement area represents 20 per cent of the overall ranking. Within this we look at:
- Student engagement (7%)
- Student recommendation (6%)
- Interaction with teachers and students (4%)
- Number of accredited programmes (3%)
Does the college generate good and appropriate outputs? Does it add value to the students who attend? The Outcomes area represents 40 per cent of the overall ranking. Within this we look at:
- Graduation rate (11%)
- Value added to graduate salary (12%)
- Value added to loan default (7%)
- Academic reputation (10%)
Is the college providing a learning environment for all students? Does it make efforts to attract a diverse student body and faculty? The Environment area represents 10 per cent of the overall ranking. Within this we look at:
- Proportion of international students (2%)
- Student diversity (3%)
- Student inclusion (2%)
- Staff diversity (3%)
Students and their families need to know that their college has the right resources to provide the facilities, tuition and support that are needed to succeed at college.
By looking at the amount of money that each institution spends on teaching per student (11%), we can get a clear sense of whether it is well funded and has the money to provide a positive learning environment. This metric takes into account spending on both undergraduate and graduate programmes, which is consistent with the way the relevant spend data is available in the Integrated Postsecondary Education Data System (IPEDS). Schools are required by the Department of Education to report key statistics such as this to IPEDS, making it a comprehensive source for education data. The data on academic spending per institution are adjusted for regional price differences, using regional price parities data from the US Department of Commerce’s Bureau of Economic Analysis.
By looking at the ratio of students to faculty members (11%), we get an overall sense as to whether the college has enough teachers to teach. It gives a broad sense of how likely it is that a student will receive the individual attention that may be required to succeed at college, and gives a sense as to potential class sizes. The source of this statistic is IPEDS. For the 2018 ranking, we have chosen to use the average of two years’ data, this provides a better long-term view of this metric.
Faculty who are experts in their academic fields and who are engaged in pushing the boundaries of knowledge, can significantly enhance a student’s educational experience by demonstrating, for example, the power of real-world problem-solving.
So our teaching resources pillar also offers a sense as to whether faculty are experts in their academic disciplines, by looking at research excellence. We look at the number of published scholarly research papers per faculty (8%) at each institution, giving a sense of research productivity, and testing to see whether faculty are able to produce research that is suitable for publication in the world’s top academic journals, as indexed by Elsevier.
Decades of research has found that the best way to truly understand teaching quality at an institution – how well it manages to inform, inspire and challenge its students – is through capturing what is known as “student engagement”. This was described by Malcolm Gladwell (in The New Yorker in 2011) as “the extent to which students immerse themselves in the intellectual and social life of their college – and a major component of engagement is the quality of a student’s contacts with faculty”.
Times Higher Education has captured student engagement across the US through its US Student Survey, carried out in partnership with two leading market research providers. For 2016 and 2017, we gathered the views of more than 200,000 current college and university students on a range of issues relating directly to their experience at college. Students answer 12 core questions about their experience, either multiple choice or on a scale of zero to 10, and also provide background information about themselves. The survey was conducted online and respondents were recruited by research firm Streetbees using social media, facilitated, in part, by student representatives at individual schools, and using a database of student email addresses collected by Ipsos Mori. Respondents were verified as students of their reported college through their email address. We used an aggregated group of the respondents from both years (2016 and 2017 surveys). In the 2017 survey, at least 50 responses were required for a university to be included.
The Engagement pillar of the ranking focuses on the data that we have gathered from the student survey.
To capture engagement with learning (7%), we look at the answers to four key questions: to what extent does the student’s college or university support critical thinking? (for example, developing new concepts or evaluating different points of view); to what extent does the teaching support reflection on, or making connections between, the things that the student has learned? (for example, combining ideas from different lessons to complete a task); to what extent does the teaching support apply the student’s learning to the real world? (for example, taking study excursions to see concepts in action); and, finally, to what extent do the classes taken in college challenge the student? (for example, presenting new ways of thinking to challenge assumptions or values).
To capture a student’s opportunity to interact with others (4%) to support learning, we use the responses to two questions: to what extent did the student have the opportunity to interact with faculty and teachers? (for example, talking about personal progress in feedback sessions); and to what extent does the college provide opportunities for collaborative learning (for example, group assignments).
The final measure in this area is around student recommendation (6%) and based on a question from the student survey: “If a friend or family member were considering going to university, based on your experience, how likely or unlikely are you to recommend your college or university to them?”
We also seek to help a student understand the opportunities on offer at the institution, and the likelihood of getting a rounded education, by providing an indicator on the number of different subjects taught (3%). While other components of the Engagement pillar are drawn from the student survey, the source of this metric is IPEDS.
At a time when US college debt stands at $1.3 trillion, and when the affordability of going to college and concerns about value for money in light of often very substantial tuition fees, are at the top of many families’ concerns, this section looks at perhaps the single most important aspect of any higher education institution – its record in delivering successful outcomes for students.
We look at the graduation rates for each institution (11%). This is a crucial way to help students understand whether colleges have a strong track record in supporting students enough to get them through their course.
This pillar also includes two essential value-added indicators – measuring the value added by the teaching at a college to both salary (12%) and to the ability to repay student debt (7%). Using a value-added approach means that the ranking does not simply reward the colleges that cream off all the very best students, and shepherd them into the jobs that provide the highest salaries in absolute terms. Instead, it looks at the success of the college in transforming people’s life chances, in “adding value” to their likelihood of success.
The THE data team uses statistical modelling to create an expected graduate salary and loan default rate (or rate of students not in default) for each college based on a wide range of factors, such as the make-up of its student body and the characteristics of the institution. The ranking looks at how far the college either exceeds expectations in getting students higher average salaries than one would predict based on the student body and its characteristics, or falls below what is expected. The value-added analysis uses research on this topic by the Brookings Institution, among others, as a guide.
This pillar also looks at the overall academic reputation of the college (10%), based on Times Higher Education’s annual Academic Reputation Survey, a poll of leading scholars that helps us determine which institutions have the best reputation for excellence in teaching. We used the total teaching votes from our 2016 and 2017 reputation surveys.
This category looks at the make-up of the student body at each campus, helping students understand whether they will find themselves in a diverse, supportive and inclusive environment while they are at college. We look at the proportion of international students on campus (2%), a key indicator that the university or college is able to attract talent from across the world and offers a multicultural campus where students from different backgrounds can, theoretically, learn from one another.
We also look more generally at student diversity – both racial and ethnic diversity (3%), and the inclusion of students with lower family earnings (2%). For the former, we use IPEDS data on diversity. For the latter, we look at the proportion of students who are first-generation students (the first person in their family to go to college) as reported in the College Scorecard. We also look at the proportion of students who receive Pell Grants (paid to those in need of financial support), as reported in IPEDS. We also use a measure of the racial and ethnic diversity of the faculty (3%), again, drawing on IPEDS data.
Technical overview of metrics
- Finance per student – spending on teaching-associated activity per full-time equivalent student (IPEDS data). This is adjusted using regional price comparisons (BEA data)
- Faculty student ratio – the number of faculty per student as provided by IPEDS
- Papers per faculty – the number of academic papers published by faculty from a college in the period 2011-2015 (Elsevier data) divided by the size of the faculty (IPEDS)
The data from the student survey has been rebalanced by gender to reflect the actual gender ratio at the college.
- Student engagement – the average score of four questions (on critical thinking, connections, applying learning to the real world, and challenges) in the THE US Student Survey
- Interaction – the average score of two questions (interaction with faculty, and collaborative learning) in the THE US Student Survey
- Student recommendation (THE US Student Survey)
- Subject breadth – number of courses offered (IPEDS)
- Graduation rate – the proportion of bachelor’s or equivalent graduates six years after entry (IPEDS)
- Value-added salary – the average calculated residual of the value-added models for salary 10 years after entry. This is calculated using a range of independent variables, for the College Scorecard data representing the years 2011 and 2012. It also draws on data from IPEDS and BEA.
- Value-added loan default rate – the average calculated residual of the value-added models for default three years after beginning repayment. The targeted variable is the amount of students not in default, ie, 1- default. This is calculated using a range of independent variables, and the “Num” and “Denom” variables from the Federal Student Aid data for the years 2011, 2012 and 2013. It also draws on data from IPEDS, the College Scorecard and the BEA.
- Reputation – the total votes received for teaching excellence from the THE Academic Reputation Survey, which is conducted in partnership with Elsevier. We use only votes provided by academics associated with US institutions.
- International students – the proportion of students identified as non-resident aliens (IPEDS)
- Student diversity – a Gini-Simpson calculation of the likelihood of two undergraduates being from different racial/ethnic groups (IPEDS)
- Faculty diversity – a Gini-Simpson calculation of the likelihood of two faculty members being from different racial/ethnic groups (IPEDS)
- Student inclusion – the post-normalisation average of the proportion of Pell Grant recipients (IPEDS) and proportion of first-generation students (College Scorecard)
Why isn’t my college included?
There are two reasons why a college might not be included in the ranking.
First, does it meet the eligibility requirements? (below is an abbreviated summary):
- Title IV eligible
- Awards four-year bachelor’s degrees
- Located in the 50 states, or DC
- Has more than 1,000 students
- Has 20% or fewer online-only students
- Is solvent
The second reason is missing data elements. Where possible, we will impute missing values, but where it is not possible we have excluded colleges. In addition, some colleges did not meet our threshold for a valid number of respondents (>=50) to the student survey in 2017.
We have also excluded private for-profit colleges.