View the full Wall Street Journal/Times Higher Education College Rankings results
The Wall Street Journal/Times Higher Education College Ranking is a pioneering ranking of US colleges and universities that puts student success and learning -- based on 100,000 current student voices -- at its heart.
The ranking includes clear performance indicators designed to answer the questions that matter the most to students and their families when making one of the most important decisions of their lives – who to trust with their education:? Does the college have sufficient resources to teach me properly? Will?I be engaged, and challenged, by my teacher and classmates??Does the college have a good academic reputation? What type of campus community is there? How likely am I to graduate, pay off my loans and get a good job?
The ranking includes the results of the 色盒直播 US Student Survey, which examines a range of key issues including students’ engagement with their studies, their interaction with their teachers and their satisfaction with their experience.
The rankings adopt a balanced scorecard approach, with 14 individual performance indicators combining to create an overall score that reflects the broad strength of the institution.
For all enquiries and questions about these rankings, please email:
usrankings@timeshighereducation.com
Data sources
Data comes from a variety of sources: from the US government ( (IPEDS), College Scorecard and Bureau of Economic Analysis (BEA)), from the 色盒直播 US Student Survey, the 色盒直播 Academic Survey, and from the Elsevier bibliometric dataset.
Our data is, in most cases, normalised so that the value we assign in each metric can be compared sensibly with other metrics.
?
Methodology
The overall methodology explores four key areas:
Resources
Does the college have the capacity to effectively deliver teaching?? The Resource area represents 30 per cent of the overall ranking.? Within this we look at:
- Finance per student (11%)
- Faculty per student (11%)
- Research papers per faculty (8%)
Engagement
Does the college effectively engage with its students? Most of the data in this area is gathered through the 色盒直播 US Student Survey. The Engagement area represents 20 per cent of the overall ranking.? Within this we look at:
- Student engagement (7%)
- Student recommendation (6%)
- Interaction with teachers and students (4%)
- Number of accredited programmes (3%)
Outcomes
Does the college generate good and appropriate outputs? Does it add value to the students who attend? The Outcomes area represents 40 per cent of the overall ranking.? Within this we look at:
- Graduation rate (13%)
- Value added to graduate salary (15%)
- Academic reputation (12%)
Environment
Is the college providing a learning environment for all students? Does it make efforts to attract a diverse student body and faculty?? The Environment area represents 10 per cent of the overall ranking. Within this we look at:
- Proportion of international students (2%)
- Student diversity (3%)?
- Student inclusion (2%)
- Staff diversity (3%)
?
Metrics used
Resources (30%)
Students and their families need to know that their college has the right resources to provide the facilities, tuition and support that are needed to succeed at college.
By looking at the amount of money each institution spends on teaching per student (11%), we can get a clear sense of whether? it is well-funded and has the money to provide a positive learning environment.? This metric takes into account spending on both undergraduate and graduate programs, which is consistent with the way the relevant spend data is available in the Integrated Postsecondary Education Data System, known as IPEDS. Schools are required by the Department of Education to report key statistics such as this to IPEDS, making it a comprehensive source for education data. The data on academic spending per institution are adjusted for regional price differences, using regional price parities data from the U.S. Department of Commerce's Bureau of Economic Analysis.
By looking at the ratio of students to faculty members (11%), we get an overall sense as to whether the college simply has enough teachers to teach. It gives a broad sense of how likely it is that a student will receive the individual attention that can be required to succeed at college, and gives a sense as to the potential class sizes. The source of this statistic is IPEDS.
Having faculty who are experts in their academic fields, pushing the boundaries of knowledge at the forefront of their discipline – when they can distil this to their students, and demonstrate the power of real-world problem solving and enquiry – can significantly enhance a student’s educational experience. So our teaching resources pillar also gives us a sense as to whether faculty are expert in their academic disciplines, by looking at research excellence. We look at the number of published scholarly research papers per faculty (8%) at each institution, giving a sense of their research productivity, and testing to see whether faculty are able to produce research that is suitable for publication in the world’s top academic journals, as indexed by Elsevier.
Engagement (20%)
Decades of research has found that the best way to truly understand teaching quality at an institution – how well it manages to inform, inspire and challenge its students -- is through capturing what is known as “student engagement”.? This was described by Malcolm Gladwell (New Yorker, 2011) as “the extent to which students immerse themselves in the intellectual and social life of their college—and a major component of engagement is the quality of a student’s contacts with faculty.”
Times Higher Education has captured student engagement across the US through its US Student Survey, carried out in partnership with two leading market research providers. For 2016-17 we gathered the views of more than 100,000 current college and university students on a range of issues relating directly to their experience at college. Students answer twelve core questions about their experience, on multiple choice or a scale of zero to ten, and also provide background information about themselves.?The survey was conducted online; respondents were recruited by research firm, Streetbees, using social media, facilitated, in part, by student representatives at individual schools, and using a database of student email addresses collected by Ipsos MORI. Respondents were verified as students of their reported college using their email address. At least 50 responses were required for each university included. The maximum margin of error is 10 per cent.
The engagement pillar of the ranking focuses on the data we have gathered from the student survey.
To capture engagement with learning (7%), we look at the answers to four key questions: to what extent does the student’s college or university support critical thinking (for example developing new concepts or evaluating different points of view); to what extent does the teaching support reflection upon, or making connections among the things the student has learned (for example combining ideas from different lessons to complete a task); to what extent does the teaching support applying the student’s learning to the real world (for example taking study excursions to see concepts in action);? and finally, to what extent did the classes taken in college challenge the student (for example presenting new ways of thinking to challenge assumptions or values).
To capture a student’s opportunity to interact with others?(4%) to support learning, we use the responses to two questions: to what extent did the student have the opportunity to interact with faculty and teachers (for example talking about personal progress in feedback sessions); and to what extent does the college provide opportunities for collaborative learning (for example group assignments).
The final measure in this area is around student recommendation?(6%): “If a friend or family member were considering going to university, based on your experience, how likely or unlikely are you to recommend your college or university to them?”
In this pillar of indicators we also seek to help a student understand the opportunities on offer at the institution, and the likelihood of getting a more rounded education, by providing an indicator on the number of different subjects taught (3%). While other components of the Engagement pillar are drawn from the student survey, the source of this metric is IPEDS.
Outcomes (40%)
At a time when US college debt stands at $1.3 trillion, and when the affordability of going to college, and the value for money delivered for often very substantial tuition fees, are at the top of many families concerns, this section looks at perhaps the single most important aspect of any higher education institution – their record in delivering successful outcomes for their students.
We look at the graduation rates for each institution (13%) – a crucial way to help students to understand if colleges have a strong track record in supporting students enough to get them through their course, don’t have too many drop-outs, and to ensure they complete their degrees.
This pillar also includes an essential value-added indicator – measuring the value added by the teaching at a college to?salary (15%).?Using a value added approach means that that the ranking does not simply reward the colleges that cream off all the very best students, and shepherd them into the jobs that provide the highest salaries in absolute terms. Instead it looks at the success of the college in transforming people’s life chances, in “adding value” to their likelihood of success. The 色盒直播 data team uses statistical modelling to create an expected graduate salary for each college based on a wide range of factors, such as the make-up of its students and the characteristics of the institution. The ranking looks at how far the college either exceeds expectations in getting students higher average salaries than one would predict based on its students and its characteristics, or falls below what is expected. The value-added analysis uses research on this topic by Brookings Institution, among others, as a guide.?We use median earnings data for students 10 years after they entered college.?
This pillar also looks at the overall academic reputation of the college (12%), based on Times Higher Education’s annual Academic Reputation Survey, a survey of leading scholars which helps us determine which institutions have the best reputation for excellence in teaching.
Environment (10%)
This category looks at the make-up of the student body at each campus, helping students understand whether they will find themselves in a diverse, supportive and inclusive environment while they are at college. We look at the proportion of international students on campus (2%), a key indicator that the university or college is able to attract talent from across the world and offers a multi-cultural campus where students from different backgrounds can, theoretically, learn from one another.
We also look more generally at student diversity – both racial and ethnic diversity (3%), and the inclusion of students with lower family earnings (2%). For the former, we use IPEDS data on diversity. For the latter, we look at the proportion of students that are first generation students as reported in College Scorecard – the first in their family to go to college. And we look at the proportion that receives Pell Grants (paid to students in need of financial support), as reported in IPEDS.
We also use a measure of the racial and ethnic diversity of the faculty (3%), again, drawing upon IPEDS data.
?
Technical overview of metrics
Resources
- Finance per student? - spending on teaching associated activity per full time equivalent student (IPEDS). This is adjusted using regional price comparisons (BEA)
- Faculty student ratio – the number of faculty per student as provided by IPEDS
- Papers per faculty – the number of academic papers published by faculty from a college in the period 2011-2015 (Elsevier) divided by the size of the faculty (IPEDS)
Engagement
The data from the student survey has been rebalanced by gender to reflect the actual gender ratio at the college.
- Student engagement – the average score of the four questions (critical thinking, connections, applying learning to the real world, challenge) in the 色盒直播 US Student Survey
- Interaction – the average score of two questions (interaction with faculty, and collaborative learning) in the 色盒直播 US Student Survey
- Student recommendation (色盒直播 US Student Survey)
- Subject breadth – number of courses offered (IPEDS)
Outcomes
- Graduation rate – the proportion of Bachelor’s or equivalent graduates six years after entry (IPEDS)
- Value added salary – the average calculated residual of the value added models for salary 10 years after entry. This is calculated using a range of independent variables, for the College Scorecard data representing the years 2011 and 2012. It also draws on data from IPEDS and BEA.
- Reputation – the total votes received for teaching excellence from the 色盒直播Academic Reputation survey, which is conducted in partnership with Elsevier. We use only votes provided by academics associated with US institutions.
Environment
- International students – the proportion of students identified as non-resident aliens (IPEDS)
- Student diversity – a Gini-Simpson calculation of the likelihood of two undergraduate students being from different racial/ethnic groups (IPEDS)
- Faculty diversity – a Gini-Simpson calculation of the likelihood of two faculty members being from different racial/ethnic groups (IPEDS)
- Student inclusion – the post normalisation average of proportion of Pell Grant recipients (IPEDS) and proportion of first generation students (College Scorecard)
?
Why isn’t my college included?
There are two reasons why a college might not be included in the ranking.
Firstly, does it meet the eligibility requirements (this is an abbreviated summary):
- Title IV eligible
- Award 4 year bachelor degrees
- Located in the 50 states, or DC
- Have more than 1000 students
- Have 20% or fewer online only students
- Are not insolvent
The second reason is missing data elements. Where possible we will impute missing values, but where that is not possible we have excluded colleges. In addition, some colleges did not meet our threshold for a valid number of respondents (>=50) to the student survey. For 2016-17, we have also excluded private for-profit colleges.
Editor’s note: 16 June 2017
The U.S. Department of Education announced in January 2017 that it had identified an error in its calculations of the debt repayment variable used in the College Scorecard (). The error inflated repayment rates for almost all U.S. institutions and this had an impact on the Wall Street Journal/Times Higher Education College Rankings 2017, which draw on the College Scorecard debt repayment variable as one of 15 performance indicators.
Our analysis suggests there are challenges with using this variable to create value added models in the rankings at this time. Our primary focus is always on data integrity and making sure that our rankings provide a balanced picture so we have decided to drop this variable from the rankings calculations for the 2016-17 year. The rankings have now been republished in line with our corrections policy, which is available here and the methodology information above has been changed to reflect the removal of this indicator and the re-weighting of the existing 14 indicators.