Last updated 12 June 2024
Times Higher Education’蝉 Impact Rankings capture universities’ impact on society based on institutions’ success in delivering the United Nations’ Sustainable Development Goals.
A set of performance metrics was developed?in 2018 and published as a league table at the 色盒直播 Innovation and Impact Summit held at the Korea Advanced Institute of Science and Technology (KAIST) in South Korea in April 2019.
In 2020, we extended the scope of the rankings to measure universities across?all 17?SDGs. In autumn 2021, we appointed an advisory board?to suggest how we can improve the rankings and ensure that we act responsibly.?
The sixth edition?of the rankings was published in June 2024, including more than 2,100 institutions from 125 countries and regions.?The?methodology for the 2024 edition of the table is?available?here.
色盒直播
?to?participate in the Impact Rankings 2025.
If you would like to ask any questions or offer feedback, contact us at impact@timeshighereducation.com.?
色盒直播
You can discover more about how we are evaluating impact in this rankings blog.
? |
?What are the Impact Rankings? |
What?are the 色盒直播?Impact Rankings? |
The 色盒直播?Impact Rankings show how the global higher education sector is working towards the United Nations’ Sustainable Development Goals (SDGs). |
Why are the 色盒直播?Impact Rankings important? |
It provides a showcase for the work being delivered by universities in our communities, and it is an opportunity to shine a light on institutional activities and efforts not covered in other rankings. It will allow us to demonstrate the differences a university is making to the world we live?in. |
What does this ranking do that other rankings do?not do? |
The?Times Higher Education?World University Rankings?are designed for research-intensive global universities and are dominated by indicators of research excellence. 色盒直播’蝉 data team has also successfully pioneered new teaching-led rankings, focusing on teaching excellence and student success, in?Japan?and?in?the US?(in partnership with?The?Wall Street Journal). The Impact Rankings, however, explore the impact that a university can make, specifically by looking at themes of poverty, inequality, climate change, environmental degradation, peace and justice. 色盒直播 |
What are the UN Sustainable Development Goals? |
There are , which were adopted by the UN in 2015 to provide a framework to achieve a better and more sustainable future for all. These include ending poverty and hunger; promoting good health and well-being and quality education; achieving gender equality and economic growth; providing access to clean water and sanitation and affordable and clean energy; fostering innovation; reducing inequalities; building sustainable cities and communities and achieving responsible consumption and production; tackling climate change; managing sustainably life below water and life on land; promoting peaceful societies; and revitalising global partnerships. |
Can all institutions participate in this ranking? |
This ranking is open to any higher education institution in the world. We want this ranking to be as inclusive as possible. This is different from the 色盒直播 World University Rankings, which includes a minimum publication volume as part of the eligibility criteria. However, if an institution does not provide any data, it will not be ranked. If you would like to take part in the rankings please email impact@timeshighereducation.com or click . There is no participation fee.? |
How will the ranking work? |
The ranking is based on the 17 SDGs. Not every target in the SDGs relates directly to universities, but we believe that the higher education sector has a significant role to play in helping nations to deliver on the SDGs agenda. For each SDG, we have identified a?limited set of metrics that can give an insight into progress. In the first year, we collected data on 11 of the 17 goals from participating universities. For 2020,?we expanded this to all 17?SDGs. Universities may provide data on one or more of the SDGs. We produce an overall ranking of universities based on institutions’ data for SDG?17 (the only mandatory goal)?plus their best three?results on the remaining SDGs. This will allow universities to demonstrate their excellence in the areas that are most relevant to them, their community and their country. Rankings of the universities that are best achieving the individual SDGs will also be published. |
My university is?not active (or does?not record data) across all SDGs – is it worth participating? |
Not all universities will be able to report on all the metrics covered in the rankings. To be included in the overall ranking, we ask that you submit data on SDG 17, which is mandatory, and at least three other SDGs of your choice. A university that submits data in fewer than three other SDGs cannot be included in the overall ranking. However, it can still be ranked in the tables on individual goals. For example, if you have done great work on climate action, submitting in that category alone would enable you to be ranked for it. The ranking will reflect local activities as well as international activities. |
What happens if we submit data for more than four SDG areas? |
We will evaluate your performance in all areas and then choose the three goals in which you excel; these will count towards the overall university score. |
How many rankings will 色盒直播 produce? |
色盒直播 will use provided data to produce:
|
Are other stakeholders involved in this ranking? |
An advisory board has been?set up to help us develop and improve the rankings. During the initial development of the rankings we consulted widely with universities and individuals to ensure that the rankings are as fair and balanced as possible. Our bibliometric supplier for this ranking is Elsevier. |
?
?
? |
?Methodology |
What is the rankings methodology? |
The 色盒直播 Impact Rankings is created using the UN Sustainable Development Goals as reference. Each SDG has a small number of metrics associated with it. Data will come from a variety of sources, including:
The overall score will be calculated by counting SDG 17 (revitalising global partnerships) as a mandatory data field and combining this with data on the best three SDGs per university. The metric weightings?are in the file attached at the bottom of this page. Alternatively, please email impact@timeshighereducation.com for a copy of the document.? |
How did you come up with the methodology? |
色盒直播?has been discussing aspects of university impact for several years. This has included a lengthy consultation with interested parties, culminating in an open session at the?色盒直播?Young Universities Summit in Florida in June 2018. Other crucial aspects informing our decision were feasibility and access to data. |
How do you come up with the final scores? |
Evidence-based answers are scored as described in the above webinar, with credit given for the answer, for openness, for recency, and (depending on the question) for factors such as cost. Quantitative?data is evaluated and we use normal CDF and exponential CDF for scoring to?give?a score that can be compared to other universities. This is then weighted to the correct percentage for that metric. |
How should a university interpret the results of the overall ranking if different universities are supplying data on different areas? |
The overall ranking provides a sense of which universities are making positive steps towards the SDGs most strongly.? Universities can be compared more easily in the individual SDG tables.? |
Why did we select SDG 17 as mandatory? |
SDG 17 can be considered as a meta-SDG. Working together through partnerships and collaborations in order to achieve the 2030 agenda highlights the cooperation and publishing aspect of the goals. However, as we have selected this as the?mandatory SDG, we have decreased its score in the overall value. While every other SDG is valued at 26 per cent, SDG 17 only makes up 22 per cent of the overall score. |
Won’t this just favour big, established universities? |
We have tried to define the metrics in a way that allows all universities to participate; this has included focusing on definitions that rely on less complex calculations than is ideal. We have also tried to ensure that the choice of metrics is not overly biased towards wealth. We do not expect universities in different parts of the world to have the same areas of focus. By combining the SDGs and allowing flexibility, we open up the project to universities that have very different missions?and ensure that institutions in Western Europe and North America do not have an?unfair advantage. |
Can we participate in SDG 5 (gender equality) if we are a women's only institution? |
Yes. The substance of this SDG is about addressing women's representation and access to higher education. So, if you are a women's institution with no enrolled male students this will not negatively?effect your score?for this SDG. |
Are?there any methodological changes for the 2024 edition compared with 2023? |
No, there have been no changes to the methodology this year.? |
?
? | How to participate |
What is the time frame for this ranking? |
Data collection for the Impact Rankings 2025 will open on 16 September?and close on 11 November 2024.? The Impact Rankings 2024 was published?on 12 June 2024. |
What is the first step for an institution looking to participate in the?色盒直播?Impact Rankings? |
The first step to participate in our rankings is to create a profile for your institution if it does not already exist. There are no fees or costs to participate. Participation (and evaluation) will depend on the provision of necessary data. The institution needs to nominate a data provider and approver (head of institution). For both, the following details are required:
Once a data provider contact has been provided, they will receive an email with instructions on how to submit relevant information. For further questions contact?impact@timeshighereducation.com |
色盒直播
?
? |
?How to submit data? |
What are your data sources? |
We invite universities to submit data in a subset of SDGs. For each SDG, there will be some data that are collected from universities as well as bibliometric data provided by Elsevier. |
Which year should data be based on for the 2025 ranking? |
For this edition of the ranking we are clearly specifying the date range expected in the answers in our methodology. This is especially important given the impact of Covid-19 on university opening. Please note the dates identified by each question. For the 2025 ranking, we will request data from 2023. A university “year” may be a calendar year or may be seasonal. Some institutions’ academic years are different from their financial years. “Year” for the purposes of this ranking is defined as follows: For policies, we ask an institution to submit the date the policy was created and the date it was last reviewed. At least one of these dates must be submitted to establish whether a policy is an active policy. We expect policies to be regularly reviewed, meaning it should have been?created or reviewed in the last five years. |
Which types of evidence do you accept? |
We accept links to documents or websites and publicly available timetables, brochures, magazines and articles. If provided documents are confidential, universities must explicitly indicate this in the caveats. We are not looking for a large volume of evidence; rather, we ask institutions to submit the best pieces of evidence. We allow up to three evidence items to be uploaded for each question, where relevant.? We do not expect universities to submit all the evidence in English.? If one piece of evidence is applicable for more than one question you can re-submit the same piece of evidence.? More credit will be given to publicly available evidence but we do not rate different forms of evidence differently. For example, we do not consider public websites more or less important than a public PDF document or brochure. You cannot upload videos as evidence but you can provide a URL that includes a video on the page. We look for evidence from the "university as a whole" rather than, for example, a course in a single department. |
If we provide evidence this year that we already provided last year, will we still receive credit for that evidence? |
Where evidence from last year is still valid you?can reuse it. We don’t necessarily expect policies to have changed. |
How do we deal with measures that are already regulated by state or federal law? |
Laws specify minimum standards and tell institutions what?they cannot do. Policies should explain how particular laws are?reflected in practice in the university. So, in most cases, we would expect a policy alongside the law. Please provide a URL to the relevant law on the government website.? However, there are exceptions. For example, in Spain academic freedom is a constitutional requirement and therefore we will accept?that that means that institutions in the country have a policy on supporting academic freedom.? If you think there are other exceptions please contact us at impact@timeshighereducation.com. |
Must universities submit data for all SDGs in order to participate? |
Only SDG 17 (global partnerships) is a mandatory data field. Otherwise, universities may submit data on as many SDGs as they would like or are able to. |
We do?not have all the data needed for a specific SDG – what will happen? |
If certain data points within a SDG can’t be answered because data is not available, the institution will receive a score of zero for that specific data point. The institution can still be ranked in that SDG but will score at a lower level than institutions that are able to provide all data. We would encourage you to provide data wherever you can, and to look to record data for future years, too. |
Do you have a detailed description of the data fields? |
We are providing a methodology document, which includes data submission guidance and explains key aspects of the process, including data field definitions.?The document for 2022 can be accessed .? If you have any queries, please send your questions to: impact@timeshighereducation.com |
What do you mean by 'university as a body'? |
When we refer to 'university as a body' we mean you should provide evidence whenever your institution, rather than individuals or faculties, work towards the?metric. The work done by individuals, for instance, a lecturer or researcher working for the university, can be accepted as evidence if their work is associated with an institutional action. For example,?a local or national programme of environmental education, which is performed by the researcher, but thoroughly supported or carried out by the university. |
Can the keyword search terms be accessed? |
All research metrics are measured against a keyword search of the Scopus dataset. The search terms are available here: |
Have the keyword search queries been updated? |
For the Impact Rankings 2021, Elsevier increased the number of?keywords and included?additional publications identified by artificial intelligence. |
How do you define 'number of students'? |
Number of students means number of full time equivalent students in all years and all programmes that lead to a degree, certificate, institutional credit or other qualification. We are looking for undergraduate and postgraduate students who are studying for higher education programmes such as bachelor’蝉, master’蝉, doctoral or other equivalent degrees or components of those programmes. Please do not include postdoctoral students. We use the International Standard Classification of Education (ISCED) as a guiding framework. |
What is 'open data'? |
Open data means that the data can be easily read and used by others – ideally under an open licence. Technically this can mean many things, but usually documents and images wouldn’t be counted; spreadsheets, csv, and API access would. Open data does not mean the data are?available as a table within a pdf. |
?
? | How will the submission be evaluated? |
How will you validate the data? |
Universities will be asked to provide evidence or documentation to support their submissions.? Information will be cross-checked against external sources at our discretion, and we reserve the right to investigate institutions where we believe inappropriate data collection or submission has taken place. We encourage universities to publish their evidence, and in many cases we expect the evidence to be sourced from existing public sources, for example, annual reports. Public documents do not have password protections or time limits.? Our team of analysts will compare evidence that is provided to the definitions, and it will be marked accordingly. |
We do not have a ‘policy’, but we do have a set of standards we apply. Will you accept this? |
Ideally, the evidence should be?a policy, but anything that shows that a set of standards or rules are implemented can be accepted. |
What is your process for assessing the quality of qualitative evidence? |
Where we are looking for evidence of action – for example the existence of mentoring programmes – our metrics require universities to provide evidence to support their claims. In these cases we will give credit for the evidence, and for the evidence being public. We are not in a position to explore whether a policy is the best possible one, but we do aim to assess the evidence submitted consistently and fairly. Evidence is evaluated against a set of criteria and decisions are cross-validated where there is uncertainty. Evidence is not required to be exhaustive – we are looking for examples that demonstrate best practice at the institutions concerned. |
How do you assess research publications for each SDG? |
Together with Elsevier?we have identified a series of metrics and?devised a query of keywords for each SDG.?For the Impact Rankings 2021, this was supplemented by additional publications identified by artificial intelligence. The research component for each SDG is made up of two or three metrics. These can include: proportion of a university’蝉 output that is viewed or downloaded, proportion of a university’蝉 output that is cited in SDG specific guidance, number of publications, proportion of papers in the top 10 per cent of journals as defined by Citescore, proportion of a university’蝉 output that is authored by women, number of patents, field-weighted citation index of papers produced by the university, or proportion of academic publications that are co-authored with someone from a university not in the home country. |
Will participating institutions be?able to benchmark their data against peers? |
Yes. There will be opportunity for benchmarking, using the SDG Impact Dashboard. It provides detailed,?but easy to understand analysis of performance in the?色盒直播?Impact Rankings. Strategic planners particularly will benefit from the user-friendly benchmarking and competitor analysis tools that can be customised for both global and domestic regions. 色盒直播Contact?data@timeshighereducation.com?if you would like to learn more about SDG Impact Dashboard. |
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 色盒直播’蝉 university and college rankings analysis
Already registered or a current subscriber? Login