Find your university in the听Arab University听Rankings 2026
罢丑别听探花视频听World University Rankings听are the only global performance tables that judge research-intensive universities across all their core missions: teaching, research, knowledge transfer and international outlook.
This year the methodology behind the Arab University Rankings has been brought in line with the flagship World University Rankings, using broadly the same comprehensive and trusted set of performance indicators (listed below). However, the weightings have been recalibrated to reflect the priorities of Arab institutions.
This year鈥檚 release is named the Arab University Rankings 2026 because it uses data from the World University Rankings 2026 edition and is part of the same annual data cycle. Last year鈥檚 Arab rankings was named the 2024 edition because it used the previous methodology but no years have been skipped since the inaugural Arab University Rankings in 2021.
The methodology behind the Arab University Rankings is based on the same听comprehensive and trusted framework as the global table听but some important adjustments have been made to reflect the features of universities in the Arab region.
探花视频
Discover how universities are ranked, the indicators that are used, and why the THE Arab University Rankings is a trusted benchmark for university performance in the Arab World.
Download the full methodology as a PDF at the bottom of this page.
探花视频

Key criteria for the听Arab University Rankings
Five core pillars of evaluation
- Teaching听(the learning environment)
- Research environment听(volume, income and reputation)
- Research quality听(research strength, research excellence and research influence)
- International outlook听(staff, students and research)
- Industry听(income and patents)
How are the Arab University Rankings calculated?
Metrics included
Teaching (the learning environment):鈥29.5%
- Teaching reputation: 15%
- Doctorates awarded-to-academic-staff ratio:听 5.5%
- Academic staff-to-student ratio:鈥4.5%
- Doctorates awarded-to-undergraduate-degrees-awarded ratio:鈥2%
- Institutional income per academic staff:鈥2.5%
The Academic Reputation Survey that underpins this pillar was carried out between November 2024 and January 2025. This exercise examined the perceived prestige of institutions in teaching and research. The teaching reputation metric is based on the number of teaching votes obtained from the survey. We have run the survey to ensure a balanced spread of responses across disciplines and countries. Where disciplines or countries were over- or under-represented, THE鈥檚 data team weighted the responses to fully reflect the global distribution of scholars. The 2025 data are combined with the results of the 2024 survey, giving more than 108,000 responses globally. Universities that received no votes score zero for this metric.
As well as giving a sense of how committed an institution is to nurturing the next generation of academics, a high proportion of postgraduate research students also suggests the provision of teaching at the highest level that is thus attractive to graduates and effective at developing them. This metric is听generated by dividing the total subject-weighted number of doctorates by the total subject-weighted number of academic staff (because the volume of doctoral awards varies by听discipline). The metric is normalised after calculation.
The academic staff-to-student ratio is defined as the total full-time equivalent (FTE) number of staff employed in an academic post divided by听FTE number of students in all years and of all programmes that lead to a听degree, certificate, university credit or other qualification. This variable and听the doctorates-awarded-to-undergraduate-degrees-awarded ratio are normalised after calculation.
Institutional income indicates an institution鈥檚 general status and gives a听broad sense of the infrastructure and facilities available to students and staff. This metric is generated by dividing the institutional income, adjusted to purchasing power parity (PPP), by the total number of academic staff. This variable is normalised after calculation
Research environment:鈥29%
- Research reputation:鈥18%
- Research productivity:鈥5.5%
- Research income per academic staff:鈥5.5%
The most prominent indicator in this category looks at a university鈥檚 reputation for research excellence among its peers, based on the responses to our Academic Reputation Survey (see above).
To measure productivity, we count the number of publications published in the academic journals indexed by Elsevier鈥檚 Scopus database per scholar, scaled for institutional size and weighted by subject. This gives a sense of the university鈥檚 ability to get papers published in quality peer-reviewed journals. This measure includes a method to give credit for cross-subject research that results in papers being published in subjects where a university declares no staff.
Research income is scaled against academic staff numbers and adjusted for PPP. This is a somewhat controversial indicator because it can be influenced by national policy and economic circumstances. But income is crucial to the development of world-class research, and because much of it is subject to competition and judged by peer review, our experts suggested that it was a valid measure. This indicator takes account of each university鈥檚 distinct subject profile, reflecting the fact that research grants in science subjects are often bigger than those awarded for the highest-quality social science, arts and humanities research.
Research quality:鈥30%
- Research strength:鈥15%
- Research excellence:鈥7.5%
- Research influence:鈥7.5%
Our research quality pillar looks at universities鈥 role in听spreading new knowledge and ideas.
探花视频
We examine research quality by capturing the number of times a university鈥檚 published work is cited by scholars globally. This year, our bibliometric data supplier Elsevier provided more than 174.9听million citations to 18.7听million journal articles, article reviews, conference proceedings, books and book chapters published over five years. The data include more than 28,700 active peer-reviewed journals indexed by Elsevier鈥檚 Scopus database and all indexed publications between 2020 and 2024. Citations to these publications made in the six years from 2020 to 2025 are also collected.
We consider the field-weighted citation impact (FWCI) for each institution, per subject and overall. The research strength score used in the ranking is determined by calculating the 75th percentile FWCI听score of all papers published by each institution. We believe that this gives a more stable measure over time. The data are also normalised to reflect variations in听citation volume between different subject areas.
Two new citation measures were introduced in 2023. Our research excellence metric recognises an institution鈥檚 contribution to the best research in each subject and overall. Excellence is measured by capturing the total number of publications by an institution that are among the top 10听per cent of publications worldwide by FWCI. We adjust this number by year, subject and the total number of academic and research staff.
Finally, research influence examines the influence of an institution鈥檚 publications by analysing their corresponding citations. The importance of a publication is determined based on the importance of other papers citing it. We adjust this number by year, subject and the total number of听academic and research staff.
探花视频
International outlook:鈥7.5%
- Proportion of international students:鈥2.5%
- Proportion of international staff:鈥2.5%
- International co-authorship:鈥2.5%
The ability of a university to attract undergraduates, postgraduates and faculty from all over the planet is key to its success on the world stage.
International students and staff are defined as those whose nationality differs from the country where the institution is based. The first two metrics are calculated as the total FTE number of international students or staff divided by the total FTE number of students or听staff.
In the third international indicator, we calculate the proportion of a university鈥檚 total research journal publications that have at least one international co-author and reward higher volumes. This metric accounts for an institution鈥檚 subject mix and uses the same five-year window as the 鈥淩esearch quality鈥 category.
Historically, large countries have been disadvantaged compared with small countries in our international metrics, in that it is 鈥渆asier鈥 for staff and students in small countries to work/study abroad. This led us to change our normalisation approach for the four measures in 2023, henceforth taking into consideration the population of a country when evaluating these metrics.
A study abroad metric 鈥 assessing the provision of international learning opportunities for domestic students 鈥 complements the International outlook pillar, but is currently given a weight of 0听per听cent.
滨苍诲耻蝉迟谤测:鈥4%
- Industry income per academic staff:鈥2%
- 笔补迟别苍迟蝉:鈥2%
A university鈥檚 ability to help industry with innovations, inventions and consultancy has become a core mission of the contemporary global academy. The industry income metric seeks to capture such knowledge transfer activity by looking at how much research income an institution earns from industry (adjusted for听PPP), scaled against the number of academic staff it employs. This suggests the extent to which businesses are willing to pay for research and a university鈥檚 ability to attract funding in the commercial marketplace 鈥 useful indicators of institutional quality.
But the extent to which universities are supporting their national economies through technology transfer is an area that deserves greater recognition. The patents metric, introduced in the Arab University Rankings for the first time this year, is defined as the number of patents from any source that cite research conducted by the university. The data is provided by Elsevier and relates to patents published between 2020 and 2024 (not research published between these dates). Sources for patents include the World Intellectual Property Organisation, the European Patent Office and the patent offices of the US, the UK and Japan, as well as more than 100 patent offices around the world. In total, 43 are relevant for the time period. This measure is subject-weighted to avoid penalising universities producing research in fields low in patents, and scaled for institutional size.
Producing the overall ranking
Data collection
Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular datapoint is not provided, we enter a conservative estimate for the affected metric. By doing this, we avoid penalising an institution too harshly with a 鈥渮ero鈥 value for data that it overlooks or does听not provide, but we do听not reward it for withholding them.
Getting to the final result
Moving from a series of specific datapoints to indicators and then to a total score for an institution requires us to match values that represent fundamentally different data. To do this, we use a standardisation approach for each indicator, and then combine the indicators in the proportions shown above.
The standardisation approach we use is based on the distribution of data within a particular indicator, where we calculate a cumulative probability function, and evaluate where a particular institution鈥檚 indicator sits within that function.
For most indicators, we calculate a normal cumulative probability function. The distribution of data in the metrics on teaching reputation, research reputation, research influence, research excellence and patents requires us to use an exponential scoring function.
Inclusion criteria for Arab University Rankings
Universities must supply data to be included in the ranking. They must also have published more than 500听research publications between 2020 and听2024.
The Arab University Rankings considers only institutions based in the following countries: Algeria, Bahrain, Comoros, Djibouti, Egypt, Iraq, Jordan, Kuwait, Lebanon, Libya, Mauritania, Morocco, Oman, Palestine, Qatar, Saudi Arabia, Somalia, Sudan, Syria, Tunisia, the United Arab Emirates and Yemen.
Understanding the results
Banded institutions
Precise ranks and overall scores are shown for the institutions ranked in the top 100. We then display banded ranks and overall scores for institutions in the rest of the table because the difference between their scores is not statistically significant.
探花视频
Reporter institutions
A small number of institutions have听鈥渞eporter鈥 status听and are listed at the bottom of the table. This means that they provided data but did not meet our eligibility criteria to receive a听rank.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰鈥檚 university and college rankings analysis
Already registered or a current subscriber?
