We can wonder why the world has taken to university rankings. Perhaps there is a deep longing for hierarchy, even aristocracy, in the human soul.
The celebrity culture suggests this. But unlike the older kind of aristocrats, the status of modern celebrities is mostly temporary. ‘Public opinion’, led by the tabloids, loves to push new celebrities up and then pull them down.
After all, ours is a democratic age, in which fame is determined by merit rather than by birth—and comparative merit can always be contested. We live also in a market age, and there is a market in status, in which no one stays on top for a long time unless they have much money (in markets, money always ranks high).
Thus it is with university rankings. A leading position in the university league tables is often insecure, especially in those rankings that have been deliberately designed to be volatile, which at global level are QS and the Times Higher Education.
In an aristocracy of merit, only those universities with the strongest inherited reputations and the deepest pockets, the Oxfords, Harvards and Stanfords, are guaranteed a position near the top. Other universities have to work at it.
Why should we have to work at it? After all, rankings are a distraction from the core business of a university: teaching, scholarship, research and public service. Competition between institutions does not necessarily improve their work and might drain precious resources away from teaching and research. Higher education institutions should be encouraged to cooperate, not compete.
All true. The problem is that global rankings, which originated largely from outside the higher education sector ten years ago, have become entrenched as a competitive process and as the main source of public information (a highly simplified and often misleading one) about higher education.
We can no more vanish rankings into a point simply by wishing for it, than we can snap our fingers and obliterate the £9000 student fee.
And there are costs if we ignore rankings. Institutions with a declining rank suffer over time, losing their attractiveness to students, staff, government and public. As stewards of their institutions, university leaders are obliged to take steps to maximize the ranked position of their institutions over the course of their tenure, even while pursuing other, often contrary, objectives. It is especially important to achieve this in the global university rankings, which are more significant than UK national rankings, especially outside UK.
How then does a university perform well in the global rankings? It depends which ranking.
- The Shanghai ARWU is based entirely upon research performance, measured by the number of Nobel prizes won by former students and held by current staff, the number of high citation researchers, articles in Nature and Science, and total citations to published research papers;
- The Leiden and Scimago rankings measured the number of published journal papers and the rate these papers are cited by other scholars;
- Webometrics measure the number of web pages and hits on those pages;
- The Times Higher Education ranking measures academic reputation for both teaching and research, research output, number of PhD students, income for research, international collaboration in publishing, quantity of staff (the lower the student-staff ratio the better), and the proportion of students and staff who are international (the higher the better);
- The QS ranking is based on opinion surveys of both academics and graduate employers, citations per staff member, the student-staff ratio, and international students and staff.
All institutional rankings have one feature in common. They favour institutions such as UCL— comprehensive multi-discipline universities with high performing science-based faculties. This is especially true in research rankings, but also true of rankings that use opinion surveys, which are dominated by large universities.
Specialist single-discipline higher education institutions like the IOE cannot figure in the global rankings at all, except for those rankings that offer discipline-based league tables — the ARWU, Times Higher and QS. All three rank universities in terms of engineering, medicine, science and business. But only QS provides a league table in Education.
The IOE is invisible in the global league tables except on the QS website, where it currently sits at number one in Education. To stay there IOE must continue to excel in academic and employer surveys of reputation, and in research paper citations, and maintain high proportions of foreign-born students and staff. But Harvard, with its stellar reputation coupled to its sheer size (it publishes 64 per cent more journal papers than the world’s second largest science university, Toronto) can just go on being Harvard. Its ranking is not going to change.
Simon Marginson, Professor of International Higher Education at the IOE, is a member of the International Advisory Board of the Shanghai Academic Ranking of World Universities (ARWU), and of the Editorial Board of the Times Higher Education.
He will be giving a keynote speech on Markets in Higher Education at a conference at the IOE on 20 and 21 March 2014 – The State and the Market in Education: Partnership or Competition? organised by Llakes (the Centre for Learning and Life Chances in Knowledge Economies and Societies) and the Grundtvig Study Centre, Aarhus University, Denmark.