• Ranking the rankings

    The list of lists

    International rankings have got universities in a stranglehold, even though they are completely pointless.

    in short

    The RUG likes to profile itself as a top 100 university. Yet neither RUG president Poppema nor rankings expert Jules van Rooij take the rankings very seriously. ‘No overall ranking is really very useful.’

    However, international students do place importance on these lists: a school’s position in the rankings plays an important role in their choice of university.

    This is why people are paying attention to the rankings and Van Rooij is working hard to understand their methodology.

    The best, oldest, and most stable ranking is the Shanghai ranking. Poppema thinks the RUG might just make the top 50, due among other things to the expansion to Yantai.

    The other rankings – the Quacquarelli Symonds (QS) and Times Higher Education (THE) – are mainly concerned with reputation and therefore do not provide very much objective information.

    Furthermore, QS and THE are commercial businesses that earn money by ‘helping’ universities with their data analyses. It is also said that they often change their methodologies to bring about shifts in the rankings and thereby generate attention.

    full version

    Reading time: 10 min. (2,151 words)

    It is bizarre. Each and every one of the rankings that pops up in newspapers, policy meetings, or reports on the RUG website that the RUG proudly uses to declare itself a ‘top 100’ university is complete nonsense.

    It is hardly surprising that members of the University Council have this opinion. That the pre-eminent ranking expert at the RUG, Jules van Rooij, shares this opinion, is a little more amazing. But that even RUG president Sibrand Poppema, Mister Ranking Incarnate, also supports the above statement, is downright curious.

    A high score

    And yet, it is true. ‘The only reason rankings are important is because people think they are’, Poppema emphasises. Some are a bit better than others, he says. And it’s not as though they are completely out of touch with reality, but is Boston University – 73rd in the Shanghai ranking – really better than the RUG in 75th place? And is the Japanese University of Nagoya really so much worse at position 77? No, they are not.
    A high score simply means a good research university, academics who publish a lot of work, and who are cited often as well. But that’s about it. ‘Not a single overall ranking holds any meaning’, says Van Rooij. ‘It’s much more interesting to look at the sub fields. Not even the Ivy League is number one in all fields.’

    International students

    But international students do look at those rankings. Ask anyone in an international student house, such as the one at the Winschoterdiep. Romanian student Elisabeth Efraim picked Groningen out of a list of possible universities because it had the highest score. Molly Qian, from China, says, ‘My choice was based more on the ranking than on the city.’

    When the RUG has a booth at an educational fair in Indonesia, the first question for the employees is always: ‘Are you in the top 200? Top 100?’

    If the answer is ‘no’, the students will immediately move on, and the university is left with nothing. But a university – and especially the RUG – needs those international students. Poppema: ‘This year marks the first years that the RUG has attracted less Dutch students, and that trend is only going to develop further in the coming years. We are a regional university in a region experiencing retrenchment. The amount of young Dutch people is declining, especially the amount of young people in the northern provinces. We need international students for stability. It’s that simple.’

    Admission of weakness

    It is, Van Rooij confirms, a prisoner’s dilemma. The rankings do not deserve all the attention they are getting, but not participating is not an option. ‘And if you lose and claim the methodology is wrong, that’s seen as an admission of weakness. No matter how you look at it, those lists are of great influence on one’s reputation. And they’re here to stay.’

    And that is why he spends a large part of his time trying to understand and analyse how those lists work. Take the Shanghai ranking, for instance. It is the oldest, most objective, and most stable of the three ‘greats’ and also the only one done by a university. ‘They collect their own data and they are therefore the only ranking you cannot influence by playing a clever game’, says Van Rooij.

    Zernike

    If there is one list that Poppema attaches value to, it’s the Shanghai ranking, precisely because the Shanghai list relies heavily on research citations, authors that are cited often, and publications in Nature and Science. And that is good for the RUG, because in those areas, they are doing well. Having a Nobel prize or Fields medal (for mathematics) winner among your alumni or staff also counts heavily. Although it has been 100 years since Frits Zernike worked at the RUG and 60 years since his Nobel prize, that prize still counts for the Shanghai ranking. ‘Although it decreases by 10 per cent every ten years’, Van Rooij calculates.

    ‘The people here already work so hard’

    Poppema thinks it is realistic that the RUG will go up in this ranking. It is the effect of a real increase in citations, good publications, and internationals. But the RUG is also confronted with its own limits, because it’s not so simple to beat schools like MIT, Harvard, or Cambridge.

    Yantai

    However, growth is still possible, according to Poppema. He even believes we could make it to the top 50 if the RUG delivers either a Nobel prize winner – a feat he does not consider impossible – or substantially increases its publications. This can be achieved through the expansion of the RUG to Yantai. Because in Groningen, the RUG has reached its maximum potential. ‘The people here already work so hard. They really can’t work any harder.’

    He is quick to add that the RUG is not expanding to Yantai because of the rankings. The RUG wants to expand to bring about necessary growth. Any upward movement in the rankings is a welcome incidental benefit.

    Surveys

    But then there are those other two international rankings: the Times Higher Education (THE) and Quacquarelli Symonds (QS). Poppema wants nothing to do with those.

    Both THE and QS rely heavily on reputation, while the data are collected through surveys. ‘QS questions random people’, says Van Rooij. ‘[They approach] basically anybody who has university e-mail address.’ In many cases, it is not even clear if the person is an active academic. For example, Van Rooij himself is always sent the survey. In previous years, THE used the database provided by publisher Thomson Reuters. This meant that the people surveyed were at least active peer reviewers or publishing academics. But it begs the question whether a biologist in Alaska has anything useful to say about the level of education at a university in Japan.

    ‘And even when it does concern their own expertise, the question remains how reliable the information is’, Van Rooij says. ‘They ask you to name the 30 best universities in your field. You can come up with the first five from your network. But after that? There’s a good chance you’ll just consult the top 50 or check those universities that you once heard of.’ This means that a top 100 university has a larger chance to stay in that top 100 simply based on brand awareness alone.

    Multimillion-dollar companies

    What may even be worse than the faltering methodology is the world behind both these rankers. Both QS and THE are multimillion-dollar companies that are mainly concerned with making money. QS can be sure that their ranking will be clicked on approximately 100 million times by prospective students, which means that universities will eagerly advertise on their website.

    ‘They have no problem admitting that they are not interested in stability’

    In addition, QS generates income by ‘helping’ universities with their data, for instance by analysing and calculating the data or by sending out an advisor. ‘And that is a service you pay for’, Poppema says. How much? He does not know exactly, because the RUG has never retained this service. But it is in the tens of thousands of euros.

    Changes in methodology

    Finally, QS and THE will do anything to draw attention to their ranking. This includes frequent changes to their methodology, which cause considerable shifts in the ranking. After all, that gets media attention, which in turn will ensure more traffic on their websites. For instance, the RUG was in 134th place in the THE ranking in 2012, in 89th place a year later, and in 2015 they were 117th. ‘They have no problem admitting that they are not interested in stability. They don’t want the universities to specifically adapt their policies to this’, says Poppema, who had a seat on the advisory board for THE for several years.

    Phil Baty, who works at THE, has a different explanation for the many changes: ‘THE Rankings are always striving to improve and we will make methodological improvements where we can. This can lead to some instability, but I believe that as long as we are very transparent about the changes we make, it is in everyone’s interests to get the clearest, most balanced picture possible.’

    Most reliable

    The fact remains that the Shanghai ranking did not need any of those changes in its twelve years of existence, and that this ranking is seen as the most reliable among students as well.

    Perhaps that is the reason why Poppema is not losing any sleep over the question whether the RUG will go up or down in the Times list that comes out next week. And it is also why, even if the RUG does manage to get back into the top 100, there will be no glowing press release.

     

    The RUG in the rankings

    Below is further information about the four biggest rankings, followed by an overview of other, less trend-setting rankings. The RUG’s position in each of the rankings is given between parentheses.

    Shanghai ranking  (75)

    For the Academic Ranking of World Universities – better known as the Shanghai ranking – it’s all about the science. For alumni or staff of a university to win a Nobel prize or a Fields medal is cause for celebration, and the Shanghai ranking dedicates ten per cent of its methodology to that, too – as long as it’s been within the past hundred years, that is. They also measure Web of Science citations from the previous five years, but the downside to the Shanghai ranking is that they don’t measure quality of teaching and pay little attention to the humanities.

    Of the three largest comprehensive rankings, the Shanghai ranking seems to be the most widely respected by experts and students alike. It’s also the oldest of the three, but that isn’t saying much: it was tabulated for the first time in 2003.

    The Shanghai ranking was launched with Chinese government backing and designed ‘to provide a global benchmark against which Chinese universities – enjoying billions in state and private investment – could assess their progress.’

    The Shanghai ranking gets part of its findings directly from Thomson Reuters, as well as yet another ranking: CWTS Leiden, a system that measures the impact of scientific publications from 500 global universities. In the Leiden list, Groningen is in 120th position.

    Times Higher Education (117 in 2014)

    Nowadays, Times Higher Education (THE) comes to its scores based on a whole lot of tiny factors, ranging from research reputation to how internationally mixed a university is. Some aspects receive as little as .75 per cent of the total score (in case you were wondering, that’s for public research income, a subcomponent of the research criteria). Starting this year, THE is switching teams from the Thomson Reuters database to Scopus. THE also surveys academics at other schools to rate ‘teaching and research quality.’

    U.S. News ‘Best Global Universities’ (98 in 2014)

    In 2014, the American journal U.S. News realised there’s a whole world out there beyond the states and published its first ‘Best Global Universities Rankings’. The publisher makes painstaking efforts to explain exactly how their methodology works, and like most of the big boys, they too rely on Thomson Reuters for citation info.

    Quacquarelli Symonds (100)

    Quacquarelli Symonds (QS) has a thing for international students more than most of the other rankings, which is one way that the former partner of THE sets itself apart. They also really love surveys, deriving more than half of their score from them and using student-to-faculty ratio as a proxy for teaching quality. QS jumped ship in 2010 to Thomson Reuters to Scopus.

    Relying on surveys to assess reputation is far from an exact science though. QS (and THE) seem to launch a slightly tweaked methodology each year, which can cause the position of a school to vary dramatically from one year to the next.

    Other rankings:

    Centre for World University Rankings (110)

    National Taiwan University Ranking (81)

    Global Employability Survey (86)

    Essential Science Indicators (ESI) Citation Impact (54)

    Webometrics (107)

    UI GreenMetric World University Ranking (49)