The French version of this posting is available here.
The French newspaper Libération organizes regular chatrooms where questions are adressed to various politicians. On September 15, the guest was Mrs Valérie Pécresse, the French minister of Education and Research. She answered several interesting questions and, among them, the one on the Shanghai ranking published a little ago. I prefer laughing than crying from despair when I read this…
So, the question was:
“What is your opinion about the Shanghai ranking? What do you think about ranks French universities have?”
And here comes the answer:
I personally went to the Jiao Tong University which establishes the Shanghai ranking and I realized that there were biases in the ranking which lead to underestimate the performance of French universities. In particular, the publications of CNRS (Centre National de la Recherche Scientifique, the French National Center for Basic Research) researchers working in joint laboratories in universities are not taken into account. Neither are the recent merges of universities (Bordeaux, Lorraine, Aix-Marseille). We will work with them, because these rankings are read worldwide. It is better to use these rankings than suffer from them.
Wow, we immediately feel better, hein. Fortunately, Mrs Pécresse is there and, such as Superwoman, she will change the mean world. Is it possible to know why the Superminister didn’t read the paper or even the dedicated Wikipedia page describing the ranking methodology before spending taxpayers’ money to fly to Shanghai to realize there is a particular way of making the ranking? I mean, you just enter “Shanghai ranking methodology” in you favorite search engine and first results you have are about how it is done. So, when you are on the page of the ranking itself, you have a link on the very top saying “Methodology”. If you click it, it brings you to the page of the methodology used to establish this classification. Magic.
Well, it is very clear how the universities to be ranked are selected: it is described in the very first section of this page:
ARWU considers every university that has any Nobel Laureates, Fields Medalists, Highly Cited Researchers, or papers published in Nature or Science. In addition, universities with significant amount of papers indexed by Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI) are also included. In total, more than 1000 universities are actually ranked and the best 500 are published on the web.
In case someone from the ministry would make the effort to search this methodology before flying to Shanghai next time, here it is:
Below, you can read the section “Definition of indicators” indicating what each word from the Code column means. For instance, the definition of the indicator Award is as follows:
The total number of the staff of an institution winning Nobel Prizes in Physics, Chemistry, Medicine and Economics and Fields Medal in Mathematics. Staff is defined as those who work at an institution at the time of winning the prize. Different weights are set according to the periods of winning the prizes. The weight is 100% for winners in after 2001, 90% for winners in 1991-2000, 80% for winners in 1981-1990, 70% for winners in 1971-1980, and so on, and finally 10% for winners in 1911-1920. If a winner is affiliated with more than one institution, each institution is assigned the reciprocal of the number of institutions. For Nobel prizes, if a prize is shared by more than one person, weights are set for winners according to their proportion of the prize.
The guys kindly provide the link to the Nobel Prize webpage. You can therefore browse and read who are the Nobel Prize winners for the year of your choice (2009, for example). The website to search for Fields medalists as well for citations are also provided.
So, why going to Shanghai? There are very few Nobel Prize winners working in French universities. Why didn’t she mention another ranking which is much more meaningful: the Times Higher Education ranking. If you have a look on the methodology page, you read that the criteria are more realistic. So, they use 13 indicators sorted in 5 main categories:
- Teaching — the learning environment (worth 30 per cent of the final ranking score)
- Research — volume, income and reputation (worth 30 per cent)
- Citations — research influence (worth 32.5 per cent)
- Industry income — innovation (worth just 2.5 per cent)
- International mix — staff and students (worth 5 per cent)
As you can see, no mention of Nobel Prize or suchlikes. Which, for me, is extremely respectful idea: I mean, an university is aimed at giving knowledge to students and teaching them how to do good research. The university is thus supposed to insure that all the students have equal access to knowledge and carreers. Winning a Nobel Prize is cool, but far from the main goal of university policy. Therefore, you have a much more meaningful estimation of the university you are ranking with the Times Higher Education criteria than with the Shanghai ranking ones.
So, Mrs Pécresse, what about the biases in your “study”?
It is nice to definitely turn up a blog where the blogger is bright. Thanks for creating your web site.