Sunday, 26 February 2012

Collective Actions: Enhancing Access to Justice and Reconciling Multilayer Interests?

BIICL
Soon available, based on last year's conference at Kyushu University. For the Amazon page see here.

Tuesday, 14 February 2012

What’s wrong with the QS world ranking of law schools (part 2: the performance data)

The previous post (here) discussed, and dismissed, the perception element of the QS world ranking of law schools. This post turns to the performance data. These are based on citations per faculty (see here) using the database of Scopus (see here). This leads to the following – rather curious – ranking of world's top twenty law schools (see the citations column of the law ranking):
(1) Victoria University of Wellington (NZ), (2) University of Cologne (GER), (3) University of Munster (GER), (4) University of Calgary (CAN), (5) King's College London (UK), (6) University of Florence (ITA), (7) Free University Berlin (GER), (8) Humboldt University Berlin (GER), (9) University Freiburg (GER), (10) University of Illinois (US), (11) University of Otaga (NZ), (12) University of British Columbia (CAN), (13) Erasmus University Rotterdam (NL), (14) University of Washington (US), (15) University of Bristol (UK), (16) Queen's University Belfast (UK), (17) University of Adelaide (AUS), (18) Monash University (AUS), (19) Queen's University (CAN), (20) University of Toronto (CAN).
What shall we think about this? I’m sure the Victoria University of Wellington has a fine law school but, really, the best one of the world? And is it really justified that five of the top ten law schools are German? Is Calgary really the best Canadian law school, King’s College London the best British one, Illinois the best US one, and Adelaide the best Australian one?
-
So, what has gone wrong with this citation index? Presumably it has to do with the Scopus database. First I checked the journals that are part of it: eg it includes the big US law reviews, the mainstream UK law journals such as MLR and OJLS, but not more specialised journals such as EBLR, JCLS or many non-English language publications. But this does not really explain this strange ranking.
-
Thus, second, I found the actual list of journals classified as ‘law journals’ (here): the top ones, in terms of cites, are ‘Accident Analysis and Prevention’, ‘Journal of Forensic Sciences’ and ‘Expert Opinion on Therapeutic Patents’. I have never heard of these journals and of course these are not journals that should be part of a law ranking; in the list the first proper law journal (UPenn L. Rev.) is only at rank 15. Thus, this ranking is obviously meaningless because it greatly exaggerates the citations of law schools where legal academics (or perhaps just one of them) publish in these non-law journals.
-
Perhaps to add: the ranking of universities (not only law schools) based on citations per faculty (see
here) looks somehow more plausible – and I would not say that such performance data cannot be used, if done properly, as discussed previously for the Leiden ranking (see post here).

Saturday, 11 February 2012

What’s wrong with the QS world ranking of law schools (part 1: the perception data)

The QS World University Rankings have been around for a couple of years but in 2011 it was the first time that QS provided a separate ranking for law, available here.
s
The top 10 – (1) Harvard, (2) Oxford, (3) Cambridge, (4) Yale, (5) Stanford, (6) Berkeley, (7) LSE, (8) Columbia, (9) Melbourne, (10) NYU – may sound plausible, at least for the English-speaking word, but there are a number of problems with this global law school ranking. The ranking is based on both perception and performance data. In this post I deal with the first type of data, a criticism of the second one will follow in the next post. For the perception data QS asked both academics and employers to nominate up to thirty universities. 
  • A first problem is, however, that the nominators have been fairly unevenly distributed around the world. For example, with respects to the academics (see here) there have been five times as many academic responses from the US as from India and ten times as many from the US as from China. And with respect to the employer data (see here), there have been three times as many responses from China as from India, and as many from Germany as from Hong Kong.
  • Second, it is already very doubtful how much academics or employers in general know about all universities of the world in order to make an even remotely reliable comparison. With respect to law, this problem is even more pronounced due to the national character of legal research and legal practice. For example, even in Europe most lawyers from one country, say England, would have no clue how good legal education is a particular French or German, or even Slovenian or Latvian, university.
  • Third, asking academics and employers can be interesting because it shows which stereotypes people have about the quality of certain universities: for instance, one can expect that old and large universities with well-sounding names affiliated with well-known cities benefit. But quality is something quite different. Thus, the first part of the QS study should really be called “University Perception Index” (eg like the Corruption Perception Index, here). To measure quality, we need performance data, as will be discussed in the post next week.