- the schools' self-image,
- ability to recruit faculty and students, or
- relationships with various stakeholders,
various other rankings have emerged as possible enhancements of / substitutes for the USNWR rankings. Among these possibilities are SSRN downloads (with or without the "top 3 authors" per school counted) and the latest Vault rankings of the 25 most undervalued schools. (See MoneyLaw's post on the Vault rankings here.)A couple of thoughts before this year's madness begins:
1. I'm glad that I'm someone who can sit back and watch this year, as last year's episode was the continuation of some serious unpleasantness when I was the dean. (Take a look at the "Preview of some Managing by Ambush material," located on the right-hand side of my blog, if you're interested in rehashing the experience.) I don't envy the deans, each of whom has to decide what, if anything, to do with this year's ranking. Except for a few schools, deans have a no-win dilemma. Be nice to your dean. He or she is already chewing more antacids than usual.
2. Schools are doing the same freakin' things with the Vault survey, with its whopping 512 respondents, that they do with USNWR's rankings. They're touting where they sit on the "most underrated" lists. (Think I'm kidding? See this use, for one.) I agree with Vault that all of the schools in the various underrated lists (including some of my old stomping grounds--Ohio State, Houston--and my future school, Boyd/UNLV) are much better than USNWR indicates in its yearly rankings. (You can read comments about the underrated schools by region by clicking on any of Vault's "read comments" entries and then scrolling up and down for comments on other schools.)
What Vault's list cannot mean, however, is that some of the underrated schools are more underrated than others. Even if it were possible to distinguish among these schools with certain statistically significant factors, a meager 512 comments by recruiters just doesn't cut it.
3. Therefore, before the annual "we're number N!" madness begins, let's just take a note from the pages of Susan Powter, of the "stop the insanity!" infomercials. You want to refer to USNWR's rankings? Fine. You want to refer to Vault's "underrated" list? Fine again. You want to count SSRN downloads, or "top 10 downloads," or "downloads without provocatively titled articles," or any other measure of quality or status? Fine, fine, fine. But please don't prove to the world that you forgot to take a statistics course in college. Use any of these studies for what they say, but please don't use any study's ranking of law schools from 1-194 to argue that the rankings separate each school from another by real, immutable differences. They don't: they bunch certain schools together along certain characteristics. Take a look at my diagram on page 361 of Eating Our Cake and Having It, Too: Why Real Change is so Difficult in Law Schools, which illustrates the bunching effect.
Are there differences among schools? Of course there are, and some of those differences actually matter. But the real differences involve things that mean far more than mere numbers, and they can't be teased out by easily manipulable studies.