Tuesday, August 04, 2009

Reforms Suggested by Modeling the Law School Rankings

As I recently observed, the close fit between law schools' scores in U.S. News & World Report's rankings and the scores of those same schools in my model of the ranking "suggests that law schools did not try game the rankings by telling USN&WR one thing and the ABA . . . another." Since both Robert Morse, Director of Data Research for USN&WR, and the ABA Journal saw fit to comment on that observation, perhaps I should clarify a few points.

First, I have no way of knowing whether or not law schools misstated the facts, by accident or otherwise, to both the ABA and USN&WR. The fit between USN&WR's scores and my model's scores indicates only that law schools reported, or misreported, the same facts to each party.

Second, this sort of consistency test speaks only to those measures USN&WR uses in its rankings, that it does not publish with its rankings, and that the ABA collects from law schools: median LSAT, median GPA, overhead expenditures/student, financial aid/student, and library size. Measures that USN&WR uses and publishes—reputation among peers and at the Bar, employment nine months after graduation, employment at graduation, student/faculty ratio, acceptance rate, and Bar exam performance—go straight into my model, so I do not have occasion to test their consistency against ABA data. In some cases—the reputation scores and the employment at graduation measure, the ABA does not collect the data at all. This proves especially troubling with regard to the latter. We have little assurance that USN&WR double-checks what schools report under the heading of "Employment at Graduation," and no easy way to double-check that data ourselves.

Third, and consequently, USN&WR could improve the reliability of its rankings by implementing some simple reforms. I suggested three such reforms some time ago. USN&WR has largely implemented two of them by making its questionnaire more closely mirror the ABA's and by publishing corrections and explanations when it discovers errors in its rankings. (I claim no credit for that development, however; I assume that USN&WR acted of its own volition and in its own interest.)

Another of my suggested reforms remains as yet unrealized, however, so allow me to repeat it, here: USN&WR should publish all of the data that it uses in ranking law schools. It could easily make that data available on its website, if not in the print edition of its rankings. Doing so would both provide law students with useful information and allow others to help USN&WR double-check its figures.

To that, I now add this proposed reform: USN&WR should either convince the ABA to collect data on law school graduates' employment rates at graduation or discontinue using that data in its law school rankings. That data largely duplicates the more trustworthy (but still notoriously suspect) "Employment at Nine Months" data collected by the ABA and used by USN&WR in its rankings. And, unlike that data, law schools do not report "Employment at Graduation" numbers under the threat of ABA sanctions. We cannot trust the employment at graduation figures and USN&WR does not need them.

Among the reforms I suggested some two years ago I also included one directed at the ABA, calling on it to publish online, in an easily accessible format, all of the data that it collects from law schools and that USN&WR uses in its rankings. I fear that, in contrast to USN&WR, the ABA moved retrograde on that front. I leave that cause for another day, however; here I wanted to focus on what my model can tell us about USN&WR's rankings.

[Crossposted at Agoraphilia, MoneyLaw.]

Labels: ,

2 Comments:

Anonymous Soccer Uniform said...

Very informative post of this topic and specially a good and easy understandable way of defining the situation

8/12/2009 9:37 AM  
Blogger Tom W. Bell said...

Thanks, Soccer Uniform!

8/19/2009 3:25 PM  

Post a Comment

<< Home