Reforming the USN&WR Law School Rankings
[Thanks, Jim, for inviting me to contribute to the MoneyLaw blog. I here offer something I just posted on Agoraphilia.]
Earlier this summer, I began a series of posts about the U.S. News & World Report's law school rankings. (Please see below for links to each post in the series.) My research uncovered many interesting and troubling things about the rankings. I discovered errors in the data that USN&WR used for the most recent rankings and, consequently, errors in the way that it ranked several law schools. More distressingly, I discovered that almost no safeguards exist to correct or prevent such errors. I think it fair to say that, but for my peculiar obsession with the USN&WR rankings, nobody would have noticed the errors I've documented. That won't do. We cannot rely on one nutty professor to keep the rankings honest. I thus here wrap up my series about the most recent USN&WR law school rankings by describing several reforms designed to make law school rankings more accurate and open. Although I suggest all of them, implementing any one of these reforms would make errors in the rankings less likely, and surviving errors more likely to get corrected.
1. USN&WR's Questionnaire Should Mirror the ABA's
Both the ABA and USN&WR send law schools questionnaires each fall. The latter apparently wants schools to repeat their answers to the former. Judging from how it asked schools to report their median LSAT and GPA data last fall, however, USN&WR could do a better job of clarifying exactly what data it wants. To avoid honest confusion or lawerly logic-chopping, USN&WR's questionnaire should simply ask schools to repeat exactly the same answers that they put on the ABA's questionnaire.
2. USN&WR Should Commit to Publishing Corrections and Explanations
Law schools have a strong incentive to answer the ABA's fall questionnaire accurately, as that organization controls their accreditation. USN&WR, in contrast, wields no similar threat. Furthermore, law schools have a much more powerful incentive to dissemble on the USN&WR questionnaire, as their responses directly affect their rankings. What can USN&WR do to encourage law schools to give it accurate data?
USN&WR should commit now to publishing corrections to any inaccuracies it discovers in the data it uses to rank law schools. It should do so at all events, given that students use the rankings to make very important decisions. It can do so easily, too; it need only update its website. Yet USN&WR has thus far failed to correct the erroneous data it used to (mis)rank the University of Florida College of Law and Baylor University School of Law. For shame!
Perhaps USN&WR does not want to publicly acknowledge that its law school rankings sometimes contain errors, fearing that to do so would decrease the credibility of its rankings and, ultimately, its profits. Consumers will eventually discover the errors, though. Better that USN&WR should correct the rankings when necessary and thereby reassure its customers that it sells the best data available.
In addition to promising to correct errors in its rankings, USN&WR should also promise to document the cause of any errors it discovers. That double commitment would strongly discourage law schools from misreporting data on the USN&WR questionnaire. No school wants to earn a reputation for opportunistic lying. (Nor, of course, should any school suffer the wrongful imputation that it lied if, in fact, USN&WR causes errors in the rankings.)
3. USN&WR Should Publish All the Data it Uses in Ranking Law Schools
At present, USN&WR publishes only some of the data that it uses to rank law schools. Why? It is not at all clear. Even supposing that it would constitute an unwieldy amount of information in a print format, USN&WR could easily offer all the relevant data online. Specifically, USN&WR should publish the following additional categories of data for each law school it ranks:
Publishing all that data would allow others to double-check it, thereby helping to keep law schools honest and the law school rankings accurate.
4. The ABA Should Publish the Data it Collects and that USN&WR Uses to Rank Law Schools
At present, a law school must pay the ABA $1430/year to receive "take-offs" summarizing the data that the ABA has required all member schools to report. The ABA marks that data as "confidential" and forbids its unauthorized publication. As I discussed earlier, the ABA apparently treats law school data that way not to protect law schools or the ABA from public scrutiny, but rather to increase ABA revenue.
Given that revenue model, the ABA has a strong disincentive to publicly disclose all the data it collects from member schools. Fortunately, however, it need not do so in order to improve the USN&WR rankings. Rather, the ABA need only publicly disclose those few categories of data that it collects and that USN&WR uses to rank law schools. Together with the Law School Admission Council, the ABA already publishes much of that data in the Official Guide to ABA-Approved Law Schools. It remains only for the ABA to publish data in the following categories:
It would greatly help, too, if the ABA would publish in a conveniently downloadable form that and the other data that USN&WR uses in its rankings. The Official Guide to ABA-Approved Law Schools currently comes only in paper or PDF formats, making it necessary to scan or re-key the data needed double-check the USN&WR rankings. That grindingly tiresome process invites the introduction of errors, throwing a needless hurdle before those of us interested in improving the law school ranking process.
As I said, adopting any one of the reforms I suggest would improve how law schools get ranked. Adopting all four would prove better, yet. Please note, though, that I do not promote these reforms for the sake of USN&WR. It seems quite capable of milking the rankings cash cow without my help. Rather, these reforms stand to benefit all of the rest of us—students, professors, and administrators—who live in the shadow of USN&WR's law school rankings.
By opening up public access to the data used to rank law schools, moreover, the reforms I've proposed make it more likely that alternatives to the USN&WR rankings will grow in popularity. Rankings require data, after all. In a better world, the ABA would make lots and lots of data about the law schools it accredits freely available in an convenient-to-use format. Those of us who doubt that USN&WR has discovered the one sole Truth about how to measure law schools might then easily offer the world our own, new and improved, rankings.
So ends my series of posts about the most recent USN&WR law school rankings. I thank my gracious host and co-blogger, Glen Whitman, for putting up with my often-dreary march through the necessarily statistical and administrative arcana. Readers—if any!—who share my interest in these matters may want to note that I plan to write an academic paper relating and expanding on the observations I've made here. Please feel free to drop me a line if you have any suggestions about how I might make such a paper useful to you.
Earlier posts about the 2007 USN&WR law school rankings:
Earlier this summer, I began a series of posts about the U.S. News & World Report's law school rankings. (Please see below for links to each post in the series.) My research uncovered many interesting and troubling things about the rankings. I discovered errors in the data that USN&WR used for the most recent rankings and, consequently, errors in the way that it ranked several law schools. More distressingly, I discovered that almost no safeguards exist to correct or prevent such errors. I think it fair to say that, but for my peculiar obsession with the USN&WR rankings, nobody would have noticed the errors I've documented. That won't do. We cannot rely on one nutty professor to keep the rankings honest. I thus here wrap up my series about the most recent USN&WR law school rankings by describing several reforms designed to make law school rankings more accurate and open. Although I suggest all of them, implementing any one of these reforms would make errors in the rankings less likely, and surviving errors more likely to get corrected.
1. USN&WR's Questionnaire Should Mirror the ABA's
Both the ABA and USN&WR send law schools questionnaires each fall. The latter apparently wants schools to repeat their answers to the former. Judging from how it asked schools to report their median LSAT and GPA data last fall, however, USN&WR could do a better job of clarifying exactly what data it wants. To avoid honest confusion or lawerly logic-chopping, USN&WR's questionnaire should simply ask schools to repeat exactly the same answers that they put on the ABA's questionnaire.
2. USN&WR Should Commit to Publishing Corrections and Explanations
Law schools have a strong incentive to answer the ABA's fall questionnaire accurately, as that organization controls their accreditation. USN&WR, in contrast, wields no similar threat. Furthermore, law schools have a much more powerful incentive to dissemble on the USN&WR questionnaire, as their responses directly affect their rankings. What can USN&WR do to encourage law schools to give it accurate data?
USN&WR should commit now to publishing corrections to any inaccuracies it discovers in the data it uses to rank law schools. It should do so at all events, given that students use the rankings to make very important decisions. It can do so easily, too; it need only update its website. Yet USN&WR has thus far failed to correct the erroneous data it used to (mis)rank the University of Florida College of Law and Baylor University School of Law. For shame!
Perhaps USN&WR does not want to publicly acknowledge that its law school rankings sometimes contain errors, fearing that to do so would decrease the credibility of its rankings and, ultimately, its profits. Consumers will eventually discover the errors, though. Better that USN&WR should correct the rankings when necessary and thereby reassure its customers that it sells the best data available.
In addition to promising to correct errors in its rankings, USN&WR should also promise to document the cause of any errors it discovers. That double commitment would strongly discourage law schools from misreporting data on the USN&WR questionnaire. No school wants to earn a reputation for opportunistic lying. (Nor, of course, should any school suffer the wrongful imputation that it lied if, in fact, USN&WR causes errors in the rankings.)
3. USN&WR Should Publish All the Data it Uses in Ranking Law Schools
At present, USN&WR publishes only some of the data that it uses to rank law schools. Why? It is not at all clear. Even supposing that it would constitute an unwieldy amount of information in a print format, USN&WR could easily offer all the relevant data online. Specifically, USN&WR should publish the following additional categories of data for each law school it ranks:
- median LSAT;
- median GPA;
- overhead expenditures/student for the last two years, which includes
- instruction and administration expenditures;
- a cost-of-living index applied to the preceding sum;
- library operations expenditures;
- law school miscellaneous expenditures; and
- full-time enrollments;
- instruction and administration expenditures;
- financial aid expenditures/student for the last two years, which includes
- direct expenditures on students; and
- indirect expenditures on students;
- (as well as the same full-time enrollments figures used in calculating overhead expenditures/student, above); and
- direct expenditures on students; and
- library resources.
Publishing all that data would allow others to double-check it, thereby helping to keep law schools honest and the law school rankings accurate.
4. The ABA Should Publish the Data it Collects and that USN&WR Uses to Rank Law Schools
At present, a law school must pay the ABA $1430/year to receive "take-offs" summarizing the data that the ABA has required all member schools to report. The ABA marks that data as "confidential" and forbids its unauthorized publication. As I discussed earlier, the ABA apparently treats law school data that way not to protect law schools or the ABA from public scrutiny, but rather to increase ABA revenue.
Given that revenue model, the ABA has a strong disincentive to publicly disclose all the data it collects from member schools. Fortunately, however, it need not do so in order to improve the USN&WR rankings. Rather, the ABA need only publicly disclose those few categories of data that it collects and that USN&WR uses to rank law schools. Together with the Law School Admission Council, the ABA already publishes much of that data in the Official Guide to ABA-Approved Law Schools. It remains only for the ABA to publish data in the following categories:
- overhead expenditures/student, including
- instruction and administration expenditures;
- library operations expenditures; and
- law school miscellaneous expenditures;
- instruction and administration expenditures;
- financial aid expenditures/student including
- direct expenditures on students; and
- indirect expenditures on students.
- direct expenditures on students; and
It would greatly help, too, if the ABA would publish in a conveniently downloadable form that and the other data that USN&WR uses in its rankings. The Official Guide to ABA-Approved Law Schools currently comes only in paper or PDF formats, making it necessary to scan or re-key the data needed double-check the USN&WR rankings. That grindingly tiresome process invites the introduction of errors, throwing a needless hurdle before those of us interested in improving the law school ranking process.
As I said, adopting any one of the reforms I suggest would improve how law schools get ranked. Adopting all four would prove better, yet. Please note, though, that I do not promote these reforms for the sake of USN&WR. It seems quite capable of milking the rankings cash cow without my help. Rather, these reforms stand to benefit all of the rest of us—students, professors, and administrators—who live in the shadow of USN&WR's law school rankings.
By opening up public access to the data used to rank law schools, moreover, the reforms I've proposed make it more likely that alternatives to the USN&WR rankings will grow in popularity. Rankings require data, after all. In a better world, the ABA would make lots and lots of data about the law schools it accredits freely available in an convenient-to-use format. Those of us who doubt that USN&WR has discovered the one sole Truth about how to measure law schools might then easily offer the world our own, new and improved, rankings.
So ends my series of posts about the most recent USN&WR law school rankings. I thank my gracious host and co-blogger, Glen Whitman, for putting up with my often-dreary march through the necessarily statistical and administrative arcana. Readers—if any!—who share my interest in these matters may want to note that I plan to write an academic paper relating and expanding on the observations I've made here. Please feel free to drop me a line if you have any suggestions about how I might make such a paper useful to you.
Earlier posts about the 2007 USN&WR law school rankings:
- Change to U.S. News Law School Rankings Methodology
- "Financial Aid" Revised in U.S. News Methodology
- How USN&WR Counts Faculty for Rankings
- Whence Come the LSATs and GPAs Used in the Rankings?
- Gains and Losses Due to USN&WR's Use of Reported Median LSATs and GPAs
- How to Model USN&WR's Law School Rankings
- Why to Model USN&WR's Law School Rankings
- The ABA and USN&WR's Law School Rankings
- Accuracy of the Model of USN&WR's Law School Rankings
- Z-Scores in Model of USN&WR's Law School Rankings
- Further Tinkering with Model of USN&WR Law School Rankings
- Baylor's Score in the USN&WR Law School Rankings
- What USN&WR Asks About Law Schools' LSATs and GPAs
- USN&WR and Baylor on that School's Data
- The University of Florida's Score in the USN&WR Rankings
- Baylor Explains the Data it Reported for the USN&WR Rankings
- More Edits to Model of USN&WR's Law School Rankings
- Insider Higher Ed Reports on USN&WR's Ranking of Baylor
- Florida Explains the Data it Reported for the USN&WR Rankings
- Scores of All Law Schools in USN&WR Rankings
2 Comments:
This comment has been removed by the author.
I think the ABA already publishes a spreadsheet of a lot of that data in Excel format; I know I have the newest one in my docs folder. I don't know if they publish all of the exact same data as USNWR, but to me it looked like most of the pertinent statistics for someone applying.
Also, if USNWR messes up a metric, and a school gets ranked much more highly than it would otherwise, who cares? The elevated ranking, in and of itself, will lead to better employment prospects from that school. It's a chicken/egg or tail wagging the dog situation.
My hypothesis is that most ABA law education is the same. All the teachers went to Yale and Harvard, even at the worst schools. They are all smart, and in the most important year for grades in terms of employment, everybody takes the same classes and studies the same cases.
The only difference is the caliber of student. It's possible that one learns better in the company of better students. Other than that, I think it's the same everywhere.
Law school rank is a sorting mechanism. It sorts students into Exceptional, great, good, ok, and bad. That's all it's there for. It serves a needed function in the legal community.
It also gives individual consumers a realistic picture of a product. Given that law schools lie outright and egregiously to people who have limited power to arrive at the truth, USNWR is needed. Prospective students should have a way to determine the expected payoff of a legal education from different schools. Let's not kid ourselves: life is drastically different in law, as in other fields, at the top and the bottom. USNWR is quite good at predicting average future earnings, and should be extremely important to prospective law students.
I also have a question for you, and I truly mean no offense or slight. Do professors at schools like yours, who have gone to yale and harvard themselves, ever feel bad about teaching? Is it ever a philosophical discussion that comes up in the lunch room, whether or not it is ethical to teach 100 students when only the top 5% will get a shot at top jobs, and maybe half won't ever be able to get a law job? Thank you very much.
Post a Comment
<< Home