How Top-Ranked Law Schools Got That Way, Pt. 3
Part one and part two of this series focused on the top law schools in U.S. News and World Report's 2010 rankings, offering graphs and analysis to explain why those schools did so well. This part rounds out the series by way of contrast. Here, we focus on the law schools that ranked 41-51 in the most recent USN&WR rankings, those that ranked 94-100, and the eight schools that filled out the bottom of the rankings.
The above chart shows the weighted and itemized z-scores of law schools about 1/3rd of the way from the top of the 2010 USN&WR rankings. Note the sharp downward jog at Over$/Stu—a residual effect, perhaps, of the stupendously large Over$/Stu numbers we earlier saw among the very top schools. Note, too, that three schools here—GMU, BYU, and American U.—buck the prevailing trend by earning lower scores under PeerRep than under BarRep (GMU's line hides behind BYU's). As you work down from the top of the rankings, GMU offers the first instance of that sort of inversion; all of the more highly ranked schools have larger itemized z-scores for PeerRep than for BarRep. It raises an interesting question; Why did lawyers and judges rank those schools so much more highly than fellow academics did?
The above chart shows the weighted, itemized z-scores of the law schools ranked 94-100 in the 2010 USN&WR rankings—about the middle of all of the 182 schools in the rankings. As we might have expected, the lines bounce around more wildly on the left, where they trace the impact of the more heavily weighted z-scores, than on the right, where z-scores matter relatively little, pro or con. Beyond that, however, no one pattern characterizes schools in this range.
The above chart shows the weighted and itemized z-scores of law schools that probably did the worst in the 2010 USN&WR rankings. I say, "probably," because USN&WR does not reveal the scores of schools in the bottom two tiers of its rankings; these eight schools did the worst in my model of the rankings. Given that uncertainty, as well as for reasons explained elsewhere, I decline to name these schools.
Here, as with the schools at the very top of the rankings, we see a relatively uniform set of lines. All of the lines trend upward, of course. These schools did badly in the rankings exactly because they earned strongly negative z-scores in the most heavily weighted categories, displayed to the left. Several of these schools did very badly on the Emp9 measure, and one had a materially poor BarPass score. Another of them did surprisingly well on Over$/Stu, perhaps demonstrating that, while the very top schools boasted very high Over$/Stu scores, no amount of expenditures-per-student can salvage otherwise dismal z-scores.
[Crossposted at Agoraphilia, MoneyLaw.]
The above chart shows the weighted and itemized z-scores of law schools about 1/3rd of the way from the top of the 2010 USN&WR rankings. Note the sharp downward jog at Over$/Stu—a residual effect, perhaps, of the stupendously large Over$/Stu numbers we earlier saw among the very top schools. Note, too, that three schools here—GMU, BYU, and American U.—buck the prevailing trend by earning lower scores under PeerRep than under BarRep (GMU's line hides behind BYU's). As you work down from the top of the rankings, GMU offers the first instance of that sort of inversion; all of the more highly ranked schools have larger itemized z-scores for PeerRep than for BarRep. It raises an interesting question; Why did lawyers and judges rank those schools so much more highly than fellow academics did?
The above chart shows the weighted, itemized z-scores of the law schools ranked 94-100 in the 2010 USN&WR rankings—about the middle of all of the 182 schools in the rankings. As we might have expected, the lines bounce around more wildly on the left, where they trace the impact of the more heavily weighted z-scores, than on the right, where z-scores matter relatively little, pro or con. Beyond that, however, no one pattern characterizes schools in this range.
The above chart shows the weighted and itemized z-scores of law schools that probably did the worst in the 2010 USN&WR rankings. I say, "probably," because USN&WR does not reveal the scores of schools in the bottom two tiers of its rankings; these eight schools did the worst in my model of the rankings. Given that uncertainty, as well as for reasons explained elsewhere, I decline to name these schools.
Here, as with the schools at the very top of the rankings, we see a relatively uniform set of lines. All of the lines trend upward, of course. These schools did badly in the rankings exactly because they earned strongly negative z-scores in the most heavily weighted categories, displayed to the left. Several of these schools did very badly on the Emp9 measure, and one had a materially poor BarPass score. Another of them did surprisingly well on Over$/Stu, perhaps demonstrating that, while the very top schools boasted very high Over$/Stu scores, no amount of expenditures-per-student can salvage otherwise dismal z-scores.
[Crossposted at Agoraphilia, MoneyLaw.]
Labels: law school rankings, U.S. News
0 Comments:
Post a Comment
<< Home