What a law school calls such graduates can have a significant impact on its U.S. News & World Report ranking. A school's "employment at 9 months" score counts for 14% of its overall score in the rankings. Furthermore, because reported placement rates vary relatively little across law schools, a few percentage points can make a big difference. The upshot: A law school's "employment at nine months" score, and thus its overall score in the rankings, may to an unwarranted extent reflect slippery semantics rather than hard facts. I here detail how U.S. News calculates the "employment at nine months" measure and thereby set things up to explain how a law school might exploit the formula to improve its USN&WR ranking.
Every fall, the American Bar Association ("ABA") sends a questionnaire to the law schools it accredits asking, among other things, about the employment status of each school's graduates nine months after graduation. Questionnaire Part 1, question 25b, asks a school to fit each such student into either "graduates whose employment status is unknown" or "graduates whose employment status is known." The latter category contains the following subdivisions:
- Graduates known to be employed;
- Graduates who are enrolled in a full-time degree program;
- Graduates who are unemployed and seeking work;
- Graduates who are unemployed and studying for the bar full-time; and
- Graduates who are unemployed and NOT seeking work.
U.S. News & World Report ("USN&WR") likewise sends the law schools it ranks a questionnaire each fall. With regard to questions concerning employment, as in many other areas, USN&WR asks each such school to repeat the answers it gave to the ABA. USN&WR thus collects data under each of the headings listed above. It does not handle the data the same way the ABA does, however.
Judging from the 2007 ABA-LSAC Official Guide to ABA-Approved Law Schools, the ABA calculates the percentage of graduates employed 9 months after graduation by simply dividing the number of graduates a school reports as "Employed" by the number the school reports as "Employment status known." So, for instance, the 2007 Guide reports that American University School of Law (to pick a school at random) had 301 employed graduates out of the 338 graduates the school had tracked down, meaning that 301/338, or 89.1%, of American's graduates were employed 9 months after graduation.
In the law school rankings it worked up based on that same year's data (the "2007" rankings published last spring), however, USN&WR says that 97.2% of American Law School's graduates were employed nine months after graduation. Whence comes that figure? USN&WR does not publish all the details of its calculations. We can easily figure it out, however, by combining what USN&WR does say about its rankings methodology and a bit of reverse engineering. That effort reveals that USN&WR calculates a law school's "Employment at 9 months" (or "Emp9") score thusly:
["Employed" + ("Employment status unknown" * .25) + "Pursuing graduate degrees"]
[All graduates – "Unemployed not seeking employment"].
(The data labels in quotes come from the ABA-LSAC Guide. "All graduates" equates to "Employment status known" plus "Employment status unknown.") American University Law School's "Emp9" score in the USN&WR rankings thus equals [301 + (4 *.25) + 15]/(342-16) = 317/326 = 97.2%.
My fellow ranking geeks should find that information alone quite interesting. Don't thank me, though; credit someone who prefers to remain anonymous for putting me on notice that USN&WR's "Emp9" formula differs from the ABA's. More significantly, my source also tipped me off as to why law school administrators should care very much about how USN&WR's formula (mal)functions. In brief, USN&WR's Emp9 formula allows a law school to score notably higher in the rankings by characterizing those of its graduates both unemployed and studying full-time for the Bar as "unemployed and not seeking work" rather than as "unemployed and studying for the Bar full-time."
I for now leave as an exercise for reader such questions as: "Why does that classification strategy benefit law schools?" "Which law schools pursue that strategy?" "How much do they benefit from it?" "Is that ethical?" "How did we get into the mess?" and "How do we get out of it?" Time and guts permitting, I'll offer my own answers in a later post.
[Crossposted to Agoraphilia.]
Earlier post about Emp9 measure: