The How, Who, and Why of Strategic Emp9 Reporting
In a prior post, I described how U.S. News & World Report calculates law schools' "employment at 9 months" figures and observed that its "Emp9" formula encourages semantic slight-of-hand. Specifically, a law school can score notably higher in USN&WR's rankings by characterizing a graduate as "unemployed and not seeking work" rather than as "unemployed and studying for the Bar full-time." I closed that post with several questions, which I here start tackling.
Why does that classification strategy benefit law schools?
Working through the details of USN&WR's Emp9 formula, spelled out in my prior post, makes evident why a law school benefits from classifying its gradates as "unemployed and not seeking work" instead of "unemployed and studying for the Bar full-time." Putting students in the former category decreases the denominator in USN&WR's Emp9 formula, thus increasing a school's Emp9 score. Calling a student "unemployed and studying for the Bar full-time" has no like effect.
Recur to the randomly chosen example I used earlier: American University School of Law. In the fall of 2005, it reported to the ABA (and thus presumably to USN&WR) that it had had 16 students "unemployed and not seeking work" and 3 students "unemployed and studying for the Bar full-time" nine months after their 2004 graduation. Those figures, together with others used by USN&WR, gave it a 97.2% Emp9 in the 2007 rankings (which issued in March of 2006). Suppose that American had classified all 19 of those students as "unemployed and not seeking work." In that event, it would have boasted a Emp9 of 98.1%. Conversely, if it had classified all 19 as "unemployed and studying for the Bar full-time," it would have had an Emp9 of only 92.7%.
Which law schools pursue that strategy?
I do not know which law schools, if any, have embraced the characterization strategy I've described. I am not privy to any law school's deliberations on that count—not even my own school's. I venture, however, that if a law school has opportunistically pushed graduates into the "unemployed and not seeking work" category and out of the "unemployed and studying for the Bar full-time" one, it will tend to report a large percentage of graduates in the former category relative to the latter one.
To identify which law schools might have done that, I drew on data in the 2007 ABA-LSAC Official Guide to ABA-Approved Law Schools. I collected it in an Excel file, together with Emp9 data from last year's USN&WR rankings, and offer it to you for the asking; just drop me a line. (Contrary to what it promised last fall, the ABA has still not made its data available in an easily downloaded format.)
I invite you to analyze that data as you see fit. As I said, though, comparing a law school's "unemployed and not seeking work" figure with its "unemployed and studying for the Bar full-time" might help us to determine—not "prove"!—which law schools have reported placement data in ways that improved their Emp9 scores. When I subtracted the latter measure from the former, I found that these schools had "strategic reporting indicator" scores above 5%:
Again, I emphasize that I do not know why the law schools listed above had so many graduates "unemployed and not seeking work" relative to "unemployed and studying for the Bar full-time." I've written to a couple of those schools seeking explanations, but as yet gotten no replies. For now, all I can say is that the law schools listed above, as well as many other schools with notably high though lesser "strategic reporting indicator" scores, reported placement numbers in a pattern consistent with what we would expect from a school that categorized its graduates so as to maximize its USN&WR ranking.
How much do they benefit from it?
Because the Emp9 measure counts for 14% of a law school's score in the rankings, and because most schools' scores cluster in a narrow range, relatively small changes in a school's Emp9 measure can have a large effect on its ranking. Ted Seto documents that phenomenon quite ably in his excellent paper. To give you an idea of how much strategically categorizing graduates can help a law school, allow me to run some numbers through my model of the 2007 rankings.
The 2007 USN&WR rankings, for instance, credited UCLA with a 99.7% Emp9, an overall score of 71, and a rank of 15. Suppose, however, that UCLA had reported not the "unemployed and not seeking work" and "unemployed and studying for the Bar full-time" percentages related above, but rather the average percentages reported by all law schools ranked by USN&WR—2.3% and 3.0%, respectively. In that event, holding all else equal, UCLA would have had a 90.0% Emp9, an overall score of about 68, and a rank of 17—neck-in-neck with its cross-town rival, USC.
I here pause again, leaving for now unaddressed these questions: "Is that [i.e., strategic Emp9 reporting] ethical?" "How did we get into the mess?" and "How do we get out of it?" I'm not very sure, yet, about my answers. (With regard to the last two, at least, Andy P. Morriss' and William D. Henderson's impressive draft paper offers some leads.) Please feel free to comment here, or to email me, if you have your own answers.
[Crossposted to Agoraphilia.]
Earlier posts about Emp9 measure:
Why does that classification strategy benefit law schools?
Working through the details of USN&WR's Emp9 formula, spelled out in my prior post, makes evident why a law school benefits from classifying its gradates as "unemployed and not seeking work" instead of "unemployed and studying for the Bar full-time." Putting students in the former category decreases the denominator in USN&WR's Emp9 formula, thus increasing a school's Emp9 score. Calling a student "unemployed and studying for the Bar full-time" has no like effect.
Recur to the randomly chosen example I used earlier: American University School of Law. In the fall of 2005, it reported to the ABA (and thus presumably to USN&WR) that it had had 16 students "unemployed and not seeking work" and 3 students "unemployed and studying for the Bar full-time" nine months after their 2004 graduation. Those figures, together with others used by USN&WR, gave it a 97.2% Emp9 in the 2007 rankings (which issued in March of 2006). Suppose that American had classified all 19 of those students as "unemployed and not seeking work." In that event, it would have boasted a Emp9 of 98.1%. Conversely, if it had classified all 19 as "unemployed and studying for the Bar full-time," it would have had an Emp9 of only 92.7%.
Which law schools pursue that strategy?
I do not know which law schools, if any, have embraced the characterization strategy I've described. I am not privy to any law school's deliberations on that count—not even my own school's. I venture, however, that if a law school has opportunistically pushed graduates into the "unemployed and not seeking work" category and out of the "unemployed and studying for the Bar full-time" one, it will tend to report a large percentage of graduates in the former category relative to the latter one.
To identify which law schools might have done that, I drew on data in the 2007 ABA-LSAC Official Guide to ABA-Approved Law Schools. I collected it in an Excel file, together with Emp9 data from last year's USN&WR rankings, and offer it to you for the asking; just drop me a line. (Contrary to what it promised last fall, the ABA has still not made its data available in an easily downloaded format.)
I invite you to analyze that data as you see fit. As I said, though, comparing a law school's "unemployed and not seeking work" figure with its "unemployed and studying for the Bar full-time" might help us to determine—not "prove"!—which law schools have reported placement data in ways that improved their Emp9 scores. When I subtracted the latter measure from the former, I found that these schools had "strategic reporting indicator" scores above 5%:
Again, I emphasize that I do not know why the law schools listed above had so many graduates "unemployed and not seeking work" relative to "unemployed and studying for the Bar full-time." I've written to a couple of those schools seeking explanations, but as yet gotten no replies. For now, all I can say is that the law schools listed above, as well as many other schools with notably high though lesser "strategic reporting indicator" scores, reported placement numbers in a pattern consistent with what we would expect from a school that categorized its graduates so as to maximize its USN&WR ranking.
How much do they benefit from it?
Because the Emp9 measure counts for 14% of a law school's score in the rankings, and because most schools' scores cluster in a narrow range, relatively small changes in a school's Emp9 measure can have a large effect on its ranking. Ted Seto documents that phenomenon quite ably in his excellent paper. To give you an idea of how much strategically categorizing graduates can help a law school, allow me to run some numbers through my model of the 2007 rankings.
The 2007 USN&WR rankings, for instance, credited UCLA with a 99.7% Emp9, an overall score of 71, and a rank of 15. Suppose, however, that UCLA had reported not the "unemployed and not seeking work" and "unemployed and studying for the Bar full-time" percentages related above, but rather the average percentages reported by all law schools ranked by USN&WR—2.3% and 3.0%, respectively. In that event, holding all else equal, UCLA would have had a 90.0% Emp9, an overall score of about 68, and a rank of 17—neck-in-neck with its cross-town rival, USC.
I here pause again, leaving for now unaddressed these questions: "Is that [i.e., strategic Emp9 reporting] ethical?" "How did we get into the mess?" and "How do we get out of it?" I'm not very sure, yet, about my answers. (With regard to the last two, at least, Andy P. Morriss' and William D. Henderson's impressive draft paper offers some leads.) Please feel free to comment here, or to email me, if you have your own answers.
[Crossposted to Agoraphilia.]
Earlier posts about Emp9 measure:
0 Comments:
Post a Comment
<< Home