The Emp9 Puzzle: Answers and Questions
I earlier related an odd discrepancy: for a few law schools, the "employment at nine months" figures in U.S. News & World Report's 2008 rankings differ from the Emp9 figures you'd get from using the American Bar Association's data. I offered only tentative explanations for that puzzle. Grace of some correspondence, I can now offer a solid explanation for one school's data discrepancies, reservations about another school's explanation, and an anonymous cynic's take on Emp9 reporting practices.
Northern Illinois University School of Law
I emailed several parties in my attempt to understand the Emp9 data divergences I'd observed. Specifically, I wrote to two administrators each (the head of the career services and the associate dean) at the four law schools that had differences of more than 3% between their USN&WR and ABA Emp9 figures. Only Greg C. Anderson, Director of Career Opportunities and Development at Northern Illinois University College of Law, wrote back. I quote his explanation with his permission:
(According to my calculations, the Emp9 error alone cost Northern Illinois a spot in the third tier. I hadn't noticed the error in the school's Emp0 number, too, but changing that figure in my model didn't materially affect Northern Illinois' ranking.)
Mr. Anderson had apparently discovered both these errors well before my email, and had contacted USN&WR asking it to correct them. He got no reply. My query prompted him to try again. He this time directed his plea to Robert Morse, Director of Data Research at USN&WR. Mr. Anderson's renewed request offered a trenchant critique of USN&WR's data management procedures (which I again quote with permission):
Mr. Anderson raises valid points. USN&WR should do more to correct errors in its law school rankings. Even if it's too late to fix the print version, USN&WR could easily update its website. Its failure to do so sorely disserves the would-be law students who rely on the rankings, not to mention all the law schools affected by those students' decisions.
Florida International University School of Law
[NB: This post originally, and erroneously, referred to Ohio Northern University School of Law instead of Florida International University School of Law.]
I did not receive any reply to the emails I directed to Florida International School of Law. Fortunately, though, Maggie D. Austin, the school's Assistant Dean for Career Planning and Placement, discovered my blog post independently. She explained that Florida International reported different Emp9 figures to USN&WR than it reported to the ABA because"[T]he data submitted to USNWR is the official employment data as of February 15th, 2006 as required by NALP. However, the data submitted to the ABA was as of February 1st, 2006."
As I replied, however, "The ABA's 2006 questionnaire requested, 'The percentage of graduates who are employed (as a percentage of those whose employment status is known) as of February 15 as reported to NALP.'" I double-checked with administrators at two schools, who confirmed that the ABA questionnaire [PDF] means what it plainly says. I'm thus left puzzled why Florida International reported to the ABA its Emp9 data as of Feb. 1. Regardless, though Ms. Austin did not say as much, it would appear that USN&WR got the right Emp9 data for her school; the data that Florida International sent the ABA, being two weeks premature, presumably showed lower employment rates.
An Anonymous Cynic's Take
My postings on this subject led to an email exchange with a seasoned law school administrator who, for obvious reasons, prefers to remain anonymous. My correspondent first observed that Ohio Northern and several other schools reported to the ABA unusually large percentages of uncategorized graduates. That is to say, those schools claimed that relatively many of their graduates had found work nine months after graduation and yet declined to describe what sort of work those graduates had found. When I questioned the salience of that observation, my correspondent hypothesized that schools "don't know the categories of employment because they are classifying students as 'employed' based on a weak evidentiary standard and not always on affirmative knowledge of employment." The explanation continued with "a cynical hypothetical" illustration:
I relate my correspondent's comments because they convey a viewpoint I'd not considered, and one that doubtless reflects hard-won experience. I do not present them as my own opinion, much less as the Truth. Even my correspondent, in passages I've not quoted, offered alternative, and less damning, explanations of oddities in law schools' Emp9 data. But these sorts of worst-case scenarios will remain all too likely unless and until the ABA starts actively policing how law schools describe themselves.
[Crossposted to Agoraphilia.]
Earlier posts about Emp9 measure:
Northern Illinois University School of Law
I emailed several parties in my attempt to understand the Emp9 data divergences I'd observed. Specifically, I wrote to two administrators each (the head of the career services and the associate dean) at the four law schools that had differences of more than 3% between their USN&WR and ABA Emp9 figures. Only Greg C. Anderson, Director of Career Opportunities and Development at Northern Illinois University College of Law, wrote back. I quote his explanation with his permission:
[O]ur Director of Budget and Records had made a math error when calculating our total number of graduates (Line 165) for the USNWR survey. He reported the total as 109 instead of 99. This resulted in a reduction of our placement statistics on [USN&WR's] survey. Our "Employed at 9 Months" figure should have been 90.0% instead of 82.6% while our "Employed at Graduation" figure should have been 64.6% instead of 58.7%.
(According to my calculations, the Emp9 error alone cost Northern Illinois a spot in the third tier. I hadn't noticed the error in the school's Emp0 number, too, but changing that figure in my model didn't materially affect Northern Illinois' ranking.)
Mr. Anderson had apparently discovered both these errors well before my email, and had contacted USN&WR asking it to correct them. He got no reply. My query prompted him to try again. He this time directed his plea to Robert Morse, Director of Data Research at USN&WR. Mr. Anderson's renewed request offered a trenchant critique of USN&WR's data management procedures (which I again quote with permission):
[M]y goal in bringing this to your attention is to request that USNWR consider [] having a longer time period to allow schools to react to the initial rankings. This year the initial notice was sent to the law schools on March 29 at 9:06 p.m. The survey was released to all print and broadcast media as of 12:01 a.m. on March 31st. This means that, at best, we had one business day to submit any changes . . . .
Given the amount of information disclosed to USNWR in the survey, I find it hard to believe that errors such as ours are not uncommon. Your publication has an impact on potential students and I am sure you will agree that accuracy is crucial. I certainly understand the importance and necessity of hard deadlines in your industry, but schools need more than 24 hours to react to your initial notice.
Mr. Anderson raises valid points. USN&WR should do more to correct errors in its law school rankings. Even if it's too late to fix the print version, USN&WR could easily update its website. Its failure to do so sorely disserves the would-be law students who rely on the rankings, not to mention all the law schools affected by those students' decisions.
Florida International University School of Law
[NB: This post originally, and erroneously, referred to Ohio Northern University School of Law instead of Florida International University School of Law.]
I did not receive any reply to the emails I directed to Florida International School of Law. Fortunately, though, Maggie D. Austin, the school's Assistant Dean for Career Planning and Placement, discovered my blog post independently. She explained that Florida International reported different Emp9 figures to USN&WR than it reported to the ABA because"[T]he data submitted to USNWR is the official employment data as of February 15th, 2006 as required by NALP. However, the data submitted to the ABA was as of February 1st, 2006."
As I replied, however, "The ABA's 2006 questionnaire requested, 'The percentage of graduates who are employed (as a percentage of those whose employment status is known) as of February 15 as reported to NALP.'" I double-checked with administrators at two schools, who confirmed that the ABA questionnaire [PDF] means what it plainly says. I'm thus left puzzled why Florida International reported to the ABA its Emp9 data as of Feb. 1. Regardless, though Ms. Austin did not say as much, it would appear that USN&WR got the right Emp9 data for her school; the data that Florida International sent the ABA, being two weeks premature, presumably showed lower employment rates.
An Anonymous Cynic's Take
My postings on this subject led to an email exchange with a seasoned law school administrator who, for obvious reasons, prefers to remain anonymous. My correspondent first observed that Ohio Northern and several other schools reported to the ABA unusually large percentages of uncategorized graduates. That is to say, those schools claimed that relatively many of their graduates had found work nine months after graduation and yet declined to describe what sort of work those graduates had found. When I questioned the salience of that observation, my correspondent hypothesized that schools "don't know the categories of employment because they are classifying students as 'employed' based on a weak evidentiary standard and not always on affirmative knowledge of employment." The explanation continued with "a cynical hypothetical" illustration:
If a school wants to know whether Janet Jones has a job, they could rely on the knowledge that Professor Smith saw Janet's parents at graduation, and they told Professor Smith how 'excited they are about Janet beginning working in the Fall.' If the career office then asks the faculty/administration if they know what Janet Jones is doing, Professor Smith could relay the 'information' he had back at graduation. Granted, it is incomplete, 2nd hand information. But it is reason enough to fill out Janet's form for NALP (to later be reported to the ABA) and call her employed.
I relate my correspondent's comments because they convey a viewpoint I'd not considered, and one that doubtless reflects hard-won experience. I do not present them as my own opinion, much less as the Truth. Even my correspondent, in passages I've not quoted, offered alternative, and less damning, explanations of oddities in law schools' Emp9 data. But these sorts of worst-case scenarios will remain all too likely unless and until the ABA starts actively policing how law schools describe themselves.
[Crossposted to Agoraphilia.]
Earlier posts about Emp9 measure:
- Change to U.S. News Law School Rankings Methodology
- How U.S. News Calculates "Employment at 9 Months"
- The How, Who, and Why of Strategic Emp9 Reporting
- USN&WR to Change Employment Measure
- ABA v. USN&WR on "Employment at 9 Months" Data
- The Ethics of Strategic Emp9 Reporting
- 2008 USN&WR Law School Rankings Under New Emp9 Formula
0 Comments:
Post a Comment
<< Home