I should point out now that I began all of the work featured on Law School Almanac as an uninformed amateur, and I did not set out to produce anything approaching proper scholarship. I have read what law review articles, blogs, and other materials I happened to find during my research. But I never undertook a comprehensive survey of the literature on law school rankings (nor have I even started law school, as I write this). I will therefore apologize now for any point on which by accident I have duplicated prior art without proper credit, or any aspect in which as a novice I have reinvented some wheels or repeated work done better and before by someone else.
Like many people who set out to attend law school today, I noticed several things soon after I began: 1) Our profession is gripped like no other with a strange fetish for scoring, sorting, and ranking, and this fetish dominates almost every aspect of the legal education and hiring markets; 2) The information we use to feed this fetish is often vague, unreliable, and corrupt; and 3) The methods used for scoring, sorting, and ranking are in the worst cases obviously invalid and unreliable and even in the best cases often irrelevant to the goals of those who rely on their results.
I am sure that I don't need to mention the many faults with the US News & World Report rankings, as others on this forum and in the literature have done a far better job of that. When I set out to look beyond these rankings, I had three goals in mind: 1) Cut through the hype that pervades popular opinions on legal education; 2) Test anecdotes with data; and 3) Decide where to attend. I wanted to find information that related to my concepts of value in legal education, verify or refute anecdotal judgments that I heard from others, and find some rational basis for choosing schools that I would apply to and eventually attend.
I think that the posts on LSA cover the method for the ANNR study well enough that I won't need to say much on that topic here. Instead, I will address some problems with the data which readers and users should consider. And I'll do that by looking at the numbers for two schools: Notre Dame and Thomas M. Cooley. I chose these because they have similar numbers of alumni included in the data from the survey; they have about as huge a gap in reputation and USN&WR ranking as any two schools we might pick; they follow very different models for admitting students; and yet they somehow wind up both ranked in the Top 20 by ANNR score.
Click here to read the rest of this post . . . .The Thomas M. Cooley School of Law attracts a certain type of attention from prospective law students, in part due to their notorious Judging the Law Schools rankings, which also happen to place Cooley within the Top 20 among all schools. I won't try to explain the reasons behind Cooley's position in their own rankings, but I think we can find some likely factors that will account for their success on the ANNR measure. We'll start by looking at a few numbers that describe the student bodies of these schools in the current ABA Official Guide to Law Schools:
NUMBER OF STUDENTS ND TC
Applied: 3,499 4,978
Admitted: 651 3,699
Enrolled: 176 1,580
NUMBER OF GRADUATES ND TC
Awarded JD degree: 184 805
Sitting for bar exam: 69 229
Passing bar exam: 62 183
Bar passage rate: 0.90 0.80
Three things about these numbers stand out right away: 1) Cooley admits more students than even apply to Notre Dame; 2) Cooley enrolls ten times more students each year; and 3) Notre Dame graduates more or less the same number of students they enroll, but Cooley has a harrowing total attrition rate of around 50%. Another thing worth noting is that we cannot fully reconcile the bar exam numbers with enrollment and graduation numbers, because schools report to the ABA the results of graduates taking the bar exam in only one or two jurisdictions. This issue will come up again when we look at some problems with the Legal Education Value Added Rankings.
The numbers above make clear the radically different models on which these two schools operate. Notre Dame, like most law schools today, is relatively selective and admits less than 1 in 5 applicants each year. But of those admitted and who attend, nearly all will graduate. And of those who sat for the bar exam in Illinois and New York, 9 in 10 passed on their first try.
Cooley, on the other hand, operates the closest thing to an open admissions program that we have left in the world of American legal education. Their application requires no personal statement, no letters of reference, and no submission fee. If a candidate can score at least 143 (20th percentile) on the LSAT and has a 3.0 GPA, then they're in. If they score 149 or higher (40th percentile), then that alone will suffice. If they hit 163 (89th percentile), then they can attend for free. Cooley even has a process for admitting candidates who never finished a college degree but who have completed 60 hours of college level work. The hardest part of the whole application for most candidates will be detailing any criminal history they may have.
Once admitted to Cooley, students must face the much harder problem of graduating -- the school appears to grant only about half as many JD degrees as the number of students that enroll each year. But of those who graduated and who attempted the bar exam in Michigan, 8 in 10 passed on their first try. In fact the number of Cooley alumni who passed the bar exam in Michigan was about the same as the total number who graduated from Notre Dame. As noted above, we can't really tell how many students from either school took or passed the bar in jurisdictions besides those reported. And there may be no good source today for finding those numbers. Some states such as California report bar exam statistics by school, but not all may disclose the detail needed to compile full results for all schools.
Looking at the 4,254 Notre Dame alumni listed by Martindale, and dividing by the number of JD degrees awarded in the most recent year, it appears that there may be about 23 years worth of graduating classes represented. If we assume that the Martindale directory includes about the same number of graduating classes for Cooley, and if the school graduated as many lawyers per year in the past as it does now, then we might expect them to have something like 18,500 alumni listed. This is close to the total number for other large schools such as Harvard and more than four times as many as the 4,287 that actually appear in the Martindale directory. Even if classes at Cooley were smaller two decades ago, there might still be as many as 16,000 matriculants just from the past ten years, and maybe 8,000 alumni.
In the case of Cooley then, the huge gap between the number of students entering and the number listed in the Martindale directory (those who apparently graduate, pass the bar exam, and find work as lawyers) suggests caution in drawing certain conclusions based on the ANNR data. At the least, for some schools the survey of alumni listed in the Martindale directory must suffer from extreme extinction bias relative to the population of students who enroll. This extinction bias results if many enrolled students fail to graduate, fail to pass the bar exam, or cannot find work as attorneys after passing. And further, selection bias may result if many graduates choose not to take the bar exam or decide not to seek work as lawyers.
For these reasons, it seems unwise to rely on the results of the ANNR study as predicting anything about the prospects of entering students at any given school, without looking carefully at how the entering and "surviving" populations relate. At best, the ANNR statistics describe outcomes or behaviors for those from each school who succeed in becoming licensed attorneys and who choose to appear in the Martindale directory. That may include most students who enroll at some schools, and very few at others. We can assume as a general rule that the survey will include mainly the results of the most successful graduates. So the results may overstate outcomes or rates of success for all schools to some degree, and for some schools to an extreme degree, at least in relation to the population of entering students.
On the other hand, if we consider only the population of "survivors," then the examples of Notre Dame and Cooley may suggest that schools can achieve similar results with very different models. The selective admissions / high success model at Notre Dame may produce about the same number of licensed attorneys overall as the open admissions / heavy attrition program at Cooley. Given the great weight of LSAT scores in admissions decisions at more selective schools, and the modest correlation of those scores with law school grades (0.36 - 0.44), we might even argue that the selective admissions model excludes some candidates who would otherwise succeed under an open admissions / high attrition program. Of course we cannot judge the quality of jobs attained by graduates of either school by the distribution of alumni alone. But the fact that significant numbers of Cooley survivors land in regions other than the Midwest suggests that they find work in many different settings, and that alumni can expect to find modest numbers of their fellows in many areas.
Finally, the ANNR survey may also include significant selection bias for all schools if many successful graduates decline to register with Martindale. But the population of about 772,000 alumni found in the survey matches well with the Department of Labor's estimate that there were 761,000 lawyers working in America in 2006. So it seems reasonable to assume that the results of the study include most active licensed attorneys in the country today, even though they may cover only a small portion of those who attended or graduated from any given school.