Saturday, February 24, 2007

MoneyLaw schools in a non-MoneyLaw world, part II

(Part I is here.) As long as employers stick to their tried-and-not-so-much-true methods of choosing employees, graduates of MoneyLaw schools will head into a world that values group membership over individual abilities--a very anti-MoneyLaw world. Here's why:

Grades aren't bad measures of ability, but they're certainly not perfect measures. For one thing, curved grades show comparative ability, not absolute ability. For another thing, to the extent that law professors aren't particularly well-trained at composing tests, law students always run the risk of being part of the garbage-in, garbage-out problem. Moreover, law school exams are often a learned skill, at least in the first year of law school. Students who "get" how to write exams have an enormous edge in the first semester, and their GPAs get a head start.

Grades aren't good measures of other skills that good lawyers must have, though: grades don't measure the ability to work in teams, emotional intelligence, public speaking ability, etc. In a way, many exams reward the ability to come up with every possible answer, even silly ones, as a way of demonstrating the student's capacity for wringing meaning from the hypothetical. Such exams don't reward those students who decide not to write about every possible answer--in other words, those exams don't reward the students who use common sense to restrict their answer to the most important possibilities. (That's one reason that I prefer to limit the space for answers--it forces students to focus on the most important answers, not every possible answer.)

Think about how students get selected for summer associate slots at many big law firms. Their 20-minute, on-campus screening interview is determined by their law school GPA. If they get call-backs, they go for a full-day interview, broken into 20-30 minute, one-on-one in-office meetings with partners and associates, and a lunch. Unless the in-office interviewers are particularly skilled, those 20- or 30-minutes sessions are a rehash of the student's resume, coupled with routine questions ("Why are you interested in our firm?") and routine answers ("Because your firm is the leader in [insert language from firm's website]") that reveal nothing about whether there's a match between the student's abilities and interests and the firm's needs.

The advantage of these types of interviews? Relatively little lawyer-time. But the cost? HUGE. By not taking the time, at the front end, to look beyond (1) the rank of the school (group membership), (2) grades (and if grades are curved, then there's a group membership complication in the grades as well), and (3) whether the student can walk and talk, the firm is buying itself some serious associate attrition as soon as the associates are mobile (coincidentally, associates are mobile around the time they first turn profitable--after the second or third year of practice).

What if law firms used a MoneyLaw approach and searched for indicia of those several skills that the most successful lawyers in the firm have? (Heck, I'd settle for an analysis of the law school GPA of those associates who stayed and made partner....) Here are some possibilities:

1. Participation in clinical programs. (Reducing the effect of overall GPA on the decision to interview a student.)
2. Participation in moot court, mock trial, or writing competitions. (Ditto.)
3. Answering hypotheticals during the interview. (Focusing on individual skills, not group membership.)
4. Getting an assignment during the interview and then returning it within a short timeframe. (Ditto.)
5. Being willing to dip twice as low into a school's cutoff GPA to see if there's any significant difference in the quality of candidates.

Until law firms change the way that they hire, MoneyLaw schools--which would be lovely places to work--won't provide their graduates with one of the crucial benefits of a legal education: the ability to find satisfying work in the law.

OK, Bill and Al: I still don't have a MoneyLaw theory, per se, but maybe I'm on my way....

3 Comments:

Blogger William Henderson said...

Nancy, a great series of posts.

Two comments:

1) UCLA established a very substantive business law certificate program for its students, including a skilled-based capstone requirement taught by adjuncts. I am told that employers are willing to dig deeper into the class to hire students how have bonded themselves with this program. I have heard a similar anecdotes about Univ of Tenn. The burden is on the Law School's to strike a deal to change employer behavior. We are not passive actors here.

2) Law schools control the exam / grading process. If grades are the product of something truly artificial and not strongly related to skills required by practicing lawyers, then we can come up with a grading systems that more closely mirror practice. (This was one of the takeaways from my 2004 Tex L Rev article.) Employers will use any GPA we provide. The Carnegie Report (Educating Lawyers (2007)) explicitly states that law faculty "privilege" the "cognitive apprenticeship" with summative high stakes exams. So the education experts would favor the change!

By the way, I have a detailed memo, ready for law partner circulation, that lays out how legal employers can use statistical analysis to play Moneyball/Moneylaw in recruitment. If you know one hiring partner who is frustrated by attrition or associates with brains but little emotional intelligence, we need to talk. bh.

2/24/2007 11:35 AM  
Blogger Unknown said...

Actually, there's at least one law partner--one of the founding partners of a Dallas law firm--who's working with a team of social scientists on an instrument that tests rainmaking ability. To him (and, in full disclosure, to me, as I've been working with him on this project), the fact that rainmakers bring in the lion's share of the revenue means that those characteristics count more than does GPA. Happy to talk w/you offline about this in more detail. And I agree w/you (as usual!) about the fact that law schools could level the playing field by reforming the curriculum and by changing the ways in which they evaluate students.

2/24/2007 5:00 PM  
Blogger Anthony Ciolli said...

"1. Participation in clinical programs. (Reducing the effect of overall GPA on the decision to interview a student.)"

How would this work when most big firms hire summer associates during an OCI process that takes place during the 1st term of 2L year? No one would have taken a clinic as a 1L, and, given the high student demand for clinics, it's rare at many (most?) schools to even get into a clinic until 3L year.

"2. Participation in moot court, mock trial, or writing competitions. (Ditto.)"

Doesn't law review / journal membership already take this into account, at least at schools where most law review slots are determined by a writing competition and not grades? Not saying I like the idea of journal membership being used as a proxy for anything during the hiring process, but I think it is pretty safe to say that the assumption is that if you're on law review you likely have better writing skills than the rest of the student body. Now, as for whether mere participation in those other activities should also count, I have a feeling the signal would significantly lose its value after it becomes known that firms are preferring people who are submitting things to writing competitons and everyone then submits something somewhere just so they can say they "participated."

Agreed with Professor Henderson on the grades issue -- if grades don't correlate well with practice, and one of the purposes of grades is to be an indicator of one's ability as a practicing attorney, then it's up to the law schools to fix the broken system. Then again, that assumes that law schools should be in the business of providing easy sorting mechanisms for law firms.

2/24/2007 5:16 PM  

Post a Comment

<< Home