Grades aren't bad measures of ability, but they're certainly not perfect measures. For one thing, curved grades show comparative ability, not absolute ability. For another thing, to the extent that law professors aren't particularly well-trained at composing tests, law students always run the risk of being part of the garbage-in, garbage-out problem. Moreover, law school exams are often a learned skill, at least in the first year of law school. Students who "get" how to write exams have an enormous edge in the first semester, and their GPAs get a head start.
Grades aren't good measures of other skills that good lawyers must have, though: grades don't measure the ability to work in teams, emotional intelligence, public speaking ability, etc. In a way, many exams reward the ability to come up with every possible answer, even silly ones, as a way of demonstrating the student's capacity for wringing meaning from the hypothetical. Such exams don't reward those students who decide not to write about every possible answer--in other words, those exams don't reward the students who use common sense to restrict their answer to the most important possibilities. (That's one reason that I prefer to limit the space for answers--it forces students to focus on the most important answers, not every possible answer.)
Think about how students get selected for summer associate slots at many big law firms. Their 20-minute, on-campus screening interview is determined by their law school GPA. If they get call-backs, they go for a full-day interview, broken into 20-30 minute, one-on-one in-office meetings with partners and associates, and a lunch. Unless the in-office interviewers are particularly skilled, those 20- or 30-minutes sessions are a rehash of the student's resume, coupled with routine questions ("Why are you interested in our firm?") and routine answers ("Because your firm is the leader in [insert language from firm's website]") that reveal nothing about whether there's a match between the student's abilities and interests and the firm's needs.
The advantage of these types of interviews? Relatively little lawyer-time. But the cost? HUGE. By not taking the time, at the front end, to look beyond (1) the rank of the school (group membership), (2) grades (and if grades are curved, then there's a group membership complication in the grades as well), and (3) whether the student can walk and talk, the firm is buying itself some serious associate attrition as soon as the associates are mobile (coincidentally, associates are mobile around the time they first turn profitable--after the second or third year of practice).
What if law firms used a MoneyLaw approach and searched for indicia of those several skills that the most successful lawyers in the firm have? (Heck, I'd settle for an analysis of the law school GPA of those associates who stayed and made partner....) Here are some possibilities:
1. Participation in clinical programs. (Reducing the effect of overall GPA on the decision to interview a student.)
2. Participation in moot court, mock trial, or writing competitions. (Ditto.)
3. Answering hypotheticals during the interview. (Focusing on individual skills, not group membership.)
4. Getting an assignment during the interview and then returning it within a short timeframe. (Ditto.)
5. Being willing to dip twice as low into a school's cutoff GPA to see if there's any significant difference in the quality of candidates.
Until law firms change the way that they hire, MoneyLaw schools--which would be lovely places to work--won't provide their graduates with one of the crucial benefits of a legal education: the ability to find satisfying work in the law.
OK, Bill and Al: I still don't have a MoneyLaw theory, per se, but maybe I'm on my way....