None of the Above: Multiple Choice, Machine Graded
Grading 190 essay contracts exams gets me riled up about multiple-choice-machine-graded (MCMG)-test-giving law professors. I am not talking about a few MCMG questions to get the students warmed up. And, I am not talking about multiple choice with and explanation – essentially short essay questions. I am thinking about those who give and then defend the giving of MCMG as the primary evaluative tool.
When I wrote what is perhaps becoming my annual blog on the topic last year, I conducted a poll. I was happy to see that only a small percentage of those responding use MCMG as the exclusive method of evaluation. I suspect that understates reality. In addition, I did not include the choice “use MCMG for more than half but less than all of the exam.” Faculty using MCMG exams for a substantial portion of the exam are, to me, are dodging their obligations to fairly evaluate students. Essays are the only way that students can actually be heard. Essay questions allow a two way conversation in which the students have an opportunity express the reality that law is nuanced, fuzzy, and often inconsistent. Of course, maybe MCMG testers aren’t teaching that at all which raises a more fundamental question.
I have heard a couple of defenses of this practice. One is that the outcome in terms of grades is the same as an essay test. I do not know anyone who has asserted this who has actually tested it. Aside from that, the more important point is that as soon as the students know that MCMG is the principal testing tool, you are teaching a different course than people who are not using MCMG. The entire focus of the students’ changes. And, to the extent you fancy yourself teaching critical thinking and analysis, forget it.
One defense is that people have different ways of learning. Huh? It’s not that I doubt that people learn in different ways; it’s that I do not see the connection between that and testing. MCMG exams and fact-intensive complex essays test different things, no matter how they are learned. I doubt any attorney is going to advertise that he or she is not so good at the analysis of complex fact patterns but hell on wheels when it comes to multiple choice.
Maybe, just maybe, if law professors giving MCMG had some experience or training in testing theory and exam writing I could be swayed. But I doubt it and I have yet to see a multiple choice law exam that required analysis.
Basically, in law school, if the exam can be graded with a machine, the course could have been taught by a machine. I could be wrong but I have yet to hear an argument for MCMG exams that makes sense other than a rationalization for not grading.
When I wrote what is perhaps becoming my annual blog on the topic last year, I conducted a poll. I was happy to see that only a small percentage of those responding use MCMG as the exclusive method of evaluation. I suspect that understates reality. In addition, I did not include the choice “use MCMG for more than half but less than all of the exam.” Faculty using MCMG exams for a substantial portion of the exam are, to me, are dodging their obligations to fairly evaluate students. Essays are the only way that students can actually be heard. Essay questions allow a two way conversation in which the students have an opportunity express the reality that law is nuanced, fuzzy, and often inconsistent. Of course, maybe MCMG testers aren’t teaching that at all which raises a more fundamental question.
I have heard a couple of defenses of this practice. One is that the outcome in terms of grades is the same as an essay test. I do not know anyone who has asserted this who has actually tested it. Aside from that, the more important point is that as soon as the students know that MCMG is the principal testing tool, you are teaching a different course than people who are not using MCMG. The entire focus of the students’ changes. And, to the extent you fancy yourself teaching critical thinking and analysis, forget it.
One defense is that people have different ways of learning. Huh? It’s not that I doubt that people learn in different ways; it’s that I do not see the connection between that and testing. MCMG exams and fact-intensive complex essays test different things, no matter how they are learned. I doubt any attorney is going to advertise that he or she is not so good at the analysis of complex fact patterns but hell on wheels when it comes to multiple choice.
Maybe, just maybe, if law professors giving MCMG had some experience or training in testing theory and exam writing I could be swayed. But I doubt it and I have yet to see a multiple choice law exam that required analysis.
Basically, in law school, if the exam can be graded with a machine, the course could have been taught by a machine. I could be wrong but I have yet to hear an argument for MCMG exams that makes sense other than a rationalization for not grading.
17 Comments:
I share your views regarding multiple choice exams (I've never used MC questions), but essays aren't the only way that students can actually be heard. In all my upper-level courses, I have my students write multiple open research memos. I give no exams. This approach has its costs.
Across a full semester, I spend roughy twice as much time grading memos as I would grading exams for a class of the same size. The approach works much better than exams in terms of measuring student comprehension and providing detailed feedback before the semester ends.
Mike Madison
University of Pittsburgh
I have only taught as an adjunct and have exclusively relied on essay examinations. I generally favor essay examinations in all subjects. But could you not in a contracts, evidence or property course, for example, craft meaningful, workable, multiple choice questions? Would there be value to that given that these subjects will be tested on bar examinations via multiple choice questions? I am not advocating for multiple choice examinations, nor am I suggesting that law schools simply teach to bar examinations, but I question whether law school faculty members' use of machine-graded examinations is really much of an issue.
I've taken multiple-choice exams in both Contracts and Evidence (in the latter, 75% of the test was actually true/false). In Evidence, the questions were generally of the, "X, Y, Z: hearsay?" variety. In Contracts, the questions were absolutely crazy, but it's not clear that they were actually meaningful.
This is not to say that, as Doug suggests, such questions could not be written.
(Note: I've also had a multiple choice exam in Criminal Procedure, which strikes me as absurd.)
I'm afraid that you are right on the " money," Jeff. I agree that giving pure MC exams indeed constitutes a shirking of teaching obligations. However, here is a potential defense of them: since MC questions can be graded by machine, and since grading essays takes lots and lots of time, a faculty member can consequently use MC exams in order to devote her time to higher, far more important scholarly pursuits. There is, after all, a trade off between teaching and scholarship, and if the former is beside the point, why not give MC exams? It goes back to your post about 50 million or so law review articles over the past decade!
Thanks for the comments. Mike, I like the memo idea. Doug, I just have not seen multiple choice questions that really permit the students to let the professor know that the law is imprecise and fuzzy and why. I would love to be proven wrong because it would eliminate about a month of misery.
The problem with multiple-choice questions is that law professors often don't know a thing about writing them or figuring out whether they're valid (as in item-validity). If the questions aren't valid--in other words, if the best students don't choose the correct answers more often than the worst students do--then the professor is just substituting ease of grading for precision of grading, which is pure laziness.
SURE, we'd love to having grading over and done with quickly. It's boring and time-consuming. But it's also part of our job. If there are other, better ways to evaluate our students (including grading memos, projects, etc.), then we should use those ways, either in substitution for exams or in addition to them.
Frankly, we law professors lead incredibly privileged lives of freedom and power, and we're asked to do comparatively few onerous duties. Grading is one of those duties. Compared to most other jobs in the world (anyone for chicken-sexing?), our lives are pure heaven.
Time to bask in our good fortune, and then buckle down to work.
I seem to be nearly alone among my colleagues at Acorn Law in giving an exclusively essay-based final. My more senior colleagues advised against it, urging that I use a combination of MCMG and short answers like they do. A cynic might suspect they're concerned that the students will realize they've been cheated.
I did give some MCMG quizzes during the semester, but those were merely by way of making sure the students were actually keeping up with the reading.
Let's cut through the fog: if you believe that what you have taught to first year Contracts students is adequately tested by multiple choice questions (even well-written multiple choice questions) then you are either fooling yourself or just trying to keep the students mollified. Law is not subject to multiple choice questions. Those who point to the bar exam are fools. If you are teaching your child to drive, do you teach to the test or to the reality of driving on real roads? Isn't law just a little bit more serious than introducing a new driver to the road (and I mean that seriously)?
Jeff, the bottom line is that a tenured professor teaching 10-11 credits who complains about grading exams is a complete loser who doesn't deserve the cushy job that he or she has. I am surprised that you weren't more forthright in your post.
What if we all agreed that law schools were created to make it possible for law professors to do what they want to do while pretending to add value to the student experience? Wouldn't that change a lot of the anaylsis of legal education and make everything crystal clear?
My school is dominated by faculty that rely almost exclusively on MGMC questions. Which I view as pitiful. And deliberately lazy. Only made worse by rampant (and erroneous) rationalization of this practice by those who employ it. That said, after a decade of giving only essay questions, I now give one-quarter MGMC and one-quarter short answer. And think that's a valuable educational practice. I do a lot of statistical analysis of the results as well, and there's always a very strong correlation between MGMC scores and both essay grades and bar passage. And I think it's very possible to write accurate MC questions, at least in my areas (Civil Procedure and Legal Ethics). Those who believe that subjective essay exams alone are the best measure of performance go too far, in my view. A mix of testing measures seems right -- and more accurate -- to me than exclusive use of essays.
I think Lucky Jim comes close to equating multiple choice with short answer, which, if that's the position, isn't really fair. I've had short-answer exams in tax and payment systems that I thought were completely fair and accurate.
I may have a different conception of "short answer", though, as these exams were of the 3-4 sentence answer variety, and that may be longer than other people's ideas of "short" answer.
I completely agree with your characterization of MCMG exams as inadequate to test how well a student understands nuances of the law. I am currently a 2L, and while I have done very well on the MCMG exams I've taken in law school (only one criminal law exam and a property exam on future interests), I found both of them to invite rote memorization rather than analysis. Even worse, the criminal law exam was riddled with errors (changes in character names in the fact patterns, etc), which illustrate failure by the professor to even carefully draft the exam, and often had two answers which a good student could have argued as correct.
Picking the correct answer with a 1/4 chance of guessing correctly is much different from finding an issue and analyzing it.
I have seen good multiple choice with explanation questions. They are really ways to focus students on a specific issue. I do not know if this is the same as "short answer." Lucky Jim, I am sorry to hear that about Acorn.
I don't think first year teachers come to the profession expecting to give MCMG exams. They must be advised to do so. The most likely advisors -- those giving MCMG exams and having a stake in having others do so.
I definitely didn't mean to conflate short answer and multiple choice, and I apologize if it sounded that way. What some of my colleagues call "short answer", though, is more like fill-in-the-blank, not explain-in-a-few-sentences.
Can you explain how an MCMG test would really be different than a typical essay question where the professor looks for issue spotting and key words? While a student may not be able to get away with just a list of bullet points, the actuality was not that far off at my top-5 school for the foundational courses.
What about multiple choice exams as a means to prepare for the bar exam? I teach AP government, and even though I'd prefer to give essay only exams, I'm encouraged to give some MC questions because, besides essays, that's what they'll see on the AP exam.
Same goes for my 10th graders (world history). Since they'll see MC questions on the FCAT, I give them MC questions on their tests (I give them essay and short answer as well).
But, if an argument can be made for MC questions, I think that might be it. We go to law school to learn how to think about the law, but also to prepare ourselves to practice law, a mandatory prerequisite for which is the protectionist and abominable bar exam. Though I passed, I found the test more difficult because I never wanted to say, "THIS is the answer." Perhaps a few MC questions in law school would better prepare us to handle the bar.
Incidentally, I like Mike Madison's approach. I'm certain I could've done better than a C+ in your class all those years ago if I could have written memos instead of taking tests (and I"m sure Paige, Justin, and Jason B., with whom I studied all semester, and who all got "A's", would've done just as well under that system).
And surely open research memos mimic the actual practice of law better than 5 pages of implausible "facts" that have to be analyzed and answered in 3 hours.
My favorite test format was the law and economics test, and because I got an A on that one.
A Freudian slip, perhaps, but the above should read, "...and NOT ONLY because I got an 'A' on that one."
It's because the test allowed us to explain ourselves, and because Justin thought he failed it, but he ended up booking the class.
Clearly your law and economics teacher knew how to test people properly but your comment suggests you did not get the memo about the grade change.
More seriously, I understand the bar exam matter and it depends one whether you view law school as preparation for passing the bar exam itself or for being an effective attorney for 40 or 50 years. To me the bar exam, in part because it is multiple choice, sets a very low bar that has little to do with future success. In addition, just because the bar examiners have it wrong as far as testing, should law schools follow along? To me law school is about the long run, not about jumping over one hurdle that the vast majority of law students will clear without a hitch.
Post a Comment
<< Home