Faculty Perspectives

Faculty Perspectives

 
Faculty Perspectives is a new ongoing series in which AALS presents authored opinion articles from law faculty on a variety of issues important to legal education and the legal profession. Opinions expressed here are not necessarily the opinions of the Association of American Law Schools. If you would like to contribute to Faculty Perspectives or would like to offer a response to the opinion published here, email James Greif, AALS Director of Communications.
 

Validity, Competence, and the Bar Exam

 

by Deborah Jones Merritt, John Deaver Drinko/Baker & Hostetler Chair in Law, The Ohio State University Moritz College of Law

 
Deborah Jones Merritt, John Deaver Drinko/Baker & Hostetler Chair in Law, The Ohio State University Moritz College of LawThe bar exam is broken: it tests too much and too little. On the one hand, the exam forces applicants to memorize hundreds of black-letter rules that they will never use in practice. On the other hand, the exam licenses lawyers who don’t know how to interview a client, compose an engagement letter, or negotiate with an adversary.
 
This flawed exam puts clients at risk. It also subjects applicants to an expensive, stressful process that does little to improve their professional competence. The mismatch between the exam and practice, finally, raises troubling questions about the exam’s disproportionate racial impact. How can we defend a racial disparity if our exam does not properly track the knowledge, skills, and judgment that new lawyers use in practice?
 
We can’t. In the language of psychometricians, our bar exam lacks “validity.” We haven’t shown that the exam measures the quality (minimal competence to practice law) that we want to measure. On the contrary, growing evidence suggests that our exam is invalid: the knowledge and skills tested by the exam vary too greatly from the ones clients require from their lawyers.
 
We cannot ignore the bar exam’s invalidity any longer. Every legal educator should care about this issue, no matter how many of her students pass or fail the exam. The bar exam defines the baseline of our profession. If the exam tests the wrong things, we have a professional obligation to change it.
 
Minimum Competence
 
Establishing an exam’s validity requires a clear definition of the exam’s purpose. What does the bar exam attempt to measure? Bar examiners tell us that the exam assesses “minimum competence to practice law”—but what do they mean by that phrase?
 
In the early days, bar examiners adopted a “know it when we see it” view of minimum competence. “Everybody in this room knows what minimum competency is,” one member of the National Conference of Bar Examiners (NCBE) declared in 1980. “I mean, we feel it in our bones.”
 
A feeling “in our bones,” however, is too vague to validate a professional licensing exam. Certainly that phrase cannot justify an exam that fails non-white test-takers at a higher rate than white ones. Today’s Code of Recommended Standards for Bar Examiners—jointly sponsored by NCBE, the ABA, and AALS—proposes a more rigorous definition. According to that Code:
 
The bar examination should test the ability of an applicant to identify legal issues in a statement of facts, such as may be encountered in the practice of law, to engage in a reasoned analysis of the issues, and to arrive at a logical solution by the application of fundamental legal principles, in a manner which demonstrates a thorough understanding of these principles.
 
The Code also specifies that “[t]he examination should not be designed primarily to test for information, memory, or experience.”
 
This definition of minimum competence offers a useful starting point, but it falls short in two ways. First, the definition omits some of the most important skills that clients expect from a minimally competent lawyer. This is the “too little” problem. Second, although the Code warns against testing too heavily for memorization, it has not prevented that practice: current exams require vast amounts of memorization. That is the “too much” problem.
 
Too Little
 
Testing experts recommend using a job analysis to define minimum competence. NCBE sponsored this type of study in 2012, surveying more than 1,500 junior lawyers about the knowledge and skills they actually use in their work.
 
Those survey results support a few facets of the current bar exam. The exam, for example, tests some of the doctrinal subjects that new lawyers draw upon. Similarly, almost all new lawyers rely upon written communication, critical reading, legal reasoning, and issue spotting in their practice; these are all skills that the bar exam currently tests.
 
NCBE’s job analysis, however, also reveals important gaps in our measure of minimum competence. New lawyers reported that knowledge of research methods was more important than knowledge of most subjects tested on the bar. Similarly, they stressed the importance of fact gathering, negotiating, and interviewing; more than 85 percent of new lawyers used each of these cognitive skills.
 
These competencies matter to clients. A lawyer who doesn’t know suitable research methods won’t find the regulations, legislative history, and data that will help her client. One who lacks knowledge of negotiation principles won’t get the best outcome for his client. Unskilled negotiators cost their clients money, business opportunities, family relationships, and even days in jail.
 
A recent study by the Institute for the Advancement of the American Legal System (IAALS) illustrates how many new lawyers lack essential practice skills. A group of 123 junior lawyers completed an assessment in which they interviewed a mock client with a simple legal problem. Only 16 percent of the practicing lawyers gathered all 10 of the relevant facts from the interview. On average, they obtained just 69 percent of the necessary information. These lawyers could not have properly assisted the client, simply because they didn’t know how to interview effectively.
 
Why doesn’t our definition of minimum competence include cognitive skills that are essential for effective client representation? The answer does not lie in the fact that these skills are difficult to test on a written exam. Research, fact gathering, interviewing, and other lawyering skills are cognitive abilities. We could test for these skills by directing test-takers to outline a research plan, interview approach, or negotiation strategy based on a mock client file. Test-takers could also identify potential pitfalls, fall back positions, and ethical issues associated with their plan. These questions are no more difficult to draft and grade than classic issue-spotter essay questions.
 
The primary reason we don’t test bar candidates on these skills is that law schools don’t stress them. Schools teach some professional competencies (like appellate advocacy) quite effectively, but relegate others to a corner of the curriculum. Employers and state supreme courts have urged law schools to teach a fuller range of lawyer competencies, but most schools have resisted.
 
This resistance makes our licensing scheme incoherent. Law schools insist that they lack the time or resources to educate “practice ready” lawyers within three years. Yet 10 weeks after graduation, those students take an exam that purports to test their minimum competence to practice law. If they are not practice ready, how can they be minimally competent?
 
We have sidestepped this conundrum by adopting an artificially narrow definition of minimum competence. The bar exam tests some of the competencies that clients require, but it omits others. This distinction accommodates legal educators, but it harms clients. Every client deserves a lawyer who knows how to gather facts, perform research, conduct an interview, identify the client’s goals, and negotiate on behalf of the client.
 
The IAALS study, like many others, confirms that law schools can teach these cognitive skills to students. Although licensed lawyers performed poorly on the IAALS client interview, law students who had studied interviewing did much better. Fifty-one percent of those students gleaned all of the relevant facts during the client interview; on average, they learned 89 percent of the relevant information. Those numbers were significantly higher than the scores earned by licensed lawyers with no education in client interviewing.
 
Too Much
 
At the same time that the bar exam tests too little of the competencies new lawyers need, it requires too much memorization. The bar examiners’ Code specifies that the exam should test legal reasoning and knowledge of “fundamental legal principles” rather than memorization of specialized rules. Even a cursory glance at the exam, however, reveals the extent of memorization it requires. Consider this sample Multistate Bar Exam (MBE) question from the NCBE website:
 
At a defendant’s trial for extortion, the prosecutor called a witness expecting her to testify that she had heard the defendant threaten a man with physical harm unless the man made payoffs to the defendant. The witness denied ever having heard the defendant make such threats, even though she had testified to that effect before the grand jury. The prosecutor now seeks to admit the witness’s grand jury testimony.
 
How should the court rule with regard to the grand jury testimony?
 
(A) Admit the testimony, because it contains a statement by a party-opponent.
 
(B) Admit the testimony, both for impeachment and for substantive use, because the witness made the inconsistent statement under oath at a formal proceeding.
 
(C) Admit the testimony under the former testimony exception to the hearsay rule.
 
(D) Exclude the testimony for substantive use, because it is a testimonial statement.
 
What is your answer? Although I regularly teach Evidence, supervise both criminal defense and prosecution clinics, and have authored a text on Evidence, I needed to refresh my memory before answering.
 
This question requires the test-taker to know not just the general concepts of hearsay, hearsay exceptions, and the criminal defendant’s right to confront witnesses, but the details of two complex hearsay exceptions. Even consulting the text of those exceptions wouldn’t point to the right answer; the test-taker would also have to know that a grand jury appearance counts as a “proceeding” for one of the exceptions.
 
Equally troubling, the rule tested by this question matters only in felony trials. The vast majority of crimes in the United States are misdemeanors, and grand juries don’t charge those crimes. This is a useful question for television script writers to research, but it has little relevance to newly licensed lawyers. The answer, by the way, is (B).
 
In 1992, the ABA’s Section on Legal Education examined the extent to which the MBE required memorization. Law professors and practitioners reviewed dozens of questions and scored each one on a scale from 1 to 5 in which “1” indicated that the question placed “almost total emphasis on memorization” and “5” signified that the question required “almost total emphasis on reasoning skills.”
 
Responses clustered close to “3” with a mean response of 3.1. Stephen Klein, an NCBE consultant, interpreted this result to suggest that the questions “were equally balanced between memorization and legal reasoning skills.” The bar exam, however, is not supposed to balance memorization and legal reasoning; it is supposed to test reasoning skills instead of memorization. The ABA study confirms what test-takers routinely report: most questions require examinees to recall a memorized rule before they can use legal reasoning to apply the rule.
 
No one, to my knowledge, has performed this type of study since 1992. It would be illuminating to repeat. But there is an even simpler way to determine whether the bar exam requires too much memorization: administer the exam to practicing lawyers. The supreme court in each state could randomly choose 40-50 practicing lawyers and ask those attorneys to take a three-hour segment of the bar. The chosen lawyers would receive this assignment with two weeks’ notice, foreclosing extensive review, and would take the exam segment at the same time as candidates.
 
Lawyers have a professional obligation to accept court-appointed clients, so I hope they would also accept this type of assignment. Courts could sweeten the deal by awarding CLE credit to participants and promising to keep their scores confidential. These recruits would perform a significant public service: they would help assess the bar exam’s validity, thereby improving legal representation for all clients.
 
How would these licensed lawyers perform on the bar? If the exam tests fundamental legal principles and legal reasoning, they should score quite well; experienced lawyers possess more than minimum competence in these matters. But if the bar tests memorized rules from multiple practice areas, the experienced lawyers will perform poorly. That type of memorization does not pay off in practice.
 
But don’t all lawyers need to know the law? They do. Competent lawyers “know” the law in complex ways. They recall some basic principles from memory, but they consult codes, desk books, online sources, and personal notes more often than they draw from memory. Knowledge is essential for law practice, but professional knowledge is not the same as memorization.
 
Can We Fix It?
 
By testing too much and too little, the bar exam endangers clients and treats applicants unfairly. Our failure to adequately define minimal competence—or even to abide by the definition reflected in the bar examiners’ Code—also raises disturbing questions the bar’s disproportionate racial impact. We cannot tolerate these problems any longer. The question is not “can we fix the bar exam?” but “how soon can we fix it?”
 
Individual states could address this problem, as a few states have tried to do. It would be more effective, however, for states to pool their efforts—especially as state supreme courts consider adoption of a uniform exam.
 
I propose creation of a National Task Force on the Bar Exam. This group would study current approaches to the bar exam, develop a more realistic definition of minimum competence, and explore best practices for measuring that competence. AALS, the Conference of Chief Justices, ABA Section of Legal Education and Admissions to the Bar, and NCBE could jointly sponsor the task force.
 
To inform its deliberations, the task force should commission more studies of the work that new lawyers perform; that knowledge is essential to refine our concept of minimum competence. The group should also explore innovative ways to test for that competence. The task force’s recommendations would not bind any state, but those proposals could inform decisions within the states. The recommendations could also guide development of testing instruments by NCBE and other organizations.
 
Here are some of the many ideas that the task force could consider:

 
These ideas do not foreclose others; the task force should rest its recommendations on research-based evidence. I offer these ideas simply to illustrate that we are not tied to the current bar exam; we have many options that could better serve clients, candidates, and the diversity of our profession.
 
Conclusion
 
Some legal educators have raised concerns about the bar exam because an increasing number of their students are failing. I am not part of that group. Law schools have an obligation to prepare students to satisfy our profession’s definition of minimum competence. We cannot change that definition simply because graduates find it harder to meet.
 
The problems with our bar exam, however, date back decades—encompassing years with high pass rates as well as low ones. An exam’s pass rate tells us little about the test’s validity. Rather than worry about pass rates, legal educators should focus on validity. Most important, we must develop a definition of minimum competence that tracks the real work of new lawyers.
 
This will not be an easy task for law schools. We will have to examine our assumptions about law practice and lawyering competence. If we want bar examiners to change their approaches, we may have to revise parts of our own educational model. The work, however, comes at a good time. Our profession is struggling to define itself in the face of changing technologies, business practices, and client needs. If we more fully identify our professional competencies, teach students to achieve those competencies, and develop a valid licensing system, we will help build a stronger profession.
 
Further reading on this topic:

Bar Examinations: “The State of the Art,” 49 B. Examiner 132.
 
Code of Recommended Standards for Bar Examiners, NCBE & ABA Sec. of Legal Educ. & Admissions to the Bar, Comprehensive Guide to Bar Admission Requirements: 2017, at vii (2017).
 
Educational Measurement (Robert L. Brennan ed. 4th ed. 2006).
 
Susan Case, Summary of the National Conference of Bar Examiners Job Analysis Survey Results (2013), http://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F55.
 
Daniel Webster Scholar Program (DWS), University of New Hampshire School of Law, https://law.unh.edu/academics/experiential-education/daniel-webster-scholar-program-dws (last visited Apr. 13, 2017).
 
Institute for the Advancement of the American Legal System, Ahead of the Curve: Turning Law Students into Lawyers (2015).
 
Stephen P. Klein, Summary of Research on the Multistate Bar Examination (1993).
 
Deborah J. Merritt, et al., Raising the Bar: A Social Science Critique of Recent Increases to Passing Scores on the Bar Exam, 69 U. Cincinnati L. Rev. 929 (2001).
 
Steven S. Nettles & James Hellrung, A Study of the Newly Licensed Lawyer Conducted for the National Conference of Bar Examiners (July 2012).