California Supreme Court Questions State Bar’s Use of AI in Bar Exam
On Thursday, the California Supreme Court called on the State Bar of California to provide a detailed explanation regarding its integration of artificial intelligence (AI) in the creation of multiple-choice questions for the recently flawed February bar exams. This inquiry marks a significant step in addressing concerns raised by hundreds of aspiring lawyers who faced technical challenges during the examination.
Background on the Exam Controversy
Following the February bar exams, the court revealed that it was not informed prior to the test that the State Bar permitted its independent psychometrician to incorporate AI in developing select questions. This oversight has drawn intense scrutiny regarding the transparency and reliability of the examination process.
Key Demands from the Court
The court’s recent statements indicate a desire for accountability: it is seeking answers on several fronts, including:
- The rationale behind using AI for developing certain multiple-choice questions.
- Measures implemented to ensure the accuracy and reliability of these AI-generated questions.
- Whether any questions were excluded from the scoring process due to doubts about their reliability.
- The overall reliability of the remaining questions used for scoring purposes.
State Bar’s Current Position
The controversy surrounds not only the use of AI but also the State Bar’s shift from the National Conference of Bar Examiners’ Multistate Bar Examination system to a new hybrid testing format aimed at reducing expenses. The State Bar engaged Kaplan to develop the primary set of questions, but it was disclosed last week that a portion were generated through alternative means, including AI.
In its defense, the State Bar’s executive director, Leah Wilson, reported confidence in the validity of the questions, asserting they accurately assess the legal competence of candidates. Alex Chan, chair of the Committee of Bar Examiners, emphasized that AI’s use was limited and that the committee was not made aware of these changes ahead of the exam.
Concerns Over Oversight and Validity
Experts in legal education have raised significant questions about the implications of allowing non-legal professionals to participate in question development using AI. Katie Moran, an associate professor at the University of San Francisco School of Law, expressed concerns regarding the decision-making process that led to hiring ACS Ventures for AI assistance.
Mary Basick, assistant dean of academic skills at UC Irvine Law School, highlighted that any substantial alterations to testing protocols require extended notice under California’s Business and Professions Code. The rapid changes implemented for the February exam raised flags about the development timeline and review processes typically required to ensure question validity.
Next Steps and Ongoing Inquiries
As the State Bar assesses its testing framework, it faces pressure to clarify the rationale for deviating from its original plan with Kaplan. Additionally, the State Bar has yet to offer a detailed account of how AI was employed in generating exam questions, leaving many stakeholders eager for transparency.
As this situation continues to unfold, the implications of technology’s role in legal education and examination will remain a pertinent topic among educators, legal professionals, and regulators alike.