AI and the Courts: Conference Highlights Dangers of New Tech
April 30, 2025
Discussions at the 2025 Judicial & Bar Conference on April 25 reflected on the impact of technological changes on the legal profession, but several panelists also drew attention to the potential harm of artificial intelligence on the orderly administration of justice.
At the session titled “Civility, Ethics, and Tradition in the Age of Artificial Intelligence,” panelists considered the adequacy of existing ethics rules in light of the emergence of AI, as well as its effects on litigants and court processes.
D.C. Court of Appeals Chief Judge Anna Blackburne-Rigsby, who led a team examining the impacts of AI on the judicial system as president of the Conference of Chief Justices (CCJ) and as chair of the board of the National Center for State Courts (NCSC), said that the existing canon of judicial ethics and code of professional responsibility for lawyers remain effective despite the new circumstances introduced by changes in technology.
“For example, the D.C. Code of Judicial Conduct Rule 2.5 imposes a duty on judges and court staff to be competent in dealing with technology,” Blackburne-Rigsby said.
The chief judge gave a brief overview of the resource center assembled by the CCJ and NCSC’s AI Rapid Response Team, with interim guidance and a list of events, activities, and materials to improve tech literacy, as well as information about AI-related state court orders and decisions, ethics rules, task forces, and policies.
On the same panel, D.C. Superior Court Chief Judge Milton C. Lee Jr. drew comparisons between his court’s adoption of virtual appearances during the pandemic and the increased use of AI, pointing to positive impacts of both in landlord–tenant court. “There’s a real value to it because that remote access enhanced our ability to [improve] access to justice,” he said. “We went from a 22 percent default rate to less than 1 percent because of doing virtual appearances.”
“But it comes at a cost,” Lee continued. “If you ask any judge, he or she has a top 10 list of things they’ve seen on Webex. You’d be shocked.” Lee said the degradation in civility and decorum in virtual hearings has included inappropriate attire, inadvertent sharing of information through screen sharing, and witnesses receiving coaching from off-camera third parties.
Lee also raised concerns about use of AI as a research or drafting tool. For example, when an AI platform was asked about the admissibility of out-of-court statements, it produced answers that assumed the statements were hearsay without making threshold assessments of each statement’s purpose.
Ed Walters, chief strategy officer for legal tech company vLex and adjunct professor at Georgetown Law, described AI’s capabilities as a “jagged frontier.” As an illustrative example, he cited a 2011 episode of the quiz show Jeopardy! in which two champions, Ken Jennings and Brad Rutter, were handily beaten by IBM’s Watson AI. Walters pointed out that Watson, despite its victory, incorrectly answered a final question, naming a Canadian airport in a “U.S. Cities” category.
“There’s no way of knowing ahead of time whether the task you’re asking AI to accomplish is something it is going to be really capable at, or really bad at,” Walters said.
Walters noted that studies have shown that skilled and informed users of AI experience a lower incidence of hallucination and error, but attorneys and other professionals are not the only ones using the new technology. A video shared at the start of the session by moderator William P. Lightfoot of William Lightfoot Law, PLLC illustrated this point. Taken from a New York appeals court case in early April, a pro se litigant attempted to use an AI avatar to argue their claim, which the judge promptly prohibited.
A later session at the conference explored even darker intentions of AI use. The program “The Future of Domestic Violence in the Digital Age: Cyber Abuse and Deep Fake Technology” explored a number of disturbing and increasingly common uses for AI by stalkers and domestic abusers.
“Domestic violence is no longer confined to physical spaces,” said Lindsay Lieberman, founder and managing attorney of her eponymous firm. It has expanded into the digital world, where abusers are using technology as a tool to harass, to stalk, and to surveil their survivors, often in ways that the legal system has not fully addressed,” Lieberman said.
Stephanie Bergman, supervising attorney for the DC Volunteer Lawyers Project, defined cyber abuse as “the intentional use of digital technologies to cause harm.” She went on to describe instances in which abusers used technology to track and monitor their victims; to steal, create, and distribute intimate or embarrassing images; or to facilitate harassment.
One increasingly common weapon deployed by abusers is the use of deepfake technology to produce pornography involving the target that is virtually indistinguishable from authentic media. Tara Zeiss, supervising attorney for the Project on Domestic Violence and Firearms at the DC Volunteer Lawyers Project, cited studies showing that the vast majority of deepfake pornography depicts women, describing it as a form of gender-based violence.
“It’s really important for courts to understand the emotional, reputational, and financial harm caused by this fake media, which can be just as bad as real images and videos,” Zeiss said.
An increasing number of jurisdictions have introduced legislation relating to the use of deepfake technology in cyber sexual abuse; however, panelists noted the limited ability of laws to end its creation and dissemination. Platforms hosting user-generated content are not generally liable for user-uploaded material, and civil judgments against abusers are often pointless, both because the abuser may have limited resources and because the content, once-released onto the internet, may be beyond even the creator’s ability to claw back.
“It can happen to any of us,” Lieberman said. “We are all at risk, and we all have an interest in creating a safer space online. The legal profession must prioritize awareness, training, and education, ensuring that judges, lawyers, and law enforcement are equipped to protect survivors and uphold justice.”