Next month, AI will enter the courtroom and the US legal system may never be the same again.
An artificial intelligence chatbot, technology programmed to answer questions and hold a conversation, is expected to advise two people fighting speeding tickets in courts in undisclosed cities. The two will wear a wireless headset, which will transmit what the judge says to the chatbot run by DoNotPay, a company that typically helps people fight mail-in traffic tickets. The headset will then play the chatbot’s suggested responses to the judge’s questions, which people can choose to repeat in court.
it’s a hack. But it also has the potential to change the way people interact with the law and bring many more changes over time. DoNotPay CEO Josh Browder says costly legal fees have historically prevented people from hiring traditional attorneys to fight for them in traffic court, typically involving fines that can run into the hundreds of dollars.
So his team wondered if an AI chatbot, trained to understand and argue the law, could intervene.
“Most people can’t afford legal representation,” Browder said in an interview. The use of AI in a real court situation “will be a proof of concept for courts to enable the technology in the courtroom.”
Whether or not Browder succeeds, he says he will, his company’s actions mark the first of what are likely to be many more efforts to bring AI into our daily lives.
Modern life is already full of technology. Some people wake up to a song chosen by AI-powered alarms. Their news feed is also often curated by a computer program, one that is taught to pick the items they will find most interesting or are most likely to comment on and share via social media. AI chooses which photos to show us on our phones, asks us if it should add a meeting to our calendars based on the emails we receive, and reminds us to text birthday greetings to our loved ones.
But proponents say AI’s ability to sort through information, spot patterns and quickly pull up data means that, before long, it could become a “co-pilot” for our daily lives. Microsoft-owned GitHub coders are already using AI to help them build apps and solve technical problems. Social media managers rely on AI to help determine the best time to post a new item. Even here at CNET we are experimenting if the AI it can help to write explanatory-type stories about the ever-changing world of finance.
So it may seem like only a matter of time before AI finds its way into investigative industries like law as well. And considering that 80% of low-income Americans don’t have access to legal help, while 40-60% of the middle class still struggle to get it, clearly there is demand. AI could help fill that need, but lawyers shouldn’t feel like new technology will take away their business, says Andrew Perlman, dean of Suffolk University law school. It is simply a matter of scale.
“There is no way that the legal profession can provide all the legal services that people need,” Perlman said.
Moving on to AI
DoNotPay began its latest AI experiment in 2021 when businesses gained early access to GPT-3, the same AI tool used by OpenAI startup to create ChatGPT, who went viral for his ability to answer questions, write essays, and even create new computer programs. In December, Browder pitched the idea to him via a tweet– Have someone wear an Apple AirPod at the traffic court so the AI can hear what’s going on through the microphone and feed responses back through the earpiece.
In addition to people making fun of him for the stunt, Browder knew he would have other challenges. Many states and districts limit legal advisers to those who are licensed to practice law, a clear hurdle that UC Irvine School of Law professor Emily Taylor Poppe says may cause problems for DoNotPay’s AI.
“Because AI would provide real-time information, and because it would implicate fact-specific law enforcement, it’s hard to see how it could avoid being seen as providing legal advice,” Poppe said. Essentially, the AI would be legally considered a lawyer acting without a lawyer’s license.
AI tools also raise privacy concerns. The computer program technically needs to record audio to interpret what it hears, a move that is not allowed in many courts. Lawyers are also expected to follow ethics rules that prohibit them from sharing confidential client information. Can a chatbot, designed to share information, follow the same protocols?
Perlman says that many of these concerns can be answered if these tools are created carefully. If he’s successful, he argues, these technologies could also help with the mountains of paperwork lawyers face on a daily basis.
Ultimately, he argues, chatbots can become as useful as Google and other research tools are today, saving lawyers from having to physically wade through law libraries to find information stored on the shelves.
“Attorneys trying to provide legal services without technology will be inadequate and insufficient to meet the legalities of the public,” Perlman said. Ultimately, he believes that AI can do more good than harm.
The two cases DoNotPay is involved in will likely affect much of that conversation. Browder declined to say where the proceedings will take place, citing security concerns.
Neither DoNotPay nor the defendants plan to inform judges or anyone in court that an AI is being used or that audio is being recorded, a fact that raises ethical concerns. This itself resulted in a pushback on Twitter when Browder solicited traffic ticket volunteers in December. But Browder says the courts that DoNotPay chose are likely to be more lenient if they find out.
The future of the law
Following these traffic ticket fights, DoNotPay plans to create a video presentation designed to advocate for the technology, ultimately aiming to change law and policy to allow AI in court.
States and legal bodies, for their part, are already debating these issues. In 2020, a California task force dedicated to exploring ways to expand access to legal services recommended allowing select unlicensed professionals to represent clients, among other reforms. The American Bar Association told judges using artificial intelligence tools to be aware of biases instilled in the tools themselves. UNESCO, the international organization dedicated to the preservation of culture, has a free online course that covers the basics of what AI can offer to legal systems.
For his part, Browder says that AI chatbots will become so popular in the next few years that courts will have no choice but to allow them anyway. Perhaps AI tools have a seat at the table, instead of having to be whispered in our ears.
“Six months ago, you couldn’t even imagine that an AI could respond in this detailed way,” Browder said. “Nobody has imagined, in any law, what this could be like in real life.”