Canadians who represent themselves in court might soon be able to employ artificial intelligence to improve their chances of success by using it to sift through court cases, draft documents and recommend next steps. But experts are raising legal and ethical concerns, saying the tools may perpetuate bias and further stratify Canada’s wealthy and poor.
Specially trained generative AI technology could help litigants effectively sort through thousands of cases to find relevant precedents, draft court submissions using legal terminology and understand the next steps to take after filing legal documents. In some areas outside Canada, specially trained generative AI technology is already helping litigants.
North Carolina legal tech company Courtroom5, for example, uses simple AI to assess patterns in past cases and recommend next steps in a user’s own case, including filing documents, making a counterclaim or challenging the case entirely. The platform for self-represented litigants (SRLs) – or “pro se” litigants – is frequently used in cases of home foreclosure, medical debt and difficult divorce, co-founder Sonja Ebron said in an interview. The company is working on a generative AI chatbot based on anonymous case records held in its own database, with plans to release it this fall, she said.
Canada’s access to information system is not ready for AI
ChatGPT has been criticized for its inaccuracies, but the bot can draft a wrongful dismissal challenge, divorce petition, trademark application or legal counterclaim within seconds.
And while the majority of AI-powered legal search tools are targeted at law firms, not SRLs, some are making their search tools available to law clinics. Toronto-based Blue J, a legal technology company that predicts outcomes based on previous cases, provides its software to more than 30 legal clinics across the country, according to the company’s co-founder Benjamin Alarie.
Against a background of broader questions about the future use of AI in Canada’s legal system, and with Canada’s first AI regulation currently under review, experts say it’s time to start thinking about how generative AI, which identifies patterns in large sets of data and then translates those patterns into intelligible content, will affect the most vulnerable litigants – for better and worse.
Canada lacks any substantial study of SRLs but they are common within the country’s courts, representing between 50 and 80 per cent of litigants in family and civil court, according to Statistics Canada. In some jurisdictions, like Quebec’s small claims court, litigants are actually prohibited from having legal representation before a judge.
Affordability issues remain the most significant barrier to accessing justice in Canada, according to the National Self-Represented Litigants Project (NSRLP), a research and advocacy group. There’s a wide and growing gap between those who can afford legal support and those who qualify for legal aid.
But complex legal structures can be difficult to navigate, and SRLs are typically at a significant disadvantage to those with legal representation, the NSRLP has found.
Artificial intelligence could be significantly more beneficial than a simple Google search when it comes to making assessments that could replace the judgment that is currently the exclusive domain of lawyers, said Jennifer Leitch, NSRLP’s executive director and an Osgoode Hall Law School adjunct professor.
This includes applying legal information from other cases to the particular facts of their own case, she said. AI applications will be most valuable when they can help SRLs understand how leading cases are like or dislike their own situations, therefore how they might help or hurt them, and what the judge will think of particular evidence.
Most Canadian legal AI companies such as Clio and Alexi currently target law firms, not SRLs. The development of AI applications specifically for SRLs is further along in the U.S.
San Francisco-based DoNotPay promises to help SRLs “fight corporations, beat bureaucracy and sue anyone at the press of a button.” The company has created what it claims to be the first robot lawyer, an AI chatbot that can listen to a case, generate comments and transmit them to the litigant to repeat to a judge through earbuds. But the company has faced criticism over its product’s efficacy, as well as pushback from legal regulators who have warned that the technology constituted an illegal practice of law.
This is just one of the challenges.
AI tools may not be trained on up-to-date jurisdiction-specific information, which could make their output inaccurate or too general to be of use. AI has also been known to “hallucinate” false information.
In a now-infamous case from May, two Manhattan lawyers included credible-seeming legal citations from ChatGPT in a submission to court. It turns out that ChatGPT had hallucinated those citations, and the lawyers have been sanctioned.
Moreover, there’s a risk that while assistance from AI could relieve some of the burden on Canada’s backlogged family and civil courts, it could also encourage more pro-active cases overall, said Courtroom5′s Ms. Ebron, further clogging the system.
Others are raising concerns about procedural fairness: while basic AI tools like be available to the public for free or at low cost, more sophisticated platforms will only be available to larger law firms and could give them an unfair advantage, said University of Ottawa law professor Amy Salyzyn, board chair of the Canadian Association for Legal Ethics.
“There’s a question of whether the technology will further stratify the wealthy litigants versus the poor, or whether we’ll see some levelling of the playing field,” said Prof. Salyzyn.
Meanwhile, AI is increasingly being used by law officials, including judges, to make decisions about their case. Administrative tribunals, like civil tribunals, are using AI to parse and triage applications to settle cases more quickly, according to Ryan Fritsch, legal counsel with the Law Commission of Ontario.
This could make it more difficult for SRLs to challenge decisions when they don’t understand how the technology was used or how it informed a judge’s opinion, he said.
Possibly one of the most useful applications of AI for SRLs is its use in their particular needs and the challenges they face in order to inform policy decisions. In a study for the NSRLP, researcher David Lundgren demonstrated that AI can be used to assess the experiences of SRLs based on socioeconomic and demographic information and highlight which groups are in need of support.