An Alberta-based legal technology company says it is using artificial intelligence to help personal injury claimants navigate the court system.
Painworth, founded in 2020, recently began offering personalized AI-assisted representation for accident victims seeking injury settlements.
Mike Zouhri, one of three founders, said the company’s mission is to improve access to justice for people who may struggle to afford or reach traditional legal services.
Mike Zouhri (left), one of three founders, said the company’s mission is to improve access to justice for people who may struggle to afford or reach traditional legal services.
Painworth/ Mike Zouhri
“There’s a myriad of reasons why people have difficulty getting access to justice,” he told Global News in an interview, pointing to rural and remote communities, mobility issues, literacy barriers and what he called “bad case economics,” where a claim may be worth money but not enough to attract a lawyer.
The company previously provided AI-powered back-end tools to law firms, including software that can process thousands of pages of medical records and generate chronologies and draft documents.
Its newer offering adds what the company calls DAVID, an AI system that interacts directly with clients seeking representation. DAVID is available 24-7 and can chat in almost every language.
“In Alberta, the Law Society has given us permission to operate as a law firm, which means that we can now represent people in Alberta, even though we’re non-lawyers,” Zouhri said.
He said every represented file will be overseen by a human lawyer, as required by the regulator.
“Anyone who seeks representation will still have a human ultimately making the final calls,” he said. “These are human attorneys who could lose their bar license. They are ultimately responsible for the file.”
Get breaking National news
For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.
In an email, the Law Society of Alberta said it was unable to comment on specifics regarding the company, but that it must adhere to a number of conditions to ensure the protection of the public.
“Those conditions include the Law Society’s ability to request information about a participant’s operations and the Law Society’s ability to conduct audits to ensure ongoing compliance with conditions.”
The personalized AI-assisted representation has been operating since December. Zouhri said no case has yet reached trial or a final resolution, noting that personal injury cases typically take years to move through the courts.
“Statistically, traditional law firms get resolution on any particular personal injury file in about five years,” he said.
Under the model, initial intake with the AI system is free. If a client signs a retainer agreement, the company charges a flat 28 per cent contingency fee, which Zouhri said is lower than the 33 per cent or more commonly charged by traditional firms.
“We have a flat, which means it never ratchets up or down. It’s just flat, 28 per cent,” he said.
Zouhri said the idea for the company came from his own experience after being hit by a drunk driver in 2019.
“I’m patient zero. I’m the alpha tester number one,” he said.
He explained how he faced long delays and little communication on his file.
“A huge and common frustration is ‘why won’t my lawyer get back to me? I don’t know what’s going on with my case’,” Zouhri said.
The introduction of AI into legal representation raises questions about privacy and accuracy — according to Kyle Wilson, a BC based tech expert — particularly with large language models, or LLMs, which can generate human-like text.
Wilson said the concept could be beneficial if proper safeguards are in place.
“My reaction to this is that it is definitely interesting. And I do believe that it could be of serious benefit to people,” Wilson told Global News in an interview.
He said it is crucial that a human remains “in the loop,” especially given documented cases in the United States of lawyers submitting AI-generated filings that contained fabricated legal citations.
“In the U.S. there have been hundreds of hallucinated citations, where lawyers have used it to assist them and then blindly submitted it to the court and it had invented laws and citations,” Wilson said.
Wilson also raised concerns about how commercial large language models handle user data.
He said clarity on non-retention agreements or something similar to ensure that clients’ data is protected and not trained upon is important.
Zouhri said Painworth does not train its system on client case material and described each file as isolated.
“All of that stuff is completely black boxed. The [files] don’t mix,” he said, comparing it to locked filing cabinets that cannot share information with one another.
He also said the system uses structured databases and proprietary tools to avoid fabricating case law. “David has no ability to hallucinate that whatsoever,” Zouhri said.
Broader fears about AI should be balanced with recognition of its potential benefits, Wilson said.
“There is a lot of good that can occur out of AI,” Wilson said. But he cautioned that large language models are “kind of like stochastic parrots” that can convincingly generate text without understanding it.
Painworth’s DAVID remains in its early stages. Zouhri said the goal is to lower barriers to legal help.
“Access to justice,” he said. “That’s the mission.”
Alberta startup launches AI ‘lawyer’ for personal injury claims


