![]() |
Ben Su |
What followed was a two-year ordeal that I allege was marked by obstruction, delay and procedural abuse by the Toronto Police Service and the City of Toronto’s legal counsel. In March 2025, less than 30 seconds into trial, the Crown dropped all charges, ending what I maintain was a case of malicious prosecution, unlawful detention and violations of my constitutional rights.
I’m a lawyer. I know the law. I even won a national essay contest from the Canadian Bar Association in law school for critiquing the Supreme Court’s interpretation of s. 15 of the Charter. And still, I barely held it together.
The only reason I was able to survive it — and fight back effectively — was because I had an unlikely co-counsel: artificial intelligence.
The legal system as a resource war

style-photography: ISTOCKPHOTO.COM
Most individuals, especially racialized citizens, don’t.
The playing field isn’t level. The deeper I got into this case, the clearer it became: the legal system can be used to wear people down — even when they’re right.
The truth was easy — the process wasn’t
The police officer who charged me claimed he had no notes. But video footage showed him writing in a memo book at the scene. That same officer changed his account only after I filed a formal complaint with the chief of police.
Yet I was prosecuted. The Crown eventually dropped all charges after reviewing the facts. The truth wasn’t hard to establish. But for two years, I had to fight for it against the weight of a system that was slow, opaque and resistant to transparency.
Where AI stepped in
I used large language models (LLMs) to:
- Draft my freedom of information (FOI) requests under the Freedom of Information and Protection of Privacy Act (FIPPA).
- Prepare complaint filings to the Information and Privacy Commissioner.
- Cross-reference legal standards for perjury, obstruction of justice, and abuse of process.
- Generate outlines and citations for my constitutional arguments.
LLMs allowed me to work faster, cheaper and with better mental clarity. AI didn’t replace my legal skills — it amplified them.
Late at night, when no lawyer could answer my questions, AI could. That saved me time. That saved me money. That saved my case.
This isn’t about me
I had training, language skills and access. I could verify the AI’s output and use it responsibly. But what about the person who doesn’t?
If even I almost lost a case this broken, how does anyone else survive?
Access to justice is not just a question of having a lawyer. It’s about whether anyone can navigate the system, assert their rights and challenge public institutions when they overreach.
If we don’t scale up tools that help ordinary people access legal knowledge and protections, we’re complicit in preserving a system that works for insiders and no one else.
What needs to change
We need to stop pretending that procedural delays are neutral. They are power moves.
We need to recognize that misusing FOI exemptions (like s. 52(2.1) of FIPPA) or denying access to records for tactical reasons is not bureaucracy — it’s obstruction of justice.
We need to invest in public-interest AI legal tools and embed transparency, accountability and community-driven design into them.
And we need lawyers and institutions to treat AI not as a threat to their profession but as a bridge to restore public trust in it.
Ben Su is a Toronto-based lawyer and legal tech founder. He is the co-founder of Capita, a legal technology company backed by U.S. venture capital firms including Drive Capital and Hyperplane VC. While in law school, Ben was the national winner of the Canadian Bar Association’s essay competition for his critique of Supreme Court jurisprudence on s. 15 of the Charter and served on the CBA’s Legal Futures Initiative. He has previously contributed to Law360 Canada on constitutional law and technology.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.