Alberta

Edmonton Law Student's AI Gamble Backfires in Suspension Appeal

Articling student faces stiffer rebuke after using artificial intelligence to draft legal submissions citing non-existent case.

Edmonton Law Student's AI Gamble Backfires in Suspension Appeal
(Edmonton Journal / File)

An Edmonton articling student has learned a costly lesson about artificial intelligence in the legal profession after Alberta's law society slammed him for using AI to draft an appeal of his own suspension — complete with a fictitious court case that never existed.

Manraj Tiwana was hit with a 20-month suspension last year after the Law Society of Alberta discovered he was taking on clients without informing his articling supervisor. When he appealed that decision, hoping to overturn or reduce the penalty, the strategy backfired spectacularly.

In his appeal submissions, Tiwana relied on AI-generated legal arguments that cited what regulators call a "hallucinated" case — legal jargon for a completely fabricated court decision that the AI system invented out of thin air.

A Cautionary Tale for Legal Profession

A seven-member appeal panel delivered a stinging rebuke in its March decision, acknowledging that Tiwana at least admitted his AI use upfront. However, the panel was unimpressed with his apparent lack of understanding about the gravity of his actions.

"While we appreciate that Mr. Tiwana was candid in admitting his use of AI, he did not seem to grasp the gravity of misusing it in this way, particularly in this setting, where he is arguing that his discipline was too harsh and his prospect for recovery is good," the panel wrote.

The regulators were particularly troubled by the combination of factors: Tiwana was asking the panel to believe he deserved leniency and could be trusted with legal practice, yet he was simultaneously submitting AI-drafted arguments containing fabricated legal precedent.

"The misuse of AI to write his submissions, and in particular his reliance on a hallucinated case in his submissions, suggests otherwise," the panel concluded.

Growing Pains for AI in Law

The case highlights emerging tensions within Canada's legal community as artificial intelligence tools become increasingly accessible. While AI has legitimate applications in legal research and document drafting, the profession remains deeply concerned about reliability, accuracy, and ethical obligations to courts and clients.

For law students and articling candidates across Alberta, the Tiwana decision sends a clear message: AI is not a shortcut for legal work, and regulators will scrutinize its use heavily, especially when credibility is already on the line.

This article is based on reporting from the Edmonton Journal.

Share this story