
An Ontario lawyer filed seven completely fake quotations from court cases to a judge while arguing in court, but claims it was human error and not artificial intelligence tools behind it. A skeptical judge wonders if the lawyer’s claim makes things even worse.
Khalid Parvaiz has been referred to the Law Society of Ontario, the governing body of the legal profession in the province, although the judge mused a referral to police could have been an option.
“This decision may involve the next generation of A.I. hallucinations,” Ontario Superior Court Judge Frederick Myers writes in his written decision.
“In this case, counsel delivered a factum that cited real cases with correct neutral citations…. But then counsel added quotations from the cases. The quotations do not exist in the cases. The quotations are fake.”
The bizarre case emerged during court proceedings in a real estate investment dispute.
After losing his motion regarding legal costs at a hearing featuring some acrimony between lawyers — Parvaiz was accused of making “a scurrilous allegation against his colleagues” — both sides were told to submit a written factum arguing how much costs should be awarded.
The lawyers opposite Parvaiz asked for increased costs for him allegedly submitting false AI generated material in the case.
Myers confirmed Parvaiz’s material contained false quotations attributed to cases that Parvais was citing. Myers asked the lawyer whether he used generative AI tolls to create them.
On Feb. 26, Parvaiz replied in writing to the court.
“I acknowledge that on review my recitation of the legal principles and substance of the cases referenced was not accurate,” Parvaiz wrote, according to a published court decision released Tuesday.
He called them “clear errors on my part and the result of a lack of due care.
“These were, however, human errors and while I take full responsibility for them, I wish to advise the Court that I did not use or rely on artificial intelligence or other such tools in preparing the reply factum. The errors arose from my misreading of the cases cited,” he wrote. “I take full responsibility for these errors. I have reflected on them and learned from them.”
In a withering ruling, Myers recites seven paragraphs of made-up quotations, appearing inside quotation marks and attributed to real case law in Parvaiz’s submitted material.
“Nothing like this quotation appears in the case. It is wholly made up,” Myers writes after each false quotation.
“The most obvious explanation for these fake quotations is that counsel used A.I. to draft the factum. But I am not making that finding, as I have not had the benefit of full submissions on this issue.,” Myers wrote.
“Try as I might, I do not understand Mr. Parvaiz’s response. If he did not use A.I., how did he come to make up seven paragraphs and call them quotations from real cases? If I accept that Mr. Parvaiz did not use A.I. for research or drafting, I am at a loss for how these quotations could be a result of human error, a lack of due care, misreading the cases cited, carelessness, or inadvertence as stated by Mr. Parvaiz.”
Myers said the made-up passages were not gobbledegook, but rather statements that supported Parvaiz’s arguments in the legal case.
“The only way I can understand Mr. Parvaiz having made up seven distinct quotations is if he believes that counsel is allowed to make up law in his factum. Perhaps doing it once could be some kind of slip or error that mistakenly found its way into the factum. But not seven times,” Myers wrote.
Parvaiz’s claim, Myers wrote, “leaves me in a quandary. Either Mr. Parvaiz used AI and has been untruthful about it, or he made up seven fake paragraphs and chose to present them as actual quotations from precedent cases. As is often the case, if Mr. Parvaiz has not been truthful, the cover-up may be worse than the initial error.”
Myers asked for the Law Society to consider the situation.
Parvaiz, who practices in Toronto and Mississauga, was called to the bar in 2022 and is currently listed by the Law Society of Ontario as a licensed lawyer.
His promotional website says he has “incredible courtroom presence and experience in procedure.”
Parvaiz did not respond to requests for comment by phone and email prior to publishing deadline. Neither did his lawyer respond to requests by phone and email before deadline.
The Law Society could not yet answer questions about the case or about Parvaiz prior to publishing.
Last April, Myers caught a lawyer citing cases in her legal submission that did not exist, and other citations said a judge ruled in the opposite way the judge really did. He asked her if she used AI tools to prepare it.
“It is the litigation lawyer’s most fundamental duty not to mislead the court,” Myers writes in that case. “With the sudden advent of A.I., this has quickly become a very important issue.”
In 2024, a judge in British Columbia found two non-existent cases, which were discovered to have been invented by ChatGPT, in her materials in a high-conflict parental access to children case.
• Email: ahumphreys@postmedia.com | Twitter: AD_Humphreys
Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark nationalpost.com and sign up for our daily newsletter, Posted, here.