Skip to content
Join our Newsletter

B.C. lawyer alleged to have used AI program for case law

Use of artificial intelligence programs in the legal system must meet the code of conduct requirements of the Law Society of BC, according to society guidance on AI.
chong-ke
Chong Ke of Westside Law may have used AI erroneously to advance her client's case.

A Vancouver-based family lawyer is subject of a Law Society of BC investigation for allegedly using artificial intelligence (AI) to submit non-existent case law in a trial at B.C. Supreme Court.

The society says it is investigating the conduct Chong Ke of Westside Family Law, “who is alleged to have relied on submissions to the court on non-existent case law identified by ChatGPT,” a popular AI program.

The society is examining if Chong violated any rules in the course of her actions. The society acknowledged AI may be used by lawyers in the course of their work; however, there are boundaries.

“While recognizing the potential benefits of using AI in the delivery of legal services, the Law Society has also issued guidance to lawyers on the appropriate use of AI in providing legal services and expects lawyers to comply with the standards of conduct expected of a competent lawyer if they do rely on AI in serving their clients,” the society stated by email.

Chong's online profile states she graduated from the J.D. program at University of B.C. and the University of Ottawa and also received an LL.B. and an LL.M. from "top law schools" in China and a PhD from the University of Victoria Faculty of Law.

"She passed the competitive bar examination in China," the profile states.

Chong did not reply to Glacier Media's offer last week to comment on the investigation.

Underpinning the guidance is the society’s code of conduct that requires B.C. lawyers to “perform all legal services undertaken on a client’s behalf to the standard of a competent lawyer.”

The society’s guidance refers lawyers to the court they are conducting business in.

“Courts in some jurisdictions in Canada, as well as some U.S. states, require lawyers to disclose when generative AI was used to prepare their submissions. Some courts even require not just disclosure that generative AI was used, but how it was used. If you are thinking about using generative AI in your practice, you should check with the court, tribunal, or other relevant decision-maker to verify whether you are required to attribute, and to what degree, your use of generative AI,” the society states.

B.C. Supreme Court does not have practice direction on AI. Manitoba and Yukon courts have put out such direction, notes Bennett Jones law firm in July 2023.

Legal consultant David J. Bilinsky wrote in August 2022 with the Canadian Bar Association – BC Branch that AI is increasingly used by lawyers for research: “AI is being used to analyze possible legal arguments and case strength by taking the case facts and using AI prediction technologies to forecast litigation outcomes. Legal analytics software can look at a judge’s past rulings, win/loss rates and other data points to look for trends and patterns in case law and predict a possible case’s outcome.

“AI can also be used to analyze a client’s legal position and determine if there are any logical inconsistencies, gaps in evidence, logic, or arguments in a client’s position. Once uncovered, the lawyer can then evaluate risks and see if there are additional documents, witnesses or such that can be used to tighten up a legal position,” wrote Bilinsky.

[email protected]