How Are Alberta Law Firms Using AI Without Risking Client Confidentiality?

Shaheer Tariq
Mar 13, 2026

Over 1,000 court cases globally involve AI-hallucinated legal citations. Here's how Alberta law firms can adopt AI safely and avoid joining that list.
Last updated: March 2026
As of early 2026, researcher Damien Charlotin's global database of court decisions involving AI-hallucinated content in legal filings has surpassed 1,000 cases, with multiple Canadian decisions resulting in personal cost awards, contempt citations, and disciplinary referrals. In Alberta specifically, the Court of Appeal released its first decision addressing generative AI hallucinations in Reddy v. Saroya in September 2025, and is considering enhanced costs against counsel personally. For Alberta law firms, the question is no longer whether to use AI, but how to use it without exposing the firm to the very risks the technology is supposed to reduce. This guide covers what's working, what's dangerous, and how firms of 5 to 50 lawyers can build a practical AI foundation that protects client confidentiality while recovering meaningful capacity.
The Real Risk: It's Not Just Hallucinations
The headline risk is fabricated case citations. In Zhang v. Chen (2024 BCSC 285), a BC lawyer was personally liable for costs after citing two non-existent authorities generated by ChatGPT. In Ko v. Li (2025 ONSC 2766), the Ontario Superior Court ordered a lawyer to attend a contempt hearing after submitting hallucinated cases. And in Hussein v. Canada (2025 FC 1060), the Federal Court imposed special costs after counsel relied on Visto.ai, a legal-specific AI tool, without verifying its output.
But hallucinated citations are only the most visible risk. A less obvious and potentially more serious concern is data security. When legal staff use consumer AI tools like free ChatGPT to speed up their work, client names, file details, and privileged information can be entered into systems with no confidentiality guarantees and no control over how that data is stored or used. The Law Society of Alberta has published guidelines including "The Generative AI Playbook" and a 2023 notice emphasizing the importance of keeping a "human in the loop" to verify AI-generated materials.
Meanwhile, the landscape of AI tools continues to expand. Capabilities like computer-use agents, which can autonomously browse and interact with software on a user's behalf, are already publicly available. Without a clear policy in place, it becomes difficult to know what your team is using and whether client data is being handled appropriately.
The Competitive Landscape: AI-Native Legal Models Are Already Here
The pressure isn't just regulatory. A new class of AI-native law firms has emerged internationally. Crosby, backed by Sequoia Capital in the US, pairs a small number of lawyers with AI agents to review and redline contracts in under an hour at fixed rates. In Canada, McCarthy Tétrault launched MT>Forge in January 2026, a dedicated division combining AI, scalable legal talent, and the firm's existing expertise to deliver high-volume transactional legal work with greater speed and cost predictability.
These are not experimental pilots. They are funded, operational divisions competing on efficiency and value. For established Alberta firms, the takeaway is straightforward: the same AI capabilities that power these new models can be applied within your existing practice to reduce overhead, reclaim capacity, and improve client service.
Where AI Delivers Value for Alberta Law Firms
In our work with Calgary law firms, the highest-value AI opportunities cluster around tasks that consume professional time without requiring professional judgment:
Drafting and populating templated correspondence from client and file data. A 12-lawyer firm identified that routine correspondence drafting consumed an estimated 8 to 12 hours per week across the firm. With properly configured Copilot agents grounded in the firm's templates, first drafts now take minutes instead of 30 to 45 minutes each.
Summarizing incoming documents, contracts, or opposing counsel filings. Rather than spending 45 minutes reading and distilling a 30-page contract, lawyers can get a structured summary of key terms, obligations, and notable clauses in under five minutes.
Preparing first-draft pleading shells and standard court filings. AI handles the repetitive structural elements while the lawyer focuses on the substantive legal arguments.
Extracting key terms and obligations from contracts for review. Particularly valuable for commercial law practices handling high volumes of lease agreements, purchase orders, or vendor contracts.
Generating internal memos or file summaries from meeting notes. Microsoft Copilot in Teams can generate meeting transcripts and summaries automatically, which can then be refined into client file notes.
Tracking and organizing accounts payable and receivable. For firms where lawyers absorb AP/AR administration due to thin support staff, automation recovers directly billable time.
Each of these individually may save only minutes per task. But in aggregate, across a team of twelve lawyers and their support staff, they represent a substantial amount of recoverable capacity.
Why Microsoft Copilot, Not Legal-Specific AI Tools
Many firms have evaluated purpose-built legal AI platforms like Harvey or CoCounsel. The practical barriers for mid-size Alberta firms are consistent: prohibitively expensive per-seat pricing, limited Alberta-specific legal research coverage, and the requirement to feed firm data into yet another external platform.
Microsoft Copilot offers a different path. Most Alberta law firms already hold Microsoft 365 licenses. Copilot operates within that existing infrastructure, meaning client data stays within the firm's own Microsoft tenant and inherits the firm's existing security, compliance, and data residency settings. There's no additional platform to vet, no new vendor security review, and no separate data processing agreement to negotiate.
The trade-off is that Copilot is a general-purpose tool, not a legal research engine. It won't replace Westlaw or CanLII for case research. But for the 80% of AI value in legal practice that isn't legal research, drafting, summarization, document management, scheduling, and administrative coordination, Copilot is both sufficient and safer than the alternatives.
Building an AI Governance Policy for Your Firm
Ontario is currently the only jurisdiction with legislative requirements addressing AI in the courtroom. Amendments to the Ontario Rules of Civil Procedure introduced in December 2024 require certification of the authenticity of cited legal authorities. Alberta, BC, Saskatchewan, Manitoba, and Nova Scotia have all issued practice guidelines. The Federal Court requires parties to disclose if submissions include AI-generated content.
For Alberta firms, a practical AI governance policy should address four areas:
Permitted and prohibited uses. Which tasks can staff use AI for? Which are off-limits? Consumer AI tools like free ChatGPT should be explicitly prohibited for any work involving client data. Enterprise tools within the firm's Microsoft tenant can be permitted with appropriate guardrails.
Verification obligations. Every AI output used in client work or court submissions must be reviewed and verified by the responsible lawyer. This isn't a suggestion; it's a professional obligation under the Code of Conduct.
Data handling rules. No client names, file details, privileged information, or personally identifiable information should be entered into any AI system outside the firm's controlled environment.
Disclosure requirements. Staff should know when and how to disclose AI use, both to the firm and, where required by court rules, to the court.
Solway's AI Policy Framework, the Solway System, provides a structured approach with 14 components across three sections (Role and Purpose, Accountability and Trust, Ethical Use), each calibrated on a sliding scale from caution-oriented to innovation-oriented. For law firms, the framework typically skews toward the caution end on data handling and toward the innovation end on productivity applications.
The CAPG Opportunity for Alberta Law Firms
The Canada-Alberta Productivity Grant (CAPG) reimburses up to 50% of eligible training costs for existing employees, capped at $5,000 per trainee per fiscal year (75% and $10,000 cap for newly hired unemployed Albertans). There is no minimum hour requirement. For a 12-lawyer firm investing in a Copilot Foundations Workshop, CAPG can meaningfully reduce the cost of the initial training engagement. The key requirement: the training must be instructional and delivered by an eligible training provider.
Consulting engagements, discovery work, and custom agent engineering don't qualify. But the workshop component, where your team learns to use Copilot effectively, build prompts, and configure basic agents, does. This makes a phased approach practical: start with a CAPG-eligible training engagement, then move to consulting and implementation work once the team has a foundation.
Frequently Asked Questions
Is it safe for lawyers to use AI for client work in Alberta?
Yes, when used within a controlled enterprise environment like Microsoft 365 with Copilot, and when outputs are verified by the responsible lawyer. Consumer AI tools without confidentiality guarantees should not be used for any work involving client data.
What are the Law Society of Alberta's requirements for AI use?
The Law Society has published "The Generative AI Playbook" and a 2023 notice emphasizing human verification of AI-generated materials. Lawyers' existing obligations under the Code of Conduct, including competence and supervision duties, apply fully to AI-assisted work.
How many Canadian court cases involve AI hallucinations?
Damien Charlotin's global database tracks over 1,000 cases as of early 2026, with multiple Canadian decisions including Zhang v. Chen (BC), Ko v. Li (Ontario), Hussein v. Canada (Federal Court), and Reddy v. Saroya (Alberta Court of Appeal).
Can a legal-specific AI tool also hallucinate?
Yes. In Hussein v. Canada, the Federal Court found that Visto.ai, a tool designed specifically for Canadian immigration law, hallucinated two non-existent cases and mis-cited an existing case. No AI tool is immune to errors, which is why human verification is mandatory.
How much does AI training cost for a law firm in Calgary?
A Copilot Foundations Workshop for a firm of 10 to 15 people typically costs $4,500 to $7,500, depending on scope and duration. CAPG funding reimburses up to 50% of eligible training costs with no minimum hours required.
What is the difference between Copilot agents and legal AI tools like Harvey?
Copilot agents operate within your existing Microsoft 365 environment and handle drafting, summarization, and workflow automation. Legal AI tools like Harvey are specialized for legal research and document analysis but require separate licensing, separate data processing, and separate security review. For mid-size Alberta firms, Copilot covers the majority of practical use cases at a fraction of the cost.
Should my firm have an AI policy even if we're small?
Absolutely. The risk of unguided AI use is proportionally higher at smaller firms where individual staff decisions have outsized impact. A clear, practical policy takes days to develop and can prevent the kind of incidents that have resulted in cost awards and contempt citations across Canadian courts.
What is Solway's approach to AI governance for law firms?
Solway uses the Solway System, a 14-component AI Policy Framework that maps every known AI consideration from transparency to data handling on a spectrum from caution-oriented to innovation-oriented. For law firms, we collaboratively calibrate each component to reflect both the firm's values and its professional obligations.
More articles
Explore more insights from our team to deepen your understanding of digital strategy and web development best practices.
Load More





