Legal Sector

AI Risk Register for Legal Teams

For Irish law firms, an AI risk register is not optional best practice — it is the foundation of demonstrating compliance with the Law Society's AI guidance and the EU AI Act.

A legal team AI risk register maps each AI tool in use to: the risk it carries (EU AI Act tier, professional conduct exposure, confidentiality risk), the governance controls that apply, and the human oversight required. For Irish law firms, this is not optional good practice — it is the foundation of demonstrating compliance with the Law Society's AI guidance.

Why legal teams need a specific AI risk register

A generic AI risk register does not work for a law firm. The risk profile of AI in legal practice is shaped by obligations that do not apply elsewhere: the duty of competence, the duty not to mislead the court, the client confidentiality rules, and the supervision obligations of principals over fee earners using AI. These are professional conduct issues, not just technology governance questions.

The Law Society of Ireland's December 2025 guidance established that solicitors remain personally responsible for AI-assisted work. A legal team AI risk register translates that principle into operational governance: it documents what oversight is applied to each AI tool, what verification is required before AI outputs are relied upon, and what conditions must be met before AI-assisted work leaves the firm.

The hallucination risk in legal AI is specific and well-documented. AI tools regularly produce legal citations that do not exist, statutory provisions that are inaccurate, and legal propositions that sound correct but are not. The risk register addresses this directly: for each tool, it specifies what verification process is required and who is responsible for it.

Confidentiality risk is the second major dimension. If fee earners are using AI tools that process client data — even for research or drafting support — the firm needs to understand where that data goes, whether it crosses jurisdictional boundaries, and whether it is used by the vendor for model training. The risk register documents these questions and the answers the firm has established for each tool in use.

EU AI Act and legal AI risk classification

AI used in the administration of justice is classified as high-risk under the EU AI Act. For solicitors, this means AI tools used in legal processes — document review for litigation, AI-assisted contract analysis, and decision-support tools used in the course of advisory work — may carry high-risk deployer obligations.

The LSRA is designated as a sectoral enforcement authority for AI in the legal sector under Ireland's implementation legislation. The risk register should reflect both the EU AI Act classification and the LSRA's sectoral oversight role — these are separate but overlapping enforcement channels.

What the register covers

  • AI tool inventory across all practice areas and fee earner use
  • EU AI Act risk tier classification for each legal AI tool
  • Professional conduct risk assessment: hallucination, confidentiality, supervision
  • LSRA oversight implications and sectoral enforcement exposure
  • Approved use cases and governance conditions for each tool
  • Human oversight requirements before AI outputs are relied upon
  • Review schedule and update triggers for maintaining the register

Why Acuity AI Advisory

Ger Perdisatt contributed the AI section of the Law Society of Ireland's Essentials in Practice Toolkit. The legal team AI risk register is built on the same framework — not on a generic risk register template applied to a legal context.

Acuity AI Advisory is vendor-neutral and fixed-fee. The register is designed to be maintained by the firm, not to create ongoing dependency on an advisory relationship.

Common questions

What should a legal team AI risk register include?

A legal team AI risk register should map each AI tool in use to three dimensions: the regulatory risk it carries (EU AI Act tier, LSRA oversight implications), the professional conduct risk it creates (confidentiality exposure, hallucination risk in legal outputs, supervision requirements), and the governance controls currently in place (or absent). For each tool, the register should document the use cases it is approved for, what human oversight is required before outputs are relied upon, and what client data it processes and how. The register is a living document — it must be updated when new AI tools are adopted or existing tools change significantly.

What are the main AI risks for solicitors?

The main AI risks for solicitors cluster around three areas. Hallucination risk: AI tools generate plausible but inaccurate legal citations, statutory references, and legal propositions. The duty of competence and the duty not to mislead the court require verification that many solicitors have not yet built into their AI workflows. Confidentiality risk: AI tools processing client data may operate outside Irish jurisdiction, share data with third parties, or use client data for model training — all of which raise issues under Law Society conduct rules and data protection law. Supervision risk: principals who allow fee earners to use AI without governance frameworks are not supervising the output of that work, which is itself a professional obligation issue.

How does a legal AI risk register relate to the Law Society's guidance?

The Law Society of Ireland's December 2025 AI guidance makes clear that solicitors remain personally responsible for their work regardless of AI tool involvement, and that existing supervision, competence, and confidentiality obligations apply to AI-assisted work. A legal AI risk register is the operational implementation of that guidance: it documents how each AI tool in use is being governed, what oversight is applied, and how the firm is meeting its professional obligations in respect of AI. Firms that can produce an AI risk register are demonstrably taking their governance obligations seriously. Firms that cannot are exposed if a professional conduct issue arises from AI use.

Build an AI Risk Register for Your Legal Team

Fixed-fee. Grounded in Law Society guidance. Built to demonstrate compliance.