Legal Sector

AI Governance Toolkit for Solicitors

Built around the Law Society of Ireland's December 2025 guidance, EU AI Act obligations, and the professional conduct rules that apply regardless of what technology is in use.

An AI governance toolkit for solicitors must map to three sets of obligations: the Law Society of Ireland's AI guidance (December 2025), the EU AI Act (enforcement August 2026), and existing professional conduct rules. Acuity AI Advisory contributed the AI section of the Law Society's Essentials in Practice Toolkit — this work is grounded in that framework.

What “AI governance” means for a solicitors' practice

For solicitors, AI governance is not a generic compliance exercise. It sits at the intersection of three distinct obligation sets, each of which applies independently and each of which creates exposure if unaddressed.

The Law Society's December 2025 guidance confirms that solicitors remain personally responsible for the accuracy and integrity of their work, regardless of AI tool involvement. That means hallucination risk in court submissions, discovery documents, or contract drafts is not a technology problem — it is a professional liability problem. The AI tool does not carry the liability. The solicitor does.

Client confidentiality rules apply to AI-processed data. If a solicitor is using a general-purpose AI tool that processes client information — even for summarisation or research — the firm needs to understand where that data goes, who can access it, and whether it is used to train AI models. These are not theoretical concerns. They are active obligations under the Law Society's conduct rules and data protection law simultaneously.

Supervision requirements also extend to AI use. A principal who allows staff to use AI tools without a governance framework in place is not supervising the output of that work. The Law Society's guidance addresses this directly.

EU AI Act obligations for Irish legal practice

The EU AI Act is not exclusively a technology company obligation. Solicitors who deploy AI tools — even off-the-shelf products — are deployers under the Act and carry corresponding obligations. The most immediately relevant for law firms are: the Article 4 AI literacy obligation (all staff using AI must have proportionate literacy training by August 2026), and the classification requirements that apply when AI is used in legal processes or the administration of justice.

The Law Society of Ireland's Regulation of Legal Services Authority (LSRA) is designated as a sectoral enforcement authority under Ireland's AI legislation. Enforcement of AI obligations in legal practice will come via the LSRA — not just the central AI Office. The regulatory channel is already determined.

What the toolkit covers

  • AI tool inventory template covering all practice areas
  • EU AI Act risk classification guide for legal AI tools
  • Acceptable use policy aligned to Law Society professional conduct rules
  • Client confidentiality assessment for AI data processing
  • Human oversight checklist for AI-assisted legal work
  • Staff AI literacy training framework (Article 4 compliance)
  • Vendor due diligence questionnaire for legal AI suppliers

Why Acuity AI Advisory

Ger Perdisatt contributed the AI section of the Law Society of Ireland's Essentials in Practice Toolkit — the primary AI guidance document for Irish solicitors. This means the toolkit Acuity builds is not an external interpretation of that guidance; it is grounded in the same analytical framework.

Acuity AI Advisory has no commercial relationship with any AI vendor or legal technology platform. The toolkit is designed around your firm's actual obligations — not around a product we need you to adopt. Engagements are fixed-fee and structured to deliver a usable framework, not a report that requires further professional services to implement.

Relevant background: former Microsoft COO, current Non-Executive Director at Dublin Airport Authority and Tailte Éireann. The combination of operational AI experience and board-level governance experience is directly relevant to building governance frameworks that work in practice.

Common questions

What should a law firm AI governance toolkit include?

A law firm AI governance toolkit should cover five areas: an inventory of AI tools in use (including AI embedded in practice management software), a risk classification of each tool against the EU AI Act's risk tiers, an acceptable use policy calibrated to Law Society professional conduct rules, data handling protocols addressing client confidentiality and data jurisdiction, and human oversight requirements specifying what professional verification is needed before AI-assisted work leaves the firm. The toolkit is the operational infrastructure that sits behind a governance policy.

Does the Law Society have guidance on AI use?

Yes. The Law Society of Ireland published AI guidance in December 2025 as part of its Essentials in Practice Toolkit. The guidance addresses solicitors' duties of competence, supervision, and client confidentiality in the context of AI use. It makes clear that solicitors remain personally responsible for their work regardless of AI tool involvement. Acuity AI Advisory contributed the AI section of the Law Society's Essentials in Practice Toolkit — our advisory is grounded in that framework.

What EU AI Act obligations apply to Irish solicitors?

The EU AI Act obligations most relevant to solicitors are: Article 4 (AI literacy obligation — all staff using AI must have proportionate literacy training by August 2026), deployer obligations for high-risk AI (which includes AI used in the administration of justice), and documentation and transparency requirements. The Law Society of Ireland's Regulation of Legal Services Authority (LSRA) is designated as a sectoral regulator under Ireland's AI implementation legislation — meaning enforcement of AI obligations in legal practice will come via the LSRA as well as the central AI Office.

Request an AI Governance Toolkit for Your Firm

Fixed-fee. Grounded in Law Society guidance. Vendor-neutral.