← Insights
·7 min read

Should Your Board Have a Dedicated AI Committee?

G

Ger Perdisatt

Founder, Acuity AI Advisory

As AI obligations under Irish and EU law intensify, boards face a structural question: does AI governance belong on the agenda of an existing committee, or does it warrant a dedicated sub-committee? The answer depends on the complexity of your AI use, your sector, and how close you are to the August 2026 enforcement deadline.

The question of whether boards need a dedicated AI governance committee has moved from theoretical to practical in the past twelve months. With Ireland's AI Office operational from August 2026, the Regulation of Artificial Intelligence Bill 2026 designating sectoral enforcement authorities, and the EU AI Act fully applicable from the same date, boards that have been treating AI as a management matter are running out of runway.

This is not an argument that every board needs a new committee. It is an argument that every board should make a deliberate decision about how AI governance is structured — and that for a significant proportion of Irish organisations, an existing audit or risk committee agenda item is no longer adequate.

Why AI governance has become a board matter

The EU AI Act places explicit obligations on operators — the organisations that deploy AI systems. These are not obligations that management can discharge without board-level oversight. They include documented risk assessments for high-risk AI systems, accountability structures that identify who is responsible for AI oversight, human oversight mechanisms for systems that affect individuals, and incident reporting processes.

For a board to sign off on these obligations being met, it needs a governance structure that gives it confidence the obligations are actually being discharged — not just a management assurance in the annual report.

There is also the liability question. The Regulation of AI Bill 2026 allows penalties of up to 7% of worldwide annual turnover for serious violations. While directors are not personally liable for corporate AI Act violations in the same way they might be for some GDPR breaches, the reputational and fiduciary consequences of a regulatory enforcement action against a board-governed organisation are a board concern regardless.

The case for an existing committee

Most Irish boards govern AI through an existing structure — typically the audit and risk committee, sometimes the technology committee where one exists. There are genuine arguments for keeping it there.

Avoiding fragmentation. AI risk sits alongside cyber risk, operational risk, and strategic risk. A separate AI committee risks creating a governance silo where AI is addressed in isolation from the broader risk environment it operates in.

Board bandwidth. Creating a new committee has costs: director time, preparation requirements, reporting cadences. For organisations with modest AI use and manageable risk profiles, a dedicated committee may not justify the overhead.

Maturity. For many organisations, current AI use does not yet have the complexity to justify dedicated governance. The risk of over-engineering governance for a problem that has not yet materialised is real.

The argument for an existing committee is strongest for organisations that: use AI in low-risk categories; have mature audit and risk committee processes that have explicitly incorporated AI; have a small AI inventory with clear ownership; and face minimal exposure under the EU AI Act's Annex III high-risk categories.

The case for a dedicated committee

The argument for a dedicated AI sub-committee is strongest when three conditions are met.

AI use has grown to the point where an audit and risk committee agenda cannot do it justice. Most audit and risk committee meetings have packed agendas. AI as a standing item typically means a five-minute management update, not substantive governance. If your organisation has multiple AI systems, significant AI-related regulatory exposure, or is actively developing AI capabilities, the audit committee cannot give AI governance the attention it requires without structural support.

The regulatory environment demands documented board-level oversight. Under the EU AI Act, high-risk AI system deployers are required to have human oversight mechanisms in place. For a regulated financial services firm, healthcare provider, or public sector body, demonstrating that board oversight of AI is substantive — not procedural — is part of the compliance posture. A dedicated committee with clear terms of reference, regular reporting, and documented minutes provides that evidence.

The board has a strategic interest in AI adoption, not just compliance management. Governance exists to support good decision-making, not just to prevent bad outcomes. Boards that want to be genuinely engaged in AI strategy — understanding what AI capabilities the organisation is building, what competitive advantage is being developed, what talent and investment are needed — benefit from a structure that gives that topic the space it warrants.

What a dedicated AI committee should cover

A board-level AI governance committee — whether constituted as a standing committee or a sub-committee of the audit and risk committee — typically has responsibility for:

AI strategy oversight. Understanding the organisation's overall approach to AI: what capabilities are being built, what vendors are being engaged, what investment is being approved. The committee does not make these decisions — management does — but it provides the governance lens through which strategic AI decisions are reviewed.

Regulatory compliance. Maintaining board-level awareness of the organisation's EU AI Act compliance position: inventory status, risk classification, identified gaps, remediation progress, and readiness for August 2026 enforcement. The committee receives management reports and challenges where appropriate.

Risk and incident oversight. Reviewing material AI-related risks and incidents. This includes outputs from the organisation's AI risk management process, any incidents involving AI system failure or misuse, and emerging risks from new AI deployments.

External intelligence. Maintaining awareness of regulatory developments, enforcement actions against comparable organisations, and evolving best practice in AI governance. This is the committee's intelligence function — ensuring the board is not managing AI governance in isolation from what is happening in the market.

Terms of reference. Like any board committee, an AI governance committee should have documented terms of reference, a clear reporting line to the full board, and regular review of its own effectiveness.

Practical structures that work

For most Irish boards, a dedicated committee does not need to be large or time-intensive to be effective. Three practical models are in common use:

AI sub-committee of the audit and risk committee. Two or three board members, plus management representatives, meeting quarterly with a defined agenda. Reports to the full audit and risk committee and escalates to the board where warranted. This is the lightest-touch structure that still provides genuine governance.

Technology and AI committee. For organisations that already have a technology committee, expanding the terms of reference to include AI governance explicitly. This avoids creating a new committee while giving AI the structured attention it needs.

Full AI governance committee. For regulated organisations with significant AI use — financial services, healthcare, public sector bodies — a standing committee with its own terms of reference, independent membership, and direct reporting to the board. This is the more resource-intensive structure but provides the clearest evidence of substantive board oversight for regulatory purposes.

The NED's perspective

As a serving NED, the governance question about AI committees is one I encounter regularly on boards. The honest answer is that most boards are behind where they need to be — not through negligence, but because AI governance has moved fast and board structures adapt slowly.

The question is rarely whether to create a committee. It is whether the existing governance structures are producing the information and challenge that a board needs to discharge its responsibilities. If audit and risk committee minutes show no substantive AI governance discussion in 2025, that is not adequate preparation for August 2026 enforcement. The structure that fixes that is a secondary question.

The primary question is: does the board have confidence that it understands what AI systems are operating in the organisation, what risks they create, and what oversight mechanisms are in place? If the answer is uncertain, some form of structural change is warranted — committee or otherwise.


Acuity AI Advisory provides board-level AI governance advisory, including support for boards designing and implementing appropriate governance structures. If you are a board chair, company secretary, or audit committee chair thinking through these questions, a diagnostic conversation is a useful starting point. It is the work I do from a board seat as a serving NED — which means the advice is grounded in what actually works in the boardroom, not what looks good in a governance paper.

board advisoryai governance