← Insights
·4 min read

Why AI Governance Is Now a Board-Level Risk, Not an IT Problem

G

Ger Perdisatt

Founder, Acuity AI Advisory

For most Irish organisations, AI governance still sits with the IT function. The EU AI Act has made that arrangement legally inadequate. Here is what the shift to board-level accountability requires.

There is a consistent pattern in how Irish organisations have handled AI governance to date. They have classified it as a technology matter, placed it under IT or digital transformation ownership, and expected the board to receive occasional briefings rather than exercise active oversight. That arrangement has been comfortable. It has also become legally inadequate.

The EU AI Act does not treat AI governance as an IT function. It treats it as an organisational accountability — one that sits at the level of senior management and, in effect, the board. Understanding why this shift has happened, and what it requires in practice, is now a governance priority.

How AI risk moved up the accountability chain

Three years ago, the risks associated with AI in most Irish organisations were primarily operational. A tool did not work as expected. A project was delayed. An investment did not deliver the promised return. Those are management risks, and it was reasonable to manage them at the management level.

The risk profile has changed substantially. AI tools are now embedded in processes that affect employees, customers, and in some cases the safety of third parties. AI outputs are influencing decisions in credit assessment, recruitment, performance management, and compliance monitoring. The potential for harm — and the potential for regulatory liability — has moved from marginal to material.

When the potential harm from a category of risk becomes material, governance accountability moves up the organisation. That is true of financial risk, health and safety risk, and data protection risk. It is now true of AI risk.

What the EU AI Act does to director liability

The Act places explicit obligations on operators — organisations that deploy AI systems within the EU. Those obligations include risk classification, technical documentation, human oversight mechanisms, and in some cases conformity assessment. They apply regardless of whether the organisation built the AI system or purchased it from a vendor.

Boards that approve the use of AI systems without ensuring these obligations are being met are approving regulatory exposure. The Act's enforcement regime includes significant financial penalties. More consequentially for directors, a governance failure on AI in a regulated sector — financial services, healthcare, critical infrastructure — may constitute a breach of director duties under existing company law frameworks, independent of the Act itself.

The Act also creates mandatory obligations to report certain AI incidents to regulators. A board that is not aware of what AI systems the organisation is operating has no realistic ability to meet that obligation. See our EU AI Act compliance overview for the relevant framework and deadlines.

The governance gap most Irish boards have

The gap is structural, not attitudinal. Most Irish board members are not indifferent to AI risk — they simply have not been given the framework or the information to exercise oversight effectively. Management has not been required to report on AI governance in any structured way. The board has not specified what it wants to know. No escalation framework exists that would bring AI-related decisions to the board rather than remaining within management discretion.

The result is that AI deployment decisions — some of which have material compliance implications — are being made at a level of the organisation that does not have full visibility of the regulatory context and is not structured to manage board-level risk.

What boards need to do now

The practical steps are not complex, but they require explicit decision-making at board level rather than assumption.

Assign ownership. AI governance needs a named owner at senior management level with a clear reporting line to the board. The risk or audit committee is the natural home for AI governance oversight.

Require an inventory. The board should request — and receive — a complete picture of AI systems in use across the organisation before the next scheduled committee meeting.

Define escalation triggers. Decide, as a board, which categories of AI deployment require board or committee approval rather than management discretion.

Commission an independent assessment. A board that has been relying on management self-reporting for AI governance should understand its current compliance position through an independent review, not through the same channel that created the gap.


We provide independent AI governance advice to Irish boards and senior management teams, including board briefings, governance framework design, and EU AI Act readiness assessments. Talk to us about where your organisation currently stands.

ai governanceboard advisory