Board Governance

AI Strategy Workshop for Boards

Boards are being asked to approve AI strategies and govern AI risk without the frameworks to do so independently of management. This workshop changes that.

An AI strategy workshop for boards is not the same as an AI training session. It focuses on governance: how does the board evaluate management's AI proposals, what risk appetite has been set, how does the board exercise oversight without operational dependency on management, and what does the EU AI Act require of directors?

The governance gap boards need to close

Most boards are in the same position: management is presenting AI proposals, and directors are approving them without a framework to independently evaluate the governance, risk, or strategic logic. The gap is not knowledge — it is the absence of tools to exercise oversight without becoming dependent on the people being overseen.

This is the board's governance problem. The EU AI Act has made it a legal exposure as well. Directors of organisations that deploy high-risk AI systems carry oversight and accountability obligations under the Act. A board that approves an AI strategy without addressing these obligations is exposed — not hypothetically, but with an August 2026 enforcement date and designated sectoral regulators already in place.

The solution is not for directors to become AI specialists. It is for boards to develop the governance frameworks, the language, and the structured questions that allow them to exercise independent oversight of management's AI decisions. That is what this workshop delivers.

Director liability is a separate but related dimension. Where AI systems cause harm — to customers, employees, or third parties — the accountability question will reach board level. Directors who can demonstrate they exercised substantive, documented oversight are in a fundamentally different position from those who cannot.

EU AI Act obligations for directors

The EU AI Act does not create personal criminal liability for directors in most circumstances. What it does create is an expectation of board-level governance that regulators will assess. Boards of organisations that deploy high-risk AI systems are expected to have: clear accountability structures, documented governance frameworks, evidence of informed board-level oversight, and human oversight mechanisms that function independently of automated processes.

Ireland's AI Office, operational from August 2026, has designated fifteen sectoral enforcement authorities with powers to conduct inspections and impose penalties. The governance question the workshop addresses is not theoretical — it is the practical preparation for a regulatory environment that is already in place.

What the workshop covers

  • EU AI Act obligations that apply specifically at board level
  • Framework for evaluating management AI proposals independently
  • AI risk appetite: how to set it and what it should cover
  • Board governance structures: committees, reporting, escalation
  • Director liability and AI: what personal exposure looks like
  • Twenty questions every board should be asking management about AI

Delivered by a working NED

Ger Perdisatt is a current Non-Executive Director at Dublin Airport Authority and Tailte Éireann. The workshop is not a consultant presenting theory about how boards should work — it is a working NED presenting the governance frameworks he uses and recommends, in the language of the boardroom.

Former Microsoft COO experience provides the operational AI context. NED experience provides the governance context. The combination is the specific combination boards need: someone who understands both what AI actually does and what boards actually need to govern it.

Common questions

What should a board AI strategy workshop cover?

A board AI strategy workshop should cover: the EU AI Act obligations that apply at board level, how directors evaluate management's AI proposals without operational dependency on management, what an AI risk appetite statement looks like and how to set one, what governance structures — committees, reporting lines, escalation mechanisms — are needed, and what questions the board should be asking that it currently is not. The output is a governance framework for board-level AI oversight, not an AI implementation plan. Implementation is management's responsibility.

What are directors' AI governance obligations under the EU AI Act?

The EU AI Act places oversight and accountability obligations on organisations deploying high-risk AI systems. At board level, this means directors are expected to: understand the nature and risk classification of AI systems in use, ensure appropriate governance structures exist, exercise informed oversight rather than delegating entirely to management, and take accountability for material AI risks. The Act does not require directors to be technical specialists — but it does require them to be able to exercise substantive oversight. A board that cannot articulate its AI governance position is exposed to personal liability as well as organisational risk.

How does an AI strategy workshop for boards differ from management AI training?

Management AI training is about how to deploy and use AI effectively. A board AI strategy workshop is about governance: how the board exercises oversight of AI decisions made by management, what risk appetite the board has set, how the board holds management accountable for AI outcomes, and what the board's obligations are under the EU AI Act. The distinction is fundamental. A board that has attended management AI training has not addressed its governance obligations. The workshop is specifically calibrated to the board's oversight role — it is not a scaled version of a management briefing.

Arrange a Board AI Strategy Workshop

Delivered as a standalone board session. NED-led. Fixed-fee.