Service

Board AI Advisory

Boards are being asked questions about AI they have not been prepared to answer. We change that.

What boards are facing

Directors are being asked to approve AI strategies, govern AI risk, and take accountability for AI outcomes — without the briefing or frameworks to do so effectively. The gap between what boards are expected to govern and what they have been equipped to govern is real. EU AI Act obligations have made that gap a liability.

What boards need

A clear view of AI exposure. A governance framework that works in practice. The language and frameworks to exercise oversight effectively. And a way to ask the right questions without being dependent on management for the answers.

What we deliver

  • Board briefings on AI risk, liability and strategic positioning
  • AI governance framework development for board-level accountability
  • Director education programmes on AI oversight responsibilities
  • EU AI Act liability briefings tailored to board obligations
  • Language and frameworks that enable effective board oversight

Grounded in boardroom experience

Ger Perdisatt is a current Non-Executive Director at Dublin Airport Authority and Tailte Éireann. He understands what boards need to hear — and how to present it. Advisory grounded in the realities of board governance, not in theory.

Sectors served

  • Regulated professional services firms
  • Financial services and insurance
  • State bodies and publicly accountable organisations
  • Healthcare and life sciences

Common questions

What are board directors' responsibilities under the EU AI Act?

Under the EU AI Act, boards of organisations deploying high-risk AI systems carry oversight and accountability responsibilities. Directors are expected to understand the nature of AI systems in use, ensure appropriate governance structures exist, and take accountability for material AI risks. The Act does not require directors to be technical experts — but it does require them to exercise informed oversight. Boards that cannot articulate their AI governance position are exposed to regulatory, reputational and personal liability risk.

How should a board approach AI governance?

An effective board approach to AI governance starts with a clear picture of what AI is being used across the organisation, followed by an assessment of the associated risks and obligations. The board's role is oversight, not management — so the framework needs to give directors the language, metrics and escalation mechanisms to exercise that oversight without being dependent on management for interpretation. A board AI advisory engagement with Acuity AI typically delivers a governance framework, a set of board-level questions for management, and a director education component.

What questions should a board ask management about AI?

Boards should be asking: What AI systems are we currently using or developing? Which are classified as high-risk under the EU AI Act? Who is accountable for each system's performance and compliance? What is our process for reviewing AI before deployment? Have we assessed the liability implications of AI errors or failures? What is our position if an AI system causes harm to a customer, employee or third party? These questions create the accountability structure that governs AI risk at board level.

Do non-executive directors need specialist AI knowledge?

No. Non-executive directors do not need to be AI specialists — but they do need sufficient literacy to ask the right questions, identify red flags, and hold management to account. The gap for most boards is not technical understanding, it is the language and frameworks to exercise effective oversight. Our board advisory work is specifically designed to bridge that gap: building the competence needed for governance without requiring directors to become technologists.

Arrange a Board AI Briefing

Get in touch