Boards approving AI projects are carrying accountability they may not be equipped to exercise. A structured due diligence framework changes that — without requiring directors to become technologists.
Boards of Irish organisations are increasingly being asked to approve material AI investments. A new platform. An automated decision system. A Copilot or AI productivity rollout. An AI-assisted customer service deployment. The investment case lands, management recommends approval, and the board is expected to exercise oversight.
The problem is that most boards do not have a framework for AI due diligence. They approve technology investments based on financial analysis — the cost, the projected return, the implementation timeline. They do not systematically assess the governance risks: the liability exposure from AI errors, the regulatory obligations the investment creates, the oversight mechanisms required to manage it, or the vendor's own compliance position.
This is a gap that the EU AI Act has made materially consequential.
What AI due diligence should cover
A structured AI due diligence process addresses five areas before a board approves a material AI investment.
Risk classification — does the proposed AI system fall into a high-risk category under the EU AI Act? If so, what specific obligations does deployment create for the organisation? This question should be answered before approval, not after implementation.
Vendor compliance position — what has the vendor committed to under the EU AI Act? Do they provide the technical documentation required for high-risk systems? What conformity assessments have been conducted? What support do they provide for the deploying organisation's compliance obligations?
Oversight design — what human oversight mechanism will be in place once the system is deployed? Is it genuinely operational — meaning humans are actually reviewing AI outputs and can override them — or is it nominal, meaning a sign-off process that does not involve meaningful review?
Data governance — what data does the system use? Is it current, representative and appropriate for the decisions being made? Has a bias assessment been conducted? How is data quality monitored on an ongoing basis?
Liability analysis — if the AI system makes a wrong decision that harms a customer, employee or third party, what is the organisation's liability exposure? Is that exposure adequately managed by the oversight design and governance framework?
The accountability boards carry
The EU AI Act places accountability for AI governance on the organisations deploying AI systems. In publicly accountable organisations — listed companies, regulated firms, state bodies — that accountability ultimately rests with the board. A board that approved an AI system without adequate due diligence, and where that system subsequently caused harm or regulatory exposure, faces difficult questions about whether it exercised its governance responsibilities adequately.
This is not a theoretical risk. European courts are beginning to address AI liability questions. Regulators are developing enforcement frameworks. The organisations that will navigate this environment well are those that built governance infrastructure ahead of the pressure — not those that scrambled to explain their position after an incident.
A practical framework for board-level AI decisions
Before approving any material AI investment, boards should ask management to address:
- What is the EU AI Act risk classification of this system?
- What deployer obligations does this classification create?
- What has the vendor committed to in writing regarding compliance?
- What is the human oversight mechanism and how does it work in practice?
- Who in the organisation is accountable for this system once deployed?
- What is the incident response procedure if the system causes harm?
- How will AI performance and compliance be reported to the board on an ongoing basis?
Management that cannot answer these questions does not have the governance infrastructure to deploy the system responsibly. The board's role is to require the answers, not to accept the investment case without them.
Acuity AI Advisory provides board briefings, AI due diligence frameworks and governance advisory for Irish boards. Contact us to discuss how to structure AI oversight in your organisation.