AI literacy is fast becoming a core competence for modern directors. Only 20% of boards have a member with AI expertise. For Irish NEDs facing EU AI Act obligations, the gap between what boards must govern and what they understand is now a liability.
Here is a number that should unsettle every board chair in Ireland: only 20% of companies report having at least one director with AI expertise. That figure has doubled from 2022 — but it still means four in five boards are overseeing AI adoption without any member who can meaningfully interrogate what management is telling them.
This is not a technology problem. It is a governance competence problem. And for Irish directors, it is about to become a regulatory one.
What AI literacy means for directors
AI literacy at board level does not mean the ability to build, audit, or understand the technical internals of AI models. No board needs a machine learning engineer at the table.
What it does mean is the capacity to understand AI at the level of consequence:
- How AI systems make decisions, and what can go wrong
- What governance questions to ask management, and how to assess whether the answers are adequate
- What assurance evidence should look like for AI systems in use across the organisation
- Where the regulatory obligations sit, and whether the organisation is meeting them
This is the same standard boards apply to financial oversight. Directors are not expected to be accountants, but they are expected to understand financial statements well enough to exercise effective oversight. AI governance requires the equivalent.
The hidden influence problem
There is a specific risk that boards need to understand: AI is already influencing board-level decisions in many organisations, without disclosure or governance.
Directors and executives are using public AI tools informally — for drafting board papers, exploring strategic options, analysing market data, summarising reports. In many cases, this happens without governance, guardrails, or disclosure to the rest of the board.
If AI shapes board choices but no one validates the AI output, discloses its role, or owns its errors, the board may make flawed decisions while believing it acted responsibly. This is the governance equivalent of relying on an unvetted external advisor without telling the other directors.
The regulatory dimension
The EU AI Act creates specific obligations for organisations that deploy AI systems. For Irish boards, the AI Bill 2026 establishes the enforcement framework, with 15 competent authorities and an AI Office operational from August 2026.
Directors cannot discharge their oversight obligations under the Act — or their duties of care under Irish company law — if they lack the literacy to understand what they are overseeing. A board that rubber-stamps management's assurances about AI compliance, without the competence to assess those assurances critically, is exposed.
This is not theoretical. When enforcement begins, regulators will look at governance structures. A board that cannot demonstrate it had adequate AI oversight mechanisms is in a weaker position than one that can show it invested in director competence.
What good looks like
Boards that are addressing AI literacy effectively are doing several things:
Structured education programmes. Not one-off awareness sessions, but a programme that builds AI governance competence across the full board over time. The content is calibrated for directors — focused on governance, risk, and oversight, not technical detail.
Dedicated board discussion. AI governance is a standing agenda item, not an occasional update buried in the CEO's report. The board allocates time to understand AI use across the organisation, review risk assessments, and challenge management's AI governance position.
Expert advisory. Boards that lack internal AI expertise engage independent advisory to provide the analysis and frameworks needed for effective oversight. The key word is independent — advisory from someone without a product to sell.
Board composition review. When board renewal opportunities arise, AI governance competence is considered alongside other skills requirements. Nearly one-third of boards now disclose some form of AI oversight — whether via specific committees, expert directors, or dedicated governance structures.
AI is not optional governance territory
The comparison to cybersecurity a decade ago is instructive. In 2015, cybersecurity was seen as a technical domain that boards could delegate to IT. By 2020, it was clear that boards that failed to develop cybersecurity literacy were exposed — to regulatory action, to shareholder challenge, and to the reputational consequences of incidents they should have been better prepared for.
AI governance is following the same trajectory, on a compressed timeline. Boards that treat AI as a technology topic rather than a governance capability will find themselves in the same position — except the regulatory framework is more explicit, the obligations are more demanding, and the enforcement infrastructure is being built now.
If your board needs to build AI governance competence, Acuity AI Advisory delivers board-level AI briefings and director education grounded in practising NED experience. Get in touch to arrange a session.