Public Sector

AI Governance for Irish State Bodies

State bodies face the same EU AI Act obligations as the private sector — with higher public accountability stakes. Independent advisory from someone who sits on public boards.

The public sector AI governance challenge

Irish state bodies are adopting AI across operations — from citizen-facing services and case management to procurement, compliance monitoring and internal decision support. Much of this adoption has been pragmatic and incremental. But the governance structures have not kept pace.

The EU AI Act creates specific obligations for deployers of AI systems. For state bodies, these obligations intersect with existing public accountability requirements, Freedom of Information legislation, public sector duty of care, and ministerial oversight. The governance framework needs to address all of these dimensions — not just the AI Act in isolation.

Why state bodies face heightened exposure

When a private sector organisation deploys AI that produces a poor outcome, the consequences are commercial and reputational. When a state body deploys AI that affects a citizen's welfare payment, housing application, or legal status, the consequences are constitutional. AI systems used in public administration, justice, immigration, and social services are explicitly classified as high-risk under the EU AI Act — with the most demanding compliance requirements.

What our state body advisory covers

  • AI system inventory and risk classification under the EU AI Act
  • Governance framework design for public sector accountability structures
  • Board-level AI oversight mechanisms and reporting
  • Human oversight protocols for high-risk AI systems
  • Fundamental rights impact assessment for citizen-facing AI
  • Vendor management and procurement governance for AI tools
  • Incident response and escalation frameworks

Grounded in public board experience

Ger Perdisatt is a current Non-Executive Director at Dublin Airport Authority and Tailte Éireann — two of Ireland's most significant state and semi-state bodies. He understands how public boards operate: the accountability structures, the ministerial reporting lines, the public interest obligations, and the governance culture that distinguishes public sector oversight from private.

Before his board roles, Ger served as COO of Microsoft Western Europe, governing technology adoption at scale across regulated environments. That combination — public board governance and senior technology operations — is exactly what AI governance for state bodies requires.

Common questions

Does the EU AI Act apply to Irish state bodies?

Yes. The EU AI Act applies to all organisations that deploy AI systems within the EU, including state bodies, semi-state entities, and public sector organisations. Public sector deployers of high-risk AI systems — including those used in social welfare, immigration, justice, public safety, and critical infrastructure — face the same obligations as private sector organisations, and in some cases heightened scrutiny given the impact on citizens' rights.

What AI governance structures do state bodies need?

State bodies need an AI governance framework covering: an inventory of all AI systems in use, risk classification under the EU AI Act, documented accountability structures, human oversight mechanisms for high-risk systems, incident reporting processes, and regular board-level reporting. The framework must be proportionate to the organisation's AI use and integrated with existing governance structures rather than bolted on as a separate compliance exercise.

How is AI governance different for public sector organisations?

Public sector AI governance carries additional dimensions beyond private sector requirements. State bodies must consider: the impact on citizens' fundamental rights, public accountability and transparency obligations, the intersection with public sector duty of care, Freedom of Information implications of AI-assisted decisions, and the reputational sensitivity of AI failures in publicly accountable organisations. The governance framework must address these dimensions explicitly.

What is the timeline for EU AI Act compliance for state bodies?

The timeline is the same as for all deployers. Prohibited AI practices are already banned (since February 2025). GPAI model obligations apply from August 2025. High-risk AI system requirements apply from August 2026, which is also when Ireland's AI Office becomes fully operational and enforcement begins. State bodies should be conducting readiness assessments now — not waiting for the deadline.

Preview of AI Governance Policy Template

Free download

AI Governance Policy Template

A structured starting point for your organisation's AI governance policy — covers inventory, risk classification, accountability, approved use and review schedules. Adaptable for public sector governance requirements.

No spam. Unsubscribe at any time.

Request an AI Governance Assessment

Structured as a fixed-fee diagnostic. Designed for publicly accountable organisations.