Public Sector
AI Advisory for State Agencies
Public sector AI operates in a context of heightened accountability: ministerial oversight, audit committee scrutiny, citizen transparency obligations, and stricter EU AI Act requirements for AI used by public authorities.
Irish state agencies operate AI in a context of heightened accountability: public transparency obligations, ministerial oversight, audit committee scrutiny, and stricter EU AI Act requirements for AI used by public authorities. AI advisory for state agencies must address all of these — not just the technical compliance question.
What makes public sector AI governance different
State agencies are not private sector organisations with a regulatory compliance obligation. They are publicly accountable bodies whose AI use affects citizens who have no choice but to interact with them. That changes the governance requirements significantly.
Citizens affected by AI-influenced public decisions — in social welfare, planning, licensing, or public service access — have rights to transparency and explanation that go beyond standard data protection entitlements. The EU AI Act reinforces these rights with specific transparency requirements for AI used by public authorities. A state agency that cannot explain how its AI systems influenced a decision affecting a citizen is exposed — legally, politically, and reputationally.
The audit and accountability dimension is equally significant. Comptroller and Auditor General scrutiny, Oireachtas committee examination, and internal audit committee oversight all create evidence requirements that do not apply to private sector organisations. Governance frameworks for state agencies must generate the documentation and audit trail that these accountability mechanisms require.
Ministerial accountability adds a further dimension. Where a state agency's AI use creates controversy or causes harm, the accountability chain reaches the sponsoring Minister. Agencies that can demonstrate robust, documented governance are in a materially different position from those that cannot.
EU AI Act and public sector AI
The EU AI Act applies stricter requirements to AI used by public authorities than to equivalent AI used in the private sector. AI systems used in the administration of justice, law enforcement, immigration, and social benefit assessment are classified as high-risk with mandatory requirements including conformity assessment, human oversight, and transparency obligations.
Ireland's AI Office of Ireland, operational from August 2026, has fifteen designated sectoral enforcement authorities covering specific public sector domains. State agencies should expect AI governance to be a subject of regulatory inspection in the period following the August 2026 enforcement date.
What the advisory covers
- AI use inventory and EU AI Act risk classification
- Public accountability framework for citizen-facing AI
- Board and audit committee AI governance structures
- Ministerial accountability and reporting requirements
- Transparency obligations for AI-influenced public decisions
- Human oversight mechanisms for consequential AI outputs
- Compliance readiness for AI Office of Ireland inspection
Why Acuity AI Advisory
Ger Perdisatt is a current Non-Executive Director at Dublin Airport Authority and Tailte Éireann — both state bodies. The governance challenges he addresses in his NED roles are the same governance challenges the advisory is built to resolve. This is not advisory designed from a private sector template and applied to a public sector context.
Acuity AI Advisory is independent of all technology vendors and public sector service providers. The advisory is fixed-fee and structured to produce governance frameworks that satisfy audit committee, ministerial, and regulatory scrutiny.
Common questions
What EU AI Act obligations apply to Irish state agencies?
Irish state agencies are subject to the EU AI Act as deployers and, in some cases, as providers of AI-enabled public services. Several obligation categories are particularly relevant: AI used by public authorities in social benefit assessments or public service administration may be classified as high-risk, requiring documented governance, human oversight, and transparency to affected citizens. AI used in law enforcement or border control contexts faces the strictest restrictions in the Act. The AI Office of Ireland, fully operational from August 2026, is the primary enforcement authority, with sectoral regulators designated for specific public sector domains.
How should state agencies govern AI?
State agency AI governance needs to address the specific accountability context of public sector AI: ministerial accountability, audit committee oversight, Comptroller and Auditor General scrutiny, and citizen transparency obligations that do not apply to private sector organisations. An effective governance framework for a state agency includes: an AI use inventory with risk classification, board-level accountability structures, transparency mechanisms for AI-influenced decisions affecting citizens, human oversight requirements for consequential AI outputs, and documented evidence of governance for regulatory inspection. The framework must be operationally grounded — audit committees will expect to see governance in practice, not in policy documents.
What does the AI Office of Ireland mean for public bodies?
The AI Office of Ireland becomes fully operational on 1 August 2026 under Ireland's Regulation of Artificial Intelligence Act 2026. For public bodies, this means: a designated central enforcement authority with inspection powers, fifteen sectoral enforcement authorities for specific domains, and an obligation to demonstrate AI governance on request. Public bodies that have not addressed their AI governance obligations before this date face regulatory exposure. The AI Office has indicated that its initial enforcement focus will include public sector AI given the public accountability dimension.
Request AI Advisory for Your State Agency
Independent. Fixed-fee. Grounded in public sector governance experience.