Charity and Non-Profit Sector
AI Advisory for Charities
AI governance obligations do not stop at the charity door. EU AI Act compliance, GDPR obligations for donor and beneficiary data, and Charities Regulator board oversight expectations all apply — and all are commonly overlooked. Proportionate does not mean absent.
Request the advisoryTL;DR
Irish charities face AI governance obligations that are often overlooked: EU AI Act compliance for any AI used in staff assessments or beneficiary decisions, GDPR obligations for donor and beneficiary data in AI systems, and Charities Regulator expectations of board oversight. AI advisory for charities must be proportionate — but that does not mean it can be absent.
What AI governance means for an Irish charity
Irish charities have been slower than the private sector to engage with AI governance — partly because they assume it does not apply to them, and partly because governance capacity is limited. Both assumptions are wrong. The EU AI Act applies to organisations that deploy AI systems. It does not exempt charities. And the cost of inadequate AI governance in a charity is not just regulatory — it is mission-critical: AI errors in beneficiary-facing work cause real harm to the people the charity exists to serve.
The charity AI governance picture has three distinct layers. First, EU AI Act obligations — which apply to any AI used in processes that significantly affect individuals, including beneficiary assessment, staff recruitment screening, or grant allocation. Second, GDPR obligations — donor databases, beneficiary records, and volunteer data all contain personal information that creates specific obligations when entered into AI systems. Third, Charities Regulator expectations — the Regulator expects charity boards to exercise oversight of significant operational matters, and AI has become a significant operational matter.
AI advisory for charities is designed to be proportionate. The programme for a small charity with a staff of fifteen is different from the programme for a large national organisation. But proportionate does not mean cosmetic — it means genuinely calibrated to the organisation's scale, risk profile, and capacity.
What the advisory covers
- AI systems inventory — what is in use across the organisation, including staff-adopted tools
- EU AI Act risk classification for AI used in beneficiary and staff decisions
- Charities Regulator board oversight obligations for AI governance
- GDPR compliance assessment for donor and beneficiary data in AI systems
- AI use policy — proportionate, practical, and specific to charity operations
- Board briefing — what trustees need to know about AI risk and oversight
- Remediation roadmap — prioritised, proportionate, achievable
EU AI Act obligations for Irish charities
AI used in processes that have a significant effect on individuals is high-risk under the EU AI Act. For charities, this can include: AI used to assess beneficiary eligibility or prioritise service allocation; AI in staff or volunteer recruitment screening; AI that processes sensitive personal data (health, financial, or social circumstance data) about beneficiaries; and AI used in grant management or fraud detection that influences decisions about individual applicants.
High-risk classification triggers specific deployer obligations. The fact that an organisation is a charity does not reduce these obligations — it may affect how they are practically implemented, but not whether they apply. Charities that are using high-risk AI without meeting the Act's requirements from August 2026 are in breach of EU law.
GDPR obligations intersect with the EU AI Act for charities that hold donor, beneficiary, or volunteer data. When personal data enters AI tools — for analysis, communications, or administration — the charity's data protection obligations follow the data. The advisory addresses this intersection explicitly.
Why Acuity AI Advisory
Acuity AI Advisory understands that charity governance operates with limited resources and high accountability. The advisory is genuinely proportionate — designed to provide the governance that the organisation's risk profile requires, not a scaled-down enterprise programme applied without adaptation.
Ger Perdisatt, who leads all engagements, is a former COO of Microsoft Western Europe and holds non-executive directorships at DAA and Tailte Éireann. The advisory draws on board-level governance experience and operational AI expertise. It is also vendor-neutral — Acuity AI has no commercial interest in the technology choices the charity makes.
The advisory is fixed-fee. The output includes a written AI use policy adapted to the charity's operations, a board briefing note for trustees, and a prioritised action list. For charities that want to start with an assessment before committing to a full advisory programme, a scoping conversation is available at no charge.
Questions
Common questions
Do charities need to comply with the EU AI Act?
Yes. The EU AI Act applies to organisations that deploy AI systems — and most charities are deployers, whether they recognise it or not. Charities using AI for staff or volunteer recruitment screening, AI-assisted beneficiary triage, AI in assessment processes that influence decisions about individuals, or AI tools that process sensitive personal data are within scope. Size or legal form does not exempt an organisation from the Act. The Act does provide for proportionality in how requirements are met, but proportionate compliance is not the same as no compliance.
What AI governance do Irish charities need?
Irish charities need AI governance that is proportionate — genuinely proportionate, not a scaled-down version of enterprise governance that still does not fit. That means: an inventory of AI tools actually in use across the organisation, including tools used by individual staff on personal accounts for work purposes; a basic assessment of which tools create compliance risk under the EU AI Act or GDPR; a policy that sets clear boundaries on what AI may and may not be used for in work with beneficiaries; board oversight that is active rather than nominal; and a named person accountable for AI governance in the organisation. For most charities, this is not a large programme — but it needs to be real.
What AI risks should charity boards be aware of?
Charity boards should be aware of four AI risk areas specifically: EU AI Act compliance risk — AI used in beneficiary assessment or staff decisions may be high-risk and carry specific obligations; GDPR risk — donor and beneficiary data in AI tools creates data protection obligations that may not have been addressed in existing data protection policies; reputational risk — AI errors in beneficiary-facing work can cause real harm and attract public scrutiny; and governance risk — the Charities Regulator expects boards to exercise oversight of significant operational matters, and AI use is now a significant operational matter. A board that is unaware of how AI is used in its organisation is not exercising adequate oversight.
Related
Request an AI Advisory for Your Organisation
Proportionate. Fixed-fee. Designed for how charities actually work.
Get in touch