← Insights
·4 min read

Shadow AI: The Ungoverned Risk Already Inside Your Organisation

G

Ger Perdisatt

Founder, Acuity AI Advisory

86% of employees now use AI tools weekly. Only 37% of organisations have governance policies. Shadow AI is not a future risk — it is an operational reality that most Irish organisations have not addressed.

There is a term gaining traction in security and governance circles that every Irish board and executive team needs to understand: shadow AI.

Shadow AI is the use of AI tools by employees without organisational oversight, approval, or governance. It includes browser extensions that summarise internal documents, personal ChatGPT accounts used to analyse confidential data, AI note-takers connected to meeting platforms, and SaaS features with embedded AI that were enabled without IT or compliance review.

If you think this is not happening in your organisation, you are almost certainly wrong.

The scale of the problem

The numbers are stark. Research published in early 2026 found that 86% of employees now use AI tools at least weekly for work-related tasks. The average enterprise experiences 223 data policy violations per month related to AI usage. And only 37% of organisations have AI governance policies in place.

That last number is the one that should concern Irish boards most. If your organisation is among the 63% without a governance policy, AI is being used across your operations right now — processing client data, summarising internal communications, analysing confidential documents — and you have no visibility, no controls, and no audit trail.

Why shadow AI is different from shadow IT

Traditional shadow IT — employees using unsanctioned software — was primarily a procurement and security concern. Shadow AI is a governance concern, because AI tools do not just store or transmit data. They process it, learn from it, and generate outputs that may be used in business decisions.

When an employee pastes a client contract into a public AI tool for summarisation, the governance implications go beyond data leakage. The AI-generated summary may contain errors. It may be used to inform advice given to clients. There is no audit trail of what was submitted, what was generated, or how it was used. Under the EU AI Act, if that tool is used in a context classified as high-risk — such as employment decisions or contract assessment — the organisation has obligations it cannot meet because it does not even know the tool exists.

The financial impact

Shadow AI is not just a compliance risk — it is a financial one. Research from 2025-2026 found that shadow AI added $670,000 to average breach costs. One in five organisations reported breaches specifically caused by shadow AI. And the cost of remediation after a shadow AI incident is substantially higher than for traditional data breaches, because the scope of exposure is harder to determine when you do not know what tools were used or what data they processed.

What Irish organisations should do

The response to shadow AI is not to ban AI tools — that approach has been tried and consistently fails. Employees use AI because it makes them more productive. The response is to govern it.

1. Conduct an AI inventory. You cannot govern what you cannot see. A structured AI governance assessment should include discovery of AI tools in use across the organisation — not just those purchased centrally, but those adopted by individual employees and teams.

2. Establish an acceptable use policy. Define which AI tools are approved, what data can be processed through them, and what oversight mechanisms apply. This is not a 50-page document — it is a practical framework that employees can actually follow.

3. Provide approved alternatives. If employees are using personal AI accounts because the organisation has not provided governed alternatives, fix the supply problem rather than policing the demand.

4. Build monitoring capability. Implement the technical and procedural controls needed to detect ungoverned AI use and respond to policy violations.

5. Report to the board. Shadow AI is a governance risk that requires board-level visibility. Directors need to understand the organisation's exposure and the adequacy of its response. If your board has not discussed shadow AI, that conversation is overdue.

The connection to the EU AI Act

Ireland's AI Bill 2026 establishes the enforcement framework for the EU AI Act in Ireland. When the AI Office becomes operational in August 2026, it will have powers to access documentation that deployers of AI systems are required to hold. An organisation that cannot produce an inventory of its AI systems — because half of them are shadow AI — is in a fundamentally weak position.

The time to address shadow AI is before the enforcement framework is operational, not after.


If your organisation has not conducted an AI governance assessment that includes shadow AI discovery, contact Acuity AI Advisory for a structured diagnostic. We assess what AI tools are actually in use, where the governance gaps are, and what needs to change.

ai governance