The EU AI Act applies to SMEs as well as large enterprises. The compliance obligations depend on how AI is used, not on company size. Here is a practical checklist for Irish SMEs to assess their current position.
The EU AI Act contains specific provisions designed to reduce compliance burden for small and medium enterprises. SMEs benefit from reduced fees for accessing testing environments, priority access to regulatory sandboxes, and simplified documentation requirements in certain contexts.
What the Act does not do is exempt SMEs from compliance. The obligations that apply to any organisation deploying AI — particularly in high-risk use cases — apply regardless of company size. The provisions for SMEs are about reducing barriers to the compliance process, not about reducing the obligations themselves.
This checklist is designed to help Irish SMEs assess where they currently stand and identify the most important steps to take.
Step 1 — Understand which role you play
The EU AI Act distinguishes between providers (organisations that develop AI systems or place them on the market) and operators (organisations that deploy AI systems in their business).
Most Irish SMEs are operators: they use AI tools developed and sold by other companies. This matters because operator obligations are different from — and in many cases lighter than — provider obligations.
As an operator, your primary obligations are to:
- Use AI systems in accordance with the provider's instructions
- Not modify systems in ways that alter their risk profile
- Implement adequate human oversight for high-risk systems
- Report serious incidents to the relevant authority
- Maintain records of high-risk AI system use
If your business has developed its own AI tools — even internal tools built on third-party AI platforms — you may have provider obligations as well. This is worth assessing carefully.
Step 2 — Build an AI inventory
List every AI system your business currently uses. This should include:
- Dedicated AI tools (ChatGPT, Claude, Gemini, Copilot, and similar)
- AI features embedded in business software (CRM platforms, accounting tools, HR systems, marketing platforms)
- AI tools used by individual employees that have not been formally approved by the business
- Any custom-built tools or workflows that use AI APIs
The inventory should capture: the tool, its main use cases, who uses it, and what data it processes. Most SMEs find this exercise surfaces tools they were not aware of at a business level.
Step 3 — Apply the risk classification framework
For each system in your inventory, assess which risk category applies.
Prohibited. AI systems that manipulate individuals through subliminal techniques, exploit vulnerabilities of specific groups, conduct real-time biometric surveillance in public spaces, or engage in social scoring. These must not be used. Most Irish SMEs will not be using any prohibited systems, but it is worth confirming.
High-risk. This is the category that requires the most attention. High-risk systems include those used in recruitment and HR decisions (CV screening, performance evaluation, workforce monitoring), credit assessment, and certain safety-critical applications. Check Annex III of the Act for the full list.
Limited risk. AI systems that interact with humans (chatbots) or generate synthetic content must meet transparency requirements. If you use a customer-facing chatbot, users must know they are interacting with AI.
Minimal risk. Everything else. Most AI tools used by SMEs fall into this category — general writing assistance, content generation, data analysis — and carry no mandatory compliance obligations under the Act, though responsible use practices remain important.
Step 4 — Assess your high-risk exposure
If any system in your inventory falls into the high-risk category, you need to take specific action.
As an operator of a high-risk system, you must:
- Verify that the system you are using has undergone a conformity assessment by the provider and carries a CE mark (from August 2026 when this requirement applies)
- Ensure you have adequate documentation about the system from the provider
- Implement human oversight — someone must be responsible for monitoring outputs and able to intervene
- Report any serious incidents or malfunctions to the relevant market surveillance authority
- Maintain records of your use of the system for a minimum of 10 years (or 5 years for law enforcement systems)
For most Irish SMEs, the most likely high-risk use cases involve recruitment tools and, for financial services businesses, credit assessment. If you use AI in either of these areas, this is where your compliance work should focus.
Step 5 — Check your transparency obligations
If you deploy AI systems that interact with your customers, you need to ensure transparency is in place.
- Customer-facing chatbots must be disclosed as AI to users at the start of any interaction
- If your AI generates content that could be mistaken for human-created content (synthetic audio, video, or images), this must be disclosed
- If your business makes automated decisions that significantly affect customers — for example, automated credit decisions — you have disclosure obligations under both the AI Act and GDPR
Step 6 — Review your data handling for AI
AI tools process data. The data protection obligations that apply to that processing are governed by GDPR, administered in Ireland by the Data Protection Commission.
Key questions:
- What personal data is processed by each AI tool in your inventory?
- Do you have a lawful basis for that processing?
- Are your privacy notices updated to reflect AI processing?
- Have you conducted Data Protection Impact Assessments for high-risk processing?
- Do your contracts with AI tool providers include adequate data processing agreements?
The DPC has signalled active interest in AI and data protection. For Irish SMEs, this is not a theoretical risk.
Step 7 — Establish basic governance
Even without formal compliance obligations for low-risk AI use, establishing basic governance is good practice and positions your business well for the increasing regulatory attention AI will receive.
At a minimum:
- Designate someone responsible for AI governance in your business
- Create a simple approval process for new AI tools
- Set out in writing what data may be processed by external AI tools
- Ensure employees understand what AI use is permitted and what is not
Acuity AI Advisory provides EU AI Act readiness assessments for Irish SMEs, including AI inventory, risk classification, gap analysis, and remediation planning. If you would like to understand where your business stands, a diagnostic conversation is the right starting point.