EU AI Act Compliance Checklist 2026
Everything your SMB needs to do before August 2, 2026. Free, plain-English, no legal team required. Use this checklist or let Complizo automate it.
Free for up to 3 AI systems · No credit card required
Build your AI system inventory
RequiredList every AI system your organisation uses, deploys, or integrates — including third-party tools like ChatGPT, Copilot, HireVue, or credit scoring APIs.
Estimated effort: 1–2 hours
Determine your role for each system
RequiredFor each AI system, identify whether you are the Provider (built it), Deployer (use it in your business), Importer, or Distributor. Obligations differ significantly by role.
Estimated effort: 30 mins
Check for prohibited AI practices
RequiredConfirm you are not using AI that is outright banned: social scoring systems, real-time biometric surveillance in public spaces, subliminal manipulation, or AI that exploits vulnerabilities.
Estimated effort: 15 mins
Classify each system by risk tier
RequiredAssign each system to: Unacceptable Risk (banned), High Risk (Annex III), Limited Risk (transparency obligations), or Minimal Risk (no mandatory requirements).
Estimated effort: 1–3 hours
Technical Documentation (Annex IV)
RequiredFor each high-risk system: document intended purpose, design logic, training data, performance metrics, and architecture. Required before placing the system in service.
Estimated effort: 2–4 hours per system
Risk Management Plan
RequiredDocument identified risks, mitigation measures, and residual risks for each high-risk AI system. Must be updated throughout the system's lifecycle.
Estimated effort: 1–2 hours per system
Data Governance and Management Policy
RequiredDocument your training data sources, data quality measures, and how you handle bias. Applies to providers and relevant to deployers.
Estimated effort: 1–2 hours
Instructions for Use
RequiredWritten guidance for users/operators of the high-risk AI system, covering intended use, limitations, and human oversight requirements.
Estimated effort: 1 hour per system
Post-Market Monitoring Plan
RequiredDocument how you will monitor the AI system after deployment — tracking incidents, performance degradation, and unexpected outputs.
Estimated effort: 1 hour
EU Declaration of Conformity
RequiredA formal declaration that your high-risk AI system complies with the EU AI Act requirements. Required before CE marking.
Estimated effort: 30 mins
Implement human oversight mechanisms
RequiredEnsure humans can monitor, understand, and intervene in high-risk AI system outputs. Document who is responsible and how oversight works in practice.
Estimated effort: Varies
Set up incident logging
RequiredLog all serious incidents involving high-risk AI systems and report to the relevant national market surveillance authority within defined timeframes.
Estimated effort: 2–4 hours
Add transparency notices for limited-risk AI
ConditionalIf you use chatbots, deepfakes, or emotion recognition AI, users must be informed they're interacting with AI. Add clear disclosures.
Estimated effort: 1–2 hours
Train relevant staff (Article 4 — AI Literacy)
RequiredProviders and deployers must ensure staff who use or oversee AI systems have sufficient AI literacy. Document your training programme.
Estimated effort: Ongoing
Register high-risk AI systems in EU database
RequiredProviders of high-risk AI systems under Annex III must register them in the EU AI Act database (EUAIDB) before placing them on the market.
Estimated effort: 1 hour
Appoint an EU representative (if non-EU provider)
ConditionalNon-EU companies providing AI systems to EU users must appoint an authorised EU representative.
Estimated effort: Varies
Don't do this manually
Complizo automates the entire checklist: AI system inventory, risk classification, and all 6 required compliance documents — generated in minutes, not weeks.
No credit card · No sales call · Pro from $99/month
Frequently Asked Questions
When does the EU AI Act come into full effect?
The main obligations for high-risk AI systems (Annex III) take effect on August 2, 2026. Prohibited AI practices were already enforceable from February 2, 2025. GPAI model obligations (for foundation model providers) applied from August 2, 2025.
Does the EU AI Act apply to SMBs?
Yes. The EU AI Act applies to any organisation that places AI systems on the EU market or uses AI in the EU — regardless of company size. SMBs receive some proportionality considerations (e.g. QMS obligations proportional to organisation size), but the core obligations still apply. Fines can reach €35 million or 7% of global annual turnover.
What is a high-risk AI system under the EU AI Act?
High-risk AI systems are defined in Annex III of the EU AI Act. They include AI used in: hiring and HR decisions, credit scoring and financial access, biometric identification, critical infrastructure, educational access, law enforcement, and border control. Many standard SaaS tools used by SMBs — especially in HR and finance — fall into this category.
What documents does the EU AI Act require?
For high-risk AI systems, the EU AI Act requires: Technical Documentation (Annex IV), a Risk Management Plan, a Data Governance Policy, Instructions for Use, a Post-Market Monitoring Plan, and an EU Declaration of Conformity. Complizo generates all six automatically.
How long does EU AI Act compliance take for an SMB?
Manual compliance typically takes 4–8 weeks with legal support. Using Complizo, an SMB can complete its initial assessment and generate all required documents in a few hours. The key is starting early — the August 2026 deadline will arrive quickly.