The regulation in 60 seconds
Three obligations. The rest is detail.
The EU AI Act doesn't regulate all AI. It singles out specific use cases — HR screening, credit scoring, school admissions, medical triage, biometric ID, a few others — as high-risk. If your product is one of them, the law gives you three jobs.
01 / Classify
Confirm where you sit
Show, in writing, whether each AI system you ship is high-risk under Annex III — and on which paragraph. Roughly half the work in a Diagnostic engagement.
02 / Document
Keep an evidence pack
Maintain a technical file describing the system, the data, what you tested, and what you found. The structure is fixed by Annex IV; the content already lives in your MLflow, W&B and Langfuse.
03 / Monitor
Watch it after release
Run a written post-market monitoring plan and report serious incidents on Article 73's clocks. If you have on-call, drift alarms and an incident process, this is mostly paperwork on top.
Fines reach €35M or 7 % of global turnover at the top end. A defensible technical file is what spares you from that exposure — and that file is the work we do.