ISO 42001 (AI Management System Standard)
The first certifiable international standard for AI management systems, establishing requirements for responsible development, deployment, and governance of AI.
ISO/IEC 42001:2023, "Artificial intelligence — Management system," published December 2023, is the first certifiable international standard for AI management systems (AIMS). Structured using the ISO High Level Structure (Annex SL), it is compatible with ISO 27001, ISO 27701, and ISO 9001, enabling integrated implementation. The standard applies to any organization that develops, provides, or uses AI systems and seeks to ensure responsible AI across the AI system lifecycle. Core requirements include establishing an AI policy (Clause 5.2), defining AI-specific objectives (Clause 6.2), conducting impact assessments (Annex A, Control 6.1.6), ensuring human oversight mechanisms (A.6.1.4), maintaining records of AI system performance (Clause 7.5), and addressing AI-specific risks through a risk management process (Clause 6.1).
ISO 42001's Annex A contains 38 controls organized across eight domains: AI policies (A.2), internal organization and accountability (A.3), resource management for AI (A.4), impact assessment (A.5 and A.6), AI system lifecycle (A.7), data for AI (A.8), and system by design (A.9). Control A.6.1.6 requires organizations to conduct AI impact assessments covering intended use, potential misuse, impacts on individuals and groups (including discriminatory impacts), environmental impacts, and societal impacts before deployment. Control A.8.2 requires that data used for AI training is relevant, representative, and of sufficient quality — with processes to detect and mitigate data quality issues. Control A.6.2 addresses AI system transparency: documentation of the AI system's purpose, capabilities, limitations, and intended contexts of use must be created and maintained.
For organizations subject to the EU AI Act, ISO 42001 certification is positioned as evidence of conformity with many AIMS requirements — the EU AI Act Article 9 risk management system and Article 17 quality management system requirements map substantially onto ISO 42001. Annex D of ISO 42001 provides a cross-reference to the EU AI Act's high-risk AI system requirements. However, ISO 42001 certification does not automatically constitute EU AI Act conformity: EU AI Act requires notified body involvement for certain high-risk systems and specific technical documentation requirements under Article 11 that go beyond ISO 42001's documentation controls. ISO 42001 is best treated as a necessary but not sufficient condition for EU AI Act high-risk compliance.
We implement ISO 42001 AIMS programs that integrate with existing ISO 27001 management systems, adding AI-specific impact assessment workflows, training data quality controls, and human oversight mechanisms. For clients subject to the EU AI Act, we map ISO 42001 controls to Article 9, 10, and 17 requirements and identify the additional technical documentation and conformity assessment obligations that certification alone does not satisfy.
Compliance-Native Architecture Guide
Design principles and a structured checklist for building software that is compliant by default — not compliant by retrofit. Covers data architecture, access controls, audit trails, and vendor due diligence.