A significant issue involves the European Commission’s proposed requirements for conformity assessments of various high-risk AI systems. As it stands, Member States and national authorities will appoint third-party “Notified Bodies” to conduct these assessments for AI systems integrated with product safety components, such as machinery, medical devices, and personal protective equipment. Conversely, for standalone high-risk AI systems, only self-assessment by the industry is required, if they meet certain conditions.
For internal assessments, entities like providers, distributors, and importers must show compliance through “self-certification”. They need to:
- Confirm their quality management systems meet the required standards.
- Review the technical documentation to ensure it satisfies high-risk AI system requirements.
- Validate that the AI system’s design, development, and post-market monitoring align with the technical documentation.
Entities can either follow their own compliance plans or adhere to harmonized technical standards. It is highly recommended to follow the EU-established standards, as compliance with these standards provides a presumption of conformity, making internal self-assessment sufficient. If the standards are not followed, an assessment by Notified Bodies will be required.
This reliance on self-assessment raises concerns about potential inconsistencies and varying compliance levels, which could affect consumer protection and fundamental rights. It raises the question whether self-assessment without third-party verification will be sufficient. Additionally, this approach may impose significant compliance costs and administrative challenges, especially for small and medium-sized enterprises (SMEs).
The costs of complying with the EU AI Act vary depending on the size of the company and the complexity of the AI systems involved. Generally, expenses can range from tens of thousands of euros to several hundred thousand or even millions of euros for the largest enterprises. These costs may include investments in technology, employee training, development and maintenance of technical documentation, and fees for assessments and certifications. Companies may also need to engage legal and technical experts to ensure full compliance with the Act.
Considering these issues, the European Economic and Social Committee has suggested making third-party assessments compulsory for all high-risk systems. The AI Act’s framework may lead to the establishment of AI auditing standards. High-risk AI systems that comply with harmonized standards from European Standards Organizations, are presumed to meet conformity requirements. These standards aim to standardize global AI assessment practices, potentially eliminating the need for additional self-assessment.
However, varying standards and capacities across sectors and EU Member States may hinder the creation of mutual recognition agreements with other countries, impacting international AI trade. Non-EU countries may also face challenges as they will be subject to EU standards without having had substantial input in their development.