Article 76

Supervision of Testing in Real World Conditions by Market Surveillance Authorities

Updated on 31 July 2024 based on the version published in the Official Journal of the EU dated 12 July 2024 and entered into force on 1 August 2024.

1. Market surveillance authorities shall have competences and powers to ensure that testing in real world conditions is in accordance with this Regulation.

2. Where testing in real world conditions is conducted for AI systems that are supervised within an AI regulatory sandbox under Article 58, the market surveillance authorities shall verify the compliance with Article 60 as part of their supervisory role for the AI regulatory sandbox. Those authorities may, as appropriate, allow the testing in real world conditions to be conducted by the provider or prospective provider, in derogation from the conditions set out in Article 60(4), points (f) and (g).

3. Where a market surveillance authority has been informed by the prospective provider, the provider or any third party of a serious incident or has other grounds for considering that the conditions set out in Articles 60 and 61 are not met, it may take either of the following decisions on its territory, as appropriate:


  • to suspend or terminate the testing in real world conditions;
  • to require the provider or prospective provider and the deployer or prospective deployer to modify any aspect of the testing in real world conditions.


4. Where a market surveillance authority has taken a decision referred to in paragraph 3 of this Article, or has issued an objection within the meaning of Article 60(4), point (b), the decision or the objection shall indicate the grounds therefor and how the provider or prospective provider can challenge the decision or objection.

5. Where applicable, where a market surveillance authority has taken a decision referred to in paragraph 3, it shall communicate the grounds therefor to the market surveillance authorities of other Member States in which the AI system has been tested in accordance with the testing plan.

Table of Contents

Chapter III: High-Risk AI Systems
Chapter IX: Post-Market Monitoring, Information Sharing and Market Surveillance