My favourites

CHAPTER VI – Measures in support of innovation (Art. 57-63)

Art. 57 AI Act – AI regulatory sandboxes arrow_right_alt

Art. 58 AI Act – Detailed arrangements for and functioning of AI regulatory sandboxes arrow_right_alt

Art. 59 AI Act – Further processing of personal data for developing certain AI systems in the public interest in the AI regulatory sandbox arrow_right_alt

Art. 60 AI Act – Testing of high-risk AI systems in real world conditions outside AI regulatory sandboxes arrow_right_alt

Art. 61 AI Act – Informed consent to participate in testing in real world conditions outside AI regulatory sandboxes arrow_right_alt

  1. For the purpose of testing in real world conditions under Article 60, freely-given informed consent shall be obtained from the subjects of testing prior to their participation in such testing and after their having been duly informed with concise, clear, relevant, and understandable information regarding:
    1. the nature and objectives of the testing in real world conditions and the possible inconvenience that may be linked to their participation;
    2. the conditions under which the testing in real world conditions is to be conducted, including the expected duration of the subject or subjects’ participation;
    3. their rights, and the guarantees regarding their participation, in particular their right to refuse to participate in, and the right to withdraw from, testing in real world conditions at any time without any resulting detriment and without having to provide any justification;
    4. the arrangements for requesting the reversal or the disregarding of the predictions, recommendations or decisions of the AI system;
    5. the Union-wide unique single identification number of the testing in real world conditions in accordance with Article 60(4) point (c), and the contact details of the provider or its legal representative from whom further information can be obtained.
  2. The informed consent shall be dated and documented and a copy shall be given to the subjects of testing or their legal representative.
Related
Close tabsclose
  • 141

Recital 141

In order to accelerate the process of development and the placing on the market of the high-risk AI systems listed in an annex to this Regulation, it is important that providers or prospective providers of such systems may also benefit from a specific regime for testing those systems in real world conditions, without participating in an AI regulatory sandbox. However, in such cases, taking into account the possible consequences of such testing on individuals, it should be ensured that appropriate and sufficient guarantees and conditions are introduced by this Regulation for providers or prospective providers. Such guarantees should include, inter alia, requesting informed consent of natural persons to participate in testing in real world conditions, with the exception of law enforcement where the seeking of informed consent would prevent the AI system from being tested. Consent of subjects to participate in such testing under this Regulation is distinct from, and without prejudice to, consent of data subjects for the processing of their personal data under the relevant data protection law. It is also important to minimise the risks and enable oversight by competent authorities and therefore require prospective providers to have a real-world testing plan submitted to competent market surveillance authority, register the testing in dedicated sections in the EU database subject to some limited exceptions, set limitations on the period for which the testing can be done and require additional safeguards for persons belonging to certain vulnerable groups, as well as a written agreement defining the roles and responsibilities of prospective providers and deployers and effective oversight by competent personnel involved in the real world testing. Furthermore, it is appropriate to envisage additional safeguards to ensure that the predictions, recommendations or decisions of the AI system can be effectively reversed and disregarded and that personal data is protected and is deleted when the subjects have withdrawn their consent to participate in the testing without prejudice to their rights as data subjects under the Union data protection law. As regards transfer of data, it is also appropriate to envisage that data collected and processed for the purpose of testing in real-world conditions should be transferred to third countries only where appropriate and applicable safeguards under Union law are implemented, in particular in accordance with bases for transfer of personal data under Union law on data protection, while for non-personal data appropriate safeguards are put in place in accordance with Union law, such as Regulations (EU) 2022/868 (42) and (EU) 2023/2854 (43) of the European Parliament and of the Council.


(42) Regulation (EU) 2022/868 of the European Parliament and of the Council of 30 May 2022 on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) (OJ L 152, 3.6.2022, p. 1).
(43) Regulation (EU) 2023/2854 of the European Parliament and of the Council of 13 December 2023 on harmonised rules on fair access to and use of data and amending Regulation (EU) 2017/2394 and Directive (EU) 2020/1828 (Data Act) (OJ L, 2023/2854, 22.12.2023, ELI: http://data.europa.eu/eli/reg/2023/2854/oj).

Art. 62 AI Act – Measures for providers and deployers, in particular SMEs, including start-ups arrow_right_alt

Art. 63 AI Act – Derogations for specific operators arrow_right_alt