My favourites

CHAPTER III – High-risk AI systems (Art. 6-49)

Art. 6 AI Act – Classification rules for high-risk AI systems arrow_right_alt

Art. 7 AI Act – Amendments to Annex III arrow_right_alt

Art. 8 AI Act – Compliance with the requirements arrow_right_alt

Art. 9 AI Act – Risk management system arrow_right_alt

Art. 10 AI Act – Data and data governance arrow_right_alt

Art. 11 AI Act – Technical documentation arrow_right_alt

Art. 12 AI Act – Record-keeping arrow_right_alt

Art. 13 AI Act – Transparency and provision of information to deployers arrow_right_alt

Art. 14 AI Act – Human oversight arrow_right_alt

Art. 15 AI Act – Accuracy, robustness and cybersecurity arrow_right_alt

Art. 16 AI Act – Obligations of providers of high-risk AI systems arrow_right_alt

Art. 17 AI Act – Quality management system arrow_right_alt

Art. 18 AI Act – Documentation keeping arrow_right_alt

Art. 19 AI Act – Automatically generated logs arrow_right_alt

Art. 20 AI Act – Corrective actions and duty of information arrow_right_alt

Art. 21 AI Act – Cooperation with competent authorities arrow_right_alt

Art. 22 AI Act – Authorised representatives of providers of high-risk AI systems arrow_right_alt

Art. 23 AI Act – Obligations of importers arrow_right_alt

Art. 24 AI Act – Obligations of distributors arrow_right_alt

Art. 25 AI Act – Responsibilities along the AI value chain arrow_right_alt

Art. 26 AI Act – Obligations of deployers of high-risk AI systems arrow_right_alt

Art. 27 AI Act – Fundamental rights impact assessment for high-risk AI systems arrow_right_alt

Art. 28 AI Act – Notifying authorities arrow_right_alt

Art. 29 AI Act – Application of a conformity assessment body for notification arrow_right_alt

Art. 30 AI Act – Notification procedure arrow_right_alt

Art. 31 AI Act – Requirements relating to notified bodies arrow_right_alt

Art. 32 AI Act – Presumption of conformity with requirements relating to notified bodies arrow_right_alt

Art. 33 AI Act – Subsidiaries of notified bodies and subcontracting arrow_right_alt

Art. 34 AI Act – Operational obligations of notified bodies arrow_right_alt

Art. 35 AI Act – Identification numbers and lists of notified bodies arrow_right_alt

Art. 36 AI Act – Changes to notifications arrow_right_alt

Art. 37 AI Act – Challenge to the competence of notified bodies arrow_right_alt

Art. 38 AI Act – Coordination of notified bodies arrow_right_alt

Art. 39 AI Act – Conformity assessment bodies of third countries arrow_right_alt

Art. 40 AI Act – Harmonised standards and standardisation deliverables arrow_right_alt

Art. 41 AI Act – Common specifications arrow_right_alt

Art. 42 AI Act – Presumption of conformity with certain requirements arrow_right_alt

Art. 43 AI Act – Conformity assessment arrow_right_alt

  1. For high-risk AI systems listed in point 1 of Annex III, where, in demonstrating the compliance of a high-risk AI system with the requirements set out in Section 2, the provider has applied harmonised standards referred to in Article 40, or, where applicable, common specifications referred to in Article 41, the provider shall opt for one of the following conformity assessment procedures based on:
    1. the internal control referred to in Annex VI; or
    2. the assessment of the quality management system and the assessment of the technical documentation, with the involvement of a notified body, referred to in Annex VII.

    In demonstrating the compliance of a high-risk AI system with the requirements set out in Section 2, the provider shall follow the conformity assessment procedure set out in Annex VII where:

    1. harmonised standards referred to in Article 40 do not exist, and common specifications referred to in Article 41 are not available;
    2. the provider has not applied, or has applied only part of, the harmonised standard;
    3. the common specifications referred to in point (a) exist, but the provider has not applied them;
    4. one or more of the harmonised standards referred to in point (a) has been published with a restriction, and only on the part of the standard that was restricted.

    For the purposes of the conformity assessment procedure referred to in Annex VII, the provider may choose any of the notified bodies. However, where the high-risk AI system is intended to be put into service by law enforcement, immigration or asylum authorities or by Union institutions, bodies, offices or agencies, the market surveillance authority referred to in Article 74(8) or (9), as applicable, shall act as a notified body.

  2. For high-risk AI systems referred to in points 2 to 8 of Annex III, providers shall follow the conformity assessment procedure based on internal control as referred to in Annex VI, which does not provide for the involvement of a notified body.
  3. For high-risk AI systems covered by the Union harmonisation legislation listed in Section A of Annex I, the provider shall follow the relevant conformity assessment procedure as required under those legal acts. The requirements set out in Section 2 of this Chapter shall apply to those high-risk AI systems and shall be part of that assessment. Points 4.3., 4.4., 4.5. and the fifth paragraph of point 4.6 of Annex VII shall also apply.

    For the purposes of that assessment, notified bodies which have been notified under those legal acts shall be entitled to control the conformity of the high-risk AI systems with the requirements set out in Section 2, provided that the compliance of those notified bodies with requirements laid down in Article 31(4), (5), (10) and (11) has been assessed in the context of the notification procedure under those legal acts.

    Where a legal act listed in Section A of Annex I enables the product manufacturer to opt out from a third-party conformity assessment, provided that that manufacturer has applied all harmonised standards covering all the relevant requirements, that manufacturer may use that option only if it has also applied harmonised standards or, where applicable, common specifications referred to in Article 41, covering all requirements set out in Section 2 of this Chapter.

  4. High-risk AI systems that have already been subject to a conformity assessment procedure shall undergo a new conformity assessment procedure in the event of a substantial modification, regardless of whether the modified system is intended to be further distributed or continues to be used by the current deployer.

    For high-risk AI systems that continue to learn after being placed on the market or put into service, changes to the high-risk AI system and its performance that have been pre-determined by the provider at the moment of the initial conformity assessment and are part of the information contained in the technical documentation referred to in point 2(f) of Annex IV, shall not constitute a substantial modification.

  5. The Commission is empowered to adopt delegated acts in accordance with Article 97 in order to amend Annexes VI and VII by updating them in light of technical progress.
  6. The Commission is empowered to adopt delegated acts in accordance with Article 97 in order to amend paragraphs 1 and 2 of this Article in order to subject high-risk AI systems referred to in points 2 to 8 of Annex III to the conformity assessment procedure referred to in Annex VII or parts thereof. The Commission shall adopt such delegated acts taking into account the effectiveness of the conformity assessment procedure based on internal control referred to in Annex VI in preventing or minimising the risks to health and safety and protection of fundamental rights posed by such systems, as well as the availability of adequate capacities and resources among notified bodies.
Related
Close tabsclose
  • 78
  • 123
  • 124
  • 125
  • 126
  • 128
  • 147

Recital 78

The conformity assessment procedure provided by this Regulation should apply in relation to the essential cybersecurity requirements of a product with digital elements covered by a regulation of the European Parliament and of the Council on horizontal cybersecurity requirements for products with digital elements and classified as a high-risk AI system under this Regulation. However, this rule should not result in reducing the necessary level of assurance for critical products with digital elements covered by a regulation of the European Parliament and of the Council on horizontal cybersecurity requirements for products with digital elements. Therefore, by way of derogation from this rule, high-risk AI systems that fall within the scope of this Regulation and are also qualified as important and critical products with digital elements pursuant to a regulation of the European Parliament and of the Council on horizontal cybersecurity requirements for products with digital elements and to which the conformity assessment procedure based on internal control set out in an annex to this Regulation applies, are subject to the conformity assessment provisions of a regulation of the European Parliament and of the Council on horizontal cybersecurity requirements for products with digital elements insofar as the essential cybersecurity requirements of that regulation are concerned. In this case, for all the other aspects covered by this Regulation the respective provisions on conformity assessment based on internal control set out in an annex to this Regulation should apply. Building on the knowledge and expertise of ENISA on the cybersecurity policy and tasks assigned to ENISA under the Regulation (EU) 2019/881 of the European Parliament and of the Council (37), the Commission should cooperate with ENISA on issues related to cybersecurity of AI systems.


(37) Regulation (EU) 2019/881 of the European Parliament and of the Council of 17 April 2019 on ENISA (the European Union Agency for Cybersecurity) and on information and communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013 (Cybersecurity Act) (OJ L 151, 7.6.2019, p. 15).

Recital 123

In order to ensure a high level of trustworthiness of high-risk AI systems, those systems should be subject to a conformity assessment prior to their placing on the market or putting into service.

Recital 124

It is appropriate that, in order to minimise the burden on operators and avoid any possible duplication, for high-risk AI systems related to products which are covered by existing Union harmonisation legislation based on the New Legislative Framework, the compliance of those AI systems with the requirements of this Regulation should be assessed as part of the conformity assessment already provided for in that law. The applicability of the requirements of this Regulation should thus not affect the specific logic, methodology or general structure of conformity assessment under the relevant Union harmonisation legislation.

Recital 125

Given the complexity of high-risk AI systems and the risks that are associated with them, it is important to develop an adequate conformity assessment procedure for high-risk AI systems involving notified bodies, so-called third party conformity assessment. However, given the current experience of professional pre-market certifiers in the field of product safety and the different nature of risks involved, it is appropriate to limit, at least in an initial phase of application of this Regulation, the scope of application of third-party conformity assessment for high-risk AI systems other than those related to products. Therefore, the conformity assessment of such systems should be carried out as a general rule by the provider under its own responsibility, with the only exception of AI systems intended to be used for biometrics.

Recital 126

In order to carry out third-party conformity assessments when so required, notified bodies should be notified under this Regulation by the national competent authorities, provided that they comply with a set of requirements, in particular on independence, competence, absence of conflicts of interests and suitable cybersecurity requirements. Notification of those bodies should be sent by national competent authorities to the Commission and the other Member States by means of the electronic notification tool developed and managed by the Commission pursuant to Article R23 of Annex I to Decision No 768/2008/EC.

Recital 128

In line with the commonly established notion of substantial modification for products regulated by Union harmonisation legislation, it is appropriate that whenever a change occurs which may affect the compliance of a high-risk AI system with this Regulation (e.g. change of operating system or software architecture), or when the intended purpose of the system changes, that AI system should be considered to be a new AI system which should undergo a new conformity assessment. However, changes occurring to the algorithm and the performance of AI systems which continue to ‘learn’ after being placed on the market or put into service, namely automatically adapting how functions are carried out, should not constitute a substantial modification, provided that those changes have been pre-determined by the provider and assessed at the moment of the conformity assessment.

Recital 147

It is appropriate that the Commission facilitates, to the extent possible, access to testing and experimentation facilities to bodies, groups or laboratories established or accredited pursuant to any relevant Union harmonisation legislation and which fulfil tasks in the context of conformity assessment of products or devices covered by that Union harmonisation legislation. This is, in particular, the case as regards expert panels, expert laboratories and reference laboratories in the field of medical devices pursuant to Regulations (EU) 2017/745 and (EU) 2017/746.

Art. 44 AI Act – Certificates arrow_right_alt

Art. 45 AI Act – Information obligations of notified bodies arrow_right_alt

Art. 46 AI Act – Derogation from conformity assessment procedure arrow_right_alt

Art. 47 AI Act – EU declaration of conformity arrow_right_alt

Art. 48 AI Act – CE marking arrow_right_alt

Art. 49 AI Act – Registration arrow_right_alt