My favourites

CHAPTER III – High-risk AI systems (Art. 6-49)

Art. 6 AI Act – Classification rules for high-risk AI systems arrow_right_alt

Art. 7 AI Act – Amendments to Annex III arrow_right_alt

Art. 8 AI Act – Compliance with the requirements arrow_right_alt

  1. High-risk AI systems shall comply with the requirements laid down in this Section, taking into account their intended purpose as well as the generally acknowledged state of the art on AI and AI-related technologies. The risk management system referred to in Article 9 shall be taken into account when ensuring compliance with those requirements.
  2. Where a product contains an AI system, to which the requirements of this Regulation as well as requirements of the Union harmonisation legislation listed in Section A of Annex I apply, providers shall be responsible for ensuring that their product is fully compliant with all applicable requirements under applicable Union harmonisation legislation. In ensuring the compliance of high-risk AI systems referred to in paragraph 1 with the requirements set out in this Section, and in order to ensure consistency, avoid duplication and minimise additional burdens, providers shall have a choice of integrating, as appropriate, the necessary testing and reporting processes, information and documentation they provide with regard to their product into documentation and procedures that already exist and are required under the Union harmonisation legislation listed in Section A of Annex I.
Related
Close tabsclose
  • 46
  • 64

Recital 46

High-risk AI systems should only be placed on the Union market, put into service or used if they comply with certain mandatory requirements. Those requirements should ensure that high-risk AI systems available in the Union or whose output is otherwise used in the Union do not pose unacceptable risks to important Union public interests as recognised and protected by Union law. On the basis of the New Legislative Framework, as clarified in the Commission notice ‘The “Blue Guide” on the implementation of EU product rules 2022’ (20), the general rule is that more than one legal act of Union harmonisation legislation, such as Regulations (EU) 2017/745 (21) and (EU) 2017/746 (22) of the European Parliament and of the Council or Directive 2006/42/EC of the European Parliament and of the Council (23), may be applicable to one product, since the making available or putting into service can take place only when the product complies with all applicable Union harmonisation legislation. To ensure consistency and avoid unnecessary administrative burdens or costs, providers of a product that contains one or more high-risk AI systems, to which the requirements of this Regulation and of the Union harmonisation legislation listed in an annex to this Regulation apply, should have flexibility with regard to operational decisions on how to ensure compliance of a product that contains one or more AI systems with all applicable requirements of the Union harmonisation legislation in an optimal manner. AI systems identified as high-risk should be limited to those that have a significant harmful impact on the health, safety and fundamental rights of persons in the Union and such limitation should minimise any potential restriction to international trade.


(20) OJ C 247, 29.6.2022, p. 1.
(21) Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC (OJ L 117, 5.5.2017, p. 1).
(22) Regulation (EU) 2017/746 of the European Parliament and of the Council of 5 April 2017 on in vitro diagnostic medical devices and repealing Directive 98/79/EC and Commission Decision 2010/227/EU (OJ L 117, 5.5.2017, p. 176).
(23) Directive 2006/42/EC of the European Parliament and of the Council of 17 May 2006 on machinery, and amending Directive 95/16/EC (OJ L 157, 9.6.2006, p. 24).

Recital 64

To mitigate the risks from high-risk AI systems placed on the market or put into service and to ensure a high level of trustworthiness, certain mandatory requirements should apply to high-risk AI systems, taking into account the intended purpose and the context of use of the AI system and according to the risk-management system to be established by the provider. The measures adopted by the providers to comply with the mandatory requirements of this Regulation should take into account the generally acknowledged state of the art on AI, be proportionate and effective to meet the objectives of this Regulation. Based on the New Legislative Framework, as clarified in Commission notice ‘The “Blue Guide” on the implementation of EU product rules 2022’, the general rule is that more than one legal act of Union harmonisation legislation may be applicable to one product, since the making available or putting into service can take place only when the product complies with all applicable Union harmonisation legislation. The hazards of AI systems covered by the requirements of this Regulation concern different aspects than the existing Union harmonisation legislation and therefore the requirements of this Regulation would complement the existing body of the Union harmonisation legislation. For example, machinery or medical devices products incorporating an AI system might present risks not addressed by the essential health and safety requirements set out in the relevant Union harmonised legislation, as that sectoral law does not deal with risks specific to AI systems. This calls for a simultaneous and complementary application of the various legislative acts. To ensure consistency and to avoid an unnecessary administrative burden and unnecessary costs, providers of a product that contains one or more high-risk AI system, to which the requirements of this Regulation and of the Union harmonisation legislation based on the New Legislative Framework and listed in an annex to this Regulation apply, should have flexibility with regard to operational decisions on how to ensure compliance of a product that contains one or more AI systems with all the applicable requirements of that Union harmonised legislation in an optimal manner. That flexibility could mean, for example a decision by the provider to integrate a part of the necessary testing and reporting processes, information and documentation required under this Regulation into already existing documentation and procedures required under existing Union harmonisation legislation based on the New Legislative Framework and listed in an annex to this Regulation. This should not, in any way, undermine the obligation of the provider to comply with all the applicable requirements.

Art. 9 AI Act – Risk management system arrow_right_alt

Art. 10 AI Act – Data and data governance arrow_right_alt

Art. 11 AI Act – Technical documentation arrow_right_alt

Art. 12 AI Act – Record-keeping arrow_right_alt

Art. 13 AI Act – Transparency and provision of information to deployers arrow_right_alt

Art. 14 AI Act – Human oversight arrow_right_alt

Art. 15 AI Act – Accuracy, robustness and cybersecurity arrow_right_alt

Art. 16 AI Act – Obligations of providers of high-risk AI systems arrow_right_alt

Art. 17 AI Act – Quality management system arrow_right_alt

Art. 18 AI Act – Documentation keeping arrow_right_alt

Art. 19 AI Act – Automatically generated logs arrow_right_alt

Art. 20 AI Act – Corrective actions and duty of information arrow_right_alt

Art. 21 AI Act – Cooperation with competent authorities arrow_right_alt

Art. 22 AI Act – Authorised representatives of providers of high-risk AI systems arrow_right_alt

Art. 23 AI Act – Obligations of importers arrow_right_alt

Art. 24 AI Act – Obligations of distributors arrow_right_alt

Art. 25 AI Act – Responsibilities along the AI value chain arrow_right_alt

Art. 26 AI Act – Obligations of deployers of high-risk AI systems arrow_right_alt

Art. 27 AI Act – Fundamental rights impact assessment for high-risk AI systems arrow_right_alt

Art. 28 AI Act – Notifying authorities arrow_right_alt

Art. 29 AI Act – Application of a conformity assessment body for notification arrow_right_alt

Art. 30 AI Act – Notification procedure arrow_right_alt

Art. 31 AI Act – Requirements relating to notified bodies arrow_right_alt

Art. 32 AI Act – Presumption of conformity with requirements relating to notified bodies arrow_right_alt

Art. 33 AI Act – Subsidiaries of notified bodies and subcontracting arrow_right_alt

Art. 34 AI Act – Operational obligations of notified bodies arrow_right_alt

Art. 35 AI Act – Identification numbers and lists of notified bodies arrow_right_alt

Art. 36 AI Act – Changes to notifications arrow_right_alt

Art. 37 AI Act – Challenge to the competence of notified bodies arrow_right_alt

Art. 38 AI Act – Coordination of notified bodies arrow_right_alt

Art. 39 AI Act – Conformity assessment bodies of third countries arrow_right_alt

Art. 40 AI Act – Harmonised standards and standardisation deliverables arrow_right_alt

Art. 41 AI Act – Common specifications arrow_right_alt

Art. 42 AI Act – Presumption of conformity with certain requirements arrow_right_alt

Art. 43 AI Act – Conformity assessment arrow_right_alt

Art. 44 AI Act – Certificates arrow_right_alt

Art. 45 AI Act – Information obligations of notified bodies arrow_right_alt

Art. 46 AI Act – Derogation from conformity assessment procedure arrow_right_alt

Art. 47 AI Act – EU declaration of conformity arrow_right_alt

Art. 48 AI Act – CE marking arrow_right_alt

Art. 49 AI Act – Registration arrow_right_alt