My favourites

CHAPTER III – High-risk AI systems (Art. 6-49)

Art. 6 AI Act – Classification rules for high-risk AI systems arrow_right_alt

Art. 7 AI Act – Amendments to Annex III arrow_right_alt

Art. 8 AI Act – Compliance with the requirements arrow_right_alt

Art. 9 AI Act – Risk management system arrow_right_alt

Art. 10 AI Act – Data and data governance arrow_right_alt

Art. 11 AI Act – Technical documentation arrow_right_alt

Art. 12 AI Act – Record-keeping arrow_right_alt

Art. 13 AI Act – Transparency and provision of information to deployers arrow_right_alt

Art. 14 AI Act – Human oversight arrow_right_alt

Art. 15 AI Act – Accuracy, robustness and cybersecurity arrow_right_alt

Art. 16 AI Act – Obligations of providers of high-risk AI systems arrow_right_alt

Art. 17 AI Act – Quality management system arrow_right_alt

Art. 18 AI Act – Documentation keeping arrow_right_alt

Art. 19 AI Act – Automatically generated logs arrow_right_alt

Art. 20 AI Act – Corrective actions and duty of information arrow_right_alt

Art. 21 AI Act – Cooperation with competent authorities arrow_right_alt

Art. 22 AI Act – Authorised representatives of providers of high-risk AI systems arrow_right_alt

Art. 23 AI Act – Obligations of importers arrow_right_alt

Art. 24 AI Act – Obligations of distributors arrow_right_alt

Art. 25 AI Act – Responsibilities along the AI value chain arrow_right_alt

Art. 26 AI Act – Obligations of deployers of high-risk AI systems arrow_right_alt

Art. 27 AI Act – Fundamental rights impact assessment for high-risk AI systems arrow_right_alt

Art. 28 AI Act – Notifying authorities arrow_right_alt

Art. 29 AI Act – Application of a conformity assessment body for notification arrow_right_alt

Art. 30 AI Act – Notification procedure arrow_right_alt

Art. 31 AI Act – Requirements relating to notified bodies arrow_right_alt

Art. 32 AI Act – Presumption of conformity with requirements relating to notified bodies arrow_right_alt

Art. 33 AI Act – Subsidiaries of notified bodies and subcontracting arrow_right_alt

Art. 34 AI Act – Operational obligations of notified bodies arrow_right_alt

Art. 35 AI Act – Identification numbers and lists of notified bodies arrow_right_alt

Art. 36 AI Act – Changes to notifications arrow_right_alt

Art. 37 AI Act – Challenge to the competence of notified bodies arrow_right_alt

Art. 38 AI Act – Coordination of notified bodies arrow_right_alt

Art. 39 AI Act – Conformity assessment bodies of third countries arrow_right_alt

Art. 40 AI Act – Harmonised standards and standardisation deliverables arrow_right_alt

Art. 41 AI Act – Common specifications arrow_right_alt

Art. 42 AI Act – Presumption of conformity with certain requirements arrow_right_alt

Art. 43 AI Act – Conformity assessment arrow_right_alt

Art. 44 AI Act – Certificates arrow_right_alt

Art. 45 AI Act – Information obligations of notified bodies arrow_right_alt

Art. 46 AI Act – Derogation from conformity assessment procedure arrow_right_alt

  1. By way of derogation from Article 43 and upon a duly justified request, any market surveillance authority may authorise the placing on the market or the putting into service of specific high-risk AI systems within the territory of the Member State concerned, for exceptional reasons of public security or the protection of life and health of persons, environmental protection or the protection of key industrial and infrastructural assets. That authorisation shall be for a limited period while the necessary conformity assessment procedures are being carried out, taking into account the exceptional reasons justifying the derogation. The completion of those procedures shall be undertaken without undue delay.
  2. In a duly justified situation of urgency for exceptional reasons of public security or in the case of specific, substantial and imminent threat to the life or physical safety of natural persons, law-enforcement authorities or civil protection authorities may put a specific high-risk AI system into service without the authorisation referred to in paragraph 1, provided that such authorisation is requested during or after the use without undue delay. If the authorisation referred to in paragraph 1 is refused, the use of the high-risk AI system shall be stopped with immediate effect and all the results and outputs of such use shall be immediately discarded.
  3. The authorisation referred to in paragraph 1 shall be issued only if the market surveillance authority concludes that the high-risk AI system complies with the requirements of Section 2. The market surveillance authority shall inform the Commission and the other Member States of any authorisation issued pursuant to paragraphs 1 and 2. This obligation shall not cover sensitive operational data in relation to the activities of law-enforcement authorities.
  4. Where, within 15 calendar days of receipt of the information referred to in paragraph 3, no objection has been raised by either a Member State or the Commission in respect of an authorisation issued by a market surveillance authority of a Member State in accordance with paragraph 1, that authorisation shall be deemed justified.
  5. Where, within 15 calendar days of receipt of the notification referred to in paragraph 3, objections are raised by a Member State against an authorisation issued by a market surveillance authority of another Member State, or where the Commission considers the authorisation to be contrary to Union law, or the conclusion of the Member States regarding the compliance of the system as referred to in paragraph 3 to be unfounded, the Commission shall, without delay, enter into consultations with the relevant Member State. The operators concerned shall be consulted and have the possibility to present their views. Having regard thereto, the Commission shall decide whether the authorisation is justified. The Commission shall address its decision to the Member State concerned and to the relevant operators.
  6. Where the Commission considers the authorisation unjustified, it shall be withdrawn by the market surveillance authority of the Member State concerned.
  7. For high-risk AI systems related to products covered by Union harmonisation legislation listed in Section A of Annex I, only the derogations from the conformity assessment established in that Union harmonisation legislation shall apply.
Related
Close tabsclose
  • 130

Recital 130

Under certain conditions, rapid availability of innovative technologies may be crucial for health and safety of persons, the protection of the environment and climate change and for society as a whole. It is thus appropriate that under exceptional reasons of public security or protection of life and health of natural persons, environmental protection and the protection of key industrial and infrastructural assets, market surveillance authorities could authorise the placing on the market or the putting into service of AI systems which have not undergone a conformity assessment. In duly justified situations, as provided for in this Regulation, law enforcement authorities or civil protection authorities may put a specific high-risk AI system into service without the authorisation of the market surveillance authority, provided that such authorisation is requested during or after the use without undue delay.

Art. 47 AI Act – EU declaration of conformity arrow_right_alt

Art. 48 AI Act – CE marking arrow_right_alt

Art. 49 AI Act – Registration arrow_right_alt