My favourites

CHAPTER IX – Post-market monitoring, information sharing, market surveillance (Art. 72-94)

Art. 72 AI Act – Post-market monitoring by providers and post-market monitoring plan for high-risk AI systems arrow_right_alt

Art. 73 AI Act – Reporting of serious incidents arrow_right_alt

Art. 74 AI Act – Market surveillance and control of AI systems in the Union market arrow_right_alt

Art. 75 AI Act – Mutual assistance, market surveillance and control of general-purpose AI systems arrow_right_alt

  1. Where an AI system is based on a general-purpose AI model, and the model and the system are developed by the same provider, the AI Office shall have powers to monitor and supervise compliance of that AI system with obligations under this Regulation. To carry out its monitoring and supervision tasks, the AI Office shall have all the powers of a market surveillance authority provided for in this Section and Regulation (EU) 2019/1020.
  2. Where the relevant market surveillance authorities have sufficient reason to consider general-purpose AI systems that can be used directly by deployers for at least one purpose that is classified as high-risk pursuant to this Regulation to be non-compliant with the requirements laid down in this Regulation, they shall cooperate with the AI Office to carry out compliance evaluations, and shall inform the Board and other market surveillance authorities accordingly.
  3. Where a market surveillance authority is unable to conclude its investigation of the high-risk AI system because of its inability to access certain information related to the general-purpose AI model despite having made all appropriate efforts to obtain that information, it may submit a reasoned request to the AI Office, by which access to that information shall be enforced. In that case, the AI Office shall supply to the applicant authority without delay, and in any event within 30 days, any information that the AI Office considers to be relevant in order to establish whether a high-risk AI system is non-compliant. Market surveillance authorities shall safeguard the confidentiality of the information that they obtain in accordance with Article 78 of this Regulation. The procedure provided for in Chapter VI of Regulation (EU) 2019/1020 shall apply mutatis mutandis.
Related
Close tabsclose
  • 161

Recital 161

It is necessary to clarify the responsibilities and competences at Union and national level as regards AI systems that are built on general-purpose AI models. To avoid overlapping competences, where an AI system is based on a general-purpose AI model and the model and system are provided by the same provider, the supervision should take place at Union level through the AI Office, which should have the powers of a market surveillance authority within the meaning of Regulation (EU) 2019/1020 for this purpose. In all other cases, national market surveillance authorities remain responsible for the supervision of AI systems. However, for general-purpose AI systems that can be used directly by deployers for at least one purpose that is classified as high-risk, market surveillance authorities should cooperate with the AI Office to carry out evaluations of compliance and inform the Board and other market surveillance authorities accordingly. Furthermore, market surveillance authorities should be able to request assistance from the AI Office where the market surveillance authority is unable to conclude an investigation on a high-risk AI system because of its inability to access certain information related to the general-purpose AI model on which the high-risk AI system is built. In such cases, the procedure regarding mutual assistance in cross-border cases in Chapter VI of Regulation (EU) 2019/1020 should apply mutatis mutandis.

Art. 76 AI Act – Supervision of testing in real world conditions by market surveillance authorities arrow_right_alt

Art. 77 AI Act – Powers of authorities protecting fundamental rights arrow_right_alt

Art. 78 AI Act – Confidentiality arrow_right_alt

Art. 79 AI Act – Procedure at national level for dealing with AI systems presenting a risk arrow_right_alt

Art. 80 AI Act – Procedure for dealing with AI systems classified by the provider as non-high-risk in application of Annex III arrow_right_alt

Art. 81 AI Act – Union safeguard procedure arrow_right_alt

Art. 82 AI Act – Compliant AI systems which present a risk arrow_right_alt

Art. 83 AI Act – Formal non-compliance arrow_right_alt

Art. 84 AI Act – Union AI testing support structures arrow_right_alt

Art. 85 AI Act – Right to lodge a complaint with a market surveillance authority arrow_right_alt

Art. 86 AI Act – Right to explanation of individual decision-making arrow_right_alt

Art. 87 AI Act – Reporting of infringements and protection of reporting persons arrow_right_alt

Art. 88 AI Act – Enforcement of the obligations of providers of general-purpose AI models arrow_right_alt

Art. 89 AI Act – Monitoring actions arrow_right_alt

Art. 90 AI Act – Alerts of systemic risks by the scientific panel arrow_right_alt

Art. 91 AI Act – Power to request documentation and information arrow_right_alt

Art. 92 AI Act – Power to conduct evaluations arrow_right_alt

Art. 93 AI Act – Power to request measures arrow_right_alt

Art. 94 AI Act – Procedural rights of economic operators of the general-purpose AI model arrow_right_alt