My favourites

CHAPTER IX – Post-market monitoring, information sharing, market surveillance (Art. 72-94)

Art. 72 AI Act – Post-market monitoring by providers and post-market monitoring plan for high-risk AI systems arrow_right_alt

Art. 73 AI Act – Reporting of serious incidents arrow_right_alt

Art. 74 AI Act – Market surveillance and control of AI systems in the Union market arrow_right_alt

Art. 75 AI Act – Mutual assistance, market surveillance and control of general-purpose AI systems arrow_right_alt

Art. 76 AI Act – Supervision of testing in real world conditions by market surveillance authorities arrow_right_alt

Art. 77 AI Act – Powers of authorities protecting fundamental rights arrow_right_alt

Art. 78 AI Act – Confidentiality arrow_right_alt

Art. 79 AI Act – Procedure at national level for dealing with AI systems presenting a risk arrow_right_alt

Art. 80 AI Act – Procedure for dealing with AI systems classified by the provider as non-high-risk in application of Annex III arrow_right_alt

Art. 81 AI Act – Union safeguard procedure arrow_right_alt

Art. 82 AI Act – Compliant AI systems which present a risk arrow_right_alt

Art. 83 AI Act – Formal non-compliance arrow_right_alt

Art. 84 AI Act – Union AI testing support structures arrow_right_alt

Art. 85 AI Act – Right to lodge a complaint with a market surveillance authority arrow_right_alt

Art. 86 AI Act – Right to explanation of individual decision-making arrow_right_alt

Art. 87 AI Act – Reporting of infringements and protection of reporting persons arrow_right_alt

Art. 88 AI Act – Enforcement of the obligations of providers of general-purpose AI models arrow_right_alt

Art. 89 AI Act – Monitoring actions arrow_right_alt

Art. 90 AI Act – Alerts of systemic risks by the scientific panel arrow_right_alt

Art. 91 AI Act – Power to request documentation and information arrow_right_alt

  1. The Commission may request the provider of the general-purpose AI model concerned to provide the documentation drawn up by the provider in accordance with Articles 53 and 55, or any additional information that is necessary for the purpose of assessing compliance of the provider with this Regulation.
  2. Before sending the request for information, the AI Office may initiate a structured dialogue with the provider of the general-purpose AI model.
  3. Upon a duly substantiated request from the scientific panel, the Commission may issue a request for information to a provider of a general-purpose AI model, where the access to information is necessary and proportionate for the fulfilment of the tasks of the scientific panel under Article 68(2).
  4. The request for information shall state the legal basis and the purpose of the request, specify what information is required, set a period within which the information is to be provided, and indicate the fines provided for in Article 101 for supplying incorrect, incomplete or misleading information.
  5. The provider of the general-purpose AI model concerned, or its representative shall supply the information requested. In the case of legal persons, companies or firms, or where the provider has no legal personality, the persons authorised to represent them by law or by their statutes, shall supply the information requested on behalf of the provider of the general-purpose AI model concerned. Lawyers duly authorised to act may supply information on behalf of their clients. The clients shall nevertheless remain fully responsible if the information supplied is incomplete, incorrect or misleading.
Related
Close tabsclose
  • 164

Recital 164

The AI Office should be able to take the necessary actions to monitor the effective implementation of and compliance with the obligations for providers of general-purpose AI models laid down in this Regulation. The AI Office should be able to investigate possible infringements in accordance with the powers provided for in this Regulation, including by requesting documentation and information, by conducting evaluations, as well as by requesting measures from providers of general-purpose AI models. When conducting evaluations, in order to make use of independent expertise, the AI Office should be able to involve independent experts to carry out the evaluations on its behalf. Compliance with the obligations should be enforceable, inter alia, through requests to take appropriate measures, including risk mitigation measures in the case of identified systemic risks as well as restricting the making available on the market, withdrawing or recalling the model. As a safeguard, where needed beyond the procedural rights provided for in this Regulation, providers of general-purpose AI models should have the procedural rights provided for in Article 18 of Regulation (EU) 2019/1020, which should apply mutatis mutandis, without prejudice to more specific procedural rights provided for by this Regulation.

Art. 92 AI Act – Power to conduct evaluations arrow_right_alt

Art. 93 AI Act – Power to request measures arrow_right_alt

Art. 94 AI Act – Procedural rights of economic operators of the general-purpose AI model arrow_right_alt