My favourites

CHAPTER IX – Post-market monitoring, information sharing, market surveillance (Art. 72-94)

Art. 72 AI Act – Post-market monitoring by providers and post-market monitoring plan for high-risk AI systems arrow_right_alt

  1. Providers shall establish and document a post-market monitoring system in a manner that is proportionate to the nature of the AI technologies and the risks of the high-risk AI system.
  2. The post-market monitoring system shall actively and systematically collect, document and analyse relevant data which may be provided by deployers or which may be collected through other sources on the performance of high-risk AI systems throughout their lifetime, and which allow the provider to evaluate the continuous compliance of AI systems with the requirements set out in Chapter III, Section 2. Where relevant, post-market monitoring shall include an analysis of the interaction with other AI systems. This obligation shall not cover sensitive operational data of deployers which are law-enforcement authorities.
  3. The post-market monitoring system shall be based on a post-market monitoring plan. The post-market monitoring plan shall be part of the technical documentation referred to in Annex IV. The Commission shall adopt an implementing act laying down detailed provisions establishing a template for the post-market monitoring plan and the list of elements to be included in the plan by 2 February 2026. That implementing act shall be adopted in accordance with the examination procedure referred to in Article 98(2).
  4. For high-risk AI systems covered by the Union harmonisation legislation listed in Section A of Annex I, where a post-market monitoring system and plan are already established under that legislation, in order to ensure consistency, avoid duplications and minimise additional burdens, providers shall have a choice of integrating, as appropriate, the necessary elements described in paragraphs 1, 2 and 3 using the template referred in paragraph 3 into systems and plans already existing under that legislation, provided that it achieves an equivalent level of protection.The first subparagraph of this paragraph shall also apply to high-risk AI systems referred to in point 5 of Annex III placed on the market or put into service by financial institutions that are subject to requirements under Union financial services law regarding their internal governance, arrangements or processes.
Related
Close tabsclose
  • 155

Recital 155

In order to ensure that providers of high-risk AI systems can take into account the experience on the use of high-risk AI systems for improving their systems and the design and development process or can take any possible corrective action in a timely manner, all providers should have a post-market monitoring system in place. Where relevant, post-market monitoring should include an analysis of the interaction with other AI systems including other devices and software. Post-market monitoring should not cover sensitive operational data of deployers which are law enforcement authorities. This system is also key to ensure that the possible risks emerging from AI systems which continue to ‘learn’ after being placed on the market or put into service can be more efficiently and timely addressed. In this context, providers should also be required to have a system in place to report to the relevant authorities any serious incidents resulting from the use of their AI systems, meaning incident or malfunctioning leading to death or serious damage to health, serious and irreversible disruption of the management and operation of critical infrastructure, infringements of obligations under Union law intended to protect fundamental rights or serious damage to property or the environment.

Art. 73 AI Act – Reporting of serious incidents arrow_right_alt

Art. 74 AI Act – Market surveillance and control of AI systems in the Union market arrow_right_alt

Art. 75 AI Act – Mutual assistance, market surveillance and control of general-purpose AI systems arrow_right_alt

Art. 76 AI Act – Supervision of testing in real world conditions by market surveillance authorities arrow_right_alt

Art. 77 AI Act – Powers of authorities protecting fundamental rights arrow_right_alt

Art. 78 AI Act – Confidentiality arrow_right_alt

Art. 79 AI Act – Procedure at national level for dealing with AI systems presenting a risk arrow_right_alt

Art. 80 AI Act – Procedure for dealing with AI systems classified by the provider as non-high-risk in application of Annex III arrow_right_alt

Art. 81 AI Act – Union safeguard procedure arrow_right_alt

Art. 82 AI Act – Compliant AI systems which present a risk arrow_right_alt

Art. 83 AI Act – Formal non-compliance arrow_right_alt

Art. 84 AI Act – Union AI testing support structures arrow_right_alt

Art. 85 AI Act – Right to lodge a complaint with a market surveillance authority arrow_right_alt

Art. 86 AI Act – Right to explanation of individual decision-making arrow_right_alt

Art. 87 AI Act – Reporting of infringements and protection of reporting persons arrow_right_alt

Art. 88 AI Act – Enforcement of the obligations of providers of general-purpose AI models arrow_right_alt

Art. 89 AI Act – Monitoring actions arrow_right_alt

Art. 90 AI Act – Alerts of systemic risks by the scientific panel arrow_right_alt

Art. 91 AI Act – Power to request documentation and information arrow_right_alt

Art. 92 AI Act – Power to conduct evaluations arrow_right_alt

Art. 93 AI Act – Power to request measures arrow_right_alt

Art. 94 AI Act – Procedural rights of economic operators of the general-purpose AI model arrow_right_alt