
10 min read
The AI Act is a new regulation in the European Union that aims to provide people with safe and trustworthy artificial intelligence. But how does this regulation influence medical devices? Today, we will go through the most important information on the AI Act in healthcare.
What is the AI Act?
The AI Act (Artificial Intelligence Act/AIA) regulates the development and usage of artificial intelligence in the European Union. This law is an element of the European AI Strategy, which aims to develop human-centric technologies.
The AI Act consists of twelve Titles and nine Annexes, which comprehensively describe how AI can be used within the EU countries and what developers of these systems should do to provide the highest level of safety of its users. You can read the full text of the AI Act here.
Who supports the AI Act?
The effective implementation of the AIA will be supported by the European AI Office. This body will aim to analyse emerging systemic risks, investigate incidents of non-compliance with the regulation, and more. One of their goals is to provide manufacturers with codes of practice; thus, we recommend following the AI Office website.
When will the AI Act be adopted?
You might wonder: is the AI Act in force? The European Parliament and the Council of the EU reached a temporary agreement in December 2023. The regulation was adopted on 13 of March 2024 and it was published in the Office Journal on 12 of July 2024.
Twenty days after the publication in Office Journal, the rules will take effect. So, when will it be? The AI Act will enter into force on 1 of August 2024.
It will fully apply two years after this date (2 of August 2026), but some restrictions will appear earlier or later. As we read on the official website, the AI Act timeline is as follows:
AI prohibitions will take effect after six months.
The governance rules and the obligations for general-purpose AI models will become applicable after 12 months.
The rules for AI systems embedded into regulated products will apply after 36 months. This date is especially important, as it concerns medical devices.
This AI Act timeline allows manufacturers a reasonable period to ensure their products and processes are in compliance with the new regulation.
9 things to know about the AI Act in healthcare
Now that we got through the crucial information about the regulation, let’s focus on the AI Act impact on the healthcare industry.
1. Meet the AIMDSW
The AI Act defines AI systems as “a machine-based system designed to operate with varying levels of autonomy, that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments” (source: AI Act).
But how can we refer to AI-based medical software? In a document prepared by the TÜV AI.LAB, we find the abbreviation AIMDSW, which stands for Artificial Intelligence in Medical Device Software. And how can AI be used in medical software? For example there can be an app which helps with diagnosis of disease by interpreting patient’s health data. You can find other examples of this kind of technology here.
GOOD TO KNOW
AIMDSW by AI Act stands for Artificial Intelligence in Medical Device Software.
Although AIMDSW is not yet commonly used, it’s worth remembering this abbreviation as the number of usage of AI in healthcare grows.
2. AI Act and MDR/IVDR harmonisation
You should know that MDR (Medical Device Regulation) and IVDR (In Vitro Diagnostic Device Regulation) are included in the List of Union Harmonisation Legislation of the AI Act (Annex I).
It's excellent news as it means that the EU aimed to make conformity assessments according to the MDR/IVDR and AI Act more complementary, logical, and straightforward.
3. AI Act and MDR risk classes relationship
Another thing you should learn about the AI Act in healthcare is the different approaches to devices’ risk classification.
As you probably already know, the MDR divides medical devices into four risk-class categories: I, IIa, IIb, and III. Depending on the risk class, your medical software will have to meet different requirements to maintain compliance with the MDR. It’s worth noting that for classes IIa and above, you will have to go through the conformity assessment with the notified body.
TIP
Do you plan to go through MDR certification? Read our guide on medical software certification.

The AI Act is also a risk-based regulation. This regulation classifies artificial intelligence systems based on four levels of risk:
minimal risk – there is no need for compliance assessment of the AI system of this class (e.g., spam filters),
limited risk – there will be some demands in terms of transparency (e.g., when people talk to chatbots, they have to be informed that they converse with AI),
high risk – there is a need for complex documentation and compliance assessment by the notified body of those AI systems (e.g., AI systems which could put the life and health of people at risk, such as medical devices),
unacceptable risk – those AI systems are prohibited (e.g., AI systems that deploy subliminal techniques beyond a person’s consciousness).
TIP
You can find information about prohibited AI systems in Chapter II, Article 5 of the AI Act. We recommend familiarising yourself with it, to make sure that your systems are not considered unacceptable by the EU.
The correlation between MDR/IVDR and AI Act classes
The general rule stated in Chapter III, Section 1 and Recital 50 in the preamble of the AIA is that if the medical software which uses an AI requires a conformity assessment according to the MDR, then the AI system should be considered high-risk according to the AI Act.
For medical devices classified as Class IIa and higher that incorporate AI solutions, these devices will be automatically categorised as high-risk. This classification means that, during the conformity assessment conducted by the notified body under the MDR (Medical Device Regulation), they will also evaluate compliance with the requirements established by the AI Act. It ensures that all necessary medical safety and AI functionality standards are thoroughly met.
TIP
If you develop software that uses artificial intelligence solutions (e.g. skin cancer detection aids) and it is classified as a class IIa, IIb or III medical device under the MDR, the software will automatically be classified as high-risk AI.
4. AI Act compliance process step-by-step
First of all, you should remember that compliance with the AI Act concerns only high-risk AI solutions. If you have developed AIMDSW, you will begin a process which consists of five crucial steps:
Your company's quality management system (QMS) is crucial for compliance with various regulations. Thus, we suggest that you review your QMS to guarantee alignment with the AI Act requirements.
To ensure comprehensive compliance, we encourage you to familiarise yourself with Article 11 and Annex IV of the AI Act, which outline the necessary details for technical documentation. When preparing your technical documentation under the MDR, incorporate the additional requirements specified by the AI Act. This will ensure that your documentation meets both regulatory standards.
One conformity assessment to both regulations, the AI Act and the MDR that involves a notified body.
All high-risk AI systems will have to be registered in the EU database. Will it be a separate database from MDR? We don’t know yet.
Create a comprehensive declaration of conformity that addresses the requirements of both the MDR and the AI Act.
And that’s it! Now, your AI system can be placed on the market. However, you should remember that in case of any changes in the AI system’s lifecycle you must get back to the second step. Also, you are obliged to conduct post-market monitoring to ensure ongoing compliance with the MDR and the AI Act.
5. Requirements for compliance with AI Act
There are many requirements the AIMDSW manufacturer must meet to comply with the AI Act, if their system is considered high-risk:
risk management system – “a continuous iterative process planned and run throughout the entire lifecycle of a high-risk AI system, requiring regular systematic review and updating”,
data governance – you have to provide information on techniques such as training the AI models or testing data sets,
technical documentation – information including intended purpose, model’s training and testing process, description of the user interface, and more (you can find information about technical documentation in Annex IV),
record-keeping – you have to maintain automatic recording of events over the AI’s lifetime,
transparency and provision of information to deployers – “AI systems shall (...) ensure that their operation is sufficiently transparent to enable deployers to interpret a system’s output and use it appropriately” (more in Chapter III, Section 2),
human oversight – AI system should have “human-machine interface tools, that can be effectively overseen by natural persons during the period in which they are in use”,
cybersecurity – AI systems should “achieve an appropriate level of accuracy, robustness, and cybersecurity, and that they perform consistently in those respects throughout their lifecycle” (source: AI Act).
You can read about the requirements for high-risk AI systems in Chapter III (Section 2) of the AI Act. What’s more, in Chapter III, Section 3 you will find the obligations of providers and deployers of high-risk AI systems.
6. More costly and timely compliance process
Although the AI Act provides us with safer and more trustworthy AI, it sets some challenges for medical software manufacturers.
We can’t know for sure, if there is a possibility that the compliance assessment by the notified body both with MDR/IVDR and AI Act will increase the cost of the process and the time it takes to complete it.
The audits will need to cover a broader range of requirements, including those specific to medical devices under the MDR and those related to AI systems under the AI Act, making it necessary to allocate more notified resources (hours).
Many regulatory experts point out that while this scenario might be a bit of a discomfort for large companies, startups and SMEs (small- and medium-sized enterprises) might face an impassable barrier. Therefore, there is a risk of some solutions using AI in healthcare not being placed on the market, as companies won’t be able to pay for a notified body services.
7. New tasks for notified bodies
The AIA brings challenges not only for medical software manufacturers but also for notified bodies. These organisations must conduct conformity assessments according to the MDR/IVDR and the AI Act.
The AI Act sets out a number of requirements for notified bodies that will be authorised to assess compliance of the AI-based solution with the EU regulation (Chapter III, Section 4, Article 31). At least for now, there is no notified body which meets these requirements.
It means that notified bodies must increase their competence in AI-related technologies. This process might involve hiring more experts and training the already employed staff to provide manufacturers with the highest quality services.
8. Non-compliance penalties for healthcare in the AI Act
It is worth being aware that the European Union will impose penalties on AI manufacturers that do not comply with the requirements set out by the AI Act.
For companies that don't comply with the prohibition of AI practices pointed out in Article 5, penalties could reach €35 million or up to 7% of a company's total worldwide annual turnover for the preceding financial year, whichever is higher.
What’s more, “non-compliance of an AI system with any (...) provisions related to operators of notified bodies (...), shall be subjected to administrative fines up to €15 million or, if the offender is a company, up to 3% of its total worldwide annual turnover for the preceding financial year, whichever is higher” (source: AIA).
Last but not least, supplying incorrect, incomplete or misleading information to notified bodies and national competent authorities shall be subjected to fines up to €7,5 million or up to 1% of the company’s total worldwide annual turnover for the preceding financial year, whichever is higher.
You will find all the necessary information about the penalties in Chapter XII named Penalties.
9. Beware the ISO 42001
December 2023 was a groundbreaking time for AI regulation. That month, the first standard regulating the use of AI, ISO 42001, was published after almost 3.5 years of work by the normalisation committee.
ISO 42001 “specifies requirements for establishing, implementing, maintaining, and continually improving an Artificial Intelligence Management System (AIMS) within organisations”.
Many assume that ISO 42001 will become a harmonised European standard (harmonisation means reducing conflicting and inconsistent definitions, concepts, and terminology between standards). Thus, compliance with ISO 42001 may be necessary to meet the requirements stated in the AI Act. Therefore, we recommend that you keep an eye on this standard, too.
GOOD TO KNOW
What if you have implemented multiple ISO standards in your organisation? The solution is to develop an Integrated Management System, which will allow you to optimise internal processes and improve your company’s performance. It can be a challenge, but fortunately, ISO created a guide on maintaining several standards, which you can access here.

So, how to prepare for the AI Act in healthcare?
The AIA is the first law to regulate AI in the European market. While adapting to it may be challenging, do not worry. We have a few recommendations that might help you through this process.
Verify if your medical software is AIMDSW. If you don’t use technologies specified in the AI Act, you don’t have to do anything until this changes.
If you implement an AI system in your medical software and it is classified as medical device class IIa or higher according to the MDR, it is advisable to familiarise yourself with the information regarding high-risk AI systems and the requirements for them (Chapter III of the AI Act).
TIP
An NGO named Future of Life Institute (FLI) has provided the AI Act Compliance Checker. We recommend that you use the form presented on their website to make sure that your system does (not) require compliance assessment.
Familiarise yourself with the transition periods. Find out how much time you have to conduct a conformity assessment (check out Article 111) and then talk to your notified body. Don’t wait to determine if they perform one according to the AI Act.
Contact the European AI Office website if you have any questions concerning the role of AI Act in healthcare. They aim to help companies comply with the AI Act and are thus the best source of information on this regulation.
Last but not least, we would like to offer you a free Regulatory Consultation with our experts. During this meeting, you can ask all the questions you might have about developing a medical device in accordance with European regulations, such as the AI Act or MDR/IVDR.