Introduction
With the adoption of the Cyber Resilience Act (CRA, Regulation (EU) 2024/2847) and the Artificial Intelligence Act (AI Act, Regulation (EU) 2024/1689), the European Union has created two landmark legislative acts that will fundamentally shape the digital landscape.
For companies developing products that contain both digital elements and AI components, a crucial question arises:
How do these two regulations interact with each other?

Executive Summary
In brief: The AI Act generally takes precedence. However, products with digital elements that fall within the scope of the CRA and are simultaneously classified as high-risk AI systems within the meaning of Article 6 of the AI Act must additionally fulfill the essential cybersecurity requirements of the CRA.
If these high-risk AI systems meet the cybersecurity requirements set out in Annex I Parts I and II of the CRA, they are presumed to also fulfill the requirements under Article 15 of the AI Act.
By way of exception, the CRA takes precedence for products classified as “important” or “critical” products with digital elements according to Annex III or IV of the CRA. However, this precedence applies exclusively with regard to cybersecurity requirements.
The Central Interface: Article 12 CRA
The main provision for coordination between both regulations is found in Article 12 of the CRA (“High-risk AI systems”). This is supplemented by Recitals 63 to 65 and Article 52(14) CRA on market surveillance.
The legislator has recognized that many products may simultaneously fall under both regulations – particularly when dealing with high-risk AI systems that also qualify as products with digital elements.
The Principle of Conformity Presumption
Dual Applicability
Products that fall within the scope of both the CRA and are classified as high-risk AI systems according to Article 6 AI Act must, in principle, comply with both regulatory frameworks. However, the CRA establishes an important facilitation through the principle of conformity presumption.
The Conformity Presumption in Practice
When a high-risk AI system:
- meets the essential cybersecurity requirements in Annex I Part I of the CRA, and
- the procedures established by the manufacturer comply with the requirements in Annex I Part II of the CRA,
it is automatically presumed that the cybersecurity requirements according to Article 15 AI Act are also fulfilled. This compliance must be documented in the EU declaration of conformity issued under the CRA.
Practical Note: This provision avoids duplicate compliance work and creates legal certainty for manufacturers.
Enhanced Risk Assessment for AI Systems
AI-Specific Threat Scenarios
When conducting the risk assessment required under the CRA, manufacturers of high-risk AI systems must pay special attention to AI-specific cyber threats:
- Data Poisoning: Manipulation of training data to corrupt AI behavior
- Adversarial Attacks: Targeted inputs to deceive the AI system
- Model Extraction: Unauthorized access to the trained model
- Manipulation of system behavior and performance
Fundamental Rights Protection as Assessment Criterion
It is particularly noteworthy that the risk assessment must also consider potential impacts on fundamental rights according to the AI Act. This establishes a direct connection between technical cybersecurity and legal fundamental rights protection.
The Conformity Assessment Procedure: A Complex Regulatory System
General Rule: Priority of the AI Act
As a general rule, the conformity assessment procedure under the AI Act takes precedence. This means:
- The procedure provided in Article 43 AI Act also applies to the assessment of CRA cybersecurity requirements
- Notified bodies under the AI Act are also responsible for CRA conformity, provided they meet the requirements of Article 39 CRA
The Exception: Priority of the CRA
However, CRA conformity assessment procedures take precedence when all of the following conditions are cumulatively met:
- Product classification under CRA:
- The product is classified as an “important product with digital elements” (Annex III CRA) and is subject to the procedures under Article 32(2) and (3) CRA, or
- The product is classified as a “critical product with digital elements” (Annex IV CRA) and requires a European cybersecurity certificate or is subject to Article 32(3) CRA
- Simultaneously: The conformity assessment procedure under the AI Act is based on internal control according to Annex VI AI Act
Important: In these exceptional cases, CRA priority applies only to cybersecurity aspects. All other AI Act requirements continue to be assessed according to the internal control procedure of the AI Act.
Synergies and Practical Benefits
AI Regulatory Sandboxes
An important advantage for innovators: Manufacturers of products falling under both regulations can participate in AI regulatory sandboxes according to Article 57 AI Act. This enables controlled testing environments for innovative developments.
Coordinated Market Surveillance
Market surveillance is conducted in a coordinated manner:
- AI Act authorities are also responsible for CRA aspects of high-risk AI systems
- Close cooperation with CRA market surveillance authorities, CSIRTs, and ENISA
- Information exchange about relevant findings between authorities
Practical Recommendations
- Early Classification: Determine early whether your product falls under both regulations
- Integrated Compliance Strategy: Develop a holistic compliance strategy that considers both regulatory frameworks
- Documentation: Utilize the conformity presumption through careful documentation of CRA compliance
- Risk Management: Implement comprehensive risk management covering both classical cybersecurity and AI-specific threats
Legal Living Hub provides you with modern data protection consulting and AI compliance on an equal footing.
Conclusion
The overlap provisions between CRA and AI Act demonstrate the European legislator’s commitment to creating practical solutions despite complex regulation. The conformity presumption and coordinated market surveillance are important instruments for avoiding duplicate burdens.
At the same time, practical application remains complex and requires careful analysis on a case-by-case basis. Companies should seek legal advice early and align their compliance processes accordingly.
Successfully navigating this regulatory landscape is increasingly becoming a competitive advantage – both in terms of legal certainty and regarding the trust of customers and partners in the security and legal compliance of offered products.
This article provides initial guidance on the overlap provisions between CRA and AI Act. For specific legal advice on your particular case, we recommend consulting specialized legal advisors.
Date: November 2024
Legal Basis: Regulation (EU) 2024/2847 (CRA), Regulation (EU) 2024/1689 (AI Act)




