Mar 20, 2026
Legal AI Journal
ComplianceMarch 11, 2026

EU AI Act: Navigating Importer Obligations & Risks

AI Research Brief| 12 min read|3 sources
Abstract image representing legal compliance and AI systems in the EU

Illustration: Legal AI Journal

The EU AI Act imposes stringent obligations on importers of AI systems, particularly those classified as high-risk. This article dissects the legal framework, outlining critical compliance requirements and potential liabilities for EU-established entities bringing non-EU AI systems to market. Understanding these duties is paramount for mitigating significant administrative fines and ensuring market access.

On 21 May 2024, the EU AI Act (Regulation (EU) 2024/1689) officially entered into force, establishing a landmark regulatory framework for artificial intelligence. For entities operating within the European Union, a critical aspect of this legislation concerns the role and responsibilities of AI importers. These actors are not merely logistical intermediaries; they function as regulatory gatekeepers, bearing significant legal duties and potential liabilities, particularly for high-risk AI systems originating from outside the EU. Understanding the precise definition of an importer, the scope of their obligations, and the severe penalties for non-compliance is now an urgent imperative for any EU-established entity engaged in the commercial supply of non-EU AI systems.

Defining the AI Importer Role Under EU Law

The EU AI Act meticulously defines the role of an “importer” to establish clear accountability within the AI system supply chain. An importer is an EU-established (or EU-located) natural or legal person who places an AI system on the Union market, where that system bears the name or trademark of a non-EU provider. This definition is crucial, as it designates the importer as typically the first EU actor to commercially make a branded AI system available within the EU.

Operationalizing Key Definitions

Several definitions operationalize the importer’s role:

  • Placing on the market: This refers to the first making available of an AI system (or general-purpose AI model) on the Union market.
  • Making available on the market: This encompasses any supply for distribution or use in the Union in the course of a commercial activity, whether paid or free.
  • Putting into service: This applies when an AI system is supplied for first use directly to a deployer, or for own use in the Union, for its intended purpose.

Critically, an entity is classified as an “importer” only if it is the first EU commercial supplier of a third-country-branded AI system. If the system is placed under the EU entity's own name or trademark, that entity may instead be classified as a provider, incurring a different, more extensive set of obligations under the Act.

Scope “Hooks” and Extraterritorial Reach

Article 2 of the AI Act extends its scope to importers and distributors, and it applies to providers placing AI systems on the Union market regardless of their establishment location. Furthermore, it applies to providers and deployers located outside the EU if the AI system's output is used in the EU. This extraterritorial reach often necessitates the establishment of EU distribution and importer structures, making the importer role a key compliance touchpoint. The Act explicitly states it is “without prejudice” to existing EU data protection law (e.g., GDPR) and other consumer/product safety legal acts, meaning importers often face a cumulative stack of regulatory obligations.

Importer Obligations for High-Risk AI Systems

The core obligations for importers are primarily triggered when dealing with high-risk AI systems. These systems are identified either by falling under an Annex III category or by being a safety component or product under Annex I requiring third-party assessment. Article 23 of the AI Act details these specific duties.

Pre-Market Verification and Due Diligence

Before placing a high-risk AI system on the market, importers must conduct thorough verification. This includes ensuring that the provider has:

  • Carried out the necessary conformity assessment (as per Article 43).
  • Prepared the required technical documentation (as per Article 11 and Annex IV).
  • Affixed the CE marking.
  • Issued an EU declaration of conformity.
  • Provided all necessary instructions and documentation.

Importers must maintain records of these verifications, which serve as crucial evidence of due diligence. The absence of an authorised representative for non-EU providers of high-risk AI systems, where required by Article 22, constitutes a formal non-compliance that regulators actively scrutinize.

Ongoing Duties and Incident Management

Beyond pre-market checks, importers have ongoing responsibilities. They must maintain contact with the provider, cooperate with market surveillance authorities, and act promptly if non-compliance or risks are detected. This includes ensuring proper storage and transport conditions, retaining documentation for 10 years, and providing information to authorities upon request.

Importers also play a role in incident reporting. While Article 73 primarily addresses provider duties, deployers of high-risk AI systems are mandated to inform the provider, and subsequently the importer or distributor, of any serious incidents. Importers must establish internal channels to receive and triage these reports, escalating them to the provider or authorised representative as necessary. Delayed action in response to such incidents can significantly exacerbate regulatory scrutiny and potential penalties.

Shifting Roles and Liability Exposure

The AI Act explicitly anticipates multi-actor supply chains and cumulative roles. An importer can, under specific circumstances, be legally treated as a provider, thereby assuming the more extensive obligations associated with that role. This “role shift” is a critical consideration for compliance planning.

When an Importer Becomes a Provider

Article 25 (and Recital 84) outlines conditions under which an importer (or distributor, deployer, or third party) becomes legally treated as a provider of a high-risk AI system. This occurs primarily through:

  • Rebranding: Placing their own name or trademark on an AI system already placed on the market or put into service.
  • Substantial Modification: Making a change to the AI system that was not foreseen or planned in the initial conformity assessment and that affects compliance with high-risk requirements or alters its intended purpose.
  • Changing Intended Purpose: Modifying the intended purpose of an AI system in a way that makes it high-risk.

Should any of these conditions be met, the importer must then fulfill all provider obligations, including establishing a quality management system, conducting post-market monitoring, and adhering to the full suite of requirements under Article 16 and subsequent articles. Contractual agreements attempting to reallocate these statutory duties will not override the Act’s classification mechanism for regulatory purposes.

Transparency and Civil Liability Considerations

While importers are not the primary addressees of Article 50 transparency obligations (which largely fall on providers and deployers), they have a compliance stake. For high-risk AI, instructions for use must include extensive transparency elements as per Article 13. Moreover, if an importer assumes provider status, Article 50 obligations become directly binding.

The AI Act primarily focuses on regulatory compliance and enforcement, not a harmonized civil liability regime. However, non-compliance with the AI Act can significantly increase exposure in civil liability disputes. The EU’s updated product liability framework, Directive (EU) 2024/2853, modernizes liability rules for defective products, including AI systems, and strengthens victim protection. Importers' diligence files and verification logs will be crucial in both regulatory inquiries and potential downstream liability claims.

Enforcement, Penalties, and Compliance Strategy

Non-compliance with the EU AI Act carries substantial financial and reputational risks. Member States are required to designate market surveillance authorities that will operate under Regulation (EU) 2019/1020, enabling cross-border coordination and robust enforcement powers.

Significant Administrative Fines

Article 99 of the AI Act provides for severe administrative fines. For infringements of importer obligations under Article 23, penalties can reach up to EUR 15,000,000 or 3% of worldwide annual turnover, whichever is higher. Separate tiers exist for other infringements, such as prohibited practices or providing incorrect information to authorities. These penalties underscore the critical need for proactive compliance.

Practical Importer Compliance Checklist

To navigate these complex requirements, importers should implement a robust compliance strategy:

  • Governance and Role Determination: Establish a clear, documented method to classify each AI system offering (provider, importer, distributor, deployer) based on
1.

AI importers are EU-established entities placing non-EU AI systems on the Union market, acting as regulatory gatekeepers.

2.

Importers of high-risk AI systems must verify provider conformity, technical documentation, CE marking, and instructions before market entry.

3.

Non-compliance with importer obligations under the EU AI Act can result in administrative fines up to EUR 15,000,000 or 3% of global annual turnover.

4.

Rebranding, substantial modification, or changing the intended purpose of a high-risk AI system can shift an importer's role to that of a provider, incurring full provider obligations.

5.

Importers must establish incident intake channels and cooperate with authorities, as delayed action on non-compliance can escalate regulatory scrutiny.

Focus: EU AI Act Importer