August 2024

The conclusion of the first independent DSA audit period for VLOPs and VLOSEs

The Digital Services Act (DSA) is a set of rules designed to create a secure and trustworthy online environment in the European Union (EU). It imposes specific obligations on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) with over 45 million monthly active users in the EU, including disclosing information, implementing complaint mechanisms, and undergoing annual independent audits. The first audit period closed on 25 August 2024, and VLOPs and VLOSEs must submit an audit report, describe how they will address any operational recommendations, and make the audit report publicly available within three months of receiving it. Resources are available for those wanting to learn more about the DSA.

Experimenting before marketing: Regulatory sandboxes under the EU AI Act

The EU AI Act seeks to strike a balance between regulating the risks associated with artificial intelligence (AI) technologies and promoting innovation. The Act introduces regulatory sandboxes to allow providers to experiment with AI systems before marketing. The sandboxes will be established by the national authorities in physical, digital, or hybrid forms and involve testing AI systems in real-world conditions. SMEs and startups are prioritized for participation in the sandboxes. Providers must observe the conditions and requirements of the sandbox plan and are liable for harm inflicted on third parties as a result of sandbox activities. The sandboxes are planned to become fully operational and effective within 24 months. Compliance with the Act is required for all relevant AI systems in the marketing phase, and early preparation for compliance is vital.

How the EU AI Act interacts with EU product safety legislation

The evolving technology landscape, particularly with the emergence of AI-driven products, poses new challenges to ensuring product safety. The EU already has a comprehensive product safety framework consisting of general cross-sectoral legislation and various Union harmonization legislation specific to products. The AI Act, which became effective on August 1, 2024, introduces specific requirements for AI systems and models, including those related to cybersecurity and human oversight, to ensure their safety. The AI Act complements the existing product safety laws of the EU, including the GPSR and UHL. AI and product safety relate to each other in terms of the potential risks and safety concerns that AI-enabled products may pose, requiring solid testing, robust security measures, and ongoing precautions to mitigate risks and build a safer technological ecosystem. The blog post details the scope and key safety aspects of the GPSR, explains the New Legislative Framework, outlines the mandatory requirements of the AI Act relevant to product safety, and examines how the product safety of HRAI systems will be ensured and surveilled.

July 2024

AI and Competition: How the EU AI Act will shape dynamics and enforcement

The EU's AI Act and Digital Markets Act have the potential to positively influence competition dynamics by promoting transparency, accountability, and fair competition. AI technologies can positively impact market competition by enhancing efficiency and innovation, providing deeper insights and personalization, disrupting markets, and facilitating optimized pricing strategies. However, AI can also negatively impact competition dynamics by increasing market concentration, algorithmic collusion, abuse of dominance, and erecting barriers to entry for smaller firms. The AI Act's transparency and risk assessment requirements for AI systems could help reduce concerns regarding market concentration, while its provisions on information sharing with competition authorities could bolster competition law enforcement. The Digital Markets Act's regulations regarding self-preferencing, data usage, and access rights for business users and third parties may prevent technology giants from gaining unfair competitive advantages from AI technologies. The EU is also investigating competition in virtual worlds and generative AI systems, which may supplement efforts to apply EU competition rules in AI-related contexts. The AI Act and other regulatory frameworks highlight the importance of prioritizing AI Act readiness to mitigate potential negative impacts of AI usage on competition dynamics.

What Considerations Have Been Made for SMEs Under the EU AI Act?

The EU AI Act aims to establish a trustworthy environment for AI within the EU market and employs a risk-based approach where obligations are proportionate to the risks posed by these systems. The Act outlines specific considerations for small and medium-sized enterprises (SMEs) and start-ups to support innovation among emerging players, such as giving them free access to regulatory sandboxes and being more lenient regarding documentation. The Act requires that conformity assessment fees be reduced in proportion to the size and market share of SMEs, and that guidelines and codes of conduct consider the interests and needs of SMEs. The Act also emphasizes the participation of SMEs in its governance structures and processes to ensure their views and interests are represented. Overall, compliance with the AI Act will be costly, but SMEs will benefit from initiatives and support to reduce financial burdens.