Category: Regulatory Compliance

Enforcing the EU AI Act: Challenges and Responsibilities

The European Union Artificial Intelligence Act (AI Act), which came into effect on August 1, 2024, introduces a risk-based framework for regulating AI, prohibiting certain unacceptable practices and imposing requirements on high-risk AI systems. One of the key challenges ahead is ensuring effective enforcement of the AI Act across different member states and authorities.

Read More »

EU AI Act: Key Compliance Challenges Ahead

On 2 February 2025, the European Union’s Artificial Intelligence Act (EU AI Act) became legally binding, introducing significant obligations for AI practices and literacy requirements. The Act establishes a risk-based framework for AI systems, categorizing them into four tiers and prohibiting certain harmful AI applications.

Read More »

EU Bans AI Systems with Unacceptable Risks

The European Commission has updated the AI Act to ban AI systems that pose an “unacceptable risk,” including practices such as social scoring and harmful manipulations. The Act categorizes AI systems into four risk levels, with high-risk systems requiring strict compliance and conformity assessments before market placement.

Read More »

AI Accountability: Ensuring Trust in Technology

The AI Accountability Policy Report emphasizes the importance of establishing a framework for assessing the trustworthiness of AI systems and ensuring transparency in their operations. It highlights the collaborative efforts of the Biden-Harris Administration and various stakeholders to promote responsible AI development and address potential risks associated with AI technologies.

Read More »

Ensuring Accountability in AI Systems

AI actors must be accountable for the proper functioning of AI systems and adhere to established principles, ensuring traceability throughout the AI system lifecycle. This includes applying a systematic risk management approach to address potential risks associated with AI, such as harmful bias and human rights concerns.

Read More »

European AI Act Compliance: What Medical Device Companies Need to Know

Compliance with the EU’s AI Act for Class IIa or above AI-related medical devices is not a cause for panic, according to Stephen Gilbert, a professor of medical device regulatory science. He suggests that those already preparing for compliance with the EU Medical Device Regulation (MDR) can expect a similar process for the AI Act.

Read More »

Mastering EU AI Act Compliance Before the Deadline

As the February 2nd deadline approaches, organizations must align with the EU AI Act’s requirements, focusing on AI literacy and compliance to avoid legal pitfalls. This proactive approach not only ensures adherence to regulations but also fosters a culture of responsible AI use.

Read More »

European AI Regulation: A New Era of Responsible Innovation

The European regulation on artificial intelligence (AI) came into force on August 1, 2024, aiming to promote responsible development and deployment of AI within the EU. It establishes clear requirements for developers and deployers based on specific risk assessments associated with AI technologies.

Read More »

AI Policy Deadline: What Employees Need to Know

The deadline for Belgian employers to implement an Artificial Intelligence (AI) policy is approaching, with a requirement to establish guidelines for AI use by February 2, 2025. This law aims to enhance AI literacy among employees while banning AI applications that violate fundamental European norms.

Read More »