AI Adoption Trends and Governance Challenges in 2025

AI Survey: Adoption Grows, Governance Gaps Remain

The survey, conducted in the first half of 2025, gathered responses from 80 professionals from across a range of industries, with the majority from the financial services and ICT sectors. Respondents included technology leaders, legal counsel, risk and compliance professionals, and C-suite executives.

AI Adoption is Growing, but Maturity Varies

The survey shows that nearly all organisations are engaging with AI in some form, with 10% saying that AI is fundamental to their operations. Many are still in the early stages of adoption, with 21% providing employee access to generative AI tools and 33% developing or testing proof-of-concepts.

It is evident that AI is clearly on the agenda for most organisations, but the journey from experimentation to integration is still underway. The challenge now is to move from isolated use cases to enterprise-wide strategies that are legally sound and ethically grounded.

Governance Gaps and Role Uncertainty

Despite growing adoption, 38% of organisations have not yet assigned responsibility for AI implementation to a specific individual. Where responsibility has been assigned, there is no clear consensus on where it should sit—roles range from CTOs and Heads of Data to General Counsel and COOs.

As AI strategies mature, there is a growing need for clear ownership. Establishing defined accountability structures will help organisations manage risk effectively and meet evolving regulatory expectations.

More clients are asking how to embed AI governance into their existing risk and compliance frameworks. It’s not just about meeting regulatory requirements—it’s about building trust and resilience into how AI is used across the business.

EU AI Act: Awareness Rising, but Impact Potentially Underestimated

While 61% of respondents identify as Deployers under the EU AI Act and a further 14% identify as Providers, 25% are still unsure of their classification—an important distinction given the differing compliance obligations. Only 14% believe the Act will have a high impact on their organisation, despite 28% developing AI tools in-house and 40% working with third parties to build bespoke solutions.

Misclassification under the EU AI Act could lead to serious compliance gaps. Organisations need to understand their role in the AI ecosystem and prepare accordingly.

There is a shift in how AI is viewed—not just as a technology issue, but as a strategic business priority. This shift is driving more cross-functional collaboration, particularly between legal, compliance, and technology teams.

Governance Frameworks Still Developing

While 53% of organisations have an AI usage policy and a technology committee in place, other governance elements are less mature:

  • Only 20% have an approved AI risk policy
  • Only 20% have an AI procurement policy
  • Just 25% have approved AI literacy training for employees and board members

The uneven development of governance frameworks highlights the need for a more coordinated approach. AI literacy, in particular, is becoming not just a best practice—but a legal obligation.

Global Outlook

Despite regulatory divergence globally, 76% of respondents say EU regulations take precedence for their organisation, reinforcing the EU’s role as a global leader in AI governance.

More Insights

AI Readiness Framework for the Pharmaceutical Industry

This article presents an AI readiness assessment framework tailored for the pharmaceutical industry, emphasizing the importance of aligning AI initiatives with regulatory standards and ethical...

AI as a Strategic Partner in Governance

The UAE has announced that a National Artificial Intelligence System will become a non-voting member of all federal and government company boards, marking a significant shift in governance. This...

New Code of Practice for AI Compliance Set for 2025

The European Commission announced that a code of practice to help companies comply with the EU's artificial intelligence rules may only be implemented by the end of 2025. This delay follows calls from...

New Code of Practice for AI Compliance Set for 2025

The European Commission announced that a code of practice to help companies comply with the EU's artificial intelligence rules may only be implemented by the end of 2025. This delay follows calls from...

AI Governance: The Key to Successful Enterprise Implementation

Artificial intelligence is at a critical juncture, with many enterprise AI initiatives failing to reach production and exposing organizations to significant risks. Effective AI governance is essential...

AI Code Compliance: Companies May Get a Grace Period

The commission is considering providing a grace period for companies that agree to comply with the new AI Code. This initiative aims to facilitate a smoother transition for businesses adapting to the...

Texas Enacts Groundbreaking AI Governance Law

On June 22, 2025, Texas enacted the Responsible Artificial Intelligence Governance Act, making it the second state to implement comprehensive AI legislation. The act establishes a framework for the...

Texas Enacts Groundbreaking AI Governance Law

On June 22, 2025, Texas enacted the Responsible Artificial Intelligence Governance Act, making it the second state to implement comprehensive AI legislation. The act establishes a framework for the...

Laws in Europe Combatting Deepfakes

Denmark has introduced a law that grants individuals copyright over their likenesses to combat deepfakes, making it illegal to share such content without consent. Other European countries are also...