Lobbyists Push Back Against AI Code of Practice
Lobbyists are making a last-ditch attempt to delay rules for General Purpose AI (GPAI), ahead of the European Commission’s expected publication of the much-anticipated voluntary Code of Practice in the coming days.
The Code, which will apply to multipurpose AI models that underpin technologies like OpenAI’s ChatGPT, has been surrounded by significant tension. Nearly 1,000 lobbyists and experts have participated in the drafting process alongside independent chairs.
Calls to Delay Implementation
In parallel, industry representatives and the Council’s Polish Presidency have suggested that the Commission “stops the clock” on the implementation of the AI Act, given that multiple guidelines and standards are still pending. In early June, Tech Commissioner Henna Virkkunen indicated to the Council that postponing parts of the act should “not be ruled out” if necessary implementation tools are not ready.
The Code of Practice is intended to assist AI developers in complying with the law’s rules for GPAIs, which are expected to take effect on August 2.
Concerns Over Innovation
While the Commission has not formally closed the door on extending some AI Act deadlines, it has indicated that the rules for GPAIs will indeed apply in August. This has not deterred Big Tech lobby group CCIA Europe from appealing to EU heads of government to delay the GPAI rules. With the Code still not finalized weeks before the rules are set to take effect, CCIA Europe’s Head of Office, Daniel Friedlaender, warned that the EU risks “stalling innovation altogether.”
Support for AI Act Implementation
In response, academics and civil society have voiced their support through an open letter defending the AI Act and urging the Commission to “resist pressure” to derail the rules. The letter highlights systemic risks associated with GPAI models, citing potential threats related to cybersecurity, biological, radiological, and nuclear capabilities.
Members of the European Parliament (MEPs) have also expressed their support for the timely implementation of the act. MEP Michael McNamara, co-chair of the Parliament working group on the AI Act, emphasized that a considerable effort is now required to finalize the Code of Practice and the necessary standards for conformity assessment without further delays.
“A failure to bring the Code of Practice and the governance rules for GPAI models into force as planned this year would result in a significant loss of credibility for the EU that would extend far beyond the AI Act,” McNamara stated.
MEP Sergey Lagodinsky, who was involved in negotiating the AI Act, echoed the sentiment, asserting that “robust mechanisms” are necessary to ensure the law is effectively implemented and enforced.