South Korea’s Landmark Laws Require Labels on AI-Generated Content
South Korea has introduced a landmark set of laws regulating artificial intelligence, including rules that require companies to clearly label AI-generated content.
The AI Basic Act
The new legislation, known as the six-chapter, 43-article “AI Basic Act”, took effect in South Korea today. The laws are being billed by the government as a “world first” to be fully enforced at a national level. South Korean officials say the law is designed to improve trust and safety in AI while supporting the country’s ambition to become one of the world’s top three AI powers, alongside the United States and China.
Labeling Requirements
Under the law, companies providing AI services must label content created using artificial intelligence. According to a report by The Guardian, clearly artificial outputs such as cartoons or stylized artwork must include invisible digital watermarks, while more realistic deepfakes must carry visible labels.
High-Impact AI Systems
The legislation also introduces new obligations for so-called “high-impact AI” systems. These include tools used in areas such as medical diagnosis, hiring decisions, and loan approvals. Operators of these systems are required to carry out risk assessments and document how their systems make decisions. If a human makes the final decision, the system may fall outside this category.
Safety Reports and Compliance
Developers of extremely powerful AI models must submit safety reports, although government officials acknowledge that the threshold has been set so high that no AI models currently in use worldwide meet it.
Companies that fail to comply with the labeling requirements could face fines of up to 30 million won (about $20,400). However, the government has introduced a grace period of at least one year before penalties are enforced, allowing time to refine the rules.
Criticism and Response
According to The Guardian, local tech startups have criticized the AI Basic Act, arguing that the disclosure and labeling requirements go too far and could slow innovation. But government officials have pushed back on claims that the law is overly restrictive, maintaining that 80 to 90 percent of the legislation is focused on promoting the AI industry rather than limiting it.
Global Context
South Korea’s move comes as other countries take different approaches to AI regulation. The European Union (EU) began implementing AI-related rules, but the main provisions of its AI Act — including requirements for generative AI companies to disclose copyrighted material used in training datasets — will be phased in through 2027. Meanwhile, last month, the White House signed an executive order targeting what it describes as overly restrictive state laws that it says are holding back AI development.
In summary, South Korea’s AI Basic Act represents a significant step in AI regulation, aiming to create a safer and more transparent environment for AI use while also fostering innovation.