Copyright Law Set to Govern AI Under Trump’s Executive Order
The legal landscape for artificial intelligence is entering a period of rapid consolidation. With President Donald Trump’s executive order in December 2025 establishing a national AI framework, the era of conflicting state-level rules may be drawing to a close.
However, this doesn’t signal a reduction in AI-related legal risk. Instead, it marks the beginning of a different kind of scrutiny—one centered not on regulatory innovation but on the most powerful legal instrument already available to federal courts: copyright law.
Emerging Legal Risks
The lesson emerging from recent AI litigation, most prominently Bartz v. Anthropic PBC, is that the greatest potential liability to AI developers doesn’t come from what their models generate but from how those models were trained and the provenance of the content used in that training.
As the federal government asserts primacy over AI governance, the decisive question will be whether developers can demonstrate that their training corpora were acquired lawfully, licensed appropriately (unless in the public domain), and documented thoroughly.
Inputs Are Key
To date, no U.S. court has held that an AI model’s outputs become infringing derivative works solely because it was trained on copyrighted content. However, using an AI program to copy and incorporate certain copyrighted elements constitutes infringement.
Courts have focused on two settled propositions:
- Training on lawfully acquired materials can qualify as fair use. Both Bartz and Kadrey v. Meta Platforms, Inc. held that using legally obtained books in large-scale training is “quintessentially transformative,” a major factor courts consider in determining fair use.
- Fair use collapses when the underlying material was unlawfully obtained. This distinction is crucial. When training data includes pirated works or copies acquired outside a licensing framework, the fair-use defense evaporates, leaving straightforward infringement under 17 USC Section 106.
Bartz as Blueprint
Judge William Alsup’s ruling in Bartz bifurcated the case with precision: Training on legally purchased or licensed works qualified as fair use, while training on pirated copies did not. The court found Anthropic’s use of purchased books “exceedingly transformative,” supporting fair use. However, its downloading of over seven million pirated books led to a damages trial.
The case’s significance came when Alsup certified a class of 482,460 copyright holders whose works appeared in datasets allegedly downloaded from shadow libraries, transforming modest damages into an existential threat with potential liabilities exceeding $360 million.
Executive Order’s Impact
Trump’s executive order, often described as deregulatory, actually recentralizes AI-related legal risk. A unified federal standard means federal courts will dominate AI enforcement. The order’s creation of a Department of Justice AI litigation task force and its directive to the Department of Commerce to identify conflicting state laws ensure that future AI disputes will migrate exclusively to federal forums.
Moreover, copyright law, already fully developed and used in federal courts, will become central. The Copyright Act provides strict liability and statutory damages without proof of harm, amplifying intellectual property exposure as state AI laws are increasingly disregarded.
Licensed Training Data
Fair use isn’t a blanket excuse for opaque data acquisition. It protects only those who start with lawfully obtained copies, and it can’t cure pirated inputs or shadow library corpora. Licensing, by contrast, is a complete defense to the reproduction right most implicated in AI training. A model trained on licensed data eliminates statutory-damage exposure tied to unlawful acquisition and preserves fair-use arguments.
Outlook
Trump’s executive order may limit the regulatory chaos created by competing state mandates, but it won’t eliminate AI risk. It will shift the arena. With national preemption will come increased reliance on copyright law, making it essential for companies to demonstrate that their models rest on lawfully obtained and licensed data.