How Maine is Wading into the Regulation of Explicit AI-Generated Images of Children
Child sexual abuse material is illegal under both state and federal law, a standard that has been upheld for decades. However, the rapid advancement of Artificial Intelligence has created significant loopholes in these laws.
For instance, a recent case in Maine highlighted this issue when a man manipulated images of children playing soccer into sexually explicit content using AI. Although the police were aware of his actions, they could not charge him under existing Maine law, as reported by the Bangor Daily News.
Proposed Legislative Changes
In response to these concerns, a proposal is currently before the Maine Legislature aimed at amending the criminal code to enable charges in such cases.
The bill, known as LD 524 and sponsored by Rep. Amy Kuhn (D-Falmouth), seeks to expand the definition of disseminating child sexual abuse material to include materials created or modified through generative AI or machine learning.
“It’s not a hypothetical anymore,” stated Kuhn. “This technology has created new ways to victimize children, and our laws haven’t been updated to adequately protect against that.”
Historical Context
Last year, concerns over constitutional implications led lawmakers to pass a diluted version of the bill, LD 1944, which added the dissemination of “morphed images” as a form of harassment under the “revenge porn” law. However, a broader proposal is now back on the table, and lawmakers from both parties appear receptive to it.
During the first week of this year’s legislative session, the Judiciary Committee advanced the bill through a unanimous bipartisan vote. The Criminal Justice and Public Safety Committee also agreed that the changes would have a moderate impact on Maine’s criminal justice system. Votes in the chambers are expected in the coming weeks.
State Trends and Federal Pushback
Currently, 28 states have enacted bans on the creation of AI-generated child sexual abuse material, as tracked by MultiState, a government relations firm. Nearby New Hampshire has classified the use of AI to create “intimate visual representations” of children as a Class B felony.
This legislative momentum occurs despite President Donald Trump’s executive order in December aiming to prevent states from enacting their own AI regulations. Such a move marks a notable shift from the traditional federalist structure of American governance, likely inviting legal challenges.
Exponential Increase in Reports
According to Maine State Police Lt. Jason Richards, who has over 21 years of experience investigating child sexual abuse and exploitation, the methods of victimization have increased dramatically. Richards testified that AI has been exploited in ways never before seen, stating, “You can’t really put the genie back in the bottle” when it comes to removing explicit images from the internet.
In 2024, the National Center for Missing and Exploited Children received 67,000 reports of AI-generated child sexual abuse material, with that number skyrocketing to 485,000 in the first half of 2025—an alarming 624% increase.
Contentious Discussions Surrounding the Bill
The bill in Maine was developed in collaboration with various public safety organizations, including the Maine Chiefs of Police Association and the Maine Coalition Against Sexual Assault. However, the Maine Association of Criminal Defense Lawyers has expressed opposition, arguing that the bill is “well-intentioned but dangerously overbroad.”
Critics point out that the revised definition of “child sexual abuse material” to include AI-created images that “appear to depict” a minor risks violating First Amendment protections. This concern is rooted in a 2002 U.S. Supreme Court ruling that struck down parts of a federal child pornography law, affirming that graphic manipulation of images, even if they suggest children are engaged in sexual acts, is protected speech.