A Maine man went to watch a children’s soccer game. He snapped photos of kids playing. Then he went home and used artificial intelligence to take the otherwise innocuous pictures and turn them into sexually explicit images.
Police know who he is. But there is nothing they could do because the images are legal to have under state law, according to Maine State Police Lt. Jason Richards, who is in charge of the Computer Crimes Unit.
While child sexual abuse material has been illegal for decades under both federal and state law, the rapid development of generative AI — which uses models to create new content based on user prompts — means Maine’s definition of those images has lagged behind other states. Lawmakers here attempted to address the proliferating problem this year but took only a partial step.
“I’m very concerned that we have this out there, this new way of exploiting children, and we don’t yet have a protection for that,” Richards said.
Two years ago, it was easy to discern when a piece of material had been produced by AI, he said. It’s now hard to tell without extensive experience. In some instances, it can take a fully clothed picture of a child and make the child appear naked in an image known as a “deepfake.” People also train AI on child sexual abuse materials that are already online.
Nationally, the rise of AI-generated child sexual abuse material is a concern. At the end of last year, the National Center for Missing and Exploited Children saw a 1,325% increase in the number of tips it received related to AI-generated materials. That material is becoming more commonly found when investigating cases of possession of child sexual abuse materials.
On Sept. 5, a former Maine state probation officer pleaded guilty to accessing with intent to view child sexual abuse materials in federal court. When federal investigators searched the man’s Kik account, they found he had sought out the content and had at least one image that was “AI-generated,” according to court documents.
The explicit material generated by AI has rapidly become intertwined with the real stuff at the same time as his staff are seeing increasing reports. In 2020, Richards’ team received 700 tips relating to child sexual abuse materials and reports of adults sexually exploiting minors online in Maine.
By the end of 2025, Richards said he expects his team will have received more than 3,000 tips. They can only investigate about 14% any given year. His team now has to discard any material that is touched by AI.
“It’s not what could happen, it is happening, and this is not material that anyone is OK with in that it should be criminalized,” Shira Burns, the executive director of the Maine Prosecutors’ Association, said.
Across the country, 43 states have created laws outlawing sexual deepfakes, and an additional 28 states have banned the creation of AI-generated child sexual abuse material. Twenty-two states have done both, according to MultiState, a government relations firm that tracks how state legislatures have passed laws governing artificial intelligence.
More than two dozen states have enacted laws banning AI-generated child sexual abuse material. Rep. Amy Kuhn, D-Falmouth, proposed doing so earlier this year. But lawmakers on the Judiciary Committee had concerns about how the proposed legislation could cause constitutional issues.

She agreed to drop that portion of the bill for now. The version of the bill that passed expanded the state’s pre-existing law against “revenge porn” to include dissemination of altered or so-called “morphed images” as a form of harassment. But it did not label morphed images of children as child sexual abuse material.
The legislation, which was drafted chiefly by the Maine Prosecutors’ Association and the Maine Coalition Against Sexual Assault, was modeled after already enacted law in other places. Kuhn said she plans to propose the expanded definition of sexually explicit material mostly unchanged from her early version when the Legislature reconvenes in January.
Maine’s lack of a law at least labeling morphed images of children as child sexual abuse material makes the state an outlier, said Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered AI. She studies the abusive uses of AI and the intersection of legislation around AI-generated content and the Constitution.
In her research, Pfefferkorn said she’s found that most legislatures that have considered changing pre-existing laws on child sexual abuse material have at least added that morphed images of children should be considered sexually explicit material.
“It’s a bipartisan area of interest to protect children online, and nobody wants to be the person sticking their hand up and very publicly saying, ‘I oppose this bill that would essentially better protect children online,’” Pfefferkorn said.
There is also pre-existing federal law and case law that Maine can look to in drafting its own legislation, she said. Morphed images of children are already banned federally, she said. While federal agencies have a role in investigating these cases, they typically handle only the most serious ones. It mostly falls on the state to police sexually explicit materials.
Come 2026, both Burns and Kuhn said they are confident that the Legislature will fix the loophole because there are plenty of model policies to follow across the country.
“We’re on the tail end of addressing this issue, but I am very confident that this is something that the judiciary will look at, and we will be able to get a version through, because it’s needed,” Burns said.
Bangor Daily News investigative reporter Sawyer Loftus may be reached at sloftus@bangordailynews.com.


