Kansas lawmakers are taking a new approach in the battle against child pornography.
They are considering legislation that would outlaw anyone from using artificial intelligence to create sexually explicit images of children or adults who are victims of revenge porn.
Opponents say they believe that the proposal is unconstitutional, highlighting a 2002 U.S. Supreme Court decision striking down similar language in federal law – long before artificial intelligence emerged as an influencing force in the culture.
This comes about a year after the Legislature enacted a law requiring anyone 18 and older to verify their age before viewing online pornography.
A challenge to a similar age-verification law from Texas is now pending before the U.S. Supreme Court.
The new Kansas legislation clarifies the crime of sexual exploitation of a child so that it includes photographs, film, videos, digital or computer-generated images or pictures created by artificial intelligence.
Law enforcement says that it’s already seeing cases where photos of children and adults are manipulated to make it falsely appear a person is nude or engaging in sexual activities.
They say the images are often used to embarrass or intimidate the person with the image and occasionally they are used for blackmail.
Shawnee County District Attorney Mike Kagay said the bill is about trying to modernize the definition of what constitutes “visual depiction” in the child sexual exploitation law.
“AI is exploding,” Kagay said. “I believe this bill is absolutely necessary to protect our kids from those who would commit sexual abuse upon them.”
“We’re not talking about real kids. We’re talking about made-up computer kids,” he said.
So why does that matter, he asked.
“The technology that exists today can take any of your pictures and can morph it into whatever they want,” Kagay said.
“They can take video from this hearing, plug it into the model and generate that and send it out. You may not be able to tell the difference,” he said.
“To me that’s a horrifying thought, especially for a kid who may have to live through that.”
He said child abuse sexual material is used to groom children and variation of that material generated by artificial intelligence is no different.
“This legislation equips law enforcement with an essential tool to identify offenders and prevent further harm to children,” he said.
“Offenders who possess or distribute visual depictions, whether real or artificially created, often leave behind digital trails.
“By expanding the legal framework to include AI generated images, we increase our ability to identify offenders and intervene before they cause further harm.”
He acknowledged there could be times when law enforcement doesn’t know whether a child in sexually explicit content is real.
“Sometimes we don’t know anymore,” he said. “We’re not going to be able to tell the difference a lot of times. This covers that.”
The bill was opposed by the Board of Indigent Defense Services, which said the 2002 Supreme Court case addressed the very language included in the Kansas bill.
The U.S. Supreme Court struck down language in the Child Pornography Prevention Act of 1996 that outlawed “any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture” that “is, or appears to be, of a minor engaging in sexually explicit conduct.”
Representing Board of Indigent Defense Services at a hearing on Tuesday, Emily Brandt said the same arguments used to support the Kansas bill were used by the federal government in defending the child pornography language before the U.S. Supreme Court.
“The United States Supreme Court found that language is unconstitutional. It infringes on people’s free speech right. With all due respect to proponents’ testimony, one of the arguments is that (child abuse sex material) is actually used to groom children.
“That exact same argument was used by the federal government…and the United States Supreme Court explicitly said that was not good enough,” she said.
“You cannot punish people, you cannot infringe on their rights to free speech because free speech might encourage them later to do unlawful acts,” she said.
“We might not like virtual child pornography, it might be offensive to us, it does not mean we have the right to tell other people that they have no right to it,” she said. “We cannot criminalize people for something that they have not done.”
She said morphed images are already against the law in Kansas. She said the state Court of Appeals has already ruled that morphed images of real children are not protected protected free speech because they harm real children.
The bill goes beyond that, she said.
“We’re talking about material that, frankly, doesn’t harm any real children,” she said.
“If it doesn’t harm any real children, then the United States Supreme Court says you can’t criminalize it,” she said.














