Texas recently passed a law that criminalizes AI-generated child sexual abuse material—even in cases where no real child exists. On the surface, the intention seems straightforward: prevent harm to children. But legally, ethically, and practically, this new category of offense raises difficult questions that shift the foundations of criminal law.
As a criminal defense attorney, I spend a lot of time thinking about the why behind laws—not just the what. And the truth is: this area is complicated.
Why Child Pornography Is Illegal in the First Place
Traditional child pornography laws rest on an unshakable premise: a real child is harmed in the creation of the material. The criminalization exists to prevent exploitation and protect actual victims.
But AI-generated material challenges that core justification. When there is no real child involved, no exploitation, and no victim, the entire theory behind the law changes. We’re no longer talking about preventing harm—we’re talking about punishing a thought, a fantasy, a disorder, or a perceived risk.
That’s a major shift.
When Criminal Law Moves Into “Victimless” Territory
Victimless crimes have always presented problems. They rely on predicting harm or assuming danger rather than responding to an actual injury. When you start criminalizing things where no one is harmed, you risk expanding the reach of government into what people think, not what they do.
As I mentioned in a recent conversation:
“When a child isn’t harmed, it’s a big shift in the underpinnings of criminal law.”
And that’s the heart of the issue here.
Mental Health Is Not a Crime—But This Law Treats It Like One
Many people who might be drawn to this type of content suffer from sexual paraphilias—recognized clinical disorders. Criminalizing a disorder is a dangerous path. It’s akin to punishing schizophrenia or depression.
If we create laws that equate a mental health condition with criminal intent, we’re blurring the line between treatment and punishment.
And the truth is, we don’t even understand this issue well enough to legislate it effectively.
We Don’t Study It—Because No One Wants to Touch It
This entire field is understudied. Not because it’s unimportant, but because no one wants to be associated with the research. That means lawmakers are legislating based on guesswork, fear, and political pressure—not data.
And in any guessing game involving sex offenses, the outcome almost always skews toward:
“Screw the sex offenders—criminalize it and sort out the rest later.”
This approach may look tough on paper, but it doesn’t necessarily help anyone—not children, not people struggling with mental health issues, and not the justice system.
So What’s the Right Answer?
There’s no easy answer here. AI is changing the landscape faster than lawmakers can understand it. But before we create new crimes, we should at least be honest about what we’re criminalizing—and why.
Are we preventing harm?
Or are we punishing a disorder, a thought, or a fear of what might happen?
These are the questions the legal community must grapple with as technology pushes us into unfamiliar territory.