Navigation

Texas Gets Tougher On Owning Sex Dolls That Look Like Children

Starting Sept. 1, the penalties for possession and the intention to promote childlike sex dolls will increase. About time, right?
Image: texas capitol building austin
The state is cracking down on sex dolls that look like children, but an unregulated AI market still allows for the easy production of generated child pornography. Adobe Stock
Share this:
Carbonatix Pre-Player Loader

Audio By Carbonatix

Come Sept. 1, Texas will tighten its laws regarding sex dolls that look like children.

You read that right. Somehow, they weren’t strict enough. The average person of basic moral guidance likely never even considered the existence of said dolls, let alone the need for heightened penalties, but as it happens, there were loopholes in our existing laws. During the 89th Legislative Session, a bill to increase penalties for the possession of childlike sex dolls passed unanimously. Looks like both sides of the aisle can easily agree on one thing: child sex dolls are bad, and anyone who has them should be headed straight to the pen.

“Current state law already criminalizes possession with intent to promote obscene devices, which could include child-like sex dolls,” reads a bill analysis published by the Senate Research Center. “However, state law does not currently criminalize possession without intent to promote, leaving a gap for enforcement against these devices.”

The bill, filed by Republican Rep. Nate Scahtzline representing Fort Worth, creates felony offenses for the possession and promotion of child sex dolls, and establishes a “two-doll limit for the presumption that the individual possessed the dolls with the intent to promote them.”

Promotion of a child sex doll will be a second-degree felony, punishable by 2-20 years in prison and up to a $10,000 fine. Possession with the intention to distribute, which is qualified by obtaining more than two dollars, will be a third-degree felony, punishable by 2-10 years in prison and up to a $10,000 fine. A charge of possession without promotion or the intent to distribute will be a state jail felony, punishable up to 2 years in prison and up to a $10,000 fine.

“It’s shocking that we even have to do this, but these HORRIFIC devices that resemble real kids are being used to train predators,” the representative wrote on social media. “It’s time we END the normalization of pedophilia & protect Texas kids!”

The bill specifically bans any “anatomically correct doll, mannequin, or robot that has the features of a child and that is intended to be used for sexual stimulation or gratification.”


Sexual Technology Developments Create Easy Access To Child Pornography

Similar laws have been passed in Louisiana, Wisconsin and Washington within the last five years. Though it feels like there shouldn’t be so many states, including our own, that need to revamp their child sex doll laws, the quickening pace of sexual tech development necessitates it.

“The presence of sex dolls in our pop cultural imaginaries is growing just as rapidly as— or, likely, more rapidly than—the development and use of sex dolls themselves,” reads an analysis on the history of sex dolls from Bo Ruberg, a former technology journalist and current video game scholar at the University of California, Irvine.

Simply put, the market moves faster than the legislature can keep up, and sex dolls are only one example. A new monster has emerged with the advent of artificial intelligence: digitally generated child pornography.

According to data from the Internet Watch Foundation, a non-profit that collects data on child pornography reports, AI child sexual abuse imagery reports increased 400% in 2025. In the first six months of the year, 1,286 individual AI videos of child sexual abuse were discovered. In the same time period in 2024, there were only two.

“It’s a canary in the coal mine,” Derk Ray-Hill, interim chief executive of the Internet Watch Foundation, said to the New York Times. “There is an absolute tsunami we are seeing.”

AI systems have been built to create “deepfake nudes,” and there are no limits to their creation capacities. Entire explicit pornographic films can be generated from a single photo pirated from social media, and the concern is highest around pictures parents have posted of their own children. The apps called “nudifiers” are mostly headquartered and operated in foreign countries, and so regulating their software poses challenges, though American companies are giving it their best efforts.

In June of 2025, the crown conglomerate of social media, Meta, filed a lawsuit against a Hong Kong-based AI developer responsible for one of the nudifying systems, preventing them from advertising on Meta-owned apps. But according to Brian Chen, lead consumer technology writer at the Times, these attempts are futile in actually stopping the production of AI child porn.

“Social media companies like Snap, TikTok and Meta prohibit advertising of nudifiers on their apps, and some states are beginning to discuss legislation that would ban companies from offering nudifier apps… To put it another way, anyone can still easily use a nudifier app on a child and keep the photos, and no one would know,” writes Chen.