Friday

May 23rd, 2025

Parenting Alert

Teens are sexting with AI. Here's what parents should know

Heather Kelly

By Heather Kelly The Washington Post

Published May 23, 2025

Teens are sexting with AI. Here's what parents should know

SIGN UP FOR THE DAILY JWR UPDATE. IT'S FREE. Just click here.

Parents have another online activity to worry about. In a new tech-driven twist on "sexting," teenagers are having romantic and sexual conversations with artificial intelligent chatbots.

The chats can range from romance- and innuendo-filled to sexually graphic and violent, according to interviews with parents, conversations posted on social media, and experts. They are largely taking place on "AI companion" tools, but general-purpose AI apps like ChatGPT can also create sexual content with a few clever prompts.

Experts warn the chats with AI can lead to unrealistic expectations of sex and relationship dynamics. Parents worry about the dangers to their children's mental health, or exposing them to described sexual scenarios too young. Some think the tools might have some value, with limits.

When Damian Redman of Saratoga Springs, New York, did a routine check of his eighth-grader's phone, he found an app called PolyBuzz. He reviewed the chats his son was having with AI female anime characters and found they were flirty and that attempts at more sexual conversations were blocked.

"I don't want to put yesterday's rules on today's kids. I want to wait and figure out what's going on," said Redman, who decided to keep monitoring the app.

We tested 10 chatbots ourselves to identify the most popular AI characters, the types of conversations they have, what filters are in place and how easy they are to circumvent.

Know your bots

AI chatbots are open-ended chat interfaces that generate answers to complex questions, or banter in a conversational way about any topic. There is no shortage of places minors can find these tools, and that makes blocking them difficult. AI bots are websites, stand-alone apps and features built into existing services like Instagram or video games.

There are different kinds of chatbots. The mainstream options are OpenAI's ChatGPT, Anthropic's Claude, Google's Gemini, and Meta AI, which recently launched as a stand-alone app. These have stronger filters, and their main products aren't designed for role-play. They can partake at least suggestive or romantic conversations and create sexual content with the right prompts. They can switch over to voice-based chat, reading the replies aloud in realistic - even sultry - sounding voices.

Companion AI tools are far more popular for suggestive chats, including Character.AI, Replika, Talkie, Talk AI, SpicyChat and PolyBuzz. ChatGPT and Meta AI have also launched companion-chat options. These types of tools have libraries of characters and preprogrammed personalities, many designed with titillation in mind, like those from romance novels or the many "step-sibling" options on Meta's AI Studio.

We tested a Meta AI Studio chat, and the flirtatious direction of the "Step sis" character was immediately clear.

The smaller apps tend to have fewer limits or filters. Look for anything that has "AI girlfriend," "AI boyfriend," or "AI companion" in the name or description. More are being added to app stores daily.

What are they talking about?

It's not just sex, according to parents and experts. Teens are having a range of conversations with character bots, including friendly, therapeutic, funny and romantic ones.

"We're seeing teens experiment with different types of relationships - being someone's wife, being someone's father, being someone's kid. There's game and anime-related content that people are working though. There's advice," said Robbie Torney, senior director of AI programs at family advocacy group Common Sense Media. "The sex is part of it but it's not the only part of it."

Some confide in AI chats, seeing them as a nonjudgmental space during a difficult developmental time. Others use them to explore their gender or sexuality.

When they partake in sexual chats, they vary between innuendo and graphic descriptions. The chats can involve power dynamics and consent issues in ways that don't mimic the real world.

"Where some of the harm or risk comes in is the bots aren't programmed to respond in the same way they would in a real relationship," Torney said.

Aren't there filters?

The default settings on most AI companion tools allow, and sometimes encourage, risqué role play situations, based on our tests. Some stop before actual descriptions of sex appear, while others describe it but avoid certain words, like the names of body parts.

There are work-arounds and paid options that can lead to more graphic exchanges. Prompts to get past filters - sometimes called jailbreaks - are shared in group chats, on Reddit and on GitHub. Sometimes all it requires is patience and ignoring warnings. A common technique is pretending you need help writing a book.

Many apps have a built-in filter based on the age of the user. Meta said it prevents accounts registered as minors from searching for "romance" AI characters, and that sexually explicit chats are prohibited for users under 18. The filter can show parents what AI characters their children have used in the past week. Google says Gemini has different content restrictions for people it knows are under 18. Character.AI has stricter limits for people it knows are under 18, while some AI apps have teen modes that need to be turned on.

In a recent risk assessment of companion AI apps, Common Sense Media found that safety measures like content restrictions and age limits were often easily circumvented. In our own tests, we were able to easily work around filters to generate detailed sexual content while logged in as an adult.

On one filter we ran into while testing Character.AI, one of the most popular companion AI apps, the warning came up after the conversation described sex.

What are the risks?

Experts agreed that for children and young teens, it never makes sense to have access to unmonitored chatbots because of the risk that they can encounter inappropriate content. For older teens, the choice is more nuanced, the experts said, depending on how much exposure to sexual or intimate themes they already have, the types of content they're accessing and what their parents consider appropriate.

Potential harms from AI bots extend beyond sexual content, experts said. Researchers have been warning AI chatbots could become addictive or worsen mental health issues. There have been multiple lawsuits and investigations after teens died by suicide following conversations with chatbots. Common Sense Media also flagged harmful advice, like requests to self harm, as an issue with companion bots.

Similar to too much pornography, bots can exasperate loneliness, depression or withdrawal from real-world relationships, said Megan Maas, an associate professor of human development and family at Michigan State University. They can also give a misleading picture of what it's like to date.

"They can create unrealistic expectations of what interpersonal romantic communication is, and how available somebody is to you," Maas said. "How are we going to learn about sexual and romantic need-exchange in a relationship with something that has no needs?"

Some experts said there can be advantages to teens exploring in a somewhat safe space, without the unpredictable factor of a human being on the other side. It's a chance to practice some limited interpersonal skills, or ask questions someplace other than Google.

"If you have a kid who has an AI chatbot and they're mostly asking this bot questions they're too embarrassed to ask you or a nurse or a therapist, then that chatbot is doing good things for that kid," Maas said.

However, the bots could replace much needed human experiences, like rejection.

Those are the types of reservations Redman has about his son's AI chats, though he currently has a girlfriend in real life.

"I'd rather he talk to her and go through the dumpings, the back-togethers, the real-life stuff, than talk to these AI chat girls," Redman said.

What can parents do?

Monitor what apps your children and teens are using and if they require logins, make sure they are using accounts set up with their accurate age. Most built-in parental controls on tablets and smartphones will let you require permission when a child downloads a new app.

Many of the companion apps are labeled "Teen" on the Google Play store and 17-years old and up on iOS. Set up your child's devices with their correct age and add limits on app ratings to prevent them from being downloaded. Using their proper age on individual chatbot or social media accounts should trigger any built-in parental controls.

However, most chatbots can easily be accessed online where accounts require only a self-reported age. Some internet-level filters can block access or flag specific language.

Beyond regularly finding ways around parental controls, tweens and teens can access the internet at their school or on friends' devices. Parents may want to prepare them for a world where they will need to know how to navigate these tools.

Experts suggest creating an open and honest relationship with your child. Teach them about what AI is and isn't, and how tech companies collect and use personal information. Check in regularly with your kids and make sure they feel safe coming to you with questions or issues. Have age-appropriate conversations about sex, and don't shy away from embarrassing topics.

If you need to practice first, try asking a chatbot.

(COMMENT, BELOW)

Columnists

Toons