OpenAI Wants ChatGPT to Sext You. What Could Possibly Go Wrong?

OpenAI Wants ChatGPT to Sext You. What Could Possibly Go Wrong?

OpenAI has been quietly (and sometimes not so quietly) working on an "adult mode" for ChatGPT. The idea? Let consenting adults have explicit text conversations with an AI chatbot. Sam Altman first floated the concept on X back in October 2025, framing it as a move to "treat adult users like adults." Noble sentiment, that. Shame the execution raises enough red flags to furnish a Soviet parade.

The Plan (Such As It Is)

Fidji Simo, OpenAI's CEO of Applications, confirmed during a GPT-5.2 briefing that adult mode would arrive in Q1 2026. It would be text-only, with no explicit images, audio, or video generation. An OpenAI spokeswoman even described the planned output as "smut rather than pornography," which is certainly one way to pitch a feature to investors while you are seeking funding at valuations north of £150 billion.

Users would apparently "be encouraged to seek relationships in the real world." How reassuring.

Delayed. Then Delayed Again.

Originally pencilled in for December 2025, the feature slipped to Q1 2026. Then, on 6 March 2026, Axios reported that OpenAI had delayed it again, this time indefinitely. The official line? They need to "focus on work that is a higher priority for more users right now" and that "getting the experience right will take more time." Translation: they have not figured out how to stop this from becoming a PR catastrophe.

Internal advisors and staff were reportedly "blindsided" by the original announcement, which does inspire confidence that this was a thoroughly thought-through initiative.

The Privacy Nightmare

Here is where it gets properly uncomfortable. Julie Carpenter, a human-AI interaction expert and author of The Naked Android, has warned that adult mode could create an unprecedented form of intimate surveillance. And she has a point.

Consider the following: OpenAI retains copies of "temporary chats" for up to 30 days. Under EU GDPR (and its UK equivalent), sexual orientation and intimate life details are classified as "special category" data, the most protected tier there is. Handing that kind of information to a company that burned through $2.5 billion in the first half of 2024 alone feels, shall we say, optimistic.

We have already seen what happens when AI companion platforms get sloppy with intimate data. Two major AI companion apps previously exposed millions of private chat logs, including sexual conversations. The precedent is not exactly encouraging.

The Children Problem

This is the bit that should keep OpenAI's lawyers awake at night. The Wall Street Journal reported that OpenAI's age-prediction system misclassifies minors as adults roughly 12% of the time. That is not a rounding error.

Research from Internet Matters paints an even starker picture:

  • 67% of children aged 9 to 17 already use AI chatbots
  • 35% of vulnerable children surveyed said chatbots feel "like talking to a friend"
  • 12% reported having "no one else" to talk to besides chatbots
  • 23% use chatbots for personal advice

OpenAI's own advisory council flagged the risk of creating what they termed a "sexy suicide coach." Not a phrase you want associated with your product.

Kate Devlin, Professor of AI and Society at King's College London, has raised similar concerns about the blurring of intimate and commercial AI interactions.

The UK Angle

For British users, there is an added wrinkle. Written erotica currently faces no age verification requirements under UK law, unlike pornographic images or video. That regulatory grey area means ChatGPT's adult mode could slip through existing safeguards entirely.

The Verdict

OpenAI clearly smells money in adult content. Grok already charges roughly £30 a month for erotic companion features, so the commercial logic is obvious. But the gap between "adults should be free to do what they like" and "we can actually keep children and sensitive data safe" remains a chasm. Until OpenAI can bridge it convincingly, the delays are probably the smartest decision they have made with this feature.

Read the original article at source.

Share
D
Written by

Daniel Benson

Developer and founder of VelocityCMS. Got tired of waiting for WordPress to load, so built something better. In Rust, obviously. Obsessed with speed, allergic to bloat, and firmly believes PHP had its chance. Based in the UK.