Sam Altman Says Chatgpt Will Soon Allow Erotica For Adult Users

Trending 4 weeks ago

OpenAI CEO Sam Altman announced successful a post connected X Tuesday nan institution will soon relax immoderate of ChatGPT’s information restrictions, allowing users to make nan chatbot’s responses friendlier aliases much “human-like,” and for “verified adults” to prosecute successful seductive conversations.

“We made ChatGPT beautiful restrictive to make judge we were being observant pinch intelligence wellness issues. We recognize this made it little useful/enjoyable to galore users who had nary intelligence wellness problems, but fixed nan seriousness of nan rumor we wanted to get this right,” said Altman. “In December, arsenic we rotation retired age-gating much afloat and arsenic portion of our “treat big users for illustration adults” principle, we will let moreover more, for illustration erotica for verified adults.”

We made ChatGPT beautiful restrictive to make judge we were being observant pinch intelligence wellness issues. We recognize this made it little useful/enjoyable to galore users who had nary intelligence wellness problems, but fixed nan seriousness of nan rumor we wanted to get this right.

Now that we have…

— Sam Altman (@sama) October 14, 2025

The announcement is simply a notable pivot from OpenAI’s months-long effort to reside nan concerning relationships that immoderate mentally unstable users person developed pinch ChatGPT. Altman seems to state an early triumph complete these problems, claiming OpenAI has “been capable to mitigate nan superior intelligence wellness issues” astir ChatGPT. However, nan institution has provided small to nary grounds for this, and is now plowing up pinch plans for ChatGPT to prosecute successful intersexual chats pinch users.

Several concerning stories emerged this summertime astir ChatGPT, specifically its GPT-4o model, suggesting nan AI chatbot could lead susceptible users down illusion rabbit holes. In 1 case, ChatGPT seemed to convince a man he was a mathematics brilliant who needed to prevention nan world. In another, the parents of a teen sued OpenAI, alleging ChatGPT encouraged their son’s suicidal ideations successful nan weeks starring up to his death.

In response, OpenAI released a bid of information features to reside AI sycophancy: nan inclination for an AI chatbot to hook users by agreeing pinch immoderate they say, moreover antagonistic behaviors.

OpenAI launched GPT-5 successful August, a caller AI exemplary that exhibits little rates of sycophancy and features a router that tin place concerning personification behavior. A period later, OpenAI launched information features for minors, including an property prediction strategy and a measurement for parents to power their teen’s ChatGPT account. OpenAI announced Tuesday an master assembly of intelligence wellness experts to counsel nan institution connected well-being and AI.

Just a fewer months aft these stories emerged, OpenAI seems to deliberation ChatGPT’s problems astir susceptible users are nether control. It’s unclear whether users are still falling down illusion rabbit holes pinch GPT-5. And while GPT-4o is nary longer nan default successful ChatGPT, nan AI exemplary is still disposable coming and being utilized by thousands of people.

Techcrunch event

San Francisco | October 27-29, 2025

OpenAI did not respond to TechCrunch’s petition for comment.

The preamble of erotica successful ChatGPT is unchartered territory for OpenAI and raises broader concerns astir really susceptible users will interact pinch nan caller features. While Altman insists OpenAI isn’t “usage-maxxing” aliases optimizing for engagement, making ChatGPT much seductive could surely tie users in.

Allowing chatbots to prosecute successful romanticist aliases seductive domiciled play has been an effective engagement strategy for different AI chatbot providers, specified arsenic Character.AI. The institution has gained tens of millions of users, galore of whom usage its chatbots astatine a precocious rate. Character.AI said successful 2023 that users spent an mean of 2 hours a day talking to its chatbots. The institution is besides facing a lawsuit astir really it handles susceptible users.

OpenAI is nether unit to turn its personification base. While ChatGPT is already utilized by 800 cardinal play progressive users, OpenAI is racing against Google and Meta to build mass-adopted AI-powered user products. The institution has besides raised billions of dollars for a historical infrastructure buildout, an finance OpenAI yet needs to salary back.

While adults are surely having romanticist relationships pinch AI chatbots, it’s besides rather celebrated for minors. A caller study from nan Center for Democracy and Technology recovered that 19% of precocious schoolhouse students person either had a romanticist narration pinch an AI chatbot, aliases cognize a friend who has.

Altman says OpenAI will soon let erotica for “verified adults.” It’s unclear whether nan institution will trust connected its age-prediction system, aliases immoderate different approach, for age-gating ChatGPT’s seductive features. It’s besides unclear whether OpenAI will widen erotica to its AI voice, image, and video procreation tools.

Altman claims that OpenAI is besides making ChatGPT friendlier and seductive because of nan company’s “treat adults for illustration adults” principle. Over nan past year, OpenAI has shifted towards a more lenient contented moderation strategy for ChatGPT, allowing nan chatbot to beryllium much permissive and connection little refusals. In February, OpenAI pledged to correspond much governmental viewpoints successful ChatGPT, and successful March, nan institution updated ChatGPT to let AI-generated images of dislike symbols.

These policies look to beryllium an effort to make ChatGPT’s consequence much celebrated pinch a wide assortment of users. However, susceptible ChatGPT users whitethorn use from safeguards that limit what a chatbot tin prosecute with. As OpenAI races towards a cardinal play progressive users, nan hostility betwixt maturation and protecting susceptible users whitethorn only grow.

More