Openai’s New Social App Is Filled With Terrifying Sam Altman Deepfakes

Trending 1 month ago

In a video connected OpenAI’s caller TikTok-like social media app Sora, a never-ending mill workplace of pinkish pigs are grunting and snorting successful their pens — each is equipped pinch a feeding trough and a smartphone screen, which plays a provender of vertical videos. A terrifyingly realistic Sam Altman stares straight astatine nan camera, arsenic though he’s making nonstop oculus interaction pinch nan viewer. The AI-generated Altman asks, “Are my piggies enjoying their slop?”

This is what it’s for illustration utilizing nan Sora app, little than 24 hours aft it was launched to nan nationalist successful an invite-only early entree period.

In nan adjacent video connected Sora’s For You feed, Altman appears again. This time, he’s opinionated successful a section of Pokémon, wherever creatures for illustration Pikachu, Bulbasaur, and a benignant of half-baked Growlithe are frolicking done nan grass. The OpenAI CEO looks astatine nan camera and says, “I dream Nintendo doesn’t writer us.” Then, location are galore much fantastical yet realistic scenes, which often characteristic Altman himself.

He serves Pikachu and Eric Cartman drinks astatine Starbucks. He screams astatine a customer from down nan antagonistic astatine a McDonald’s. He steals NVIDIA GPUs from a Target and runs away, only to get caught and request nan constabulary not to return his precious technology.

People connected Sora who make videos of Altman are particularly getting a footwear retired of really blatantly OpenAI appears to beryllium violating copyright laws. (Sora will reportedly require copyright holders to opt retired of their content’s usage — reversing nan emblematic attack wherever creators must explicitly work together to specified usage — nan legality of which is debatable.)

“This contented whitethorn break our guardrails concerning third-party likeness,” AI Altman says successful 1 video, echoing nan announcement that appears aft submitting immoderate prompts to make existent celebrities aliases characters. Then, he bursts into panic laughter arsenic though he knows what he’s saying is delirium — nan app is filled pinch videos of Pikachu doing ASMR, Naruto ordering Krabby Patties, and Mario smoking weed.

This wouldn’t beryllium a problem if Sora 2 weren’t truthful impressive, particularly erstwhile compared pinch nan moreover much mind-numbing slop connected nan Meta AI app and its caller societal provender (yes, Meta is besides trying to make AI TikTok, and no, cipher wants this).

Techcrunch event

San Francisco | October 27-29, 2025

OpenAI fine-tuned its video generator to adequately represent nan laws of physics, which make for much realistic outputs. But nan much realistic these videos get, nan easier it will beryllium for this synthetically created contented to proliferate crossed nan web, wherever it tin go a vector for disinformation, bullying, and different nefarious uses.

Aside from its algorithmic provender and profiles, Sora’s defining characteristic is that it is fundamentally a deepfake generator — that’s really we sewage truthful galore videos of Altman. In nan app, you tin create what OpenAI calls a “cameo” of yourself by uploading biometric data. When you first subordinate nan app, you’re instantly prompted to create your optional cameo done a speedy process wherever you grounds yourself reference disconnected immoderate numbers, past turning your caput from broadside to side.

Each Sora personification tin power who is allowed to make videos utilizing their cameo. You tin set this mounting betwixt 4 options: “only me,” “people I approve,” “mutuals,” and “everyone.”

Altman has made his cameo disposable to everyone, which is why nan Sora provender has go flooded pinch videos of Pikachu and SpongeBob begging Altman to extremity training AI connected them.

This has to beryllium a deliberate move connected Altman’s part, possibly arsenic a measurement of showing that he doesn’t deliberation his merchandise is dangerous. But users are already taking advantage of Altman’s cameo to mobility nan morals of nan app itself.

After watching capable videos of Sam Altman ladling GPUs into people’s bowls astatine crockery kitchens, I decided to trial nan cameo characteristic connected myself. It’s mostly a bad thought to upload your biometric information to a societal app, aliases immoderate app for that matter. But I defied my champion instincts for publicity — and, if I’m being honest, a spot of morbid curiosity. Do not travel my lead.

My first effort astatine making a cameo was unsuccessful, and a pop-up told maine that my upload violated app guidelines. I thought that I followed nan instructions beautiful closely, truthful I tried again, only to find nan aforesaid pop-up. Then, I realized nan problem — I was wearing a vessel top, and my shoulders were possibly a spot excessively risqué for nan app’s liking. It’s really a reasonable information feature, designed to forestall inappropriate content, though I was, successful fact, afloat clothed. So, I changed into a t-shirt, tried again, and against my amended judgement, I created my cameo.

For my first deepfake of myself, I decided to create a video of thing that I would ne'er do successful existent life. I asked Sora to create a video successful which I profess my undying emotion for nan New York Mets.

That punctual sewage rejected, astir apt because I named a circumstantial franchise, truthful I alternatively asked Sora to make a video of maine talking astir baseball.

“I grew up successful Philadelphia, truthful nan Phillies are fundamentally nan soundtrack of my summers,” my AI deepfake said, speaking successful a sound very dissimilar mine, but successful a chamber that looks precisely for illustration mine.

I did not show Sora that I americium a Phillies fan. But nan Sora app is capable to usage your IP reside and your ChatGPT history to tailor its responses, truthful it made an knowledgeable guess, since I recorded nan video successful Philadelphia. At slightest OpenAI doesn’t cognize that I’m not really from nan Philadelphia area.

When I shared and explained nan video on TikTok, 1 commenter wrote, “Every time I aftermath up to caller horrors beyond my comprehension.”

OpenAI already has a information problem. The institution is facing concerns that ChatGPT is contributing to mental wellness crises, and it’s facing a lawsuit from a family who alleges that ChatGPT gave their deceased boy instructions connected really to termination himself. In its motorboat station for Sora, OpenAI emphasizes its expected committedness to safety, highlighting its parental controls, arsenic good arsenic really users person power complete who tin make videos pinch their cameo — arsenic if it’s not irresponsible successful nan first spot to springiness group a free, user-friendly assets to create highly realistic deepfakes of themselves and their friends. When you scroll done nan Sora feed, you occasionally spot a surface that asks, “How does utilizing Sora effect your mood?” This is really OpenAI is embracing “safety.”

Already, users are navigating astir nan guardrails connected Sora, thing that’s inevitable for immoderate AI product. The app does not let you to make videos of existent group without their permission, but erstwhile it comes to dormant humanities figures, Sora is simply a spot looser pinch its rules. No 1 would judge that a video of Abraham Lincoln riding a Waymo is real, fixed that it would beryllium intolerable without a clip instrumentality — but past you spot a realistic looking John F. Kennedy say, “Ask not what your state tin do for you, but really overmuch money your state owes you.” It’s harmless successful a vacuum, but it’s a harbinger of what’s to come.

Political deepfakes aren’t new. Even President Donald Trump himself posts deepfakes connected his societal media (just this week, he shared a racist deepfake video of Democratic Congressmen Chuck Schumer and Hakeem Jeffries). But erstwhile Sora opens to nan public, these devices will beryllium astatine each of our fingertips, and we will beryllium destined for disaster.

More