
Sora A.I. Video Exposes Netflix Star’s Chest, Company Says They’re Working on It
Artificial Intelligence
Sora Vid Exposes Influencer’s Chest!!!
OpenAI Says This One Slipped Past
Published
|
Updated

The Sora A.I.-generated video tool may be the hot new toy from OpenAI … but it’s already creating an explosion of drama — with one influencer speaking out about how someone used it to fake a clip that made it look like she was flashing the camera!
Avori Strib tells TMZ the bogus Sora clip — showing her exposing her chest in a deepfake video — shines a floodlamp on the growing danger of unregulated A.I. technologies spitting out deeply invasive, harmful content — especially using real-live people as subjects.
Strib says A.I. companies like OpenAI, Sora’s parent, need to take real responsibility to protect people from privacy and identity violations.
Avori — an influencer, streamer and star of Netflix’s “Battle Camp” and “The Mole” — says she’s all for A.I.’s creative potential … but she stresses the smart move would be for companies to develop the software safely before unleashing it on the public.
ICYDK … Sora’s an app that creates realistic videos from text or image prompts from users. OpenAI already took some heat this week after users created videos using Martin Luther King Jr.‘s likeness … which the company acknowledged were “disrespectful depictions” of the civil-rights giant.
Avori’s now teaming up with her crew and a legal advisor to take the fight straight to the platform … saying she hopes this mess sparks real awareness and accountability in the A.I. world so no one else has to go through it.
She’s also asking folks to stop spreading or engaging with shady, unauthorized content — especially stuff that invades her privacy without consent.
An OpenAI spokesperson responded to TMZ, saying sexually explicit or pornographic content isn’t allowed on Sora — and the fake Avori clip has already been yanked. They admitted this one slipped past their systems, but stressed they’re beefing up safeguards to prevent it from happening again … adding their tech also blocks people from cameoing or recreating this kind of content.