
For most of internet history, the rule was simple: “Pics or it didn’t happen.” Then came Photoshop, and that confidence cracked. Now, with OpenAI’s Sora and other AI video generators like Meta’s Vibe and Google’s Veo 3, even the videos you watch can’t be trusted.
We’ve officially entered the era of visual lies—and it’s about to change how scams work forever.
From Entertainment to Manipulation
Sora looks like harmless fun at first. You type a prompt — “a raccoon flying a plane” or “a cat climbing a bouldering wall” — and out pops a hyper-realistic video in seconds. It’s silly, addictive, and oddly magical.
But beneath that novelty is a darker possibility: anyone can now create a convincing video of anything, anywhere, at any time.
A fake police chase. A doctored “bodycam” clip. A fabricated news broadcast showing your favorite celebrity admitting to crimes.
That’s not sci-fi — that’s this month.
AI is getting INSANE! 🤯
Stephen Hawking escapes the Police
(Made on Sora 2) 👇 pic.twitter.com/Z5hfKr38iV
— John Savage (@johnsavage_eth) October 8, 2025
And just like deepfake audio has been used to mimic a CEO’s voice for wire-fraud schemes, Sora-style videos are about to give scammers an entirely new arsenal.
Scams Built for the Visual Age
Here’s how this changes the game:
1. The “Proof” Scam
Until now, fraudsters relied on screenshots or fake documents to “prove” their claims — fake investment dashboards, counterfeit invoices, spoofed bank statements.
Now, they can send dashcam footage, security-camera clips, or even “FaceTime videos” showing events that never happened.
Example: an AI-generated video of a supposed hit-and-run used to frame a driver for insurance fraud. The video looks authentic, down to the reflections on the windshield and license-plate numbers.
2. The “Fake Arrest” or “Fake Crisis” Scam
Imagine getting a video of your child or spouse “in police custody,” pleading for help. The voice and setting look real. Your panic overrides logic — and before you can think, you’ve wired the ransom money.
This is the next generation of voice-cloning kidnapping scams, but now with visual confirmation to seal the emotional manipulation.
3. The “Authority” Scam
A fake video of a mayor, health official, or police chief making an announcement could go viral in minutes.
Think about the chaos that could come from an AI-generated “breaking news” clip warning of a chemical spill or fake bank run. It doesn’t need to stay online for long — just long enough to make people panic, click, and comply.
4. Reputation Sabotage
Scammers and extortionists can fabricate incriminating “leaks” — footage of you saying or doing something outrageous — and demand payment to delete it.
The burden of proof shifts unfairly: you’ll have to prove something didn’t happen.
Why Humans Are Easy to Fool
As UC Berkeley professor Ren Ng points out, our brains are wired to believe what we see.
We evolved that way — visual confirmation was survival. The tiger you saw was the tiger that could eat you.
But now, that wiring is being exploited.
Every emotion scammers prey on —

Leave a Reply