Former TV Meteorologist Battles AI Sextortion Scam That Hijacked Her Face

Bree Smith - Deepfake Sextortion

By any measure, Bree Smith should have been celebrating a fulfilling career and life. A beloved meteorologist in Nashville, a mother, and a woman with a public presence built on years of hard work. But instead of charting cold fronts, she’s now at the epicenter of a chilling new digital storm—fighting for her identity, her dignity, and her children’s peace of mind.

Smith’s life was upended by a modern horror story: deepfake sextortion. It started with a single email. A concerned stranger reached out to let her know an impersonator was using her face. What followed was a nightmare: dozens—eventually hundreds—of online accounts sharing explicit images and videos, falsely featuring Smith’s face on someone else’s body, offering sexual favors in exchange for money.

“They’re weaponizing me,” Smith said through tears. “You’re taking someone’s identity and using it to hurt them—and hurt others.”

And make no mistake—this isn’t just about embarrassment. It’s about theft. Of reputation. Of safety. Of the right to exist online without being turned into someone else’s profit machine.

The FBI says more than 54,000 Americans were victims of sextortion last year alone. The usual victims are teens—often boys—but adults, especially public-facing ones like Smith, are increasingly targeted too. Why? Because they can pay.

And for months, Smith fought it alone. She tracked the impersonator accounts in a spreadsheet, blocking what she could. But it was like whack-a-mole—24 new accounts in one week alone. “I’m just supposed to look at them and say, ‘nothing we can do, bud?’” she said to Tennessee lawmakers.

But instead of folding inward with shame—what these scammers count on their victims to do—Bree Smith decided to fight back.

She testified in front of lawmakers. She went public. She backed the Preventing Deepfake Images Act, which just passed the Tennessee Senate and is expected to become law. The legislation will allow victims to sue anyone who distributes nonconsensual AI-generated images of them.

And she’s not stopping there.

“This is my story,” Smith declared. “I’ve worked hard, I’ve loved well, and I’m not going to roll over and take this.”

Her advocacy has come at great personal cost. “I cry myself to sleep most nights,” she admits. “Mostly because I don’t want my kids to see me like this.”

But she’s also lighting the path forward—not just for herself, but for the next wave of victims, many of whom won’t have the platform she does.

Because here’s the truth: this could happen to anyone. Your face, your name, your photos are all online. And the tools to manipulate them are getting faster, cheaper, and more convincing by the day. What was once science fiction is now an economic model for criminals.

So what can you do?

  1. Stay vigilant: Set up alerts for your name, and reverse image search your photos occasionally.

  2. Report quickly: If you find content, report it to the platform and to the FBI’s Internet Crime Complaint Center.

  3. Talk about it: The silence around these scams is what allows them to flourish. Shame thrives in secrecy.

And most importantly, support victims who speak up. Because they’re doing what our laws and platforms are still catching up to: drawing a line in the digital sand and saying—this is mine. My story. My face. My life.

And no, you don’t get to take it.

Sign up for our newsletter to get the latest scam alerts, practical security tips, real-life scam examples, and expert advice to keep you one step ahead of online threats.

Please enable JavaScript in your browser to complete this form.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to content