Scenario · 5 minutes · Free
What would you do if a deepfake of you started spreading?
It's 7:34 PM on a Thursday. Your phone buzzes nine times in a row.
What do you do?
Why this scenario is real in 2026
One open-source face-swap model, one 30-second clip of anyone's voice from their TikTok, and the output looks believable on a phone screen. Research from Northwestern Kellogg shows humans detect AI-generated faces and voices only slightly better than a coin flip. Teens are now being targeted both for reputation harm (class pranks, bullying) and for sextortion (fake explicit imagery as blackmail).
The first 10 minutes matter
- Screenshot first. Everything. Usernames, timestamps, the video itself. Evidence disappears fast when a group realizes it's a deepfake.
- Don't fight in the chat. Emotional replies accelerate spread. "Lol it's fake" often reads as confirmation it's not.
- Tell an adult. A parent, a counselor, a teacher — anyone you trust. Even if you're embarrassed. Especially if you're embarrassed.
- Report to the platform. Snapchat, TikTok, Instagram all have explicit deepfake/non-consensual imagery reporting. Use it immediately.
- If the content is sexual, stop everything and report to the NCMEC CyberTipline at 1-800-843-5678 or report.cybertip.org. This is sextortion and it's taken seriously by law enforcement.
For parents reading this
Full parent guide: Deepfake Scams Teens Fall For (And How to Spot Them). Covers detection limits, recovery pathways, and what to say to a teen who just got targeted.
More "What would you do if..." scenarios
Your best friend's Discord got hacked
9:47 PM. Alex sends "omg is this you?" with a link. What do you do?
Social · BelongingYou find a secret group chat — without you
Screenshots leak. You weren't invited. Now what?
Safety · MoneySomeone DMs you free Robux
10,000 free Robux. "Just verify your account." Is this real or a scam?