AI-Powered Fake Kidnapping Scam
Scammers use cloned voices to fake kidnappings of family members, demanding ransom while keeping victims on the phone.
🚩 Red Flags
- ⚠Kidnapping call demanding immediate ransom
- ⚠Insistence to stay on phone
- ⚠Won't let "victim" speak freely
- ⚠Crypto or wire payment demanded
- ⚠Can't verify victim's location
🛡️ Protect Yourself
- →STAY CALM: Most are fake
- →TRY TO CONTACT the "victim" separately
- →ASK questions only they'd know
- →CALL 911 if you can't verify
- →ESTABLISH family code word in advance
More Details
- “[Cloned voice crying] Mom, help me!”
- “We have your daughter, don't call police”
- “Send $10,000 Bitcoin or she dies”
- “Stay on the line or we hurt her”
- “You have one hour”
Common Questions
Yes. AI needs only 3 seconds from TikTok, Instagram, voicemail. The clone can say anything and sounds identical.
Family code word for emergencies. If urgent call, hang up and call back on known number. Be cautious posting voice content online.
AI replaces scammer's face with executive's face in real-time video calls. Used for fraudulent wire transfers. One case: $25M lost.
Not completely. Deepfakes run in real-time now. Look for odd movements and verify through separate channels.
Report This Scam
If you've encountered this scam, report it to help protect others.
Warn Someone You Know
Know someone who might fall for this? Share this warning with them.
Related Scams
AI Voice Cloning Scam
Scammers use AI to clone voices from social media videos. One short clip creates convincing fake emergency calls. Fastest-growing scam threat.
Grandparent Emergency Scam
Scammers pose as grandchild in crisis—arrested, hospitalized, stranded. Modern versions use AI voice cloning. FBI IC3: 357 complaints, $2.7M losses in 2024.
Think you received a message like this?
Free • Private • No signup required
🔒 We analyze your message — then it's gone.
73% of Americans targeted(Pew, 2025)
|$470M lost to text scams in 2024(FTC)
|$16.6B total losses(FBI IC3, 2024)