If this helped, forward it to one person who’d benefit.
You get a call from your mom. She's panicked. Your nephew's been in an accident and needs bail money right now.
Except it's not your mom.
It's a machine.
Welcome to the world of deepfake voice scams, and they're exploding.
How Bad Is It?
According to Hiya's 2026 State of the Call report, one in four Americans has received a deepfake voice call in the past 12 months. Another 24% aren't sure they could tell the difference. That means nearly half the population has either been targeted or wouldn't know if they were.
And it's not just random robocalls anymore. AI can now clone a person's voice from just a few seconds of audio, scraped from a TikTok, an Instagram Story, a YouTube video, or even a voicemail greeting.
One real case: a 90-year-old woman received a call that sounded exactly like her grandson, begging for money. She was so shaken that she refused to answer her phone for months afterward.
Who's Being Targeted?
Everyone.
Seniors are hit with the classic "grandparent scam" — a fake grandchild calling in a panic. Average loss per senior victim: $1,298.
Professionals are targeted with fake calls from their "CEO" or "CFO." In one case, a finance worker transferred $25 million after a video call where every person on screen was a deepfake.
Parents get calls with their "child's" voice, crying and begging for help.
Regular people get fake jury duty warrants, IRS threats, and law enforcement impersonation calls.
How Does It Work?
The scammer doesn't need much:
They find a short clip of someone's voice online
They feed it into an AI voice cloning tool (many are free)
They call the target, sounding exactly like a trusted person
They create urgency: accidents, arrests, emergencies, anything to stop you from thinking clearly
The whole point is to bypass your brain's natural defenses. You hear a familiar voice. You feel the panic. You act before you think.
How to Protect Yourself
Create a family safe word. Pick a random, nonsensical phrase, something like "Purple Cactus" or "Midnight Penguin." Share it only with close family and friends, never online. If someone calls in distress, ask for the code. An AI clone can't guess what it was never trained on.
Hang up and call back. If you get an urgent call from someone you know, hang up and call them directly using the number you already have saved. Don't use any number the caller gives you.
Watch your "audio footprint." Every public video, voice note, or voicemail you post is raw material for a voice clone. That doesn't mean you have to go silent, just be aware that your voice is now a piece of your digital identity.
Don't act under pressure. Scammers rely on urgency. A real emergency can wait 60 seconds for you to verify. A scam can't.
Be skeptical of any call asking for money. Especially via gift cards, wire transfers, crypto, or apps like Zelle or CashApp. No legitimate person or organization operates that way.
📌 Quick Takeaways
🎙️ AI can clone anyone's voice: from just a few seconds of audio.
📊 1 in 4 Americans have received a deepfake voice call this year.
👴 Seniors lose an average of $1,298 per incident.
🏢 Businesses aren't safe either: one company lost $25 million to a deepfake video call.
🔑 A family safe word is the simplest, most effective defense.
✅ Bottom Line
If someone calls you in a panic, a loved one, a boss, a government agency, pause. Hang up. Verify independently.
Your ears can no longer be trusted. But your brain still can.
Stay safe out there.
Until next time — stay private, stay safe.
— Peter Oram
Chief Cyber Safety Evangelist
P.S.: I’m working on a practical iPhone safety guide for parents.
Reach out if you’re interested in early access.
Join the Community! A Facebook group where you can ask your questions, get tips, and help others.
Want more practical tips like this?
👉 Subscribe or read past issues at newsletter.cybersafety.group
Have a topic you’d like covered?
📬 Email me directly: [email protected]
FOLLOW US ON SOCIAL MEDIA





