A terrifying scam is spreading fast: criminals use AI to clone a loved one’s voice and call families with a fake emergency, like a kidnapping or accident, then pressure them to wire money immediately.

A recently reported case where a mother received a call that appeared to come from her daughter’s number and sounded exactly like her daughter, leading to thousands lost before she realized it was a scam.

This is not rare anymore. McAfee’s research has found one in 10 people surveyed said they received a message from an AI voice clone, and 77% of those victims said they lost money.

Meanwhile, the Federal Trade Commission says Americans reported $12,5billion in fraud losses in 2024, with imposter scams totaling $2,95-billion.

 

Why the scam works

Voice is emotional proof. We’re wired to trust it, especially when it sounds like a child, parent, or partner. With modern cloning tools, scammers may only need a short sample from social media videos, voicemails, or other recordings to generate convincing audio.

Analysis shows AI voice scams now require as little as 30 seconds of audio to clone a loved one’s voice, and experts explain how to spot and stop these fake emergency calls in real time.

Olga Scryaba, head of product at isFake.ai, explains: “Voice used to be a strong ‘trust signal’. Now it’s just another file that can be copied. If a call tries to rush you into money or secrecy, treat the voice as unverified until you confirm it through a second channel.”

She advises users who get a terrifying call from a “loved one” to do this in order:

  • Interrupt the script with a “proof question” – Ask something a cloner won’t know: “What was the name of our first pet?” “What did we eat last Sunday?” Avoid info visible on social media.
  • Use a family safe word (yes, really) – National Cybersecurity Alliance-style advice reported in the story: agree on a code word or phrase for emergencies and make it universal across the family (including grandparents).
  • Hang up and call back on a saved number – Do not trust caller ID. Call the person (or another family member) using a number you already have saved.
  • Slow the money down – Scammers rely on urgency. Any request to wire money, buy gift cards, or send crypto during an “emergency call” is a flashing red sign.
  • If you have a recording, verify it – If the scam left a voicemail/voice note, run it through an AI audio checker before you treat it as evidence. (This is where detection tools help when your ears can’t.)

 

Some red flags

Even good fakes can slip. Still, listen for:

  • Odd timing: unnatural pauses, latency before responses, or answers that dodge follow-up questions.
  • Too-clean speech: perfect pronunciation, limited natural breathing, weirdly consistent cadence.
  • Context gaps: vague details (“I’m in trouble, send money”) instead of specifics only the real person would share.

As Olga notes: “The biggest tell is usually not the sound, it’s the behaviour. Real people will let you verify. Scammers will punish you for trying.”