Scammers Are Cloning Your Voice – And Using It to Trick Your Family

One of the most frightening uses of AI today is voice cloning, where scammers can replicate a person’s voice with just a few seconds of audio and use it to trick their loved ones into sending money. The result is a scam that feels more personal and believable than anything we’ve seen before.

The Rise of Voice Cloning Scams

For decades, phishing scams have relied on fake emails or texts to trick people into giving up sensitive information. This has now evolved into smishing—phishing by text—and vishing, which involves fake or cloned voices over the phone. But what makes voice cloning different is how intimate it feels.

Scammers are now able to mimic a person’s voice so accurately that even close family members can’t tell the difference. According to Nathan House, CEO of cybersecurity firm StationX, “The everyday vishing script is a high-pressure, ‘urgent problem’ phone call. The caller spoofs your bank’s number, claims your account is compromised, and needs you to verify a one-time passcode they just texted – actually your real two-factor code.”

House explained that scammers impersonate “police, utility companies, or a panicked relative demanding emergency funds.” The common pattern is clear: the call appears to come from someone you trust, the message creates a sense of emergency, and the caller demands immediate action, like reading a code or wiring money.

A Technology That’s Becoming Harder to Spot

Voice cloning technology used to require several minutes of high-quality audio to work. But according to cybersecurity journalist Jurgita Lapienyte, “These days, with AI and automation, and other innovations, you just need a couple of seconds of someone’s voice, and you can fake someone’s voice.” She explained that scammers can now grab audio from a TikTok post or a wrong-number voicemail, then feed that into a free cloning engine and start calling potential victims.

The voice often follows a script because the current generation of AI can’t react naturally in a conversation. But that’s changing. “It’s only a matter of time until it actually learns to be more like us and can be weaponized against us,” Lapienyte warned.

A Real and Terrifying Example

In one disturbing case, a woman named Robin received a late-night phone call. On the other end, she heard what sounded like her mother-in-law’s voice crying out, “I can’t do it, I can’t do it.” Robin believed she was being told something tragic had happened, maybe a car crash involving her parents and in-laws. She woke her husband Steve, who works in law enforcement. When he answered the phone, a man told him, “You’re not gonna call the police. You’re not gonna tell anybody. I’ve got a gun to your mom’s head, and I’m gonna blow her brains out if you don’t do exactly what I say.”

Steve kept the caller on the line while texting a colleague with hostage negotiation experience. He asked to hear his mother’s voice. The man refused and threatened to kill her if Steve asked again. Then he demanded $500 via Venmo. “It was such an insanely small amount of money for a human being,” Steve said. But they sent the money. The scammer then asked for more – another $250 – for a plane ticket. Only after the call ended did they reach Steve’s real parents, who were safe and asleep the whole time.

Robin later reflected, “We told everyone we knew to be aware of this very sophisticated thing.” Their family now uses a shared password to verify calls, though her mother-in-law admitted she already forgot what it was. “Seven hundred and fifty dollars,” she said. “I still can’t believe that’s all I was worth.”

The Scale of the Problem

The FBI’s Internet Crime Complaint Center received 193,407 reports of phishing or spoofing scams in 2024, making it the most reported cybercrime that year. In comparison, there were just 441 reports of malware. The agency said that voice phishing and spoofing are particularly dangerous because they are cheap to run and can scale to millions of people.

Older adults are the most common targets. People over the age of 60 reported losses nearing $5 billion in 2024. The agency also found that many of these crimes go unreported. “They are feeling lonely and they don’t want to be ridiculed,” said Lapienyte, explaining why many victims, especially seniors, choose not to come forward.

In another case, a British executive transferred over $243,000 to scammers after being tricked by a cloned voice he believed was his boss. “Voice-deepfake heists remain rare headlines, but they’re almost certainly under-reported because companies dread the reputational hit,” said House.

A U.K. man named Tejay Fletcher was sentenced to more than 13 years in prison in 2023 for operating iSpoof, a website that allowed scammers to buy spoofing software and impersonate banks, government agencies, and even relatives. That single operation brought in more than $1 million in Bitcoin from users around the world.

Smishing: The Silent Threat in Your Texts

While voice cloning grabs headlines, smishing remains a major threat. The Internal Revenue Service has warned of mass campaigns that send fake text messages claiming to offer tax refunds, COVID relief, or other benefits. Clicking the links in these messages can lead to websites that collect personal data or install malicious code.

“This is phishing on an industrial scale,” said former IRS Commissioner Chuck Rettig. In some cases, scammers used just a few dozen fake email addresses to create over a thousand fraudulent domains.

Smishing scams may claim to be from banks, delivery services, or government agencies. Some texts even pose as customer service messages, asking users to verify account details.

What Can You Do?

Experts and agencies say that awareness and preparation are your best defense.

1. Use a Family Safe Word
Choose a phrase that only your family would know—something that isn’t posted on social media or easily guessed. Actor James Nesbitt suggested using “a family in-joke” to confirm identities. If someone calls asking for money, ask them for the phrase. If they can’t say it, hang up.

Nathan House agreed: “It’s a simple, effective speed-bump. If someone calls sounding like your son begging for bail money, asking for the agreed phrase forces the impostor to break character.”

2. Verify Every Emergency Call
If you get a call from a loved one asking for money, hang up and call them back using a number you know is real. If you can’t reach them, contact another family member.

3. Never Send Money Over the Phone
The FTC warns that real emergencies rarely involve wiring money, buying gift cards, or sending crypto. If someone pressures you to act fast and avoid contacting others, it’s likely a scam.

4. Train Yourself and Your Family
Teach elderly relatives about the risks of voice cloning and smishing. Talk about what to do if they receive a suspicious call or message. Lapienyte pointed out, “When someone close to you is calling, you don’t try to verify their identity. You don’t put up protective shields.”

5. Push for Better Protections
The FTC has launched a Voice Cloning Challenge to develop tools that detect fake voices in real time. One winning entry uses a “liveness score” to identify whether a voice is human or synthetic. Others use invisible audio watermarks to confuse cloning software.

Still, experts like Berkeley professor Hany Farid say the technology is advancing faster than the regulations can keep up. He noted that scammers only need to succeed once in a hundred attempts to profit: “The bad guy can fail ninety-nine percent of the time, and they will still become very, very rich.”

The Bigger Picture

Voice cloning isn’t always used for crime. It’s helping people with diseases like ALS keep their voices. It’s letting actors dub films in other languages. Even former President Obama has used it for official announcements. But when tools like ElevenLabs and Vall-E can replicate your voice from a three-second clip, it’s clear that criminals have the same power as corporations.

Senator Jon Ossoff said the issue of deepfake voices is urgent for lawmakers. “Can we get good enough fast enough at discerning real from fake, or will we lose the ability to verify the authenticity of voices, images, video, and other media?” he asked.

For now, protecting yourself means slowing down, verifying everything, and trusting your instincts. Technology may be making life easier, but it’s also making lies sound more like the truth than ever before.