+
upworthy

federal trade commission

Pop Culture

Heads up! That call from a panicky relative may be a scammer voice clone.

The FTC is warning people to look out for the latest scam trend.

via Pexels

A man makes a phone call from prison.

One of the oldest frauds in the book is the “your loved one is in trouble” scam. Scammers call posing as a grandchild or loved one in distress who claims they’ve been kidnapped or are in jail. The scammer may also impersonate a nurse, police officer, lawyer or other authority figure representing the loved one.

The scammer claims that the loved one needs money wired to the fraudster immediately to bring them to safety.

The scam is effective because the victim is under pressure to get them money quickly, so they don’t have time to consider the fact that it may be a scam. All the while, they imagine the torment the loved one is going through. The urgency of the scam makes it much more likely that the victim will hand over the money.


The FTC is warning people that scammers have given this scam a new technological twist by faking the voice of the loved one by using voice cloning powered by artificial intelligence.

“Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie,” the FTCs warning reads. “We're living with it, here and now. A scammer could use AI to clone the voice of your loved one. All he needs is a short audio clip of your family member's voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.”

All fraudsters need to clone someone’s voice realistically is a 30-second clip of audio that they can easily rip from Facebook, TikTok, podcasts, commercials or Instagram. The voice-generating software synthesizes what makes a person's voice sound unique and then finds similar voices by searching vast databases. This allows them to replicate someone’s voice in real-time to create a phone call that sounds authentic.

It’s terribly difficult to detect the scam because voice-cloning software has become increasingly accurate. As AI technology improves, avoiding being fooled by the scam will become harder.

scams, ftc, artificial intelligence

A scammer finds his next victim.

via Pexels

"It's terrifying," Hany Farid, a professor of digital forensics at University of California, Berkeley, told The Washington Post. "It's sort of the perfect storm ... [with] all the ingredients you need to create chaos."

FTC officials say that in 2022 Americans lost $8.8 billion to fraud, with imposter scams being the most common, and there’s usually no way to get the money back. Scammers usually demand payment through cryptocurrency, money wires, or gift cards, so tracing it is impossible.

The FTC wants people to think twice if they receive a phone call from a loved one in distress or someone claiming to be their representative, especially if they ask for money.

“Don’t trust the voice,” the FTC warning reads. “Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”

If you are targeted by one of these voice clone scams, report it to the FTC immediately. You could prevent the next person from being scammed.