Audio deepfake scams: Criminals are using AI to sound like family and people are falling for it

Audio deepfakes are being used to scam elderly family members out of thousands of euros.
Audio deepfakes are being used to scam elderly family members out of thousands of euros. Copyright Canva
Copyright Canva
By Sophia Khatsenkova
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied

Artificial intelligence (AI) is being used to recreate the sound of family members' voices in order to scam people out of thousands of euros.


Imagine you receive a phone call from a family member. They sound distressed and beg you to send them money.

In the end, you find out you’ve been scammed. The call you received was never them but a fake voice created using artificial intelligence (AI).

This isn’t the plot of a science-fiction novel. In fact, for thousands of people, it's very much a reality.

This new technology is called an audio deepfake (a portmanteau of "deep learning" and "fake"). With just a few euros to spare, you can pay a number of companies to produce high-quality voice clones that can be convincing enough to fool someone.

One recent example that went viral is when the users of 4chan, an online forum, replicated British actress Emma Watson’s voice reading Adolf Hitler’s Mein Kampf. 

They used a voice cloning tool called Prime Voice by the start-up ElevenLabs. Another example of how eerily accurate their technology is their replication of actor Leonardo Di Caprio speaking at the United Nations. 

Many experts are starting to become concerned. First, this technology can be used for misinformation; for instance, allowing people to believe a politician made a shocking statement that they never did. Or secondly, to scam people, especially the elderly. 

It doesn't require a lot of work, according to computer science experts. 

"They don't need very much of the voice to make a pretty good reproduction," said Matthew Wright, Chair of Computer Science at the Rochester Institute of Technology in an interview with Euronews. 

"You can imagine they just call them up, pretend to be a salesperson, and capture just enough of the audio to make it work. And that's maybe all they would need to fool someone," he explained. 

Recently, a Vice journalist managed to break into his bank account using an AI replica of his voice. He tricked the bank into thinking it was him in order to access his transactions – putting in question voice biometric security systems. 

The company used to create the Emma Watson audio deepfake announced they increased the price of their service and started manually verifying new accounts.

But experts warn legislation should be put in place to avoid such scams in the future. 

What should you do to avoid being scammed?

First of all, experts suggest you should be mindful of unexpected and urgent calls asking for money from your loved ones or your work. 

Wright suggests you should try to ask them some personal questions to verify their identity.

"You should try using a different source, a different channel. Make up an excuse and say you have to call them back. Then, call back on what you know to be their number. If they're calling and they've spoofed the number, they won't receive the call back," he told Euronews.

"But in general, just take a step back and think. Make sure that this makes sense, Does it really make sense that your boss is asking you to buy ten $250 [€231] gift certificates? Probably not".

For more on this story, watch our report from The Cube in the media player above.

Share this articleComments

You might also like