Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Chronicle Live
Chronicle Live
National
Catherine Furze

Scammers need just three seconds of your voice to con your friends and family

Scammers may need just a three-second snippet of your voice to con your friends and family into handing over money.

Researchers at online protection specialists McAfee found that artificial intelligence (AI) technology is fuelling a rise in online voice scams, with nearly a quarter of British people saying themselves or a friend have already been targeted.

Half of all UK adults share their voice at least once a week online or on social media and criminals can clone your voice then use this to target your friends and family over the phone or via a voice note, convincing them you are in an emergency and asking them to send over money. And the research found that 78% of AI voice scam victims lost money – with 40% losing over £1,000 and 6% between £5,000 and £15,000.

Read more: Four scams to watch out for as high-tech criminals turn to Artificial Intelligence

Everybody’s voice is unique, which is why hearing somebody speak is a widely-accepted way of establishing trust. But cloning how somebody sounds is now a powerful tool in the arsenal of a cybercriminal, and McAfee's research team concluded that it has never been easier to commit cybercrime.

Over 1,000 UK adults were among the 7,000 surveyed worldwide for McAfee's The Artificial Imposter report. It found that almost a quarter of Brits report themselves or a friend have already experienced some kind of AI voice scam, with 1 in 12 targeted personally and 16% saying it happened to someone they know. Nearly four in five victims confirmed they had lost money as a result.

With the rise in artificial intelligence tools, it is easier than ever to manipulate images, videos, and the voices of friends and family members. McAfee found scammers are using AI technology to clone voices and then send a fake voicemail to or call the victim’s contacts pretending to be in distress. And with 65% of adults not confident that they could identify the cloned version from the real thing, it’s no surprise that this technique is gaining momentum.

More than three in 10 respondents said they would reply to a voicemail or voice note purporting to be from a friend or loved one in need of money, particularly if they thought the request had come from their partner or spouse (36%), child (28%) or parent (26%). Messages most likely to elicit a response were those claiming that the sender had been involved in a car incident (47%), been robbed (45%), lost their phone or wallet (43%), or needed help while traveling abroad (42%).

“Artificial Intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Vonny Gamot, head of Europe, the Middle East and Africa at McAfee.

“Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and deceive a close contact into sending money. It’s important to remain vigilant and to take proactive steps to keep you and your loved ones safe. Should you receive a call from your spouse or a family member in distress and asking for money, verify the caller. Identity and privacy protection services will also help limit the digital footprint of personal information that a criminal can use to develop a compelling narrative when creating a voice clone.”

Using the cloning tools they found, McAfee’s researchers discovered that they had no trouble replicating accents from around the world, whether they were from the US, UK, India or Australia, but more distinctive voices were more challenging to copy. For example, the voice of a person who speaks with an unusual pace, rhythm or style requires more effort to clone accurately and they are less likely to be targeted as a result.

How to Protect Yourself from AI Voice Cloning

  • Set a verbal ‘codeword’ - Agree a word with friends and family that only they could know. Make a plan to always ask for it if they call, text or email to ask for help, particularly if they’re older or more vulnerable.
  • Always question the source – If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognise, stop, pause and think. Does that really sound like them? Hang up and call the person directly or try to verify the information before responding and certainly before sending money.
  • Think before you click and share – Be thoughtful about the friends and connections you have online. The wider your connections and the more you share, the more risk you may be opening yourself up to having your identity cloned for malicious purposes.

Now read:

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.