Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Top News
Top News

AI Voice-Cloning Scams Target Millions, Warns UK Bank

An AI (Artificial Intelligence) sign is seen at the World Artificial Intelligence Conference (WAIC) in Shanghai

Starling Bank, a UK-based online lender, has raised concerns about the potential for fraudsters to exploit artificial intelligence technology to clone people's voices and carry out scams. The bank warns that scammers can replicate a person's voice using just three seconds of audio, which could be sourced from videos posted online. By mimicking the victim's voice, scammers can deceive friends and family members into sending money.

According to Starling Bank, these AI voice-cloning scams have the capacity to deceive millions of individuals. A recent survey conducted by the bank in collaboration with Mortar Research revealed that over a quarter of respondents had been targeted by such scams in the past year. Shockingly, 46% of those surveyed were unaware of the existence of these scams, and 8% admitted they would comply with a request for money from a suspicious call.

Scammers deceive friends and family to send money using voice replication.
AI technology can clone voices with just 3 seconds of audio.
Over a quarter of respondents have been targeted by AI voice-cloning scams.

Lisa Grahame, the Chief Information Security Officer at Starling Bank, emphasized the importance of safeguarding against these scams. She recommended establishing a unique 'safe phrase' with loved ones to verify their identity over the phone. This phrase should be easy to remember and distinct from other passwords. Grahame cautioned against sharing the safe phrase via text, as scammers could exploit this information.

As artificial intelligence technology advances, the risks associated with voice-cloning scams are escalating. Concerns have been raised about the potential misuse of AI to access individuals' bank accounts and spread misinformation. OpenAI, the developer of the ChatGPT chatbot, recently introduced a voice replication tool called Voice Engine but withheld public access due to concerns about synthetic voice misuse.

It is crucial for individuals to remain vigilant and take proactive measures to protect themselves from AI voice-cloning scams. By staying informed and implementing security measures like safe phrases, people can reduce their vulnerability to fraudulent activities.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.