Starling Bank, a UK-based online lender, has raised concerns about the potential for fraudsters to exploit artificial intelligence technology to clone people's voices and carry out scams. The bank warns that scammers can replicate a person's voice using just three seconds of audio, which could be sourced from videos posted online. By mimicking the victim's voice, scammers can deceive friends and family members into sending money.
According to Starling Bank, these AI voice-cloning scams have the capacity to deceive millions of individuals. A recent survey conducted by the bank in collaboration with Mortar Research revealed that over a quarter of respondents had been targeted by such scams in the past year. Shockingly, 46% of those surveyed were unaware of the existence of these scams, and 8% admitted they would comply with a request for money from a suspicious call.
Lisa Grahame, the Chief Information Security Officer at Starling Bank, emphasized the importance of safeguarding against these scams. She recommended establishing a unique 'safe phrase' with loved ones to verify their identity over the phone. This phrase should be easy to remember and distinct from other passwords. Grahame cautioned against sharing the safe phrase via text, as scammers could exploit this information.
As artificial intelligence technology advances, the risks associated with voice-cloning scams are escalating. Concerns have been raised about the potential misuse of AI to access individuals' bank accounts and spread misinformation. OpenAI, the developer of the ChatGPT chatbot, recently introduced a voice replication tool called Voice Engine but withheld public access due to concerns about synthetic voice misuse.
It is crucial for individuals to remain vigilant and take proactive measures to protect themselves from AI voice-cloning scams. By staying informed and implementing security measures like safe phrases, people can reduce their vulnerability to fraudulent activities.