Get all your news in one place.
100’s of premium titles.
One app.
Start reading
PC Gamer
PC Gamer
Rich Stanton

Financial worker attends company meeting with AI deepfakes of senior 'colleagues' and is duped into transferring the scammers $26 million

A face emerging from a sea of data.

A finance employee at an unnamed major multinational corporation has been fooled into transferring $200 million Hong Kong dollars (around $25.6 million) to scammers using deepfake technology to impersonate his colleagues. The AI-created simulacra of the man's fellow workers included a deepfake of the company's Chief Financial Officer (CFO), and Hong Kong police say the scam took place via a video conference call (as reported by CNN).

The scammers "invited the worker to a video conference that would have many participants," said senior superintendent Baron Chan Shun-ching to broadcaster RTHK. "Because the people in the video conference looked like the real individuals concerned, the worker made 15 transactions as instructed to five local bank accounts, which came to a total of HK$200 million. It turns out that everyone [the worker saw] was fake."

Chan is from the Hong Kong police's cyber security division, and as regards the deepfake technique used says "I believe the fraudster downloaded videos in advance, and then used artificial intelligence to add fake voices within the video conference." He said the incident began in January when the worker received a message purporting to be from the company's UK-based CFO. The message invited him to a video call to discuss a confidential transaction that the company had to make.

If you're thinking "well there's your first red flag", so did the victim here. The worker was suspicious of the message, and thought the whole secrecy of the transaction indicated this might be a phishing attack. Unfortunately the worker allowed their doubts to be assuaged and, once on the video call, were convinced that the video and audio deepfakes of several colleagues were the real deal.

The worker made the transactions as requested. Officer Chan says that after the conference call the worker contacted the company's head office, and realised it had been a scam.

"We want to alert the public to these new deception tactics," says Chan. "In the past, we would assume these scams would only involve two people in one-on-one situations, but we can see from this case that fraudsters are able to use AI technology in online meetings, so people must be vigilant even in meetings with lots of participants."

The Hong Kong police has a list of recommendations that will be familiar to anyone who's had to go through corporate infosec training, but with specific regards to this case recommends that any employee invited to a suspicious meeting try to confirm the details through their company's regular communication channels. They also suggest that you ask questions during meetings to check if the other participants are real and who they say they are… which is going to make PC Gamer's next few staff meetings a lot more fun.

At Friday's press briefing, the Hong Kong police said they'd made six arrests in connection with scams such as this, but did not say whether any were related to this particular crime. 

Chan also offered an interesting if more low-level example of how deepfake technology is being used in the region, by talking about stolen identity cards. If you live in Hong Kong you have a mandatory ID card (HKID) from the age of 11, and this government document is necessary to do practically anything. Chan said eight stolen HKID cards (all of which had been reported as lost by their owners) were used to make 90 loan applications and register for 54 bank accounts over a period of three months last year. During at least 20 of these application processes, deepfakes of the HKID cards' owners had been used to fool facial recognition programs.

Examples of deepfake technology have been around for years, some certainly better than others, but it feels like 2024 is gearing up to be the year of the deepfake. It began with a scandal around fake sexual images of the pop star Taylor Swift, part of which was how quickly various platforms allowed the content to go viral, but the acid test will come later this year in the US presidential election. Whether it's audio or video there are many examples of notable figures like President Biden being deepfaked as saying or doing things they haven't done and, amid this slurry, it doesn't seem that the likes of Facebook have much of a clue what to do about it.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.