Meta has just announced plans to bring back facial recognition technology to Facebook and Instagram. This time it's a security measure to help combat "celeb-bait" scams and restore access to compromised accounts.
"We know security matters, and that includes being able to control your social media accounts and protect yourself from scams," wrote the Big Tech giant in a blog post published on Monday, October 21.
Meta wants to use facial recognition technology to detect scammers who use images of public figures to carry out attacks. The company is proposing to compare images on adverts or suspicious accounts with celebrities' legitimate photos. The facial recognition tech will also allow regular Facebook and Instagram users to regain access to their own accounts if locked or hijacked. They'll be able to verify their identity through video selfies which can then be matched to their profile pictures. Handy, sure, but can I trust Meta with my biometrics?
The Big Tech giant promises to take a "responsible approach" which includes encrypting video selfies for secure storage, deleting any facial data as soon as it’s no longer needed, and not using these details for any other purpose. Yet, looking at Meta's track record regarding protecting and misusing its users' information and I'm concerned.
Meta's broken promises
Facebook's parent company has repeatedly breached the privacy, and trust, of its users in the past.
The 2018 Cambridge Analytica scandal was probably the turning point. It shed light on how the personal information of up to 87 million Facebook users was misused for targeting political advertising, predominately during Donald Trump's 2016 presidential campaign.
The company implemented significant changes around user data protection after that but Meta's privacy breaches have continued.
Only this year Meta admitted to having scraped all Australian Facebook posts since 2007 to train its AI model without giving the option to opt-out. The company was also hit with a major fine (€91 million) in Europe for incorrectly storing social media account passwords in unencrypted databases. The year before, January 2023, Meta was hit by an even bigger fine (€390 million) for serving personalized ads without the option to opt out and illicit data handling practices.
It's certainly enough to make me skeptical of Meta's good intentions and big promises.
I’m curious what privacy advocates think about Meta’s new plan to use facial recognition to help locked-out users. Getting locked out of Facebook or Instagram is a huge problem.But this is also a reminder we have no federal laws protecting our faces. https://t.co/jvX9NIWYPuOctober 23, 2024
It's also worth noting that Meta itself decided to shut down its previous facial recognition system in 2021 over privacy concerns, promising to delete all the "faceprints" collected. Now, three years later, it's back on the agenda.
"We want to help protect people and their accounts," wrote Meta in its official announcement, "and while the adversarial nature of this space means we won’t always get it right, we believe that facial recognition technology can help us be faster, more accurate, and more effective. We’ll continue to discuss our ongoing investments in this area with regulators, policymakers, and other experts."
We won’t always get it right – that's not very reassuring. So, something wrong is certain to happen at some point? If that's the case, no thanks, Meta, I don't trust you with my biometric data. I'd rather lose Facebook or Instagram account. What's the benefit of solving a problem to create an even bigger one?
What's certain is that Mark Zuckerberg doesn't need to lose any sleep over EU fines over this for the time being. Meta's facial recognition tests aren't running globally. The company has excluded the UK and the EU markets. GDPR provides stringent privacy laws around personal information.
Elsewhere, Meta's testing will eventually show whether or not the new security feature is the right solution to the growing issue of social media scams, or whether it becomes yet another privacy nightmare. Well, in the name of my privacy, I'm not sure it's worth the trouble of finding out.