Instagram is taking steps to protect teens from falling victim to financial sextortion, a disturbing trend where scammers coerce young individuals into sending nude photos and then extort them for money. The social media platform announced that it is testing new features designed to address this issue.
Financial sextortion involves scammers threatening to post explicit images online unless victims pay them money or gift cards. This form of exploitation particularly targets kids and teenagers, leading to feelings of isolation and shame.
As part of its efforts to combat this crime, Instagram will be rolling out various new tools in the coming weeks. These features include blurring nude images sent in direct messages and notifying users when they have interacted with someone involved in financial sextortion. The tools will eventually be available to all users worldwide.
Antigone Davis, Meta's director of global safety, emphasized the importance of raising awareness about financial sextortion and providing users with the necessary tools to protect themselves. The company aims to stay ahead of this growing problem and continuously evolve its safety measures.
While the non-consensual sharing of nude images has been a longstanding issue, the FBI has recently observed an increase in financial sextortion cases involving strangers, often originating from scammers overseas. In some tragic instances, sextortion has even led to suicide.
Meta's new tools complement existing safety features for teens, such as strict messaging settings, safety notices, and the option to report threatening or inappropriate messages. Additionally, Meta has collaborated with organizations like the National Center for Missing and Exploited Children to develop initiatives like Take It Down, which helps young people remove explicit images from the internet.
Instagram's upcoming nudity protection feature will automatically blur explicit images sent in direct messages and alert recipients about the content. The platform will also prompt users to block senders of such images and discourage the sharing of nude photos. These measures will be especially targeted at users under 18, with adults encouraged to enable the feature as well.
Furthermore, Meta is leveraging machine learning technology to identify accounts engaged in sextortion scams and make them more difficult to interact with. By monitoring suspicious behavior, the company aims to protect users from falling victim to such exploitation.
Through partnerships like the Lantern program, Meta is enhancing its ability to share information about child safety violations across tech platforms. By integrating sextortion prevention tools and promoting awareness among parents and young users, Instagram is taking proactive steps to combat this harmful practice.
Antigone Davis emphasized the importance of parental involvement in educating children about online safety and encouraging open communication about any concerning incidents. By empowering users with knowledge and tools, Instagram aims to create a safer online environment for all.