Recently, Meta has taken action to remove tens of thousands of racy advertisements for artificial intelligence-generated 'girlfriends' from its platforms. An investigation by Wired revealed over 29,000 instances of ads for 'AI girlfriend' apps on Facebook, Instagram, and Messenger. These ads featured chatbots engaging in sexually explicit messaging and AI-generated images of scantily-clad women in suggestive poses, with more than half of them labeled as 'NSFW'.
Meta's policies explicitly prohibit adult content, including nudity, explicit or suggestive depictions of people, and sexually provocative activities. Despite this, around 2,700 of the ads were active when Wired contacted Meta about the issue.
A Meta spokesperson stated that the company is committed to swiftly removing violating ads and continuously improving its systems to detect and address content that goes against its policies. The spokesperson emphasized that Meta is vigilant in monitoring for new tactics used by groups or individuals to circumvent detection and evade enforcement.
Meta's ongoing efforts to enforce its adult content ads policy demonstrate its dedication to maintaining a safe and appropriate environment on its platforms. By promptly removing these explicit ads, Meta aims to uphold community standards and protect users from encountering inappropriate content.