An oversight board has criticized Facebook owner Meta's policies surrounding manipulated media, calling them 'incoherent' and insufficient to address the growing issue of online disinformation targeting elections worldwide. The board's review of an altered video of President Joe Biden that spread on Facebook exposed gaps in the company's policy. As a result, the board recommended that Meta expand its policy to include all types of manipulated media, not just videos generated with artificial intelligence.
One specific area that the board highlighted was the need to address fake audio recordings, which have already convincingly impersonated political candidates in the United States and other countries. Additionally, the board suggested that Meta clarify the harms it aims to prevent and labeled images, videos, and audio clips as manipulated instead of removing them outright.
The board's feedback reflects the ongoing scrutiny faced by tech companies regarding their handling of election-related falsehoods as numerous countries gear up for elections this year. With the rise of generative artificial intelligence deepfakes and lower-quality 'cheap fakes' on social media, platforms are racing to respond to false posts while preserving users' right to free speech.
Oversight Board co-chair Michael McConnell described Meta's policy as making little sense in its current form. He emphasized the need for the company to close gaps in its policy while ensuring that political speech remains protected. Meta has stated that it is reviewing the board's guidance and will publicly respond to the recommendations within 60 days.
Corey Chambliss, a spokesperson for Meta, clarified that while audio deepfakes aren't explicitly mentioned in the company's manipulated media policy, they are eligible to be fact-checked and labeled or down-ranked if deemed false or altered by fact-checkers. Chambliss also noted that the company takes action against any content that violates Facebook's Community Standards.
With Facebook remaining the most popular social media site for Americans to consume news, the repercussions of manipulated media are significant. Yet, other platforms such as Meta's Instagram, WhatsApp, and Threads, as well as X, YouTube, and TikTok, also serve as potential hubs for deceptive media that can deceive and mislead voters.
Intending to address content moderation on its platforms, Meta established the oversight board in 2020. The board's recent recommendations followed a review of an altered video clip involving President Biden and his adult granddaughter. While the video was misleading, it did not violate Meta's existing manipulated media policy. The board's decision upheld Meta's choice to keep the altered clip on Facebook, as it technically did not violate the policy.
Nevertheless, the board advised Meta to update its policy and label similar manipulated videos in the future. It argued that to protect users' right to freedom of expression, Meta should label such content as manipulated rather than removing it if it does not violate any other policies. The board also emphasized that some forms of manipulated media are intended for humor, parody, or satire and should be protected.
Meta welcomed the oversight board's ruling on the Biden post and expressed its commitment to reviewing the board's recommendations and updating the post accordingly. While Meta is required to follow the oversight board's rulings on specific content decisions, it is not obliged to adhere to the board's broader recommendations. Nonetheless, the board has succeeded in prompting Meta to make some changes over the years, including providing more specific explanations to users who violate its policies.
Jen Golbeck, a professor at the University of Maryland's College of Information Studies, believes that Meta, being a major player in the social media landscape, has the potential to lead in labeling manipulated content. However, she emphasizes that more crucial than changing policy is the implementation and enforcement of those changes, especially in the face of political pressure from those who wish to disseminate misinformation. Golbeck argues that failure to enforce such changes would further erode trust in the platform.
As the issue of manipulated media continues to pose significant challenges, tech companies like Meta must strike a balance between combating disinformation and protecting users' freedom of expression. The recommendations from the oversight board are an important step toward refining Meta's approach and addressing the growing concerns surrounding online disinformation during election seasons worldwide.
___ Associated Press technology writer Barbara Ortutay in San Francisco contributed to this report. ___ The Associated Press receives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP’s democracy initiative here. The AP is solely responsible for all content.