Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Alex Hern

Facebook moderators call on firm to do more about posts praising Bucha atrocities

Mobile phone applications
Moderators working for third-party contracting firms say they are lacking input from Facebook. Photograph: Jeppe Gustafsson/REX/Shutterstock

Facebook moderators have called on the company to let them take action against users who praise or support the Russian military’s atrocities in Bucha and across Ukraine.

Almost a month after evidence of widespread murder and mass graves was uncovered by Ukrainian forces taking the suburb of Kyiv, the social network still has not flagged the atrocity as an “internally designated” incident, the moderators say.

That restricts how they can treat content related to the killings, they say, and forces them to leave up some content they believe ought to be removed.

“It’s been a month since the massacre and mass graves in Bucha, but this event hasn’t been even designated a ‘violating event’, let alone a hate crime,” said one moderator, who spoke to the Guardian on condition of anonymity. “On that same day there was a shooting in the US, with one fatality and two casualties, and this was declared a violating event within three hours.”

Under Facebook’s public moderation guidelines, users are barred from posting content that makes violent threats through “references to historical or fictional incidents of violence”. But in private documents issued to moderators, who work for third-party contracting firms such as Accenture or Bertelsmann, they are told to wait for regional input from Facebook itself before determining whether a “documented violent incident” counts.

In the absence of that input, content that praises events in Bucha is tough to remove if it’s even slightly coy about whether it’s celebrating the murder of people. One post, for instance, showed a T-shirt featuring a butcher carving up a pig, with Russian text on it reading “РеZня в Буче Можем поVторить” – “Slaughter in Bucha, we can repeat”.

“My suspicion is that this is just not as close, not as important to American audiences or the American public, so it just doesn’t get the attention,” the Facebook moderator said. “After two weeks I realised that they probably aren’t going to do anything about it.

“I was quite happy with the initial reaction of Facebook to the war,” they added. “I was quite happy with the exceptions that were made that allowed dehumanising speech against soldiers. Those changes brought some balance into the policies: victims and oppressors were not treated the same and were not given the same rights. But now, it has become clear that what counts for Facebook is American public opinion. They only care if they look good in the US media.”

In a statement, a spokesperson for Meta, Facebook’s parent company, said: “It’s wrong to suggest we wouldn’t remove graphic content that celebrates or glorifies the atrocities in Bucha, or any post that mocks the death of individual victims or advocates for violence against Ukrainians in any way.

“We’ve longstanding policies that make clear this content is not allowed on Facebook and Instagram,” they added. “We have been providing additional guidance to content reviewers on these policies and content associated with the war in Ukraine to explain how our policies apply in the current context, and will continue to do so.”

State use of force is treated differently to criminal acts of violence under its policies, Facebook said, based on extensive consultations with outside experts.

The issue was raised by the nonprofit campaign group Foxglove Legal. Its director, Martha Dark, said: “Facebook can act swiftly on mass killings when it wants to. In the US, for example, mass shootings tend to be classed within hours as a ‘violating event’ – this blocks posters from praising the murders, celebrating the perpetrators, or calling for more violence.

“The Bucha massacre happened a month ago. Facebook has a simple policy it could adapt to end this now. It just can’t be bothered. Facebook will claim for the hundredth time that it ‘takes these matters seriously’ and is ‘working closely with moderators’ to fix them. That’s just spin from a tech giant that cares only about the bottom line.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.