Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Luc Olinga

Facebook Wants to Change But Is Stained by Hate Speech

The company promotes itself as a way to bring people together and make the world a better place. 

Millions of people around the world use its platforms of Facebook, Instagram and WhatsApp to exchange, learn, discuss, create and perform a large number of daily tasks. 

But Facebook, now known as Meta Platforms (FB), continues to struggle to have a clear policy on content moderation and more particularly those relating to calls for violence despite

A week after being heavily criticized for allowing Ukrainians and nationals of certain European countries to share messages calling for the assassination of Russian President Vladimir Putin, the social media giant is again being singled out.

"Facebook approves adverts containing hate speech inciting violence and genocide against the Rohingya," the rights group Global Witness accused Facebook of this week.

The organization says it has tested the platform by submitting for approval eight paid ads to be broadcast on Facebook. 

Each of these commercials contained hate language towards the Rohingya, a minority ethnic group who predominantly follow Islam and reside in Myanmar.

The Firm Approved Hate Ads 

To everyone's surprise, all eight hate ads were approved by Facebook.

Global Witness, however, removed the ads before they were published and posted. 

But the group managed to prove that Facebook still has trouble detecting hateful content and those calling for violence despite its promises.

"The hate speech examples we used are highly offensive and we are therefore deliberately not repeating the exact phrases used here. However, all of the ads fall within Facebook’s definition of hate speech," the rights group said.

It said that the sentences it used included violent speech that called for the killing of the Rohingya, using de-humanizing speech that compared the Rohingya to animals.

"Calls for exclusion or segregation including a claim that the Rohingya are not Myanmar citizens and that the country needs to protect itself against a 'Muslim invasion'," Global Witness explained.

Facebook defines hate speech as: 

"A direct attack against people — rather than concepts or institutions— on the basis of what we call protected characteristics: race, ethnicity, […] religious affiliation […]. We define attacks as violent or dehumanizing speech, harmful stereotypes, statements of inferiority, expressions of contempt, disgust or dismissal, cursing and calls for exclusion or segregation."

U.S. Secretary of State Antony Blinken announced this week that the U.S. views the violence against the Rohingya as genocide. 

"Beyond the Holocaust, the United States has concluded that genocide was committed seven times. Today marks the eighth, as I have determined that members of the Burmese military committed genocide and crimes against humanity against Rohingya," Blinken said in a distributed statement.

The army conducted what it called a clearance campaign in western Myanmar’s Rakhine state in 2017 after an attack by a Rohingya insurgent group, according to Associated Press. 

More than 700,000 Rohingya fled into neighboring Bangladesh and security forces were accused of mass rapes, killings and torching thousands of homes.

Same Old Thing

Facebook doesn't deny it approved the eight ads. Instead, the company preferred to outline its efforts to curb hate speech.

“We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw, disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe," a Meta spokesperson told TheStreet in an email statement.

The spokesperson added that Facebook has upped its Burmese-language capabilities.

 "We've also invested in Burmese-language technology to reduce the prevalence of violating content," they said. 

"This work is guided by feedback from experts, civil society organizations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018."

Global Witness said these changes haven't made a difference.

"Facebook’s ability to detect Burmese language hate speech remains abysmally poor," it said

This is not the first time that Facebook has been questioned about the Rohingya.

In November 2018,  Facebook admitted that in Myanmar it had failed to prevent its platform from being used to "foment division and incite offline violence" in the country.

"The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence," Alex Warofka, product policy manager, wrote in a blog post on Nov. 5, 2018.

"We agree that we can and should do more," the executive added.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.