Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Lifestyle
Franz Krüger, Associate researcher, University of the Witwatersrand

South African rapper AKA's murder video went viral - it shouldn't have

Rapper Kiernan 'AKA' Forbes Known during the Metro FM awards nominations in Johannesburg in January. Veli Nhlapo. © Sowetan.

In the days after the killing of rapper Kiernan Jarryd Forbe, known as AKA, and his friend Tebello “Tibz” Motoane, the murders kept playing out on social media. Again and again, leaked CCTV footage of the two being gunned down was viewed and shared – some 490,000 times in the version of just one Twitter account.

The explosive viral spread of the grainy but dramatic footage shows the limits of mainstream media ethics. Beyond the reach of press and broadcast codes and complaints mechanisms, social media platforms are driven by algorithms that measure and reward success by the millions of clicks. This often means boosting the worst and most sensational material. It’s urgently necessary to find ways of ensuring the platforms show greater responsibility.

Mainstream media ethics, as captured in the South African Press Code and the Broadcasting Code, make it clear that footage of this kind can only be used if there is good reason. Violence should not be glorified, the press code says, and the depiction of violent crime should be avoided “unless the public interest dictates otherwise”.


Read more: The media often conflates malicious criticism with genuine critique: why it shouldn't


Public curiosity about the assassinations is undoubtedly high, but it’s not the same as what the codes understand as public interest. That is defined as

information of legitimate interest or importance to citizens.

The concern about material of this kind is less about the possibility of hampering police work, as some have argued, but about the potential harm: the pain caused to a grieving family and the offence caused to audiences by gratuitous and shocking violence. Where the value of material lies more in offering grisly entertainment than in its news value, publication becomes questionable.

The duty to shock

Editors do sometimes decide that disturbing, graphic images can be used. Examples include photographs of assassinated South African Communist Party leader Chris Hani, of a Mozambican man set alight in xenophobic violence in South African in 2008 or the footage of the police killing of George Floyd in the US.

Journalists argue there is sometimes a positive obligation to show unpleasant realities. Kelly McBride, vice-president of the US nonprofit media institute Poynter Institute, says some images may have the “power to galvanise the public”, adding:

it’s irresponsible for a news organisation to shield its audience from hard truths.

However, much depends on context and the handling of the images. Responsible editors will include audience advisories so they can opt to avoid the image. Some effort to provide names and other details can help to humanise the victims, evoking more human empathy than simple ghoulish fascination.

In the case of the AKA and Tibs murders, most South African mainstream publishers seem to have taken the view that the circumstances did not justify the publication of the actual shooting. Most simply reported the existence of the footage.

A man wearing a shirt and jacket sitting in a chair with his cusped hands resting on a table.
Tebello ‘Tibz’ Motsoane. Darryl Hammond © Sowetan.

But no such restraint was shown on social media. Fascinated by the sensational murder of a music star, users shared the footage in their tens and hundreds of thousands.

Clearly, professional codes and mechanisms are powerless against a truly viral phenomenon of this sort. The Press Council and the Broadcasting Complaints Commission of South Africa handle complaints against mainstream media, but they have no authority over the wider public on social media.


Read more: Journalism makes blunders but still feeds democracy: an insider's view


There is increasing concern about the spread of harmful content on social media platforms – not just gratuitous violence, but also hate speech, misinformation and much else. Several governments are developing legislation to fight toxic content. But the UN High Commissioner for Human Rights, among others, has voiced concern that the laws may be a pretext to act against dissent.

Peggy Hicks, director of thematic engagement at UN Human Rights, says:

Some governments see this legislation as a way to limit speech they dislike and even silence civil society or other critics.

The social media giants themselves –such as Twitter, Google and Facebook – have emphasised that they are not publishers, simply offering a platform for sharing and, therefore, don’t have to take responsibility. However, they increasingly accept the need for content moderation.

Machines are necessary to cope with the sheer volume of material. But human content moderators have a critical role as artificial intelligence is not always smart enough to deal with complex contexts and linguistic nuance, as emerged in leaks from inside Facebook. Moderators in their thousands have the unenviable task of sifting through a vast and unending flood of truly terrible material, from decapitation to child porn.

The United Nations Educational, Scientific and Cultural Organisation (Unesco) is looking into the regulation of social media platforms. A draft set of guidelines emphasises the need for platforms to have policies based on human rights and to be accountable.

Fundamentally, the platforms’ algorithms operate on a logic of rewarding traffic, which needs to be tempered with considerations of the common good. According to Unesco:

The algorithms integral to most social media platforms’ business models often prioritise engagement over safety and human rights.

Gossip sites in sensationalist feeding frenzy

In the example of the AKA video, sensationalist gossip sites also traded on and drove much of the traffic. A Google search for mentions of the video is dominated by obscure sites using poor language, for whom the video is simply clickbait. Their business model relies on bulk traffic to earn advertising income, and that in turn relies on the platform giants’ algorithms.

That, perhaps, is the most important lesson of the uncontrollable spread of the AKA video: ways need to be found to write elements of information ethics into the platforms’ algorithms. It is deeply damaging to social cohesion to have machine logic systematically boosting the worst and most disturbing material.

The Conversation

Franz Krüger is the deputy ombud of the South African Press Council. He writes in his personal capacity.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.