Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Ezenwa E. Olumba, Doctoral Research Fellow, Conflict, Violence, & Terrorism Research Centre, Royal Holloway University of London

‘Killer robots’ are becoming a real threat in Africa

PHOTOCREO Michal Bednarek/Shutterstock

The use of drones in the Sahel, a region of Africa that has been plagued by violence driven by jihadist insurgency for much of the past decade, has become a real problem. In April, for example, Al Qaeda’s affiliate in the Sahel, Jama’at Nusrat al Islam wa al Muslimeen, reportedly carried out an attack against a rival militia using drones modified with grenades and mortars in central Mali.

But even more concerning is the fact that their AI-powered variants, which are known as lethal autonomous weapons systems (Laws), have been deployed in Africa in recent years. Laws are a special class of weapons system that can conduct surveillance, select targets and carry out attacks autonomously.

There are worries that these weapons could fall into the hands of terrorist groups if their deployment in Africa is scaled up. Several advanced drones have already been lost in counter-terrorism operations both on the continent and elsewhere, due to technical error or insurgent attack.

The US reportedly lost three MQ-9 Reaper drones to the Houthis in Yemen in May 2024, and MQ-1 Predator drones in Libya and Niger in November 2019 and February 2023 respectively.

There are also a number of ethical concerns associated with autonomous weapons. They give machines the freedom and capability to take human life, and they reduce targets to mere data points. Nevertheless, the use of drones that are equipped with AI looks to be on the rise.

Ukrainian forces are reportedly using drones that have been integrated with a basic form of AI to help them navigate and avoid being jammed when carrying out attacks inside Russia. Israel is using an AI recommendation system in Gaza called “Lavender” that is designed to use algorithms to identify Hamas operatives as targets. And the US has also deployed systems that use AI to find targets for airstrikes in Syria and Yemen.


Read more: Gaza war: Israel using AI to identify human targets raising fears that innocents are being caught in the net


These are not examples of autonomous weapons. And, at the time of writing, there have only been two cases where autonomous weapons may have been deployed in Africa – of which just one occurred in an active combat situation. But the implications of deploying these weapons systems in Africa is still stark.

According to a report released by the UN in March 2021, forces backed by the Libyan government deployed a drone made in Turkey called Kargu-2 during a battle with enemy militia fighters the previous year. The report says the drone “hunted down and remotely engaged” the fighters, and suggests it may have selected the targets autonomously. The report does not say whether there were any casualties or injuries.

This news is concerning. The Kargu attack drone generally detonates an explosive charge in close proximity to the target to avoid collateral damage. However, doubts have been raised over the drone’s ability to distinguish legitimate military targets from civilians.

More recently, in May 2024, the US African Command tested an autonomous drone in Libreville, Gabon, in an exercise with west African nations to support their efforts to combat piracy. The drone, which is known as Triton, uses high resolution scanners and sensors to gather intelligence autonomously and carry out countermeasures.

The Triton drone has aerial and maritime variants, and is capable of loitering below water for over a week or carrying out surface operations over a period of two weeks.

Regulating autonomous weapons

The prospect of autonomous weapons falling into the hands of terrorist groups and militias in the Sahel is alarming. They would introduce a new deadly dimension to an already brutal conflict.

There is a need for legal frameworks that could be used to regulate the growth of these systems and sanction those who contravene such conventions. In December 2023, the UN unveiled plans to ban weapons that operate without human oversight by 2026.

Delaying the ban until 2026 will allow the development and use of these weapons to continue. But, in any case, the focus now looks to be shifting towards developing voluntary guidelines for best practices rather than imposing a total ban on these systems.

It is up to African countries to develop a common position on Laws and work with global bodies like the UN to ban their deployment on the continent. Those who manufacture them must find alternative venues to use these lethal machines.

The Conversation

Samuel Oyewole is also affiliated with the Federal University Oye-Ekiti, Nigeria.

Tony Onazi Oche is affiliated with the University of Pretoria.

Ezenwa E. Olumba does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.