Pilotless bombers play a key role in modern warfare but could AI increase the number of civilian deaths?
The RAF is pioneering new mini-helicopter drones that can fire missiles at targets four miles away.
- SEE MORE Can Ukraine win war in the skies with Russia?
- SEE MORE How drones changed the face of Ukraine’s resistance
- SEE MORE Taxidermy bird drones launched
Air force chiefs ordered the new lightweight drones after they played a successful role in Ukraine’s resistance against Moscow, said the Daily Mail. UK trials showed the drones can drop grenades on tanks while hovering above and can also launch 35lb missiles.
Drones have been part of warfare since the 19th century, said The Bureau of Investigative Journalism, when Austrians used “pilotless hot-air balloons to bomb Venice”.
But there are growing concerns about the ethical implications of their use.
Have drones helped in Ukraine?
With the West “dithering about long-range munitions”, drones have offered an alternative for Ukraine in its war with Russia, said The Economist.
On 28 February, the skies above some western Russian regions “buzzed with the sound of hostile drones”, it reported, one of which reached less than 100km from Moscow. Meanwhile, the Ukrainian army has established 60 new attack-drone squadrons.
However, an expert told the newspaper, Russia’s ground-based air defences mean that Ukrainian reconnaissance drones struggle to get more than 15km behind enemy lines.
What does the law say?
There is “no provision in international law specifically referring to the use of drones”, said Global Voices. The “main legal reference” is the Geneva Convention that established norms for international humanitarian law in times of war.
However, speaking to the site, Khalil Dewan, a lawyer and investigator specialising in cases involving civilian victims of military drone attacks, said this still left huge problems.
He wondered “how legal and ethical is it to kill suspected combatants instead of capturing them and providing a fair trial”, particularly “in countries that are not officially in a state of war”.
What about civilian deaths?
In 2021, the US admitted that a drone strike in Kabul days before its military withdrawal killed 10 innocent civilians, reported the BBC.
That “botched strike” could be “indicative of a larger trend”, wrote Sarah Kreps, Paul Lushenko and Shyam Raman for Brookings. They pointed to research from The Bureau of Investigative Journalism that US strikes in Afghanistan, Pakistan, Somalia and Yemen between 2002 and 2020 killed as many as 1,750 civilians.
This form of warfare can also have implications on the other side of the battle. Although drone strikes are “often viewed as an antiseptic, dehumanised form of killing”, wrote Murtaza Hussain for The Intercept, “operators describe experiencing physiological stress during their missions”.
What about AI?
The manufacture of artificially intelligent military drones that use facial recognition technology to detect targets “has led many people to raise concerns about the ethics involved”, said Newsweek.
"It’s the pre-emptive nature of these operations that is the really problematic part here,” Mike Ryder, a researcher at Lancaster University, told the news site.
Ryder added that “strikes work on the assumption that the person being monitored has the potential to carry out a terror attack in the future”, even though “they may never have engaged directly in terror activities up to that point”.
Edward Santow, an industry professor of responsible technology at the University of Technology Sydney, added that facial recognition technology (FRT) is prone to error, particularly for people of colour.
There is a “serious risk” that using FRT “to target lethal autonomous weapons, results in errors causing unlawful death and injury”, he said.
In Libya and Gaza, wrote Henry Bodkin and Aisling O’Leary for The Telegraph, drones are suspected to have killed people “autonomously, independent of human control”.
“Across the world a moral line is being crossed”, they added, as we enter “the much prophesied age of the killer robot”.
“The question is”, they asked “do we know what we unleashed” and “by the time we’ve worked it out, will it be too late?”