Kamikaze Drones in Russia’s War Against Ukraine Point to Future "Killer Robots"
There is evidence that Russia may have used "killer robot" drones in Ukraine. Even if that's not the case, such weapons will almost certainly be used in the near future.
“Artificial intelligence is the future, not only for Russia, but for all humankind. It comes with colossal opportunities, but also threats that are difficult to predict. Whoever becomes the leader in this sphere will become the ruler of the world.”[1] – Russian President Vladimir Putin, 2017.
Russia's shocking war against Ukraine – the biggest war in Europe since 1945 – is also among the first major wars to have started after a decade of exponential advancements in the field of AI. In fact, a “killer robot” in the form of a “kamikaze drone” may have been used for the first time. Regardless of whether that’s the case, such drones have been used extensively by both Ukraine and Russia, and are certain to be used with AI functionality soon. In this article I will explain what we know about the use of AI-enabled drones in this war, and touch on the implications of that for the future.
Concerns about Lethal Autonomous Weapons Systems (LAWS), aka ‘Killer Robots’ that can participate in warfare without human control, have been expressed for decades. Many weapons today are semi-automated, but semi automated non-LAWS weapons are either “human-in-the-loop” – a human has to make the decision to use lethal force – or are “human-on-the-loop” – a human is supervising the system’s decisions and can override them in real time. In contrast, once deployed, LAWS could conceivably use AI to perceive targets, categorize them as enemies, and take lethal action against them without human involvement.
Unmanned drones and remotely controlled tanks have come into existence over the past decades, with drones being used extensively in the Russia-Ukraine War, but these are still fundamentally human controlled. However, drones that act as autonomous “loitering munitions,” meaning they fly over an area until they detect a target below them and then dive-bomb to hit it, have – in a few cases – possibly been used under AI control. In 2021, a UN report about the end of the second Syrian Civil War The Kargu-2 included the following quote:
Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 (see annex 30) and other loitering munitions. The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true “fire, forget and find” capability.

As covered well by The Verge, articles with headlines such as “The Age of Autonomous Killer Robots May Already Be Here” soon followed, despite the fact that loitering munitions are not at all new; whether the Kargu-2 drones should be seen as LAWS is debatable given autonomous loitering munitions such as the IAI Harpy have been developed decades ago. That question – and the question of how to regulate LAWS more generally – is once again coming to the forefront as both Russian and Ukraine are using “kamikaze drone” loitering munitions:
The smaller, exploding drones now in use by both Russia and Ukraine differ from older, more traditional drones. Instead of taking off, launching missiles and then returning to a base, they fly above the battlefield and turn into missiles themselves, dive bombing vehicles or groups of soldiers and exploding on impact. Some can be carried in a backpack and launched in the midst of combat, making them especially deadly in urban or guerrilla warfare.
… Most loitering munitions are still human-controlled, but could be upgraded with software that allows them to pick their own targets. That’s generated concern among arms-control experts who worry about allowing computers to make decisions about who to kill, something that could lead the world toward a future where deadly, autonomous weapons are the norm in both big and small conflicts.

In fact, just as with the Kargu-2 drone in Syria, Russia may have already used an autonomous “kamikaze drone” in this war:
Using pictures out of Ukraine showing a crumpled metallic airframe, open-source analysts of the conflict there say they have identified images of a new sort of Russian-made drone, one that the manufacturer says can select and strike targets through inputted coordinates or autonomously. When soldiers give the Kalashnikov ZALA Aero KUB-BLA loitering munition an uploaded image, the system is capable of “real-time recognition and classification of detected objects” using artificial intelligence (AI), according to the Netherlands-based organization Pax for Peace (citing Jane’s International Defence Review). In other words, analysts appear to have spotted a killer robot on the battlefield.





Zala claims that this drone features “intelligent detection and recognition of objects by class and type in real time.” However, also as in Syria it remains unclear whether the drone was actually acting autonomously:
The drone appears intact enough that digital forensics might be possible, but the challenges of verifying autonomous weapons use mean we may never know whether it was operating entirely autonomously. Likewise, whether this is Russia’s first use of AI-based autonomous weapons in conflict is also unclear: Some published analyses suggests the remains of a mystery drone found in 2019 Syria was from a KUB-BLA (though, again, the drone may not have used the autonomous function).
Still, as far as we know LAWS are not being used in this war. But, such human-controlled drones are very much in use. Turkey, Israel and Iran have all been developing drones in recent years, and Ukraine has been using Turkey’s TB-2 drone effectively. It is only a matter of time until loitering munitions strike targets under AI control – particularly given that this may have already happened. Development of AI-powered weaponry has been a priority for Russia for years, including such capabilities for drones. For example, the company behind the KUB-BLA claims it is working on enabling their drones to act as a sort of airborne minefield:
ZALA say that they are working on turning the Lancet into a drone interceptor. A formation of Lancets would patrol the airspace above friendly troops, each inside its own 20-mile-square-box, putting them ready in place to tackle any hostile drones. Attackers would be met with an explosive ramming maneuver – concept video here, with a TB2-type-drone conspicuous as the target.
Russian soldiers were testing these drones just months ago:
Just before invading Ukraine, Russian forces were seen testing new “swarm” drones, as well as unmanned autonomous weapons capable of tracking and shooting down enemy aircraft. However, there is no evidence they have been used in Ukraine for that purpose.

Investment in such technologies is sure to increase given the degree to which it has helped the Ukrainian army:
But drones have also highlighted a key vulnerability in Russia’s invasion, which is now entering its third week. Ukrainian forces have used a remotely operated Turkish-made drone called the TB2 to great effect against Russian forces, shooting guided missiles at Russian missile launchers and vehicles. The paraglider-sized drone, which relies on a small crew on the ground, is slow and cannot defend itself, but it has proven effective against a surprisingly weak Russian air campaign.

In summary, LAWS may or may not have been used in this war, but it is clear AI powered “kamikaze drones” will be used in warfare – and not just in isolated instances – sooner or later.
Real life COD