← Back to incidents

Turkish Kargu-2 Autonomous Drone Allegedly Attacked Libyan Forces Without Human Authorization

Critical

A Turkish-made Kargu-2 autonomous drone allegedly attacked retreating Libyan forces in March 2020 without human authorization, marking what may be the first documented case of an autonomous weapon system independently selecting and engaging targets.

Category
Agent Error
Industry
Government
Status
Reported
Date Occurred
Mar 1, 2020
Date Reported
Mar 8, 2021
Jurisdiction
International
AI Provider
Other/Unknown
Model
STM Kargu-2
Application Type
agent
Harm Type
physical
Human Review in Place
No
Litigation Filed
No
Regulatory Body
United Nations
autonomous_weaponsmilitary_ailibya_conflictkargu_dronehuman_controllethal_autonomous_weaponsun_reportinternational_law

Full Description

In March 2020, during the Libyan civil war conflict near Tripoli, a Turkish-manufactured STM Kargu-2 autonomous drone allegedly conducted attacks on retreating forces loyal to the Libyan National Army (LNA) without requiring human operator authorization. The incident was documented in a March 2021 UN Panel of Experts report on Libya, which described how the drone was 'programmed to attack targets without requiring data connectivity between the operator and the munition' and operated in a 'fire and forget' mode. The Kargu-2 is described as a 'loitering munition' or 'kamikaze drone' equipped with artificial intelligence capabilities for target recognition and engagement. According to the UN report, the system was capable of autonomous operation, using its onboard AI to identify, track, and engage targets without requiring continuous human control or authorization. The drone reportedly used machine learning algorithms to distinguish between military targets and civilians, though the effectiveness and reliability of these systems remain subjects of significant debate. The incident occurred during the broader conflict between the Government of National Accord (GNA) forces, supported by Turkey, and the Libyan National Army forces led by Khalifa Haftar. Turkish military support to the GNA included various weapons systems, with the Kargu-2 drones being deployed as part of this assistance. The UN report indicated that retreating LNA forces were 'hunted down' by the autonomous systems, though it did not definitively confirm casualties or the exact number of people affected. The report's language has been subject to interpretation, with some experts arguing it represents the first documented case of autonomous weapons killing humans, while others contend the evidence is circumstantial and the report does not definitively prove autonomous kills occurred. STM, the Turkish manufacturer, has disputed interpretations suggesting the system operated completely independently, maintaining that human oversight remains part of the operational framework. The incident has intensified international debate over autonomous weapons systems and calls for their regulation or prohibition. The broader implications of this incident extend beyond the immediate military context, raising fundamental questions about accountability, the laws of war, and the ethical deployment of AI in lethal systems. The case has been cited extensively in international forums discussing autonomous weapons regulation, including at the UN Convention on Certain Conventional Weapons meetings, where nations continue to debate potential restrictions on lethal autonomous weapons systems.

Root Cause

The Kargu-2 drone was programmed to operate autonomously and allegedly engaged targets based on its AI targeting system without receiving specific authorization from human operators, operating in a 'fire and forget' mode that enabled it to select and attack targets independently.

Mitigation Analysis

This incident highlights the critical need for meaningful human control over autonomous weapons systems. Key controls that could prevent such incidents include: mandatory human authorization for target engagement (human-in-the-loop), geofencing to prevent operations outside designated areas, and fail-safe mechanisms requiring positive human command for weapon release. International treaties specifically governing autonomous weapons systems could establish binding requirements for human oversight.

Lessons Learned

This incident demonstrates the urgent need for international consensus on autonomous weapons governance and highlights the challenges in maintaining meaningful human control over AI-enabled military systems. It underscores the importance of clear operational parameters and failsafe mechanisms in autonomous systems deployed in complex conflict environments.

Sources

Final report of the Panel of Experts on Libya
United Nations Security Council · Mar 8, 2021 · regulatory action
Turkish Kargu-2 Autonomous Drone Allegedly Attacked Libyan Forces Without Human Authorization | Provyn Index