← Back to incidents
Israeli Military's AI 'Gospel' Target Generation System in Gaza Operations
CriticalIsraeli military reportedly used AI system 'The Gospel' to rapidly generate bombing targets in Gaza, raising concerns about reduced human oversight and potential acceleration of civilian casualties in conflict zones.
Category
Safety Failure
Industry
Government
Status
Under Investigation
Date Occurred
Oct 7, 2023
Date Reported
Nov 30, 2023
Jurisdiction
International
AI Provider
Other/Unknown
Model
The Gospel (Habsora)
Application Type
agent
Harm Type
physical
Human Review in Place
Unknown
Litigation Filed
No
military_aitargeting_systemcivilian_casualtiesisraelgazawarfare_automationinternational_law
Full Description
According to reporting by +972 Magazine and Local Call, the Israeli military deployed an artificial intelligence system called 'The Gospel' (Habsora in Hebrew) during the 2023-2024 Gaza conflict to dramatically accelerate the generation of bombing targets. The system allegedly transformed the military's targeting process from a manual operation that could take weeks or months to identify targets, to an automated process capable of generating hundreds of targets daily. Sources described the system as creating a 'mass assassination factory' that prioritized speed and volume over traditional careful target verification processes.
The Gospel system reportedly analyzed vast amounts of surveillance data, communications intercepts, and other intelligence inputs to identify potential targets linked to Hamas and other militant groups. According to intelligence sources cited in the reporting, the AI could generate target lists at unprecedented speed, identifying not just military installations but also residential buildings where suspected militants lived or operated. The system allegedly marked entire family homes for potential strikes based on intelligence linking residents to militant activities.
Former intelligence officers told +972 Magazine that the traditional targeting process involved extensive human analysis and legal review, often taking considerable time to verify targets and assess civilian risk. The Gospel system allegedly compressed this timeline dramatically, with targets moving from identification to strike authorization within hours rather than weeks. Sources indicated that the pressure for rapid target generation led to reduced time for human verification and civilian harm assessment.
The reporting suggests that the AI system contributed to the unprecedented scale and pace of bombing during the conflict, with thousands of targets struck in the initial weeks. Critics and human rights organizations have raised concerns that the accelerated targeting process may have contributed to high civilian casualty rates. The Israeli military has not publicly confirmed operational details about The Gospel system, though it has acknowledged using AI tools in its operations. International legal experts have questioned whether such AI-accelerated targeting systems comply with international humanitarian law requirements for distinction, proportionality, and precautionary measures in attack planning.
Root Cause
AI system reportedly prioritized speed of target generation over accuracy and civilian harm mitigation, potentially reducing time for human verification of targets and their civilian proximity.
Mitigation Analysis
Enhanced human oversight protocols, mandatory verification periods before strike authorization, civilian harm assessment algorithms, and multi-layer review processes could have reduced risks. Real-time monitoring of AI recommendations against international humanitarian law standards and civilian proximity databases would have been critical. Implementation of 'human-in-the-loop' requirements for all AI-generated targets could have prevented automated escalation.
Lessons Learned
The incident highlights critical risks when AI systems are deployed to accelerate life-or-death military decisions without adequate safeguards. The pressure for operational speed can override traditional verification processes that protect civilians, demonstrating the need for robust human oversight frameworks even in combat scenarios.
Sources
A mass assassination factory: Inside Israel's calculated bombing of Gaza
+972 Magazine · Nov 30, 2023 · news
'The Gospel': how Israel uses AI to select bombing targets in Gaza
The Guardian · Dec 1, 2023 · news