Abstract
The use of Artificial Intelligence (AI) in Automated Target Recognition (ATR) optimises martial prophecies of perpetual threat while simultaneously exonerating the politically inclined prosecution of “forever” wars. The affordances of AI in data-centric warfare are, as a result, not only in line with military demands but also increasingly consistent with government mandates and the zero-sum game of national security. Deployed by the Israel Defense Forces (IDF) in Gaza since October 2023 (and in service there since at least 2021), this article will propose that the use of AI in ATR systems such as The Gospel (Habsora) and Lavender demonstrates these invariably fatal techno- and thanato-political alignments. Although regularly offered up to deny the fact that automated prototypes of killing are a prevailing reality in contemporary wars, I will observe how the safeguards nominally associated with the so-called human-in-the-loop (HITL) defence are effectively nothing more than a convenient fallacy. A stark reality has therefore emerged in modern warfare: through the use of ATR, and Automated Weapons Systems (AWS) more broadly, AI is reliably providing an alibi for the prosecution of wholesale methods of killing without, in turn, provoking much by way of substantive political censure or legal accountability.
Original language | English |
---|---|
Article number | 9 |
Journal | Digital War |
Volume | 6 |
DOIs | |
Publication status | Published (VoR) - 16 Jun 2025 |
Keywords
- artificial intelligence
- Digital analysis system
- Automated Target Recognition
- conflict zones
- Habsora
- Gaza