The Alibi of AI: Algorithmic models of automated killing

Anthony Downey (Corresponding / Lead Author)

    Research output: Contribution to journalArticlepeer-review

    Abstract

    The use of Artificial Intelligence (AI) in Automated Target Recognition (ATR) optimises martial prophecies of perpetual threat while simultaneously exonerating the politically inclined prosecution of “forever” wars. The affordances of AI in data-centric warfare are, as a result, not only in line with military demands but also increasingly consistent with government mandates and the zero-sum game of national security. Deployed by the Israel Defense Forces (IDF) in Gaza since October 2023 (and in service there since at least 2021), this article will propose that the use of AI in ATR systems such as The Gospel (Habsora) and Lavender demonstrates these invariably fatal techno- and thanato-political alignments. Although regularly offered up to deny the fact that automated prototypes of killing are a prevailing reality in contemporary wars, I will observe how the safeguards nominally associated with the so-called human-in-the-loop (HITL) defence are effectively nothing more than a convenient fallacy. A stark reality has therefore emerged in modern warfare: through the use of ATR, and Automated Weapons Systems (AWS) more broadly, AI is reliably providing an alibi for the prosecution of wholesale methods of killing without, in turn, provoking much by way of substantive political censure or legal accountability.
    Original languageEnglish
    Article number9
    JournalDigital War
    Volume6
    DOIs
    Publication statusPublished (VoR) - 16 Jun 2025

    Keywords

    • artificial intelligence
    • Digital analysis system
    • Automated Target Recognition
    • conflict zones
    • Habsora
    • Gaza

    Fingerprint

    Dive into the research topics of 'The Alibi of AI: Algorithmic models of automated killing'. Together they form a unique fingerprint.

    Cite this