TY - JOUR
T1 - Algorithmic Predictions and Pre-Emptive Violence
T2 - artificial intelligence and the future of unmanned aerial systems
AU - Downey, Anthony
PY - 2024/1/1
Y1 - 2024/1/1
N2 - The military rationale of a pre-emptive strike is predicated upon the calculation and anticipation of threat. The underlying principle of anticipation, or prediction, is foundational to the operative logic of AI. The deployment of predictive, algorithmically driven systems in unmanned aerial systems (UAS) would therefore appear to be all but inevitable. However, the fatal interlocking of martial paradigms of pre-emption and models of predictive analysis needs to be questioned insofar as the irreparable decisiveness of a pre-emptive military strike is often at odds with the probabilistic predictions of AI. The pursuit of a human right to protect communities from aerial threats needs to therefore consider the degree to which algorithmic auguries—often erroneous but nevertheless evident in the prophetic mechanisms that power autonomous aerial apparatuses—essentially authorise and further galvanise the long-standing martial strategy of pre-emption. In the context of unmanned aerial systems, this essay will outline how AI actualises and summons forth “threats” through (i) the propositional logic of algorithms (their inclination to yield actionable directives); (ii) the systematic training of neural networks (through habitually biased methods of data-labelling); and (iii) a systemic reliance on models of statistical analysis in the structural design of machine learning (which can and do produce so-called “hallucinations”). Through defining the deterministic intentionality, systematic biases and systemic dysfunction of algorithms, I will identify how individuals and communities—configured upon and erroneously flagged through the machinations of so-called “black box” instruments—are invariably exposed to the uncertainty (or brute certainty) of imminent death based on algorithmic projections of “threat”.
AB - The military rationale of a pre-emptive strike is predicated upon the calculation and anticipation of threat. The underlying principle of anticipation, or prediction, is foundational to the operative logic of AI. The deployment of predictive, algorithmically driven systems in unmanned aerial systems (UAS) would therefore appear to be all but inevitable. However, the fatal interlocking of martial paradigms of pre-emption and models of predictive analysis needs to be questioned insofar as the irreparable decisiveness of a pre-emptive military strike is often at odds with the probabilistic predictions of AI. The pursuit of a human right to protect communities from aerial threats needs to therefore consider the degree to which algorithmic auguries—often erroneous but nevertheless evident in the prophetic mechanisms that power autonomous aerial apparatuses—essentially authorise and further galvanise the long-standing martial strategy of pre-emption. In the context of unmanned aerial systems, this essay will outline how AI actualises and summons forth “threats” through (i) the propositional logic of algorithms (their inclination to yield actionable directives); (ii) the systematic training of neural networks (through habitually biased methods of data-labelling); and (iii) a systemic reliance on models of statistical analysis in the structural design of machine learning (which can and do produce so-called “hallucinations”). Through defining the deterministic intentionality, systematic biases and systemic dysfunction of algorithms, I will identify how individuals and communities—configured upon and erroneously flagged through the machinations of so-called “black box” instruments—are invariably exposed to the uncertainty (or brute certainty) of imminent death based on algorithmic projections of “threat”.
KW - AI, algorithms, drones, UAVs, LAWs, international humanitarian law
UR - https://www.open-access.bcu.ac.uk/15228/
U2 - 10.1057/s42984-023-00068-7
DO - 10.1057/s42984-023-00068-7
M3 - Article
SN - 2662-1975
VL - 4
JO - Digital War
JF - Digital War
IS - 1-3
ER -