‘Machine Speed Warfare’: AI, Autonomous Weapons and the Future of Global Diplomacy

John Lillywhite
Hard Disc
Published in
3 min readJun 13, 2020

--

Flickr Commons (Hindrik Sijens)

Near horizon advances in AI are changing the speed, chain of command, security relationships and conventions of accountability that govern global security in the Twenty First Century.

A U.S. Air Force general summarized a sense of the change as follows: “The B-52 lived and died on the quality of its sheet metal. Today our aircraft will live or die on the quality of our software.” (Elkelhof, 2018, p. 81)

Machine Speed Warfare

The changing nature of warfare could see a future in which placing human soldiers in harms way is untenable in instances where autonomous unmanned underwater, surface and air vehicles can perform the same tasks (Leys, 2018, p. 63)

Two definitional policy challenges facing the deployment of autonomous weapons involve the speed at which these systems operate, and the degree of oversight and control which they require (Ibid, p. 50).

Human Rights Watch focuses on the control paradigm to refer to three classes of weapons systems: ‘Human-in­-the-Loop” (humans select targets), “Human-on-the-Loop” (robots select and engage with human supervision) and “Human-out­-of-the-Loop” fully autonomous weapons systems that involve no human oversight (Human Rights Watch, Losing Humanity: The Case Against Killer Robots, 19th Nov 2020 cited in Leys, 2018, p. 51)

“Human-out­-of-the-Loop” fully autonomous weapons systems that involve no human oversight

In Japan, there are plans to develop robotic aircraft as ‘helpers’ for manned fighters, with a pilot issuing commands (Ibid, p. 52). A former U.S. Department of Defense chief has suggested human soldiers could serve as “quarterbacks” for teams of Autonomous Weapons Systems, referencing a need for “human-machine-collaboration” (Ibid, p. 52).

Here both discussions satisfy the Human Rights Watch definition, with the legitimate and ethical selection of targeting (Ekelhof, 2018) as the center of gravity around which discussions on human control, hybrid human-machine control and fully autonomous decisions take place. A 2018 Center for Naval Analyses (CAN) report concluded “[f]ailure to recognize and mitigate factors besides the platform in the targeting process resulted in an increased risk to civilians from the use of drones, despite some desirable characteristics of these systems.” (Lewis, ‘Drone Strikes in Pakistan,’ pp. 1–2 cited in Ekelhof, 2018, p. 85)

In reality, the increasing speed of warfare and the arms race for AI supremacy and first-mover advantage is rendering aspects of this discussion obsolete (Leys, 2018, p. 51).

In reality, the increasing speed of warfare and the arms race for AI supremacy and first-mover advantage is rendering aspects of this discussion obsolete (Leys, 2018, p. 51). Phalanx close-in weapons system (CIWSO) mounted on US Navy War Ships can already be set to acquire and engage incoming missiles automatically, because the “time waiting for a crew member to approve a defensive action against an incoming threat could prove fatal to the ship” (Ibid). During the Iraq War operators on the Patriot Missile System were “trained to trust the system software” in situations where human reaction times were inadequate (Defense Science Board, Report of the Defense Science Board Task Force on Patriot System Performance, 2005 cited in Ibid, p. 53). Both of these case studies involve defensive rather than offensive systems.

This may change as “warfare accelerates to machine speed,” creating a battlefield dynamic in which “human cognition may prove unable to keep pace with the new operational tempo of intelligentized warfare” (Kania, 2017) Here there are clear implications for the balance of military power between states, particularly the United States and China.

An essay submitted in partial fulfillment of the requirements for the degree of MA in Global Diplomacy of the University of London International Programmes. Read the full article on Academia below.

Read the full article on Academia.

--

--