The Peloponnesian War and Killer Robots: Norms of Protection in Security Policy

How do state leaders define ‘security threats’ and appropriate responses to them? Is this a straightforward weighing of other countries’ threats to ‘our safety’? Or is it the result of social norms that condition leaders to think about the world and ‘Other People’ in certain ways? These questions – central to the study of international relations – subtly undergird the global debate on autonomous weapons systems (AWS). AWS, or ‘killer robots,’ are an emerging class of robotic weapons that would be able to select and destroy targets independent of any direct human involvement (Singer, 2009). One’s position in the killer robots debate hinges largely on whether one sees AWS as an attempt to increase ‘our’ security against ‘enemies’ through high-tech defenses, or as a threat themselves, created by a blinkered politico-military system.

Even a couple of years ago, few people in diplomatic circles had been talking about AWS. But since 2013, thanks to the prodding of the Campaign to Stop Killer Robots, an official debate has begun at the UN in Geneva under the auspices of the Convention on Certain Conventional Weapons (CCW). What happened? Did AWS suddenly become more dangerous? Or had something shifted diplomatic perceptions of them? In this article, we argue that this change was the result of collective action by academics, activists, and NGO workers, who reframed the conversation about AWS from the ‘inevitable’ result of states’ national security efforts to a humanitarian and human rights threat.

In the next section, we show how one of international relations’ founding debates – the causes of the Peloponnesian War (431-404 BCE) – offers interesting insights into current political deliberations about killer robots. We then trace how concerns about AWS moved from the margins of academia to the NGO disarmament community and finally into official diplomatic forums. We demonstrate how campaigners have resisted states’ claims to ‘know best’ about security matters, by pushing officials to answer questions about the necessity and risks of killer robots as well as the underlying social norms that guide global security policymaking.

The Peloponnesian War and Killer Robots

The Peloponnesian War in ancient Greece pitted the upstart Athens against the established hegemon Sparta, drawing into the fray their alliances of smaller city-states. In his history of the conflict, Thucydides argued that Athens’ growing prosperity and military power, particularly its naval build-up, had provoked fear among Sparta’s elites and ‘made war inevitable’ (2014). Viewing conflict as a tragedy – shaped by his experience as a general and, later, a displaced person in the war – Thucydides was fatalistic about our potential to halt the downward spiral of suspicion in insecure times. Modern realists have drawn on Thucydides to argue that states’ rational pursuit of their interests in an anarchic international system leads inexorably to arms races, economic competition and often war (see, for example, Waltz, 1954). To achieve any modicum of security, they argue, states must arm themselves and prepare to fight.

In his fictional play Lysistrata, Aristophanes (a contemporary of Thucydides) imagined an alternative view. The title character, a strong-willed and witty Athenian woman, suggests that the conflict between Athens and Sparta is in part driven by gender norms and the politics of Greek society. She and her female friends from all over Greece – including opposing city-states – organize a sex strike, refusing to sleep with men until they end the war. Lysistrata also organizes older women to seize the Acropolis and state treasuries, aiming to reorganize the political system in a more just and peaceful way. Whereas the armed men of Athens demanded Lysistrata be grateful for their protection, she and her fellow activists revealed how the very structure of ‘protection’ was threatening women (Aristophanes, 2012).

But what does this have to do with killer robots?

Contrasting these two ancient writers, we find two views of international politics. The first – Thucydides’ ‘realist’ view – sees war as the inevitable outcome of competition between states; one can only prepare for the unavoidable war by being better armed than one’s enemies. The second – Aristophanes’ ‘constructivist’ view – sees security threats as the result of how society is organized; social systems can be reorganized in less violent ways through collective action.

There are parallels in the killer robots debate. After initially dismissing fears of killer robots as ‘science fiction’, proponents of AWS now claim that they are the ‘inevitable’ result of a dangerous world, in which states must seek the most high-tech, deadly weapons to protect themselves (Anderson and Waxman, 2012:36; Reeves and Johnson, 2014). According to this realist view, killer robots are not a security threat, our enemies are. But academics, activists, and human rights organizations associated with the Campaign to Stop Killer Robots are unconvinced by these arguments. Influenced by the liberal, constructivist, feminist, and post-structuralist schools of international relations, they argue that societies are not solely or irrevocably driven by base self-interest. Instead, the behavior of states (and the development of technology) is shaped by changing social norms (see, for example, Carpenter, 2014a). These campaigners believe, as Sharkey (2012) puts it, that the development of AWS is ‘evitable’ – people can shape the technological and political future. Thus, while some argue that democratic countries like the United States need to develop AWS to prevent a ‘robotics gap’ and that prohibition is ‘unrealistic, ineffective, or dangerous’ (Horowitz, 2014; also Marchant et al, 2011; Anderson and Waxman, 2012:36; Reeves and Johnson, 2014), campaigners see this framing as a political choice. It directs people’s fear away from killer robots themselves by constructing a shadowy competing ‘Other’.

Early Concerns in Academia and the News Media

Early warnings about the humanitarian, human rights and security implications of killer robots came not from those in military and state security circles, but rather from the media and academia. A 1984 Newsweek article titled ‘The Birth of the Killer Robots’ represents one of the earliest calls for regulation of autonomy in military robotics (Rogers, 1984:51), and the 1991 Gulf War prompted debate over the impact of so-called ‘smart’ weapons (see, for example, De Landa, 1991). But for much of the 1990s and early 2000s, public discussions of high-tech weapons were dominated by those who trumpeted America’s technological military prowess. For example, writing in the military journal Parameters, Metz (2000) looked forward to the day when ‘killer robots the size of a grain of sand might search for and kill the future Saddam Husseins of the world’ while ‘leaving noncombatants untouched.’

Critical debate began in earnest in the 2000s, perhaps catalyzed by the growing use of military robots in the wars in Afghanistan and Iraq, and extrajudicial killings by the US in Pakistan, Yemen, and Somalia. In a 2002 paper, physicists Juergen Altmann and Mark Gubrud reviewed the risks of new military technologies and called for a ‘prohibition of autonomous “killer robots”’ (p. 147). Robert Sparrow (2007), Noel Sharkey (2007; 2008; 2012), and Peter Asaro (2008; 2012) also wrote key early academic papers dismissing claims that AWS could adequately follow the laws of war. They characterized AWS as dehumanizing threats to soldiers’ capacity for empathy, mercy, and wisdom in navigating the ethical complexities of modern combat. These scholars and others banded together in 2009 to establish the International Committee for Robot Arms Control (ICRAC), calling on ‘the international community to urgently commence a discussion about an arms control regime’ for military robotics (ICRAC, 2009).

Getting Policymakers’ Attention

Despite the academic debate, interest in the policymaking, NGO, and advocacy arenas remained low – killer robots were ‘virtually ignored’ (Carpenter, 2014a:1). However, between 2011 and 2012, the academic conversation began to diffuse into the policymaking arena. In April 2011, Nobel Peace Laureate Jody Williams, founding coordinator of the International Campaign to Ban Landmines, called for a ban on ‘fully autonomous attack and kill robotic weapons’ in the International Journal of Intelligence Ethics. Other international NGOs issued reports (Oudes and Zwijnenburg, 2011) or statements (Bolton, Nash, and Moyes 2012) on AWS, while the Red Cross raised concerns about the ‘humanitarian impact of developing technologies’ (Kellenberger, 2011). (See Carpenter 2014a for a detailed review of the emergence of the issue in the NGO community.)

At an October 2012 summit for disarmament NGOs in New York, a core group led by Human Rights Watch (HRW) began planning a global civil society coalition to fight for a ban on AWS. To prepare ground for the campaign, HRW and the Harvard Law School’s International Human Rights Clinic released its landmark policy report Losing Humanity: The Case Against Killer Robots (HRW, 2012). A couple of days after the report’s release, the US Defense Department (2012) issued the first ever governmental policy on fully autonomous weapons, Directive 3000.09. The directive – which is not law and will expire in 2022 – required robotic weapons systems to ‘allow commanders and operators to exercise appropriate levels of human judgment over the use of force’ (Section 4(a)). However, it has many loopholes that green-light ongoing research, provide formal mechanisms to approve weapons that contradict the directive, and allows for any AWS ‘to apply non-lethal, non-kinetic force’ to nonhuman targets (see Gubrud, 2012, for a critical analysis of the directive).

The Campaign to Stop Killer Robots launched officially in April 2013, coordinated by Mary Wareham of HRW and a steering board of nine NGOs, including ICRAC. As of August 2014, it had 44 additional NGO members. It calls for immediate national moratoria and an eventual global prohibition of AWS (Garcia, 2014). A month after the Campaign’s launch, UN Special Rapporteur on Extra-Judicial Killings Christof Heyns presented a report before the UN Human Rights Council, echoing the call for a moratorium and international deliberations on potential regulation (Heyns, 2013). In the ensuing exchange of views – the first ever in a diplomatic forum – 20 states made statements on AWS (Campaign to Ban Killer Robots, 2013:10-28). An additional 16 countries made statements on AWS in October 2013 at the UN General Assembly First Committee.

AWS finally became an official diplomatic agenda item in November 2013. Meeting at the UN in Geneva, the States Parties to the CCW agreed to mandate an informal four-day Meeting of Experts in May 2014 (UN Office in Geneva, 2014). Eighty-seven countries participated in this meeting, listening to presentations from 18 military, legal, and NGO experts, as well as statements from civil society. While the chair’s report offered no specific recommendations (Simon-Michel, 2014), at the time of writing it was expected that the CCW states party would renew the AWS mandate in November 2014.

Questioning States’ Gendered Policymaking Processes

Civil society campaigners have been remarkably successful in getting states talking about AWS. But like Lysistrata, many campaigners see their fight to ban killer robots as part of broader efforts to make global security policymaking more humane, just, and inclusive (Williams, 2014). During the May 2014 CCW meeting, all of the 18 ‘experts’ asked to make presentations were men. When campaigners asked conference organizers why no experts of other gender identities were included, they were told there were ‘no suitable women’ qualified to do so (Williams, 2014). This, in the words of one NGO’s statement, is ‘of course preposterous’ (Article 36, 2014). UN Security Council Resolution 1325 commits states to include women in global policymaking on peace and security. Instead, women experts were literally condemned to the margins of the CCW meeting, only allowed to speak in civil society statements from the back of the room or during ‘side-events.’

There were more subtle discourses, too, with proponents of killer robots depicting the civil society campaign as hysterical, or claiming that robotic weapons would avoid soldiers’ ‘emotional responses’ to war (see Roff-Perkins, 2014, and Carpenter, 2014b, for a challenge to this discourse). Some even argued that AWS have the potential to be more ethical than human soldiers, because they could supposedly one day transcend the human ‘emotions’ – namely fear, lust, or revenge (Arkin, 2009) – that Thucydides saw as the roots of violence. However, like Lysistrata, opponents of AWS question the claims of those who say they protect us, aiming to reshape the norms – including gendered ones – that govern how states conduct war. For instance, Roff-Perkins (2013) argues that killer robots are an expression of military masculinity, a technophiliac fantasy that trusts machines more than human emotions and judgment. Pro-killer robot discourse reproduced the traditional denigration of female-gendered individuals as hyperreactive, emotional, and incapable of rational or detached decision-making (see, for example, Prokhovnik, 1999).

Some of the women involved in the conference resisted this gender discrimination. Sarah Knuckey of Columbia Law School drew up a list of women actively engaged in research and policymaking regarding AWS. University of Massachusetts at Amherst scholar Charli Carpenter (2014c) and Nobel Peace Laureate Jody Williams (2014) wrote articles condemning the lack of female panelists. At the urging of women civil society activists at the CCW meeting, a group of 52 men involved in disarmament policy issued a pledge to boycott any all-male panels in forums of ‘global policymaking on peace and security’ (Article 36, 2014). Inspired by these initiatives, NGOs working on disarmament and arms control will strategize about how ‘to advance a gendered approach’ at their annual October forum in New York this year (Control Arms, 2014).

Conclusion        

The proponents of killer robots argue that AWS are the inevitable result of a rivalrous inter-state system – the same explanation Thucydides offered for the Peloponnesian War. Killer robots, in this view, are the result of a rational assessment of what militaries will need to defend themselves from 21st century security threats. However, in her now classic article on nuclear weapons discourse, the international relations scholar Carol Cohn (1987) showed how supposedly ‘common sense’ and ‘rational’ claims by the American defense establishment were actually fraught with misogynistic and racist assumptions. In the context of the killer robots debate, we must question whether such ‘realism’ is actually an objective assessment or, in fact, freighted with ideology, even fantasy. The academics, activists, and NGO workers associated with the Campaign to Stop Killer Robots argue that AWS are the result of social processes that have conditioned us to consider ‘Other People’ with contempt. By excluding a wide range of voices – including those of women – privileged policymakers can suggest that there are no alternatives to their own claims. But, a la Lysistrata, social norms – both specific laws and the processes that make them – can be challenged through collective action of those who have been excluded. For, as the constructivist international relations theorist Alexander Wendt (1992) has put it, the anarchy of the international system is what we ‘make of it’ – it is a human system, which humans can change. We need not be grateful for the ‘protection’ of killer robots, as realists would have us believe; we may instead mimic Lysistrata and humanize the very structure of protection in the 21st century, saving ourselves from killer robots’ de-humanizing violence.

Bibliography

Altmann J and Gubrud M. 2002. Risks from Military Uses of Nanotechnology – The Need for Technology Assessment and Preventative Control. Nanotechnology – Revolutionary Opportunities and Societal Implications. Roco M and Tomellini R (eds.). Luxembourg: European Communities, 144-148.

Anderson K and Waxman M C. December 2012 and January 2013. Law and Ethics for Robot Soldiers. Policy Review: 35-49.

Aristophanes. 2012. Lysistrata. Accessed 16 August 2014.

Arkin R. 2009. Governing Lethal Behavior in Autonomous Robots. Boca Raton: Taylor & Francis.

Article 36. 2014. Against gender discrimination in global policymaking. Accessed 16 August 2014.

Asaro P. 2008. How Just Could a Robot War Be? Current Issues in Computing and Philosophy. Briggle A, Vaelbers K and Brey P (eds.). Amsterdam, IOS Press: 50-64.

Asaro P. 2012. On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross. 94(886): 687-709.

Bolton M, Nash T and Moyes R. 5 March 2012. Ban autonomous armed robots. Article 36. Accessed 15 June 2014.

Carafano JJ. 2014. Autonomous Military Technology: Opportunities and Challenges for Policy and Law. Heritage Foundation Backgrounder. 2932. Accessed 15 August 2014.

Cohn C. 1987. Sex and Death in the Rational World of Defense Intellectuals. Signs. 12(4): 687-718.

Campaign to Stop Killer Robots. 31 July 2013. Report on outreach on the UN report on ‘lethal autonomous robotics’. Accessed 15 June 2014.

Carpenter C. 2014. “Lost” Causes: Agenda Vetting in Global Issue Networks and the Shaping of Human Security. Ithaca: Cornell University Press.

Carpenter C. 2014. “Robot Soldiers Would Never Rape”: Un-packing the Myth of the Humanitarian War-Bot. Duck of Minerva. Accessed 16 August 2014.

Carpenter C. 2014. Men in Disarmament Push for Inclusion of Women’s Voices as CCW Process on Autonomous Weapons Moves Forward. Duck of Minerva. Accessed 16 August 2014.

Chan M. 18 October 2013. lntervencion en el Primer Comite de la Asamblea General Armas convencionales. Statement at the First Committee of the General Assembly Conventional Weapons. Accessed 15 June 2014.

Control Arms. 2014. Welcome to the Forum! Accessed 16 August 2014.

Cohn C. 1987. Sex and Death in the Rational World of Defense Intellectuals. Signs. 12(4): 687-718.

De Landa M. 1991. War in the Age of Intelligent Machines. New York: Zone Books.

Garcia D. 2014. The Case against Killer Robots. Foreign Affairs. Accessed 8 July 2014.

Gubrud M. 27 November 2012. DoD Directive on Autonomy in Weapon Systems. ICRAC. Accessed 8 July 2014.

Heyns C. 1 April 2014. Report of the Special Rapporteur on extrajudicial,
summary or arbitrary executions, Christof Heyns
. A/HRC/26/36. Accessed 3 July 2014.

Horowitz M C. 5 May 2014. The Looming Robotics Gap. Foreign Policy. Accessed 8 July 2014.

Human Rights Watch and the Harvard Law School International Human Rights Clinic. 2012. Losing Humanity: The Case against Killer Robots. New York: Human Rights Watch.

International Committee for Robot Arms Control. September 2009. Mission Statement. Accessed 16 June 2014.

Kellenberger J. 8 September 2011. International Humanitarian Law and New Weapon Technologies. Accessed 8 July 2014.

Marchant GE, et al. 2011. International Governance of Autonomous Military Robots. The Columbia Science and Technology Law Review. 12(7): 272-315.

Metz, S. Autumn 2000. The Next Twist of the RMA. Parameters: 40-53. Accessed 24 June 2014.

Prokhovnik R. 1999. Rational woman:  A feminist critique of dichotomy.  London:  Routledge.

Oudes C and Zwijnenburg W. May 2011. Does Unmanned Make Unacceptable? Exploring the Debate on using Drones and Robots in Warfare. IKV Pax Christi. Accessed 7 July 2014.

Reeves SR and Johnson WJ. April 2014. Autonomous Weapons: Are You Sure These are Killer Robots? Can We Talk About It? The Army Lawyer. 1: 25-31.

Rogers M. 25 June 1984. Birth of the Killer Robots. Newsweek: 59. Accessed 24 June 2014.

Roff-Perkins H. 2013. Robots don’t lactate: Humanoid robots, warfighting and gender. The Washington Post. Accessed 15 August 2014.

Schechter E. 2014. In Defense of Killer Robots. Wall Street Journal. Accessed 15 August 2014.

Sharkey N. November 2007. Automated Killers and the Computing Profession. Computer. 40(11): 123-124.

Sharkey N (2012) The evitability of autonomous robot weapons. International Review of the Red Cross. 94(886): 787-799.

Simon-Michel J-H. May 2014. Draft Report of the 2014 Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS). Accessed 12 July 2014.

Singer PW. 2009. Wired for War: The Robotics Revolution and Conflict in the 21st Century. New York: Penguin.

Sparrow R. February 2007. Killer Robots. Journal of Applied Philosophy. 24(1): 62-77.

Thucydides. 2014. The State of Greece from the earliest Times to the Commencement of the Peloponnesian War. The History of the Peloponnesian War. Chapter 1. Accessed 16 August 2014.

United Nations Office in Geneva. 2014. Disarmament: Lethal Autonomous Weapons. Accessed 15 June 2014.

United States Department of Defense. 2012. Directive No. 3000.09. Accessed 16 August 2014.

Waltz KN. 1954. Man, the State, and War: A Theoretical Analysis. New York: Columbia University Press.

Wendt A. 1992. Anarchy is what states make of it: the social construction of power politics. International Organization. 46(02): 391-425.

Williams J. 2011. Borderless Battlefield: The CIA, the US Military and Drones. International Journal of Intelligence Ethics. 2(1): 2-23.

Williams J. 2014. Even Killer Robots Have a Gender Gap. Foreign Policy. Accessed 8 July 2014.

Further Reading on E-International Relations

Please Consider Donating

Before you download your free e-book, please consider donating to support open access publishing.

E-IR is an independent non-profit publisher run by an all volunteer team. Your donations allow us to invest in new open access titles and pay our bandwidth bills to ensure we keep our existing titles free to view. Any amount, in any currency, is appreciated. Many thanks!

Donations are voluntary and not required to download the e-book - your link to download is below.