Emerging and Disruptive Technologies: New Weapons in the Making?

This content was originally written for an undergraduate or Master's program. It is published as part of our mission to showcase peer-leading papers written by students during their studies. This work can be used for background reading and research, but should not be cited as an expert source or used in place of scholarly articles/books.

‘Emerging and disruptive technologies (EDTs)’ are a recent buzzword among the technology industry and state governments alike. The term refers to new and potentially cutting-edge technologies that are still in the early to middle stages of development (James 2013, 2). Currently, the most visible EDTs are arguably from the information technologies (IT) domain—as evidenced by the hype of commercially available artificial intelligence (AI) platforms like ChatGPT and Lensa AI. EDTs also exist beyond the IT realm, finding physical form in nanotechnologies, focused-energy technologies, bio-robotics, et cetera (Klare 2018, 11; James 2013, 2).

Besides consumers, EDTs have also taken defence industries by storm, especially due to the potential for EDTs to be adapted for military use. The integration of AI with weapons systems is already in development; observers fear that the resultant autonomous weapons systems (AWS) might not only jeopardise human control over lethal force but might also disrupt the logic of the present balance of power and nuclear stability (Klare 2018, 13). In the near future, focused-energy technologies and nanotechnologies could also create directed-energy weapons (DEW), armed swarming bots, and so on—each bearing the potential to become weapons of mass destruction (Borja 2023, 353; James 2013, 2).

Given this background, it seems natural to ask: will the military adoption of EDTs improve or erode international security? This paper demonstrates that realist-oriented analyses, despite still dominating discussions of armaments and security, offer an inconclusive answer to such a question. Instead, the research question can only be fully answered with a more critically oriented understanding of security and a view of armaments as socio-technical assemblages rather than purely material objects. Such an approach reveals that the military adoption of EDTs not only will but already exacerbates international insecurities.

This paper proceeds in four sections to make this argument. The first section clarifies and justifies the scope of the paper, particularly in relation to the definition of “international security”. The second section, broadly following traditional realist-oriented security studies, analyses the potential material impacts of EDTs on the battlefield and, from this, deduces the significance of the military adoption of EDTs for international security. Meanwhile, the third section responds to calls from within critical security studies to examine the “symbolic and discursive” dimensions of EDTs and armaments in general (Bousquet et al. 2017, 3). Both of these sections will first discuss the theoretical answers to the research question before applying the theoretical debate to the case studies of the military adoption of AI and space technologies—two salient EDT industries that could double in size in the upcoming years (Schwarz 2023, 297). The final section will conclude with a summary of the paper’s findings and recommendations for further research.

Unpacking “international security”

The idea of “international security” is increasingly explored at a variety of levels, including the human, social, and environmental levels (Peoples and Vaughan-Williams 2021, 3). However, most of the existing academic literature on EDTs takes a statist view of “international security”, focusing on the impacts of EDTs on state security and the prospects for armed conflict.

Even within a statist view, there are multiple ways to define and understand security. The more traditional way, broadly aligned with realist theoretical paradigms, defines security as the existential survival of a state within an international system of anarchy and distrust (Powell 2019, 52). Specifically, a state’s survival entails not only protecting its territorial integrity but also upholding its ability to make independent decisions free from external coercion. As such, “security” can be summarised as the “pursuit of freedom from threat and the ability of states and societies to maintain their independent identity and their functional integrity against forces of change which they see as hostile.” (Buzan 1991, 432). In this view, achieving security involves fortifying against external threats, usually through building up a large and effective military force that can either fend off attacks or deter attackers to begin with (Walt 1991; Steinbruner 1987, 23). Conversely, the outbreak of armed conflict or forced supplication to external demands are both symptomatic of poor security. In sum, the realist-oriented definition frames security as an absolute value, measurable through a military’s size and technological strength.

On the other hand, the critical turn in security studies contends that international security is best conceptualised not as an objective or material value but as a social construct (Bueger and Gadinger 2016, 88). While the sub-discipline of critical security studies (CSS) is very broad, its scholars agree that the realist implication that security is an absolute value is problematic. As Wolfers (1952) seminally pointed out, even states with the greatest military capacity rarely, or maybe never, judge themselves to be completely secure; this highlights a difference between objective security provided by materially greater power and subjective security deriving from the feeling of confidence that one indeed faces no threats (p. 485). Moreover, this view posits that since “security” is a feeling, the achievement of “security” is essentially a social practice. More specifically, when a state builds up its own military forces, it engages in the practice of socially producing security (Bueger and Gadinger 2016, 89). On the flip side, when a state labels changes in its external security environment as threats, it engages in an act of socially producing insecurity (Bueger and Gadinger 2016, 89).

In military and war studies, the traditional, realist-oriented definition of security remains more prevalent than its critical counterpart—largely owing to how military affairs and the conduct of war are still mostly seen as calculable and logical topics that befit a ‘hard’, objective line of inquiry (Barkawi 2011, 704; Powell 2019, 52). Some recent scholarship has attempted to challenge this phenomenon. Notably, in contrast to strategic studies that focus on the material planning and conduct of wars, Barkawi and Brighton (2011) suggest that a school of “critical war studies” is needed to focus on examining the socially, politically, and culturally constitutive dimensions of war (126-127). Others have also argued that within military studies, studies of armaments will particularly benefit from more critically oriented approaches. Notably, Bousquet et al. (2017) lament that armaments studies are still dominated by discussions about the political economy of the arms trade and the geopolitical effects of armaments—while “saying very little about the weapon” and its distinct character as a “socio-technical assemblage” (4-6). Similarly, Meiches (2017) suggests that armaments, rather than mere inanimate objects, are agents capable of influencing the psyche of their users—calling for recognition of the role the social construction of weapons plays in the production of war and insecurity (10).

This paper responds to these calls for a more social-oriented understanding of armaments, using the increasingly prominent topic of the military adoption of emerging AI and space technologies as its case study. The next section explores the topic through a realist-oriented understanding of armaments and security as being material, objective things. The following section then contrasts the inconclusiveness of realist-oriented discussions with the insights produced by a critically oriented understanding of armaments and security as possessing socially constructed dimensions.

Traditional debates around armaments and security

Having clarified the definition of “international security”, this paper now evaluates the impacts of the military adoption of EDTs on it, first through the realist paradigm and then through a CSS paradigm. As explained above, current literature about the role of armaments in international politics adopts a realist-oriented definition of international security. Of these theories, three are relevant to analysing the military adoption of EDTs: diffusion, arms racing, and deterrence.

However, despite sharing a definition for international security, these theories offer contrasting predictions about the impact of the military adoption of EDTs. On the one hand, the theories of diffusion and arms racing point towards a worst-case scenario that EDTs will worsen security for all states, whether they possess them or not. The former places emphasis on how the pathways for the military adoption of EDTs differ from that of conventional military armaments. Indeed, as introduced above, EDTs typically emerge in the civilian research realms and are only subsequently adopted by militaries. This unique innovation process prompted Horowitz (2020) to revisit his classical ‘military diffusion’ thesis. In his perspective, the dual civilian-military use cases of many EDTs will increase their availability, lowering barriers to adoption and facilitating the proliferation of military EDT-enabling technology to weaker actors (Horowitz 2020, 34; Leys 2018, 54). On the other hand, should EDTs truly pose a revolution in military affairs (RMA), the military adoption of EDTs is then likely to be a large challenge to stronger states with more firmly established military operating procedures, who will have to dedicate significant capacity to overhauling their current doctrines (Horowitz 2020, 36). The combination of both predictions leads Horowitz to conclude that military EDTs will disproportionally benefit smaller powers compared to major powers (Horowitz 2020, 36). This could catalyse significant changes to the military balance of power and create the conditions for armed conflict that would indicate a poor state of international security.

Meanwhile, theories on arms racing conclude with a similarly bleak prediction for the impact of military adoption of EDTs on international security. As Maiolo (2016) points out, weapons procurement across the international community seems to be intensified by “rapid” technological advancements, even if the new technology is only confined to one state (7). According to these observations, even if EDTs do not diffuse as easily as Horowitz (2020) predicted, the procurement of EDTs by just one state would still be enough to spark an action-reaction cycle of arms buildup from other states. According to multiple data-based studies, this potential process of arms buildup is, in turn, correlated with an increased chance for conflict—becoming symptomatic of poor international security as defined above (Colaresi and Thompson 2015; Gibler et al. 2005). For Glaser (2004), such a scenario is all the more likely within today’s specific security context. His concept of “suboptimal” arms racing describes how, when an already-strong state acquires too much military power, this fuels the excessive buildup of distrust and fear, ironically escalating into armed conflict (Glaser 2004, 45). Today, research into the military adoption of EDTs is mainly conducted by already-strong great or major powers, namely, the US, China, Russia, and NATO (Schwarz 2023, 298). On the one hand, this could set the conditions for Glaser’s concept of suboptimal arms racing, causing interstate tensions to skyrocket and ultimately leading to the outbreak of conflict that would indicate poor international security for all. On the other hand, in more zero-sum terms, the exclusive possession of EDTs by only strong militaries could enable them to coerce smaller states. If smaller states remain unable to match these larger states through similarly adopting military EDTs, the independence and freedom of their decision-making could be compromised, again fulfilling the definitions for poor security.

The predictions of diffusion and arms race theories are directly opposed by arguments presented by theories of deterrence. According to deterrence theory, the military adoption of EDTs might alter the strategic calculus so that armed conflict becomes obsolete—boosting international security as defined by the absence of conflict. Some scholars hypothesise that military EDTs could be such a significant RMA that armed conflict is rendered obsolete. For these scholars, EDTs could be similar to the 20th-century emergence of nuclear weapons, which provided the potential for mutually assured destruction (MAD) and allegedly assured an absence of conflict among nuclear-among states (Buzan and Herring 1998, 2). Thus, in contrast to the views expressed by theories of diffusion and arms racing, military adoption of EDTs could “solve many of today’s most wicked security problems” (Fjäder 2022, 51).

However, the predictions of deterrence theory arguably also rely on shaky logical assumptions. Most notably, the explanations provided by deterrence theory assume purely defensive use of EDTs. EDT-armed states seeking to use them for coercion and aggression could potentially symbolise an unstoppable security threat for weaker states without EDTs. Moreover, empirical examples—such as the Cuba Missile Crisis and nuclear-armed North Korea’s penchant for brinkmanship—highlight how even MAD might not completely deter conflict. Given these caveats, it appears that deterrence theory is not able to predict any significant impact of the military adoption of EDTS on the current status of international security.

Having explored the theoretical debates presented by traditional security studies, this paper now turns towards testing the debates against case studies. As introduced, compared with other types of EDTs, the field of AI has made the most progress in its integration with military weapons and systems. Unfortunately, the case study of the impact of military AI on security provides no clear empirical evidence or support for any of the aforementioned theories either. Many militaries have already integrated AI with surveillance and reconnaissance tools (Lague 2023). Moving forward, AI technologies are envisaged to eventually develop into AWS. This would bring the key contributions of unmatchable speed and unpredictability to the battlefield. Specifically, the algorithmic ‘black box’ of AWS makes their battlefield behaviour almost impossible to predict (Leys 2018, 53). Moreover, automated decision-making would be much faster than conventional weapons, requiring human operators to seek approval for engagement through a lengthy chain of command (Schwarz 2023, 301). This will be especially so if “human-out-of-the-loop” systems, where decision-making is delegated entirely to algorithms with no requirement for human vetoes, are successfully operationalised (Leys 2018, 51). Both characteristics mean that adopting AI could allow militaries to deploy lethal force in a way that is almost impossible to counter. This technical significance of AI seems to speak to deterrence theorists; indeed, the sheer dangers of AWS could make conflict more unlikely, helping to achieve security as defined as the absence of conflict. However, military adoption of AI could also create the conditions for arms racing, which is conversely correlated with conflict. Indeed, today’s great powers seem to be competing in developing AWS, with the US, China, and Russia spending “billions of dollars” on developing military AI annually (Schwarz 2023, 297; Klare 2018, 10). Overall, deterrence and arms racing theories present contrasting predictions about the impact of military AI on security.

These contrasting predictions are, moreover, complicated by practical issues with the current state of AI technology, which hinder a more thorough testing of theoretical viewpoints. While the sheer dangers of AWS might act as a compelling deterrent against the outbreak of any armed violence, Wyatt (2023) points out that conflict escalation might still occur. In particular, flawed decision-making algorithms, or even just random electrical errors, could cause AWS to discharge weapons or engage targets against commander intentions; this could, in turn, increase the risk of unintended escalation, especially if AWS are employed near potential flashpoints (Wyatt 2023, 336). Additionally, deeply integrating AI into combat systems could exacerbate network reliance, increasing vulnerability (Leys 2018, 55). Thus, based on the current state of AI technology, it is difficult to conclude if AWS will even be desirable enough for widespread military adoption and arms racing or effective enough to factor into deterrence calculations. Overall, the case study of military AI seems unable to provide more clarity to the inconclusive theoretical debate about the impacts of the military adoption of EDTs on international security.

Applying the aforementioned theories to the military adoption of nascent space technologies is met with similar roadblocks. Practically, the militarisation of EDTs in the space domain could take a range of forms, from hypothetical space-launched missiles to already-existing anti-satellite technology (Borja 2023, 353; Peperkamp 2020, 46). While the former could have primarily tactical applications as an anti-aircraft or anti-missile system, the latter could have more profound implications for combat, completely preventing the adversary from using the network-reliant systems that most advanced militaries are dependent on today (Housen-Couriel 2015, 126). Despite the seemingly tremendous implications that military space technologies bear for disabling the adversary, whether they will actually support deterrence theory remains unclear. Like AWS, space technologies could be especially vulnerable to network attacks; in their current state, they are limited by line-of-sight operating requirements or even unfavourable weather conditions such as cloud coverage (Borja 2023, 356). Space weapons might prove insufficiently reliable to act as a strong deterrent. On the other hand, empirical evidence for the worst-case scenario presented by theories of arms racing and diffusion is also difficult to find. This is largely due to the difficulties of identifying weapons intended for military use in space. Unlike most conventional weapons, which typically have a clear warhead and delivery system, space weapons could take either kinetic forms like conventional missiles, or non-kinetic forms, like high-energy beams or even cyber-software targeted at space infrastructure (Peperkamp 2020, 48; Czajkowski 2023, 370). Thus, unlike conventional arms races, an “arms race in space” will likely not even be detectable—in which case, it does not fulfil arms race theory’s basic criteria where two or more states are in direct and observable competition of arms production (Peperkamp 2020, 48). Moreover, compared to other EDTs, space technologies are costly and niche, even within civilian industries (Czajkowski 2023, 366). It thus remains to be seen whether they will diffuse and destabilise the international security environment, as Horowitz predicted above.

Overall, despite originating from the same school of thought, answering to the same definitions, and examining the same case study, the three concepts from conventional security studies produce very different predictions for the impact of EDTs on international security. Moreover, the overall ambiguity surrounding the tactical details of EDTs means that at this stage, no prediction offered by realist theories is particularly more compelling than the others.

The critical contribution: Towards a socio-technical understanding of armaments and security

Given the inconclusiveness of realist-oriented predictions for the impact of the military adoption of EDTs on security, this paper now turns to more critically oriented discussions. As introduced above, critical studies of armaments differ from traditional studies in two ways. Firstly, critical studies reframe “security” as not an objective quality derived from differences in material strength but a subjective practice influenced by perceptions and feelings of threat. Secondly, critical studies posit that weapons are not just objects but objects that are informed by social context to produce the perceptions of power and/or threat. This section demonstrated how the socio-technical reframing of both weapons and armaments foregrounds the military adoption of EDTs intrinsically linked to the production of insecurity.

To begin with, it is possible to argue that the military adoption of EDTs represents an inherent challenge to international security—especially because the very idea aggravates states’ perceptions of their insecurity. This becomes evident upon reflecting that the process of military adoption of EDTs is essentially a process of weaponisation. As Bousquet et al. (2017) point out: “no human artefact is intrinsically a weapon” (1). The very framing of EDTs as a technology for military adoption thus invents a source of threat to states despite being materially almost non-existent, presenting a process of production insecurity. The insecurities generated by the weaponisation of EDTs become all the more pronounced when analysing the etymology surrounding EDTs. Indeed, Csernatoni and Martins (2023) dive into what it means to be “disruptive”, arguing that the term implies the existence of a ‘normal’ or a default context and that the impact of the disruption depends on what context it is applied to. Originally, “disruptive” technologies described relatively benign consumer products such as the smartphone, which had the potential to overturn trends in the commercial market (Van Horn 2002; Christensen et al. 2001). By viewing EDTs as a potential military tool, images of disruption are transposed from the context of market trends to the context of international security. This, in turn, implies that disruptive impact is no longer just to market trends but now to state security as a whole. In this view, the discursive framing of the military adoption of EDTs is inherently a process of threat construction that produces insecurity rather than security (Csernatoni and Martins 2023, 6).

Ideas about the military adoption of EDTs as a source of state insecurity are also encompassed in Stevens’ (2015) conceptualisation of security as an “inherently temporal proposition” (1). According to Stevens, the practice of achieving security draws reference from experience of past threats and speculation about future ones. In attempting to guarantee security against threats that do not yet exist, states always seem to be playing a catch-up game to align their capabilities with imagined and hypothetical ones—thus inevitably creating a perpetual feeling of insecurity (Stevens 2015, 183). While Stevens applied this argument to the ever-expanding cyber domain, which forces states into a nonstop catch-up game, his argument is clearly suitable for assessing EDTs, too. This is highlighted by the fact that EDTs are, by definition, “emerging” and uncertain; this provides infinite potential for EDT-related threats and prompts states to fall into a perpetual process of generating insecurity. 

Critical security studies’ contributions to understanding the impact of the military adoption of EDTs are not merely theoretical. Indeed, practical examples of official political discourse demonstrate that the topic is already producing international insecurity. To illustrate this, the rest of the paper will analyse the “representation practices” surrounding the military adoption of EDTs (Dunn and Nuemann 2016, 262). Specifically, this paper will conduct a thematic discourse analysis, identifying the themes and images that commonly surround discourse about the military adoption of EDTs. The aims of this discourse analysis are twofold. Firstly, this analysis aims to compare and contrast the material status of the military adoption of EDTs and the way the matter has been represented textually; any discrepancies could potentially allow us to uncover more about EDTs’ impact on producing security or insecurity. Secondly, this analysis also aims to investigate the intersubjective nature of discourse on the military adoption of EDTs by observing the impact of such discourse on international hostilities and tensions, which could have ramifications for international security.

The official discourse surrounding the military adoption of EDTs suggests that the topic indeed provides material for threat construction, generating insecurity rather than security for states. This is highlighted especially by how the military adoption of EDTs becomes caught up in broader conversations about geopolitical rivalry, particularly within Western security discourse. For instance, the fact sheet of the NATO 2030 strategy highlights how, with the military adoption of EDTs, “NATO allies can no longer take their technological edge for granted.” (NATO 2021, 2, italics added). This is taken even further by the US; its most recent National Security Strategy white paper states that EDTs can “pose novel threats to the United States and [its] allies and partners”, especially if adopted by “authoritarian” and “revisionist” states, with China and Russia being named explicitly (The White House 2022, 21, italics added). Two things are of note from both statements. First, there is a clear discrepancy between the discursive sureness of the threat of military EDTs and the reality that most military EDTs do not even exist yet. Second, the clear themes of ‘othering’ present a common, deliberate choice to represent the military adoption of EDTs from a discursive angle of geopolitical rivalry and competition between alliances, which is already laced with themes of distrust and suspicion. Overall, both observations highlight how the very idea of military adoption provides an outlet through which the US and NATO construct a collective threat, further fueling perceptions of insecurity.

Beyond the broad military adoption of EDTs, the case studies of military AI and military adoption of space technologies provide further demonstration of the production of insecurity and show how discourse becomes intersubjective, creating tensions that exacerbate security. Specifically, the discourse surrounding the military adoption of AI not only inflates the threat of the technology but also becomes a domain in which destabilising superpower competition occurs. This is particularly evident in the US establishment of AI “battle labs” in Europe and the INDOPACOM (US DoD 2023). In reality, these “battle labs” refer to hackathon events used to scout individuals who can assist the US in developing AI-based support systems like language models and communications systems (US DoD 2023). The discursive selection of the term “battle” thus presents a discrepancy between the reality of military AI’s primarily support-based role and the images of violence generated. The discourse surrounding military AIs thus seems to create insecurities, echoing the patterns regarding the conflation between military EDTs and geopolitical rivalry described above.

Such discourse has moreover been met with responses from beyond the Western sphere, which further generate insecurity and exacerbate international tensions. Notably, an article published on Xinhua Net, the state-owned news agency of the People’s Republic of China (PRC), accused the US military of promoting AI to “suppress its opponents (打压对手 / da ya dui shou)” (Xinhua 2022). The same phrase also appeared in an article on the popular Chinese military news site Zhongguojunwang, which was later reposted on the state-controlled China.org.cn news page (Fu 2022, Fu 2022). This year, the phrase appeared to gain further traction within more official discourse, appearing on the official webpages of the Chinese foreign ministry and Chinese embassies to describe the dangers of American technological hegemony (MFA of the PRC 2023; PRC Embassy in France 2023; PRC Embassy in Samoa 2023). Obviously, the military adoption of EDTs has generated insecurity not only for Western states but also among their rivals like China. However, a deeper discursive analysis suggests that this insecurity is more extensive than meets the eye. Indeed, “da ya” is not the only Chinese phrase expressing ‘suppression’. Indeed, an alternative, milder term is “压制 / ya zhi”, which directly translates into ‘using pressure to control’ an opponent’s actions (my translation). In contrast, the repeated phrase “da ya” directly translates into “beating [someone] into submission”, connoting a more combative and aggressive form of suppression (my translation). Thus, there is a clear discrepancy between the US military’s current employment of AI in mainly support and logistical roles and popular Chinese discursive representations of imagined physical violence.

Overall, discourse analysis surrounding the military adoption of AI reveals two important findings. Firstly, the military adoption of AI has certainly become a trigger for the production of insecurity rather than security. Secondly, the official statements and language surrounding the military adoption of AI contribute to a more tense and hostile international security environment, increasing the chances of conflict and leading to less international security in a more traditional sense.

A similar pattern of insecurity is observed in the discourse surrounding the military adoption of space technologies. On the one hand, as with the military adoption of AI, the militarisation of space technologies is often framed and justified within the context of great power competition, providing fuel and justification for continued rivalry. Indeed, the US uses the “return to great power competition” as justification for its outer space missile defence programme, explicitly identifying China as its “pacing stick” and main rival (Borja 2023, 360; US DoD 2022, 111). Meanwhile, China and Russia cite the US space programme as proof of its ambitions for “dominance” (Borja 2023, 360). Interestingly, such discourse provides a direct counterpoint to the above-mentioned arms-racing theories, highlighting how a security dilemma surrounding the militarisation of outer space still seems to be occurring despite the impossibility of identifying how much other states are really investing in their programmes.

On the other hand, other discourse also reveals how the military adoption of space technologies seems to decouple the production of insecurity from identifying material and empirical threats to physical territory, making the former possible even in the absence of the latter. The possibilities for extremely asymmetric space-based military technologies seem relatively unrealistic, especially given the UN’s 2000 resolution banning the placement of WMD in space and the placement of armaments on the moon and other celestial bodies (Quinn 2008, 476). In fact, some optimistic scholars even suggest that military-owned space systems could potentially provide protection against incoming asteroids and other cosmic events—achieving security for all states (Duvall and Havercroft 2008, 762). Despite these, the military adoption of space technologies is more often framed as a security issue for states, even without clear evidence of threats to territory. For the US, the “serious threat” emanating from the military adoption of space technologies is attributed to how competitors can use military space technology to “refashion” and “destabilise” the “free and open order”; physical threats to the US homeland are only included on a separate page, like an afterthought (US DoD 2022, 4-5; US DoD 2023, 4). Similarly, China’s 2019 defence white paper claims that other major powers like the US and NATO have “undermined strategic stability” by propagating “threats to outer space” (State Council Information Office of the PRC 2019, Section 1). In both examples, no concrete explanations are given for how kinetic weapons will threaten abstract ideas like freedom and stability. Moreover, in both examples, the militarisation of outer space is presented together with other more tangible and physical security concerns despite obviously being a more abstract topic. Thus, for scholars like Peoples (2011), such statements not only link but also securitise the discrete ideas of values, world order, and the military adoption of space technologies, using the latter as an outlet for threat construction and the production of insecurity.

Overall, the discourse surrounding the military adoption of space technologies suggests that that topic is often exaggerated for threat construction. Like AI technologies and other EDTs, more generally, the military adoption of space technologies has thus provided a platform for the generation of international insecurities.

Conclusion

In conclusion, this paper discussed the effects of the military adoption of EDTs on international security, focusing specifically on the case studies of AI and space technologies as emblematic representatives of the EDTs industry. This paper argues that the military adoption of EDTs not only will but already is challenging international security. This argument was more strongly supported not by realist-oriented theories, which offered no clear predictions, but by considering the subjective dimensions of “international security”, which opened up room to examine the social impacts of the military adoption of EDTs. In this perspective, both the discourse surrounding the military adoption of EDTs and the very etymological implications of the term highlight how the topic seems to generate more insecurity than security.

Beyond answering the research question, this paper also presents important contributions to the broader theoretical debates within the sub-discipline of security studies. Specifically, due to the highly technical nature of contemporary weapons systems, the armaments sub-field remains dominated by traditionalist, materialist approaches. However, this paper demonstrates that analyses that over-emphasise these technical aspects run the risk of becoming techno-fetishist. Both armaments and security have material as well as social dimensions; questions about the relationship between both can only be comprehensively answered when such duality is acknowledged.

One key decision this paper made was to focus on “international security” from a statist perspective to engage more deeply with current literature on military EDTs and highlight the limitations of their over-emphasis on technical analysis. Moving forward, considering other definitions of “international security” could form the basis for further projects, which could supplement the views provided by this paper. For instance, throughout its analysis, this paper primarily drew evidence from the superpowers since they are at the forefront of EDT development, and evidence from these states is the most widely available. However, given the low barriers to adoption for some EDTs, further research could potentially consider the impact of the diffusion of military EDTs to violent non-state actors. Such trends could speak to ideas about the erosion of state sovereignty power, an important aspect of international security that this paper chose not to focus on (Gruszczack and Kaempf 2023, 2). From a human security perspective, EDTs could also boost the survivability of individual soldiers or even civilians caught in conflict-ridden regions by creating augmented personal protective equipment from bio-robotics technologies (Fish et al. 2018, 18). Exploring alternative definitions of “security” could, therefore, form a compelling and vital starting point for further discussions of the topic.

References

Barkawi, Tarak and Shane Brighton. 2011. “Power of War: Fighting, Knowledge, and Critique”. International Political Sociology 5: 126-143. https://doi.org/10.1111/j.1749-5687.2011.00125.x.

Barkawi, Tarak. 2011. “From War to Security: Security Studies, the Wider Agenda and the Fate of the Study of War”. Millennium: Journal of International Studies 39 (3): 701-716. https://doi.org/10.1177/0305829811400656.

Borja, Lauren J. 2023. “High-Energy Laser Directed Energy Weapons: Military Doctrine and Implications for Warfare”. In Routledge Handbook of the Future of Warfare, ed. Artur Gruszczack and Sebastian Kaempf, London: Routledge.

Bousquet, Antoine, Jairus Grove and Nisha Shah. 2017. “Becoming weapon: an opening call to arms”. Critical Studies on Security 5 (1): 1-8. https://doi.org/10.1080/21624887.2017.1343010.

Bueger, Christian and Frank Gadinger. 2016. “International Practice Theories”. In Praxeological Political Analysis, ed. Michael Jonas and Beate Littig. New York: Routlege.

Buzan, Barry. 1991. “New Patterns of Global Security in the Twenty-First Century”. International Affairs 67 (3): 431-451. https://www.jstor.org/stable/2621945.

Buzan, Barry and Eric Herring. 1998. The Arms Dynamic in World Politics. Colorado: Lynne Rienner Publishers.

Christensen, Clayton, Thomas Craig and Stuart Hart. 2001. “The Great Disruption”. Foreign Affairs 80 (2): 80-95. https://www.jstor.org/stable/20050066.

Colaresi, Michael P. and William R. Thompson. 2005. “Alliances, Arms Buildups and Recurrent Conflict: Testing a Steps-to-War Model”. The Journal of Politics 67 (2): 345-364. https://www.jstor.org/stable/10.1111/j.1468-2508.2005.00320.x.

Csernatoni, Raluca and Bruno Oliveira Martins. 2023. “Disruptive Technologies for Security and Defence: Temporality, Performativity and Imagination”. Geopolitics 28 (5): 1-24. https://doi.org/10.1080/14650045.2023.2224235.

Czajkowski, Marek. 2023. “Space-Based Systems and Counterspace Warfare”. In Routledge Handbook of the Future of Warfare, ed. Artur Gruszczack and Sebastian Kaempf, London: Routledge.

Dunn, Kevin C. and Iver B. Neumann. 2016. “Discourse Analysis”. In Routledge Handbook of International Political Sociology, ed. Xavier Guillaume and Pnar Bilgin. Milton: Taylor and Francis Group.

Duvall, Raymond and Jonathan Havercroft. 2008. “Taking Sovereignty out of This World: Sapce Weapons and Empire of the Future”. Review of International Studies 34 (4): 755-775. https://www.jstor.org/stable/40212501.

Fish, Lauren, Paul Scharre, Katherine Kidder and Amy Schafer. 2018. “Emerging Technologies”. Center for a New American Security. October 2018. https://www.jstor.org/stable/resrep20410.

Fjäder, Christian. 2022. “Emerging and Disruptive Technologies and Security: Considering Trade-Offs Between New Opportunities and Emerging Risks”. In Disruption, Ideation and Innovation for Defence and Security, ed. Gitanjali Adlakha-Hutcheon and Anthony Masys. Springer. Published online 2022. Accessed via https://link.springer.com/book/10.1007/978-3-031-06636-8.

Fu, Bo. 2022. “美持续推动人工智能作战运动” [The US Continues to Promote the Use of AI in Combat Operations]. News article. Zhongguoguofangbao. Zhongguojunwang. Published 6 Jul, 2022. http://www.81.cn/gfbmap/content/21/2022-07/06/04/2022070604_pdf.pdf.

Fu, Bo. 2022. “美持续推动人工智能作战运动” [The US Continues to Promote the Use of AI in Combat Operations]. News article. China.org.cn. Reposted from Zhongguojunwang, 07 Jul, 2022. http://home.china.com.cn/txt/2022-07/07/content_42028556.htm.

Gibler, Douglas M., Toby J. Rider and Marc L. Hutchinson. 2005. “Taking Arms Against a Sea of Troubles: Conventional Arms Races During Periods of Rivalry” Journal of Peace Research 42 (2): 131-147. https://doi.org/10.1177/0022343305050687.

Glaser, Charles. 2004. “When Are Arms Races Dangerous? Rational versus Suboptimal Arming.” International Security 28 (4): 44-84. https://www.jstor.org/stable/4137449.

Gruszczack, Artur and Sebastian Kaempf. 2023. Routledge Handbook of the Future of Warfare. London: Routledge.

Horowitz, Michael. 2020. AI and the Diffusion of Global Power. Research report chapter. Centre for International Governance Innovation. Accessed 18 Oct, 2023 via https://www.jstor.org/stable/resrep27510.8.

Housen-Couriel, Deborah. 2015. “Cybersecurity and Anti-Satellite Capabilities (ASAT): New Threats and New Legal Responses”. Journal of Law and Cyber Warfare 4 (3): 116-149. https://www.jstor.org/stable/26441259.

James, Andrew D. 2013. “Emerging Technologies and Military Capability”. S. Rajaratnam School of International Studies. Policy Brief. Accessed 3 Oct, 2023. https://www.jstor.com/stable/resrep05804.

Klare, Michael. 2018. US, Russia Impede Steps to Ban ‘Killer Robots’. Opinion article. Arms Control Today 48 (4): 31-33. Accessed 05 Oct, 2023. https://www.jstor.org/stable/10.2307/90025262.

Lague, David. 2023. Human-machine teams driven by AI are about to reshape warfare. News article. Reuters. Updated 8 Sep, 2023. Accessed 9 Nov, 2023, via https://www.reuters.com/technology/human-machine-teams-driven-by-ai-are-about-reshape-warfare-2023-09-08/.

Leys, Nathan. 2018. “Autonomous Weapons Systems and International Crises”. Strategic Studies Quarterly 12 (1): 48-73. https://www.jstor.org/stable/10.2307,26333877.

Maiolo, Joseph. 2016. “Introduction”. In Arms races in international politics: from the nineteenth to the twenty-first century, ed. Thomas Mahnken, Joseph Maiolo, and D. Stevenson, 1-10. Oxford: Oxford University Press.

Meiches, Benjamin. 2017. “Weapons, desire, and the making of war”. Critical Studies on Security 5 (1): 9-27. https://doi.org/10.1080/21624887.2017.1312149.

MFA of the PRC. 2023. “US Hegemony and its Perils”. Report. Released February 2023. Accessed via http://cy.china-embassy.gov.cn/eng/xwdt/202302/t20230220_11028013.htm.

NATO. 2021. NATO 2030: Factsheet. Fact sheet. https://www.nato.int/nato_static_fl2014/assets/pdf/2021/6/pdf/2106-factsheet-nato2030-en.pdf.

Peoples, Columba. 2011. “The Securitization of Outer Space: Challenges for Arms Control”. Contemporary Security Policy 32 (1): 76-98. https://doi.org/10.1080/13523260.2011.556846.

Peoples, Columba and Nick Vaughan-Williams. 2021. Critical Security Studies: An Introduction (Third Edition). New York: Routledge.

Peperkamp, Lonneke. 2020. “An Arms Race in Outer Space?”. Atlantisch Perspectief 44 (4), Special Edition: 46-50. https://www.jstor.org/stable/10.2307.49600572

Powell, Rhonda. 2019. Rights as Security: the Theoretical Basis of Security of Person, First edition. Oxford: Oxford University Press.

PRC Embassy in France. 2023. 美国的霸权霸道霸凌及其危害 [Report: The US’s Hegemony, Bullying, and its Dangers]. Official report. PRC Embassy in France. Published 20 Feb, 2023. http://fr.china-embassy.gov.cn/zgyw/202302/t20230220_11027609.htm.

PRC Embassy in Samoa. 2023. 美国的霸权霸道霸凌及其危害 [The US’s Hegemony, Bullying, and its Dangers]. Official report. PRC Embassy in Samoa. Published 03 Feb, 2023. http://ws.china-embassy.gov.cn/sgxw/202303/t20230302_11033816.htm.

Quinn, Adam G. 2008. “The New Age of Space Law: The Outer Space Treaty and the Weaponization of Space”. Minnesota Journal of International Law 17: 475-502. Accessed via https://core.ac.uk/download/pdf/217210297.pdf.

Schwarz, Elke. 2023. “Cybernetics at War: Military Artificial Intelligence, Weapon Systems and the De-Skilled Moral Agent”. In Routledge Handbook of the Future of Warfare, ed. Artur Gruszczack and Sebastian Kaempf, London: Routledge.

State Council Information Office of the PRC. 2019. China’s National Defense in the New Era. White Paper. Published July 2019. Accessed via https://english.www.gov.cn/atts/stream/files/5d3943eec6d0a15c923d2036.

Steinbruner, John. 1987. “The Principles of Defensive Deterrence”. The Brookings Review 5 (3): 22-28. https://www.jstor.org/stable/20079982.

Stevens, Tim. 2015. Cyber Security and the Politics of Time. Cambridge: Cambridge University Press.

The White House. 2022. National Security Strategy. White paper. Published October, 2022. https://www.whitehouse.gov/wp-content/uploads/2022/10/Biden-Harris-Administrations-National-Security-Strategy-10.2022.pdf.

US DoD. 2022. 2022 National Defense Strategy of the United States of America. White paper. Published October, 2022. https://media.defense.gov/2022/Oct/27/2003103845/-1/-1/1/2022-NATIONAL-DEFENSE-STRATEGY-NPR-MDR.PDF.

US DoD. 2023. Space Policy Review and Strategy on Protection of Satellites. Report. Published September 2023. Accessed via https://media.defense.gov/2023/Sep/14/2003301146/-1/-1/0/COMPREHENSIVE-REPORT-FOR-RELEASE.PDF.

Van Horn, Royal. 2002. “Disruptive Technology”. The Phib Delta Kappan 83 (7): 492, 494. https://www.jstor.org/stable/20440180.

Walt, Stephen. 1991. “The Renaissance of Security Studies”. International Studies Quarterly 35 (2): 211-239. https://www.jstor.org/stable/2600471.

Wolfers, Arnold. 1952. ““National Security” as an Ambiguous Symbol”. Political Science Quarterly 67 (4):481-502. https://www.jstor.org/stable/2145138.

Wyatt, Austin. 2023. “Lethal Autonomous Weapon Systems and their Potential Impact on the Future of Warfare”. In Routledge Handbook of the Future of Warfare, ed. Artur Gruszczack and Sebastian Kaempf, London: Routledge.

Xinhua. 2020. 美军加速人工智能技术在实战方面的应用 引发前沿军事竞赛” [The US Military Accelerates the Application of Artificial Intelligence in Real Combat, Triggering an Arms Race]. News article. Xinhua Net. Published 08 Jul, 2022. http://www.xinhuanet.com/mil/2022-07/08/c_1211665114.htm.

Further Reading on E-International Relations

Please Consider Donating

Before you download your free e-book, please consider donating to support open access publishing.

E-IR is an independent non-profit publisher run by an all volunteer team. Your donations allow us to invest in new open access titles and pay our bandwidth bills to ensure we keep our existing titles free to view. Any amount, in any currency, is appreciated. Many thanks!

Donations are voluntary and not required to download the e-book - your link to download is below.

Subscribe

Get our weekly email