Lethal Autonomous Weapons: Means of Defense and Attack

This article has been initially published in the Revista Seguridad y Poder Terrestre
Vol. 1 N.° 2 (2022): October – December


The deployment and use of lethal autonomous weapons (LAW) systems with self-determination in their critical functions can have an impact on international security by initiating a new arms race, promoting the escalation of unpremeditated violence, and lowering the threshold for the legitimate use of force for decision-makers. In this regard, this article identifies some of the military applications assigned or potentially assignable to LAWs and analyzes the challenges for their use in twenty-first century conflicts.

Keywords: Autonomous Weapons, Means of Defense, Means of Attack, Artificial Intelligence.


War is a constant in the history of human evolution and is considered as the last resort to end a dispute between two States. However, in accordance with the current international legal framework, the warring parties must restrict the use of force, avoid the disproportionate use of military action, and minimize collateral damage. In that context, States have used technological advances to improve the accuracy and effectiveness of weapons, modify the way they wage war, and rewrite their military doctrines. It is precisely because of these advances and technological modifications in the available means of warfare that weapons systems with a certain degree of autonomy arise in their critical functions of selection and attack against targets. This semi-autonomy or full autonomy of weapons allows us to reflect on the potential applications and implications of these means to wage war, since they are not only used as means of defense but, also and more frequently, as offensive weapons.

Obviously, lethal autonomous weapons (LAW) are instruments of war that have benefited from the progress of technology and artificial intelligence. However, they have brought with them new conflict scenarios that pose significant challenges to the existing legal framework and to international organizations due to their intensive use and the blurring of the lines between military forces, non-State actors and non-combatants. For this reason, the United Nations has initiated efforts to regulate the use of LAWs and even its Secretary General, António Guterres, has expressed that machines with the capacity to decide on the life or death of a person are something “politically unacceptable, morally repugnant and should be prohibited by international law.”[1]

In order to present the ideas on this subject, the analysis carried out in this article integrates three sections that show the existing debate on the definition of LAWs, their classification as means of defense/attack, and the challenges they face for their use in the conflicts of the XXI century.

What are Lethal Autonomous Weapons (LAWs)?

Currently, there is no single globally accepted definition for LAWs. However, there is some international convergence regarding its characteristics, uses, capabilities and dangerousness. As a result of this lack of cohesion, some international bodies are striving to find a universally accepted concept that responds to the human rights, security and ethical concerns that have arisen with the use of LAWs as weapons of defense/attack. For example, the United Nations Office for Disarmament Affairs (UNODA) states that LAWs are weapons systems that employ attack-related autonomy in the “critical functions” of target selection and targeting. Definition that, although it mentions that some functions in such weapons are autonomous, does not recognize that all functions are.

For their part, and as an attempt to find a consensual definition, government experts who participated in the Convention on Certain Conventional Weapons (CCW) in 2016, accepted the definition proposed by the International Committee of the Red Cross (ICRC) that expands the UNODA proposal and defines LAWs as any weapons system “that can select and attack targets independently.”[2] It also states that LAWs “are weapons systems with autonomy in their ‘critical functions’ of acquisition, tracking, selection and attack of targets.”[3]

Consequently, UNODA, the Group of Governmental Experts (GEG) and the ICRC define LAWs based on their autonomous functions of detecting, identifying and attacking targets. Similarly, these bodies mention that critical functions include the detection, identification, tracking, selection and use of force to neutralize, damage or destroy designated targets; however, they stress that it will be under human control. To this end, the CCW has two customary rules of International Humanitarian Law (IHL)[4] and four protocols governing the use of specific weapons.[5] Likewise, the work of the GEG has made it possible to establish eleven guiding principles for the defensive/offensive use of the LAW.

Therefore, thanks to technical, academic, legal and political considerations, there is an international debate that seeks to define realities and risks, as well as opposing actors and visions on LAWs. Obviously, the LAWs demand that the best cost-benefit alternative be sought where the interests of international actors converge in order to advance in the construction of an international legal framework and a universal definition. However, although the participation of private initiative is important, it is the representatives of the States –through international organizations– who are responsible for finding and generating consensus to regulate the LAWs based on dialogue and negotiation.

Offensive and defensive uses of LAWs

Armed confrontation between States is the last option to settle differences. Even so, war is a constant in the history of human beings and an activity that is modified with the inclusion of emerging technologies in each period of technological development. In this context, the current military applications given to LAWs can be classified as offensive and defensive.

Considering that the world is experiencing what Taiwanese scientist, entrepreneur and writer Kai-Fu Lee calls the Third Revolution of War[6] (characterized by the use of weapons with artificial intelligence),[7] some of the defensive and offensive tasks that LAWs can perform, including their various capabilities and usefulness in the military sector as a means to demonstrate power, provide protection, reduce casualties, mitigate costs, achieve success, ensure security and win the will of society.

Defensive applications

Some of the strategic visions state that invincibility lies in defense and in the opportunities to achieve success in attack. This justifies the primarily defensive use of LAWs to monitor borders, block missile attacks, destroy improvised explosives, as well as to keep a device hovering in the sky waiting for the best opportunity to disable air defenses or target targets of opportunity. However, it should be noted that much of the autonomous defensive means can be reconfigured to act as means of attack.

LAWs have a long history as a means of missile defense. Currently, the only fully automated systems for defense against missiles, artillery, and mortars are: (1) the Israeli “Iron Dome program, (2) the U.S. anti-missile systems dubbed “Terminal High Altitude Area Defense” (used to shoot down short-, medium- and intermediate-range ballistic missiles), and (3) the Russian S-400 “Triumph” air defense systems. Also, in its use against improvised explosives, the “Special Weapons Observation Reconnaissance Detection System” (SWORDS) is a good example of LAWs. SWORDS was the first AI-powered unmanned ground vehicle that was armed, but under the remote control of humans. However, if state-of-the-art technology were included, it could easily become an offensive LAW.

On the other hand, the use of LAWs as marauding weapons dates back to 1980, as it was part of the Suppression of Enemy Air Defenses[8] (SEAD).[9] LAWs in this category are autonomous “shoot and forget” weapons that combine the capabilities of drones and guided missiles. Some of the producers and users of marauding weapons are Turkey, Israel,[10] Taiwan, China, Russia and the United States.[11] Although there are many more, they are a clear example of the capabilities that have been developed in the LAW sector worldwide.

Likewise, the South Korean government implemented the SGR-1 robot system for the protection of its border with North Korea, particularly in the demilitarized zone.[12] The SGR-1 can detect targets miles away, information that is received in a control center for analysis and subsequent decision-making by a human. It is clear that, in any volatile situation, the order to shoot is not automatic, it depends on the rational decision made by a human. This is a small sample of the defensive usefulness of the LAW, so some details of their offensive potential are provided below.

The best defense is the attack

Today it is known that LAWs can be employed as a means to fulfill the so-called “dull, dirty and dangerous” (“3 Ds”) missions. However, due to their smaller size and the multiplicity of functions they can perform, tactics and maneuvers have been adapted to the new scenarios, prioritizing the mass attack. This is known as swarm attacks, which carries serious concerns about human control of attacks coordinated by the LAWs themselves.

It follows that, more frequently, LAWs are used to carry out offensive actions in order to: (1) direct ammunition against selected targets, (2) carry out bombing in hot zones, (3) eliminate terrorists, (4) eliminate targets of opportunity, (5) keep marauding weapons lurking by monitoring a specific area to later attack, (6) disable radar systems, (7) destroy strategic installations or anti-aircraft artillery, (8) monitor and control airspace, (9) attack armored vehicles, (10) neutralize enemies in a personalized manner, (11) perform missions in urban environments, (12) perform accompanying and even air combat functions, and –of course– (13) detect, identify, track, and attack any type of target. Consequently, concerns have been raised in international society about the challenges implicit in the use of these weapons.

Challenges for the use of LAWs in future conflicts

In the conflicts of this century, one of the key challenges of using LAWs focuses on finding the most efficient way for humans to preserve –at all times– control over life and death decisions. This situation has generated an extensive debate about the questions of authority and responsibility in the new scenarios of war. That is, there is an international concern to prevent LAWs from being fully autonomous and causing the unlawful death of innocent people. Consequently, it is necessary to find a middle ground in the use of artificial intelligence for the development of LAWs.

Building new and better AI-powered weapons requires reflection on ethical issues, as both authority and responsibility over people’s lives and deaths will be delegated to an “intelligent machine.” Due to the importance of this topic, some new disciplines of study have emerged such as roboethics[13] and machine ethics. Under these assumptions, will the human being be able to allow an automaton with artificial intelligence decide its fate?

It is also necessary to preserve international security and avoid a new arms race. In that regard, it is necessary for States to establish some agreement to regulate the use of autonomous weapons as instruments of attack and to define a proportional response threshold. Otherwise, the intensive use of LAWs in present and future conflicts will motivate an arms race to find effective measures and countermeasures, in order to counter the capabilities of potential opponents to design, build and put into action the LAWs in a never-ending cycle.

Finally, safeguarding cybersecurity in LAWs will be an essential issue for the safe and accurate use of these weapons in a scenario where cyberspace is the protagonist. Otherwise, there is a risk of greater conflicts due to the loss of command and control of the LAWs, the militarization of cyberspace, and the development of autonomous cyberweapons.


As this article has shown, a universally accepted definition for LAW is required, which provides an opportunity for the measurement and evaluation of IHL compliance. The use of LAWs in the conflicts of this century is limited only by human ingenuity, since the possibilities are numerous and move away from the traditional role of acting only in those “dull, dirty and dangerous” tasks. In this sense, everything seems to indicate that the LAWs will conduct precise, limited, regulated actions (defensive and offensive) controlled by human designs.

Likewise, the deployment and use of LAW systems with autonomy in their critical functions can impact international security by initiating a new arms race, promoting the escalation of unpremeditated violence and lowering the threshold of the legitimate use of force for decision-makers. Obviously, the LAWs will not reproduce some erroneous behaviors of human beings such as attacking civilians for revenge, raping defenseless women or acting irrationally due to fear. However, moral dilemmas related to life-and-death decision-making must be faced, as well as questions about how just remote-controlled warfare is.


  1. United Nations Organization, “Machines Capable of Taking Lives without Human Involvement Are Unacceptable, Secretary-General Tells Experts on Autonomous Weapons Systems”, (March 25, 2019), (Accessed June 6, 2022).
  2. International Committee of the Red Cross, “Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian Aspects”, (Geneva: November 2014), 5,
  3. Ibid.
  4. The rules are: (1) the prohibition of using weapons that have indiscriminate effects and (2) the prohibition of using weapons that cause superfluous harm. Information extracted from: International Committee of the Red Cross, ” Convención de 1980 sobre ciertas armas convencionales”, (Geneva: March 2002),
  5. The protocols are: Protocol I (Non-Traceable Fragments), Protocol II (Mines, Booby Traps and Other Devices), Protocol III (Incendiary Weapons) and Protocol IV (Blinding Laser Weapons), Ibid.
  6. The two previous revolutions made use of gunpowder and nuclear weapons, respectively.
  7. Kai-Fu Lee, “The Third Revolution of Warfare. First there was gunpowder. Then nuclear weapons. Next: artificially intelligent weapons”, The Atlantic (September 11, 2021) (Accessed June1, 2022).
  8. Christopher Bolkcom, “Military suppression of enemy air defenses (SEAD): Assessing future needs”, CRS Report for Congress (May 11, 2005),
  9. These weapons are also known as kamikaze drones for obvious reasons.
  10. Israel is by far the state with the largest number of manufacturers of marauding weapons.
  11. Some of the examples of these marauding weapons are: Turkey (Kargu, Alpagu and Togan), Israel (Harpy and Harop), Taiwan (Chien Hsiang), China (ASN-301), Russia (Lancet) and the United States (Switchblade).
  12. For more information see: Mark Prigg, “Who goes there? Samsung unveils robot sentry that can kill from two miles away”, dailymail (September 15, 2014), The South Korean government had the participation of the company Samsung Techwin.
  13. It could be understood as the ethics for an artificial intelligence with moral principles such as justice, equality and goodness.


Share on facebook
Share on twitter
Share on linkedin

The ideas contained in this analysis are the sole responsibility of the author, without necessarily reflecting the thoughts of the CEEEP or the Peruvian Army.

Image: CEEEP