top of page

Lawful Means of Warfare in the Age of Technology: Examining the Use of Loitering Munitions in Conformity with International Humanitarian Law

  • Guusje van der Vorst
  • 1 hour ago
  • 10 min read

Introduction

Governing international and non-international armed conflicts, international humanitarian law (IHL) comes into play as the law seeking to limit the harmful effects of armed conflict. With its dual function of regulating the conduct of parties to a conflict and protecting those not or no longer participating in the hostilities, IHL balances legitimate military action with the objective of limiting harm and human suffering.2 The core IHL instruments include the four Geneva Conventions of 1949, which are universally ratified, and the three Additional Protocols of 1977 and 2005.3 As today’s world witnesses the significant and continuous advancement of technology, the risks and consequences in the context of contemporary armed conflict must be considered. The present essay will investigate to what extent loitering munitions may be used in line with IHL requirements, by examining how the legitimate use of loitering munitions is affected by the extent to which they qualify as autonomous weapons systems (AWS), and secondly, by the degree of adequate human judgment and control maintained in the development and use of these munitions. 

Loitering munitions are defined as unmanned and expendable aircraft that integrate sensor-based analyses to detect their targets and crash into them, as part of single-use technologies which are destroyed once they are used to attack a target.4 These types of munitions distinguish themselves by their ability to remain airborne over the battlefield, blurring the line between drones and missiles.5 Moreover, due to technological advancements, loitering munitions may be able to identify their targets autonomously based on software, which is giving rise to debate that will be further explored using the fundamental principles and rules of IHL on lawful means and methods of warfare.6 

Qualification of loitering munitions as autonomous weapons systems

When comparing loitering munitions with other warfare munitions, several distinctions can be made. First of all, loitering munitions differ from precision-guided munitions as the former can be launched by its operators with a larger window for detonation and target attack.7 Namely, similar to drones, loitering munitions can stay airborne for an extended period of time as they search and identify their intended targets and can be remotely operated by military personnel. However, loitering munitions additionally distinguish themselves from drones as they are installed with a warhead making them similar to a projectile that cannot be reused after attack, whereas drones carry projectiles and can be reused.8 Therefore, the expendability of loitering munitions, which is reflected in their 

comparatively lower unit costs, serves as a distinguishing feature and allows them to be used in large numbers. 

However, their expandability as ‘kamikaze’ weapons coupled with their relatively autonomous capacities based on artificial intelligence-driven detection and recognition systems has given rise to significant concerns. While in many cases a human operator must approve an attack by loitering munitions before it can be executed, diminishing their autonomous capabilities, with technical system upgrades their autonomy may increasingly expand with AI software and sophisticated sensors.9 These concerns may be justified considering many loitering munitions can be launched by their human operators without knowledge of the specific time or place of the attack.10 

The three categories of weapons consist of those with humans ‘in the loop’ selecting and engaging targets, ‘on the loop’ selecting and engaging targets after activation of the weapon, and ‘out of the loop’ selecting and engaging targets after activation with no possibility of human intervention to stop an attack.11 According to the authoritative views of the International Committee of the Red Cross (ICRC), an autonomous weapons system (AWS) may be defined as a weapons system with autonomy in the exercise of its functions of selecting and attacking targets without human intervention.12 Similarly, under United States policy, lethal autonomous weapons systems are defined as those that can independently identify targets and engage and destroy the target using sensors and computer algorithms, without manual human control over the system.13 Applying the common aspects of AWS, namely, autonomy in the functions of identifying and attacking targets without human intervention, it is clear certain loitering munitions fall within this scope. For example, in case loitering munitions leave human control behind once they are launched, which can autonomously select their targets and attack without prior human approval, they may be considered autonomous weapons systems.14 However, in some cases, as set out above, loitering munitions will select a target independently but are only able to execute the attack if approval is given by their human operators. In such instances, they may not fall within the scope of autonomous weapons systems. 

Therefore, the extent to which loitering munitions can be qualified as autonomous weapons systems depends on the (non-)existence of human intervention in their functions of selecting and attacking targets. It may subsequently be argued that loitering munitions qualifying as fully autonomous weapons systems are more susceptible to committing violations of IHL principles, which will be analyzed in the third section. 


Adequate human judgment and control in development and use of loitering munitions

The increasing development and use of unmanned systems have significant effects on the impersonalization of combat by enhancing the physical and emotional distance from the battlefield.15 The degree to which human judgment and control in the development and especially the use of loitering munitions may affect their legitimate use under IHL. Namely, a higher degree of human judgment and control in the use of loitering munitions may generate greater accountability for possible violations but may also prevent such violations of fundamental IHL rules and principles in the first place. Without such human control and judgment, loitering munitions may be unable to make the proper judgment calls necessary to distinguish between surrendering combatants, participating civilians, and non-participants carrying weapons. Similar concerns were expressed by several state parties, including the United States which is developing autonomous weapons, to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects16, who formulated draft articles on AWS which emphasized the need for human control “as required in the circumstances of their use, by the principles of distinction and proportionality”.17 

It may be argued that loitering munitions’ development with a great degree of human control and judgment may improve their capabilities to conduct attacks in line with IHL by running on well-designed software allowing adequate calculation of complex circumstances, without the interference of human emotions such as anger influencing conduct.18 Nevertheless, as even human judgment is prone to faulty interpretation in context-specific cases, leaving the identification of targets and execution of attacks to fully autonomous loitering munitions based on a predetermined set of factors and scenarios may inevitably lead to mistaken military targets and violations of IHL, rendering their use illegitimate. Therefore, maintaining a high degree of human control may enhance their use in conformity with IHL. 


Loitering munitions’ conformity with fundamental principles of IHL

Under Article 36 of Additional Protocol I to the Geneva Conventions (hereinafter ‘AP I’), which has been largely recognized as customary international law, states must determine when studying, developing, acquiring, or adopting a new weapon or other means of warfare, whether it would in some or all circumstances of its employment be contrary to the Protocol or another rule of international law.19In its commentary to AP I, the ICRC already emphasized its concerns regarding (partly) autonomous weapons and potential issues of compliance with IHL, which has become increasingly relevant in light of present technological developments.20 The basic rule in Article 35(1) AP I on means and methods to be used in combat demonstrates that parties to a conflict do not have an unlimited right to choose methods and means.21 Moreover, all new weapons must be reviewed for their compatibility with IHL and other international obligations, whereby attention must be paid to their predictability in use.22 Particularly important are the fundamental IHL principles of distinction, military necessity and proportionality, precautions in attack, and environmental protection for evaluating the lawfulness of means of combat.

First of all, the principle of distinction obliges all participants in armed conflict to distinguish between combatants and non-combatants, limiting the scope of lawful attack to military objectives.23 Non-participating civilians and civilian objects may therefore not be subject to military attack and the means or methods used may not result in indiscriminate killing. In the case of fully autonomous weapons, the ability to accurately distinguish between combatants and civilians, hors de combat, or surrendering individuals in contemporary combat environments may be undermined, hindering compliance with IHL.24 For example, for combatants not identifiable by physical markings or civilians carrying weapons while not being involved in hostilities, accurate distinction requires contextual considerations that an autonomous weapon may not be able to make based on generalized software. Moreover, the ongoing debate and uncertainty regarding the exact scope of direct participation in hostilities underlines the indispensability of human judgment to take account of context in novel or particular cases. Therefore, the extent to which loitering munitions can be qualified as autonomous weapons systems and the extent to which human control and judgment are diminished adversely affect the likelihood of such munitions complying with the principle of distinction. 

Secondly, under the principle of military necessity, destruction must be imperatively demanded and unnecessary suffering should be avoided.25 Therefore, any weapons used must not cause suffering unnecessary or superfluous to advance the military objective. This relates

to the principle of proportionality, which prohibits attacks if the expected harm caused to civilians outweighs the anticipated direct military advantage from the attack.26 Similar to distinction, the determination of a military operation as necessary and proportional is context-dependent, and the essential case-by-case approach defeats the argument that AWS can be pre-programmed to interpret all potential scenarios.27 The International Criminal Tribunal for the Former Yugoslavia (ICTY) confirmed it involves a complex process of determining the relative value of the military advantage and the injury to civilians or civilian objects.28 The reasonable military commander standard, despite its margin of discretion, obliges detailed examination in every specific case.29 Moreover, beyond the principles of military necessity and proportionality, adherence to the principle of precautions in attack to avoid or minimize incidental harm to civilians or civilian objects30, the principle of humanity31, and other ethical considerations, for example, showing mercy, may be inadequate by fully automated systems without human control or judgment.32 

Thirdly, it is prohibited to employ methods or means of warfare intended or expected to “cause widespread, long-term and severe damage to the natural environment”.33 For loitering munitions, with or without human intervention, they can be developed to select and attack targets other than the natural environment. Even if a military objective allows for loitering munitions to cause damage to the natural environment without human intervention, such damage may not violate IHL. Albeit a high threshold, retaining adequate human control and judgment can ensure conformity with this principle of environmental protection. 

Concludingly, while loitering munitions as AWS are not generally prohibited, their employment as fully autonomous weapons systems may increase the risk of IHL violations if they fail to adhere to the fundamental principles and rules of means in combat, which would render their use unlawful. Namely, the principles of distinction, military necessity, and proportionality may require human judgment a fully autonomous weapon would not have. Therefore, maintaining a higher degree of human control and judgment in the development and use of loitering munitions can improve conformity with the fundamental IHL principles and may positively affect the extent to which loitering munitions may be lawfully used in line with the requirements of IHL. 


Reference List

1 Essay instructions: Loitering munitions are playing an increasingly prominent role in contemporary armed conflict. Advise on whether and if so how the following aspects affect the extent to which loitering munitions may be used in line with the requirements of IHL: a) the extent to which these munitions qualify as autonomous weapons systems; and b) the extent to which adequate human judgment and control is maintained in the development and use of these munitions.

2 Emily Crawford and Alison Pert, International humanitarian law (2nd edn, Cambridge University Press 2020)3 Jamie A Williamson, ‘Challenges of Twenty-First Century Conflicts: A Look at Direct Participation in Hostilities’ (2010) 20(3) Duke Journal of Comparative & International Law 457 

3 Jamie A Williamson, ‘Challenges of Twenty-First Century Conflicts: A Look at Direct Participation in Hostilities’ (2010) 20(3) Duke Journal of Comparative & International Law 457

4Ingvild Bode and Thomas Frank Arthur Watts, ‘Loitering Munitions and Unpredictability: Autonomy in Weapon Systems and Challenges to Human Control’ (Center for War Studies, 2023) 

5Ibid. 

6 Dan Gettinger and Arthur Holland Michel, ‘Loitering munitions’ (Center for Study of the Drone, 2017) <https://dronecenter.bard.edu/files/2017/02/CSD-Loitering-Munitions.pdf> accessed 29 November 2023 7 Bode and Watts (n 4) 

8Ibid. 

9 Peter Burt, ‘Loitering munitions, the Ukraine war, and the drift towards ‘killer robots’’ (Drone Wars, 8 June 2022) 

10 Bode and Watts (n 4) 

11 Advisory Council on International Affiars and Advisory Committee on Issues of Public International Law, Autonomous Weapon Systems: The Need for Meaningful Human Control (October 2015) <https://www.advisorycouncilinternationalaffairs.nl/documents/publications/2015/10/02/autonomous-weapon-sy stems> accessed 12 December 

12 International Committee of the Red Cross (ICRC), ‘Views of the ICRC on autonomous weapon systems’ (Convention on Certain Conventional Weapons Meeting of Experts on Lethal Autonomous Weapons Systems, 11 April 2016) <https://www.icrc.org/en/document/views-icrc-autonomous-weapon-system> accessed 29 November 2023 

13 Congressional Research Service, ‘Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems’ (May 2023) 

<https://crsreports.congress.gov/product/pdf/IF/IF11150#:~:text=Lethal%20autonomous%20weapon%20system s%20(LAWS,human%20control%20of%20the%20system.> accessed 29 November 2023 14 Berenice Boutin, ‘Legal Questions Related to the Use of Autonomous Weapon Systems’ (Asser Institute, 2021) <https://www.asser.nl/media/795707/boutin-legal-questions-related-to-the-use-of-aws.pdf> accessed 12 December 2023 

15 Peter Warren Singer, Wired for war: The robotics revolution and conflict in the 21st century (Penguin 2009), cited in Bonnie Docherty and others, ‘Losing Humanity: The Case Against Killer Robots’ (Human Rights Watch, 2012) <https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots> accessed 12 December 2023

16 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects 1980 (into force in 1983) 17 Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Draft articles on autonomous weapon systems – prohibitions and other regulatory measures on the basis of international humanitarian law (“IHL”) (March/May 2023) 

<https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-Group_of_Governmental_Ex perts_on_Lethal_Autonomous_Weapons_Systems_(2023)/CCW_GGE1_2023_WP.4_Rev1.pdf> accessed 12 December 2023 

18 Council of Europe Parliamentary Assembly, Emergence of lethal autonomous weapons systems (LAWS) and their necessary apprehension through European human rights law (2022) 

19 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I) (1977) 1125 U.N.T.S. 3, art. 36

20 International Committee of the Red Cross (ICRC), Commentary on the Additional Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949 (Geneva: Martinus Nijhoff Publishers, 1987)

21 AP I (n 19), art. 35(1) 

22 Ibid., art. 36 

23 Ibid., art. 48 

24 Human Rights Council, ‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns’ (9 April 2013) A/HRC/23/47 

25 American Military Tribunal, United States v List (The Hostage Case), Case No 7 (1948) 

26 AP I (n 19), art. 51(5)(b)

27 Bonnie Docherty and others, ‘Losing Humanity: The Case against Killer Robots’ (Human Rights Watch, 2012) <https://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf> accessed 29 November 2023 28 International Criminal Tribunal for the Former Yugoslavia, Final Report to the Prosecutor by the Committee Established to Review the NATO Bombing Campaign Against the Federal Republic of Yugoslavia (2000) <https://www.icty.org/sid/10052> accessed 29 November 2023 

29 Jonathan Hasson and Ariel H Slama, ‘IHL’s Reasonable Military Commander Standard and Culture: Applying the Lessons of ICL and IHRL’ (2023) 58 Tulsa L Rev 183 

30 International Committee of the Red Cross, Customary IHL (2005) 

31 AP I (n 19), art. 1(2) 

32 Michael N Schmitt, ‘Military Necessity and Humanity in International Humanitarian Law: Preserving the Delicate Balance’ (2010) 50(4) Virginia Journal of Intl Law 795 

33 AP I (n 19, art. 35(3) 


 
 
 

Comentarios


© 2024 by ASA International Law.

bottom of page