Open Access Research Article

CONTEMPORARY DILEMMAS IN ENFORCING INTERNATIONAL HUMANITARIAN LAW DURING MODERN WARFARE CONTEXTS

Author(s):
AJAYITA SANDHU
Journal IJLRA
ISSN 2582-6433
Published 2024/05/14
Access Open Access
Issue 7

Published Paper

PDF Preview

Article Details

CONTEMPORARY DILEMMAS IN ENFORCING INTERNATIONAL HUMANITARIAN LAW DURING MODERN WARFARE CONTEXTS
 
AUTHORED BY - AJAYITA SANDHU
 
 
Abstract:
Emerging technologies are reshaping warfare, with nations investing heavily in digital tools like cyber capabilities, autonomous weapons, and AI. The International Committee of the Red Cross (ICRC) monitors these developments to apply international humanitarian law (IHL) effectively. Throughout history, legal frameworks have evolved to regulate new weaponry, emphasising principles to minimise suffering. However, applying existing laws to emerging technologies requires careful consideration. Cyber operations, for example, raise concerns about civilian harm and infrastructure disruption, necessitating adherence to IHL principles. Autonomous weapon systems pose challenges regarding human control, accountability, and ethical implications, prompting calls for international limits. AI and machine learning present concerns about decision-making, transparency, and bias in armed conflict, highlighting the need for human- centered approaches. Additionally, the weaponisation of outer space poses humanitarian risks, necessitating adherence to existing legal frameworks and multilateral efforts to minimise civilian harm. Legal evaluations of novel weaponry face obstacles like non-compliance and resource disparities, highlighting the need for standardised processes and collaboration. Strategies to address these challenges include early initiation of legal assessments, multidisciplinary approaches, training, and rigorous testing. The researcher aims to develop how to comply with International legal norms and responsible innovation by allowing stakeholders to navigate the complexities of technological advancements in warfare while upholding humanitarian values.
 
Keywords: armed; artificial; conflict; technology; weapon
 
 
 
 
 
I. INTRODUCTION:
Emerging technologies are revolutionising human interactions, even in times of conflict, with many nations investing heavily in digital technology for warfare, including cyber tools, autonomous weapons, and AI. The International Committee of the Red Cross (ICRC) closely monitors these developments and collaborates with stakeholders to apply international humanitarian law (IHL) to these new methods. Throughout history, advancements in technology and weaponry have transformed warfare. Innovations like cannon powder and airplanes have reshaped the battlefield from the chariot era to the nuclear age. Efforts to regulate these advancements began with the St. Petersburg Declaration of 1868, limiting certain projectiles' use. Over time, international humanitarian law evolved to address new weaponry, emphasising minimising unnecessary suffering through overarching principles applicable to all methods of warfare and specific agreements banning or restricting particular weapons, such as chemical and biological agents or cluster munitions, to protect combatants and civilians from indiscriminate harm. As technology advances, the flexibility of international humanitarian law allows it to adapt to unforeseen developments, as outlined in Article 36 of Additional Protocol I. However, applying existing legal frameworks to emerging technologies requires careful consideration of their unique characteristics and potential humanitarian consequences, prompting some states to establish more tailored regulations. Today, the proliferation of information technology integrates it onto the battlefield, enabling increasingly sophisticated weapons systems, including smaller, more affordable drones and remote infrastructure manipulation.
 
While such advancements can improve civilian protection by enabling more precise targeting and informed military decisions, they introduce new risks and challenges for combatants and civilians, necessitating a comprehensive assessment of their humanitarian impact. IHL applies to the development and use of these technologies in warfare, with states responsible for ensuring compliance. Recent conflicts highlight the enduring toll on civilians, with estimates suggesting 80 to 90 percent of casualties are non-combatants, emphasising the need for humanitarian aid and training of armed conflict participants, including military personnel and NGOs, in the laws and customs of warfare. Contemporary armed conflicts exhibit shifts, with civil wars and engagements against elusive terrorist groups increasingly common, posing fresh challenges to IHL. Advancements in weaponry, such as combat drones and lethal autonomous weapons, along with ongoing arms technology developments, introduce legal and ethical uncertainties. Addressing these challenges requires continuous collaboration among stakeholders to ensure the protection of human life and dignity amidst evolving warfare landscapes.
 
II. CYBER OPERATIONS, THEIR POTENTIAL HUMAN COST, AND THE PROTECTION PROVIDED BY IHL
The utilisation of cyber operations in armed conflicts is becoming increasingly prevalent, with a growing number of states developing military cyber capabilities. The International Committee of the Red Cross (ICRC) defines "cyber warfare" as operations targeting computers, networks, or connected devices, raising questions about how existing provisions of international humanitarian law (IHL) apply and whether further development of these laws is necessary. While cyber operations may offer alternatives to traditional methods of warfare, they also carry risks, such as disrupting essential services to civilian populations. Understanding the potential human cost of cyber capabilities, the ICRC convened experts to assess their technical aspects and effects. Cyber operations pose threats to sectors like healthcare due to increased digitisation, connectivity, and vulnerabilities in infrastructure. Additionally, attacks on critical civilian infrastructure, such as electrical grids, can have significant humanitarian consequences. Cyber operations also raise concerns about overreaction, proliferation of tools, and attribution challenges. Despite not causing major harm thus far, uncertainties persist regarding technological advancements and their implications. IHL provides protections against cyber warfare, including safeguarding medical facilities and personnel and prohibiting attacks on civilian infrastructure. While some cyber tools may be designed with precision, adherence to IHL principles, such as distinction, proportionality, and precautions, is essential. The ICRC emphasises the importance of recognising civilian data as protected under IHL and stresses the need for precautionary measures to minimise harm to civilians and civilian objects. Affirming the applicability of IHL to cyber warfare does not endorse militarising cyberspace but enhances civilian protection during hostilities. The ICRC will continue monitoring cyber operations' evolution and advocate for consensus on interpreting existing IHL rules and potentially developing additional regulations to protect civilians.
 
The use of digital technology beyond warfare purposes during armed conflicts has led to adverse effects on civilian populations, including misinformation campaigns and increased surveillance. While IHL does not explicitly prohibit such activities, it prohibits acts aimed at spreading terror among civilians and prohibits parties from encouraging IHL violations. International human rights law may also be relevant in assessing surveillance and disinformation. The digital transformation is also reshaping humanitarian action, offering opportunities to improve response efforts through data analysis and two-way communication with affected populations. However, it also necessitates enhanced digital literacy and data protection measures to mitigate risks. The ICRC encourages further research and action to ensure humanitarian organisations can adapt their operations safely to digital changes while upholding the principle of doing no harm.
 
III. Autonomous Weapon system and
human attack
The International Committee of the Red Cross (ICRC) defines autonomous weapon systems as those with autonomy in critical functions, enabling them to select and attack targets without human intervention. Unlike traditional weapons where users determine specific targets and timing, autonomous systems self-initiate attacks based on environmental cues and generalised target profiles. The loss of human control over the use of force raises significant humanitarian, legal, and ethical concerns, particularly regarding the potential risks to civilians and compliance with International humanitarian law (IHL).
 
From an IHL perspective, individuals responsible for planning, deciding, and executing military operations must comply with legal obligations. Human accountability for the effects of weapon systems remains paramount, regardless of the level of autonomy. Existing IHL rules, such as those on distinction, proportionality, and precautions in attack, provide frameworks for assessing and mitigating risks during armed conflicts. Combatants must retain human control over weapon systems to make context-specific judgments and ensure compliance with these rules.[1]
 
Human control over autonomous weapon systems occurs at various stages: during development, activation, and operation. While control measures during development can establish parameters for human oversight, they alone are insufficient to ensure compliance. The most critical aspect is human control during activation and operation, allowing for real-time decision-making aligned with IHL principles. However, existing IHL rules do not address all concerns regarding autonomous weapon systems comprehensively. Questions remain about the level of human supervision, intervention, and the ability to deactivate such systems. Additionally, ethical considerations, such as the loss of human agency in decisions to use force and the diffusion of moral responsibility, may necessitate additional limitations beyond legal requirements.[2]
 
The ICRC advocates for internationally agreed limits on autonomous weapon systems to uphold compliance with IHL and protect humanity. While existing IHL rules provide some constraints, further clarification is needed regarding the degree of human control necessary in practice. Ethical concerns, particularly regarding systems designed to target humans directly, may require more stringent limitations or prohibitions. Addressing these challenges requires urgent agreement on the type and extent of human control required to ensure both legal compliance and ethical acceptability in the use of autonomous weapon systems. The ICRC continues to advocate for dialogue and consensus-building among stakeholders to address these complex issues effectively.[3]
 
IV. ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
Artificial intelligence (AI) and machine learning systems are sophisticated computer programs capable of performing tasks that traditionally require human intelligence, such as cognition, planning, reasoning, or learning. These systems are distinct from simple algorithms and have broad applications across various domains. In the context of armed conflict and humanitarian efforts, AI and machine learning present several areas of concern and consideration.[4]
 
The first area of concern is the use of AI and machine learning to control military hardware, particularly unmanned robotic systems. AI may enhance the autonomy of these systems, potentially leading to concerns about loss of human control, especially in the case of autonomous weapon systems. While not all autonomous weapons incorporate AI, the development of AI-enabled software for tasks like automatic target recognition could contribute to future autonomous weapon systems, raising questions about unpredictability and ethical implications. The second area of concern involves the application of AI and machine learning in cyber warfare. AI-enabled cyber capabilities could increase the speed, frequency, and severity of cyber attacks, including the creation and dissemination of false information for propaganda purposes.[5] These developments have significant implications for discussions surrounding the humanitarian impact of cyber warfare, as digital risks pose real dangers for civilians. The third and perhaps most consequential area of concern is the use of AI and machine learning for decision-making purposes. AI systems can analyse vast amounts of data to identify patterns, make recommendations, or predict future actions, potentially influencing decisions ranging from military strategy to specific operations, including targeting and detention practices. While AI systems may improve decision-making in compliance with international humanitarian law (IHL), they also pose risks of wrong decisions, IHL violations, and increased harm to civilians due to factors like unpredictability, lack of transparency, and bias.
 
A human-centered approach is essential in navigating the use of AI and machine learning in armed conflict. Preserving human control and judgment is crucial, particularly in tasks and decisions governed by IHL rules and where the consequences may impact human lives. These technologies should serve to augment and improve human decision-making rather than replace it entirely. Ensuring human control and judgment, especially in situations with significant risks to human life and dignity, is necessary for IHL compliance and maintaining a humane approach to armed conflict. Human-AI interaction will vary depending on the specific application, associated consequences, relevant legal frameworks, and ethical considerations. However, alongside human control, building trust in AI systems requires ensuring predictability, reliability, transparency, and lack of bias in their operation and consequences. Weapon reviews and other mechanisms can play a crucial role in assessing and verifying the functioning of AI systems to uphold ethical and humanitarian standards in armed conflict.
 
V.  Weaponising of outer space
The integration of space assets into military operations has been a significant aspect of warfare for many years. This involves utilising satellite imagery for target identification, employing satellite communication systems for command and control purposes, and more recently, utilising satellites for remotely controlled military operations. However, the weaponisation of outer space could escalate hostilities in space, potentially resulting in severe humanitarian consequences for civilians on Earth. While the exact extent of these consequences remains uncertain, it is evident that the use of weapons in outer space, whether through kinetic or non-kinetic means like electronic or directed energy attacks, could directly or indirectly disrupt, damage, or destroy civilian or dual-use space objects essential for various civilian activities and services. These include navigation satellite systems vital for civilian transportation, weather services crucial for disaster prevention, and satellite phone services essential for humanitarian assistance and emergency relief efforts.
 
The use of weapons in outer space is subject to existing legal frameworks, including the Outer Space Treaty, the UN Charter, and International Humanitarian Law (IHL), which governs the conduct of warfare. IHL applies to military operations in outer space, regardless of the legality of the use of force under the UN Charter. It aims to uphold a level of humanity in armed conflicts, particularly by protecting civilians. The Outer Space Treaty prohibits the placement of weapons of mass destruction in space, the installation of such weapons on celestial bodies, and the establishment of military bases or conduct of military activities on celestial bodies. Additionally, IHL prohibits indiscriminate attacks and requires parties to distinguish between military and civilian objects, taking precautions to minimise harm to civilians and civilian objects.
 
However, even when using weapons not explicitly prohibited, parties must adhere to IHL rules governing the conduct of hostilities. This includes avoiding targeting civilian objects and considering the potential impact on civilians when attacking dual-use satellites. Furthermore, the creation of space debris resulting from kinetic attacks poses risks to other satellites supporting civilian activities and services.
 
The potential humanitarian consequences of weaponising outer space underscore the importance of multilateral efforts to address these risks. Recognising the protections provided by IHL and acknowledging the significant civilian harm that could result from the use of weapons in space are essential steps in mitigating these risks. States have the option to agree on additional rules or limitations to minimise civilian harm and uphold humanitarian principles. In conclusion, while the use of space for military purposes is not new, the weaponisation of outer space presents significant humanitarian challenges. Adhering to existing legal frameworks and exploring additional measures to reduce civilian harm are crucial in navigating the complexities of space warfare and preserving humanitarian principles.
 
VI. challenges presented by specific emerging technologies of warfare to legal evaluations of novel weaponry.
New technologies have consistently been significant in warfare, from ancient weapons like the crossbow to modern cyber tools. Once developed, such technologies tend to be utilised in military operations, as noted by U.S. Deputy Secretary of Defence William J. Lynn, III, who stated that few weapons throughout history remain unused. The integration of new technologies into military strategies poses intricate legal challenges within the framework of armed conflict law. Legal progress often reacts to specific situations, with real-world circumstances driving the need for legal regulations. This is particularly evident in international law, where treaties and customary practices, rooted in consensus, serve as the primary legal sources. Nevertheless, law also functions as a signalling mechanism, offering guidance to nations as they weigh the incorporation and militarisation of emerging technologies.
 
In tandem with these constraints, international law, notably Article 36 of the 1977 Additional Protocol to the 1949 Geneva Conventions[6], mandates that states evaluate whether the use of new weaponry complies with international legal standards. This process, known informally as a 'weapon review', 'legal review', or 'Article 36 review', is essential for ensuring that the adoption of novel technologies aligns with humanitarian principles. The significance of conducting such reviews is widely acknowledged, particularly in light of ongoing advancements in civilian and military technologies. The thorough assessment of Article 36[7] reviews is pivotal in determining whether the adoption of new technologies poses significant humanitarian concerns. Moreover, these evaluations ensure that the armed forces of states can engage in hostilities while upholding their international obligations. As the landscape of technology continues to evolve, the importance of these reviews becomes increasingly emphasised, underscoring their role in maintaining compliance with international legal standards amidst technological advancements.The effectiveness of Article 36 reviews in regulating technological advancements can face several obstacles. Primarily, there's a notable issue of non-compliance, as only a limited number of states have established formal mechanisms for Article 36 reviews. Additionally, Article 36 lacks specific guidelines on how states should formalise this process, resulting in variations in format, methodology, mandate, and authority level among countries. Furthermore, disparities in resources and expertise among states may hinder their ability to fulfil the obligations outlined in Article 36, especially as new technologies increase the complexity and costliness of the review process.[8]
 
The lack of widespread adherence to Article 36 and the varying approaches to conducting reviews, coupled with the complexities introduced by new technologies, raise concerns among civil society organisations focused on arms control and humanitarian issues. They express skepticism regarding Article 36's adequacy in preventing the development or usage of weapons that breach international law. Consequently, they propose the need for new regulations or prohibitions to oversee specific categories of emerging military applications, such as lethal autonomous weapon systems.
 
VII. CONCLUSION & Recommendations
Several strategies can address the challenges outlined above, focusing on (a) leveraging existing best practices, (b) fostering transparency and collaboration in Article 36 reviews, and (c) supporting targeted research.Key guidelines include:
    Early initiation: Commencing the review process at the project study stage and integrating it into procurement decisions.
    Multidisciplinary approach: Involving expertise from legal, medical, operational, and technical fields to comprehensively assess new technologies.
    Training: Providing technical training to military lawyers and educating engineers on international legal requirements.
    Testing and evaluation: Conducting thorough tests and evaluations, potentially utilising computer simulations to reduce costs.
 
In conclusion, the rapid progress of new technologies in warfare poses intricate legal and humanitarian dilemmas. As countries heavily invest in digital tools like cyber capabilities, autonomous weapons, and AI for military purposes, it's vital to uphold compliance with international humanitarian law (IHL) and safeguard civilians amidst changing conflict scenarios. The International Committee of the Red Cross (ICRC) assumes a crucial role in monitoring these advancements and advocating for adherence to IHL. Regulating these emerging technologies requires a comprehensive approach, including early initiation of legal assessments, involvement of diverse expertise, ongoing training, and rigorous testing. Openness and cooperation in conducting Article 36 reviews are vital to ensure adherence to standards and manage risks linked to new weaponry. Recommendations encompass drawing upon established practices, fostering transparency and cooperation in Article 36 evaluations, and backing targeted research efforts. Prioritising compliance with international legal norms and responsible innovation allows stakeholders to navigate the complexities of technological advancements in warfare while upholding humanitarian values.


[1] Paul Scharre, Army of None: Autonomous Weapons and the Future of War (New York: W.W. Norton, 2018), p. 305.
[2] McFarland, T., Assaad, Z. Legal reviews of in situ learning in autonomous weapons. Ethics Information Technology 25, 9 (2023). https://doi.org/10.1007/s10676-023-09688-9
[3] McFarland, T. (2020). Autonomous Weapon Systems and the Law of Armed Conflict: compatibility with International Humanitarian Law. Cambridge University Press. https://doi.org/10.1017/9781108584654.
[4] Robert O. Work, Preface,” in Artificial Intelligence: What Every Policymaker Needs to Know, ed. Paul Scharre and Michael C. Horowitz (Washington, DC: Center for a New American Security, June 2018), p. 2.
[5] Lewis, D. (2019, March 21). Legal Reviews of Weapons, Means and Methods of Warfare Involving Artificial Intelligence: 16 Elements to Consider. Humanitarian Law & Policy. https://blogs.icrc.org/law-and-policy/2019/03/21/legal-reviews-weapons-means-methods-warfare-artificial-intelligence-16-elements-consider/.
[6] 1977 Protocol I Additional to the Geneva Conventions, and Relating to the Protection of Victims of International Armed Conflicts, opened for signature 12 Dec. 1977, entered into force 7 Dec. 1978.
[7] International Committee of the Red Cross (ICRC), A Guide to the Legal Review of Weapons, Means and Methods of Warfare (ICRC: Geneva, 2006).
[8] McClelland, J., The review of weapons in accordance with Article 36 of Additional Protocol I, International Review of the Red Cross, vol. 85, no. 850 (June 2003), p. 404.

About Journal

International Journal for Legal Research and Analysis

  • Abbreviation IJLRA
  • ISSN 2582-6433
  • Access Open Access
  • License CC 4.0

All research articles published in International Journal for Legal Research and Analysis are open access and available to read, download and share, subject to proper citation of the original work.

Creative Commons

Disclaimer: The opinions expressed in this publication are those of the authors and do not necessarily reflect the views of International Journal for Legal Research and Analysis.