13 May 2024
Our latest report, ‘Artificial Intelligence And Related Technologies In Military Decision-Making On The Use Of Force In Armed Conflicts: Current Developments And Potential Implications’, highlights and examines some of the themes that arose during two expert workshops on the role of AI-based decision support systems (AI DSS) in decision-making on the use of force in armed conflicts. Drafted by Anna Rosalie Greipl, Geneva Academy Researcher, with contributions from Neil Davison and Georgia Hinds, International Committee of the Red Cross (ICRC), the report aims to provide a preliminary understanding of the challenges and risks related to such use of AI DSS and to explore what measures may need to be implemented regarding their design and use, to mitigate risks to those affected by armed conflict.
Anna Rosalie Greipl explained, ‘The introduction of AI to DSS for military decision-making on the use of force in armed conflicts adds a new dimension to existing challenges relating to non-AI-based DSS and distinct legal and conceptual issues to those raised by Lethal Autonomous Weapon Systems. These systems carry the potential to reduce the human judgment involved in military decision-making on the use of force in armed conflicts in a manner that raises serious humanitarian, legal, and ethical questions.’
Georgia Hinds went on to underscore the critical role of human judgment in addressing these concerns. ‘It’s crucial to preserve human judgment when it comes to decisions on the use of force in armed conflicts. And this will have implications for the way these systems are designed and used, particularly given what we already know about how humans interact with machines, and the limitations of these systems themselves.’
The report shows that certain technical limitations of AI-DSS may be insurmountable and many existing challenges of human-AI interaction are likely to persist. It therefore may be necessary to adopt additional measures and constraints on the use of AI DSS in military decision-making on the use of force to reduce risks for people affected by armed conflicts and to facilitate compliance with IHL. To that end, it will be necessary to pursue additional research and dialogue to better understand the precise measures and constraints that may be required with regard to the design and use of AI DSS. Moreover, further analysis will be needed to identify the applications of AI DSS in this context that have the biggest impact on decisions on the use of force.
This report is part of the joint initiative ‘Digitalization of Conflict: Humanitarian Impact and Legal Protection’ between the ICRC and the Swiss Chair of International Humanitarian Law at the Geneva Academy of International Humanitarian Law and Human Rights. Other themes addressed under the project include the societal risks and humanitarian impact of cyber operations and, more recently, the legal implications of the rising civilian involvement in cyber operations.
Geneva Academy, ICRC
ICRC (AI GENERATED)
Geneva Academy
The Geneva Academy has published the first spot report from the 'IHL in Focus' research project, ‘Food Insecurity in Armed Conflict and the Use of Siege-like Tactics.’
Geneva Academy
The Geneva Academy’s Board has been recomposed with Professor Christian Bovet as the new president, who was recently welcomed at Villa Moynier by the executive committee.
ICRC
This IHL Talk will explore the intersection of armed conflict and food insecurity, through the lens of international humanitarian and human rights law.
ICRC
After having followed this online short course, participants will know who the protected persons and goods are and what rules of IHL can be used for their protection in an international armed conflict. An overview of the rules applicable in non-international armed conflicts will also be given.
ICRC
This online short course discusses the protection offered by international humanitarian law (IHL) in non-international armed conflicts (NIACs) and addresses some problems and controversies specific to IHL of NIACs, including the difficulty to ensure the respect of IHL by armed non-state actors.
Shutterstock
This project will explore humanitarian consequences and protection needs caused by the digitalization of armed conflicts and the extent to which these needs are addressed by international law, especially international humanitarian law.
Adobe Stock
This project addresses the human rights implications stemming from the development of neurotechnology for commercial, non-therapeutic ends, and is based on a partnership between the Geneva Academy, the Geneva University Neurocentre and the UN Human Rights Council Advisory Committee.
Geneva Academy