General Statement by the GCSP at the First session of the 2024 CCW Group of Governmental Experts on emerging technologies in the area of lethal autonomous weapons systems (LAWS)

GCSP Statement to the GGE on LAWS

General Statement by the GCSP at the First session of the 2024 CCW Group of Governmental Experts on emerging technologies in the area of lethal autonomous weapons systems (LAWS)

By Simon Cleobury , Head of Arms Control and Disarmament

Mr Chair,

As this is our organisation’s first time taking the floor, please allow us to express our thanks to you and your team for the efforts in organizing this meeting. The GCSP welcomes this year’s mandate, seeing it as a concrete step which moves the discussion forward. We offer our organisation’s full support going forward.

The GCSP is a firm believer that the work of the group should remain grounded in a realistic assessment of technological realities linked to Lethal Autonomous Weapon Systems, as well as the ways in which these technologies are used at present. We would therefore like to remind the group, that a sober assessment of the state of autonomous weapon systems is needed. In 2024, it is high time the group begin appreciating the fact that we are already today dangerously close to the full realisation of LAWS. A recent RUSI study found that “the software, hardware and expertise necessary to develop a minimally functioning LAWS are widely available.” Evidence from the battlefields of Ukraine and Gaza further shows us the reality of an already highly autonomous battlefield, where decision making, targeting decisions, and even target engagement are already highly algorithmically assisted, with varying levels of actual human oversight and involvement. These active warzones are acting as a catalyst for the development and deployment of these weapons. The GCSP urges the group to more directly use evidence from the real-world use of these autonomous capabilities in its work.

Should belligerents continue their efforts in this domain with tangible battlefield results - and absent international regulation - efforts towards the development and deployment of these capabilities worldwide will greatly accelerate. Chair, it would seem the pressures of the battlefield are stronger than the normative and ethical pressures currently holding back the development and deployment of these weapons. The time to act is now.

In light of the group’s convergence towards the need for some level of control in the use of autonomous weapons to guarantee their compliance with IHL, and in view of discussing elements of an instrument, the GCSP encourages states to use the 2024 GGE to further clarify human control requirements for the appropriate use of autonomous weapon systems, and consider making these clarifications part of any instrument resulting from the group’s work. Specifications are direly needed, lest states resort to so-called instances of “nominal human control”. Chair, research into automation bias clearly shows that humans have a tendency to increasingly defer to machine suggestions the more autonomous a system is. Increasingly autonomous weapons systems might therefore result in increasingly “performative” or “nominal” human control measures. Failure to clarify what amounts to “meaningful” human control which actually ensures potential compliance with IHL, will result in a situation where weapon systems are developed with various kinds of human oversight mechanisms “meaningful”, “effective”, or “appropriate” only in name. This would not be a satisfactory level of human involvement. Here, human oversight would legitimise these weapon systems in the eyes of IHL, without acting as an actual effective failsafe. Regardless of the qualifier, it should refer to control in which a human has the necessary contextual understanding and cognitive and physical capacity to critically engage with any system’s suggestions or actions. The GGE therefore should not rely on vague commitments of “human control” – without clear clarification as to what that looks like practically – as a safeguard for IHL compliance.

Chair, these assertions are not based on thought exercises, but on evidence from current conflicts. We have clear evidence today, contrary to what has been argued by some delegations, that AI-enabled targeting does not amount to fewer civilian casualties. We also have evidence of “nominal” or “performative” human control, and its devastating effects. These instances should serve as lessons for the dire need to define what amounts to required degrees of human control, as the only way for human oversight to be an actual guarantor of IHL compliance. As part of these discussions, it will also be important for states to consider how to verify compliance with the agreed definition.

Finally, Chair, the GCSP would like to further emphasise the growing momentum outside the GGE format as evidence of a growing global consensus around the need

for regulation in this area. The GGE should not be hermetic to these developments. The CCW continues to be the appropriate format for these discussions, and should act as an amplifier of this outside momentum, focusing on concrete outcomes, showing the way for future processes on the use of AI in the military domain.

Thank you.

Simon Cleobury is Head of Arms Control and Disarmament. He is a former British Deputy Disarmament Ambassador (2017 – 2023), where he represented the UK at the Conference on Disarmament and other disarmament fora in Geneva. Prior to that he worked in the Security Council Team and then the Peacebuilding Team at the UK Mission to the UN in New York (2012 – 2016). Prior to his diplomatic career, he was a corporate lawyer with global law firm Baker McKenzie. Simon obtained a Bachelor’s degree in Modern History at University College London and a Master’s Degree in Historical Research from Oxford University.  He studied law at BPP Law School, London.