Artificial Intelligence and Arms Control – How and Where to Have the Discussion

In Focus Banner

Artificial Intelligence and Arms Control – How and Where to Have the Discussion

By Simon Cleobury , Head of Arms Control and Disarmament

The UN Security Council will discuss the implications of artificial intelligence for the maintenance of international peace and security for first time in July 2023. The impact on arms control is a crucial element. So far, though, discussions have been limited and disjointed. 

The UN Security Council meeting on artificial intelligence (AI) on 18 July, organised by the United Kingdom (UK), will be the first time that the body has discussed the implications of AI for international peace and security. The meeting will set out the opportunities that AI will likely open up for the UN’s peace and security, humanitarian, and development pillars, and the risks that AI poses to international peace and security, such as the military adoption of AI-enabled capabilities and the design of new weapons of mass destruction. 

The rapid developments in AI will affect every sector of society; more specifically, they are likely to have a significant impact on arms control. Disarmament diplomats in Geneva are already talking about some aspects of this emerging technology, but discussions so far have been limited and without significant agreement. This GCSP In Focus will analyse the options for future discussions. 

The story so far 

The Conference on Disarmament (CD) has a long-standing agenda item on new types of weapons of mass destruction, which is interpreted to mean new technologies. The most recent discussion of this agenda item occurred last month under the French presidency. The CD has periodically established a subsidiary body with a broad mandate to address “emerging technology”. However, these discussions have only been an exercise in risk flagging, and no substantial proposals for negotiations are on the table. 

The UN General Assembly’s First Committee on International Security and Disarmament has taken some tentative steps towards discussing the issue. In 1996 Belarus first introduced a resolution entitled “Prohibition of the development and manufacture of new types of weapons of mass destruction and new systems of such weapons”, which calls on the CD to identify possible areas for negotiation. The resolution is now a triannual one and is expected to be tabled again in October this year. 

In 2018 India introduced a new First Committee resolution entitled “Role of science and technology in the context of international security and disarmament”. It makes a broad call for states to remain vigilant about scientific developments and asks the UN Secretary-General to report on current developments. 

For the last ten years Geneva has hosted talks on Lethal Autonomous Weapons Systems (LAWS) under the framework of the Convention on Certain Conventional Weapons (CCW). So far, these discussions have produced a set of 11 guiding principles, stressing the full applicability of international law and the need for human responsibility for decisions. However, there has been no agreement to start negotiations on a legally binding instrument to regulate the development of LAWS. 

In February this year the Dutch government hosted a Summit on Responsible AI in the Military Domain (REAIM). The summit attendees issued a call for action calling for better understanding of military AI and for a multistakeholder dialogue on the issue, particularly with the private sector. However, there were no specifics about how or where such discussions could take place, other than the promise of a follow-up summit in Seoul next year. 

Possible options 

The simplest option would be to bring all discussions of AI and arms control under one roof. In an ideal world an entirely new body could be established that allowed for intergovernmental discussions and active engagement from civil society and the private sector. This body could review scientific developments and make recommendations for regulation, which could then be taken up in the appropriate forum. 

However, many would argue that we do not need yet another body, and that it would be more realistic to establish a working group within an existing structure. The CD could set up a permanent committee and ask it to make recommendations for negotiations on specific areas. Given that agreeing on anything in the CD is difficult, the UN General Assembly is a more realistic option. The recent Open-Ended Working Groups on cyber and space, set up to come to common understandings on behaviours in these areas rather than the technology itself, are useful examples. We could soon see a resolution on a topic like the military use of AI, although it would take a few more years to start a new process. 

Less ambitiously, states could decide to make emerging technology one of the two subjects addressed in the next three-year cycle of the UN Disarmament Commission, but this only meets for three weeks each year. Some have called for the Secretary-General’s Advisory Body on Disarmament Matters to opine on the topic; however, its recommendations are unlikely to carry much weight. 

Governments may prefer to keep the current approach of discussing science and technology issues under the respective legal instruments. Discussions in the CCW could be expanded to cover other aspects of AI, rather than just focus narrowly on LAWS. Under the Biological Weapons Convention, it is hoped that states parties are close to agreeing a mechanism to review and assess scientific and technological developments relevant to biological security. The Treaty on the Prohibition of Nuclear Weapons has also established a Scientific Advisory Group to review scientific developments. It is unlikely that such a group would be established under the Nuclear Non-Proliferation Treaty, but there is nothing to stop states parties speaking up about it. 

There is always the option of taking discussions outside existing structures. For example, those that argue for a legally binding instrument on LAWS could set up their own negotiations outside the CCW. However, if militarily significant countries do not participate, it would be less effective. 

Another option outside formal structures is that of summits. The UK has announced its intention to host a Summit on AI Safety this autumn. It is to be hoped that the organisers will include an arms control element. 

Away from the multilateral arena, emerging technology could be included in bilateral arms control discussions or added to the agenda of the meetings of the Nuclear Five – China, France, Russia, the UK and the United States – known as the P5 Process. Sharing thinking on these issues is an important transparency and confidence-building measure in itself. 

Ultimately, though, in an era of renewed great power tensions, the major military countries will be wary of regulating too soon, for fear of stifling innovation and constraining themselves unnecessarily. 

Conclusion 

Given this context, disarmament diplomats must focus on the art of the possible. In the absence of agreeing on one perfect option, they should maximise opportunities within existing structures and see what works. They must be alive to opportunities too. Next year’s Summit of the Future could be a catalyst for future discussions. If the long-talked-about UN General Assembly’s Fourth Special Session on Disarmament ever happens, governments could create a dedicated emerging technology agenda item or body within the disarmament architecture. 

Finally, AI and arms control should not operate in a vacuum. It would be a highly constructive step to see the disarmament community engaging with the many other actors working on AI, notably at the International Telecommunications Union and the World Intellectual Property Office. In the same spirit, others interested in the implications of AI should pay attention to disarmament discussions. This should include the Security Council. The 18 July meeting on AI is a great way to start. 

Simon Cleobury is Head of Arms Control and Disarmament. He is a former British Deputy Disarmament Ambassador (2017 – 2023), where he represented the UK at the Conference on Disarmament and other disarmament fora in Geneva. Prior to that he worked in the Security Council Team and then the Peacebuilding Team at the UK Mission to the UN in New York (2012 – 2016). Prior to his diplomatic career, he was a corporate lawyer with global law firm Baker McKenzie. Simon obtained a Bachelor’s degree in Modern History at University College London and a Master’s Degree in Historical Research from Oxford University.  He studied law at BPP Law School, London.

Disclaimer: The views, information and opinions expressed in this publication are the author’s/authors’ own and do not necessarily reflect those of the GCSP or the members of its Foundation Council. The GCSP is not responsible for the accuracy of the information.