The integration of AI and automation into nuclear command and control would mark a new era of exponential risk to humankind.
There have been several moments in history when the order has been issued, or protocol based on available information would call for a nuclear strike. These moments of catastrophe have been narrowly avoided by human judgement making the call to disregard protocol or not complete the orders to launch. There have been many times where technical errors, breakdowns in communications, and weather related phenomenon have caused dangerously close false alarms. Almost 80 years into the nuclear era, we have survived not because of wise leaders, sound military doctrine, or infallible technology but because of luck.
If human agency is removed from nuclear command and control, then the fate of humanity will rest with self-taught machines that will make the snap decision whether to launch on warning. The increased application of advanced machine learning in defense systems can speed up warfare – giving decision-makers even less time to consider whether or not to launch nuclear weapons. Turning control of nuclear weapons over to autonomous AI is irresponsible madness.
IPPNW’s calls to decision makers
1.
As we work toward the complete abolition of nuclear weapons, the N-9 — China, France, India, Israel, North Korea, Pakistan, Russia, the United Kingdom, and the United States– must not integrate AI into nuclear weapons command and control systems. Humans must maintain control over the use of nuclear weapons
2.
Nuclear power, including Small Modular Reactors, is not the solution to the massive amounts of energy required to meet the needs of AI expansion.
This work is coordinated by an IPPNW working group on AI and nuclear threat established at IPPNW’s 24th World Congress held in Nagasaki in October 2025. For more information contact Molly McGinty.
Autonomous Armaggeddon: Nuclear Weapons and AI
Webinar from IPPNW, ICAN, and Pugwash. January, 2025.
To explore the alarming dangers posed by the integration of artificial intelligence (AI) into nuclear weapons systems, featuring expert speakers such as:
- Terumi Tanaka, Co-Chairperson of Nihon Hidankyo, 2024 Nobel Peace Prize;
- Professor Geoffrey Hinton, 2024 Nobel Prize Winner in Physics;
- Connor Leahy, CEO of Conjecture (AI safety research);
- Dr. Ruth Mitchell, neurosurgeon and Chair of IPPNW, 1985 Nobel Peace Prize;
- Melissa Parke, Executive Director of ICAN, 2017 Nobel Peace Prize; and
- Moderator: Professor Karen Hallberg, Secretary General of Pugwash Conferences on Science and World Affairs, 1995 Nobel Peace Prize