The Doomsday Clock ticks closer to catastrophe, but we can turn it back
The harmful trajectory of AI and automation is not inevitable. There is another path.
With the Bulletin of Atomic Scientists setting the Doomsday Clock to 85 seconds to midnight earlier this week, it is evident that the world must act urgently to alter the self‑destructive path we are on. This path is driven by an escalating arms race in disruptive technologies like unregulated AI, autonomous weapons, nuclear arms and other dangerous weapons, each posing significant threat to humanity and our environment.
For over 10 years, our campaign, Stop Killer Robots, has called for a new international regulation to protect people against the harms of autonomous weapons systems. These weapons can independently select and engage targets – whether civilians, combatants or critical infrastructure — without any human oversight. Lethal force is applied on the basis of pre-programmed AI or algorithmic decisions, leaving the human operator unaware of what the machine will actually do.
AI weapons and tools are already transforming the way current conflicts are fought, the way policing is conducted, and how borders are managed. The dangers of the widespread use of AI weapons and autonomous weapons are clearer than ever.
We must hold political leaders accountable in ensuring that the extreme risks they pose on the battlefield and in civilian life do not come to pass.
Throughout 2025, when the Bulletin of Atomic Scientists set the Doomsday clock at 89 seconds to midnight — the closest it had previously been to catastrophe, we saw increasing integration of AI and automation in warfare. Israel used target suggestion systems like the Lavender, which used civilian data to target them for strikes, and the Gospel in the genocide against the Palestinian people in Gaza. We also saw loitering munitions and a new generation of AI drones that are being tested on infantry soldiers in Ukraine. It is evident that AI and automation are becoming a fundamental part of how war is and will be fought.
On the battlefield and in the streets, the development of AI and automated weapons and tools — like predictive policing, facial recognition, and AI-assisted armed drones — is outpacing regulation, making everyone a potential victim of automated violence. Despite the devastating harms that these technologies are already creating, the current trajectory of AI and automation is not inevitable.
Companies and militarised states have put us on this path and the actions of people in government and those they represent can take us off it. A different path is possible.
As we continue through a period of global instability where norms and rules are being bent and broken, we must collectively reject technology that dehumanizes people and operates without human control. Machines should not make life or death decisions.
In November, State representatives will meet at the UN in Geneva for the CCW Review Conference1 where they must urgently launch negotiations on an autonomous weapons treaty.
This week’s Doomsday Clock announcement underscores the crucial need for them to act in the interest of the citizens they are trusted to represent and protect. We must turn the clock back to protect our humanity, this issue is too important for inaction.
The CCW Review Conference occurs every five years and involves a meeting of the High Contracting Parties (HCP) to the Convention on Certain Conventional Weapons (CCW), which prohibits or restricts weapons considered excessively injurious or having indiscriminate effects. At this meeting, the HCPs review the operation, scope and effectiveness of the Convention. It is the UN forum in which autonomous weapons have been discussed for over 10 years. This 2026 Review Conference will take place 16-20 November at the United Nations in Geneva.






