From an fantasy perspective for modeling incentives?
An equally powerful/cost effective weapon that left no radiation damage and no burning city afterwards even if there is still only a crater left.
That way you can have mutual destruction without destroying the rest of the world as collateral. It will be a technology replacement.
You will not loose any strategic capability, meaning you can make progress without the "timed-collective-action-threshold-conditional-commitment" thing that is never going to happen with the incentive structure as its now.
There is no incentive for weapons that can destroy the entire world if the entire world is not aligned, only for weapons that can destroy nations.
That said, this line of thinking makes me sick, while at the same time it seems impossible to me without something similar, you have to make them obsolete.
One idea I employ often is to "increase the temperature" of the system, non-essential complexity, bureaucracy, excessive variation of materials, etc becomes more apparent through a lens of being more demanding than necessary so that the physical limits of the problem dominates the solution.
Another idea I use is to mentally time travel and try to visualize how context-sensitive my decision/design/process is. Decisions that require a "superior" (or even accurate for that matter) understanding of the context I'd frown upon. Even if they are not mistaken and biased, to me, they promote a dynamic of continued survival through non-obvious and increasingly complex actions, rather than forcing a simple and obvious environment.
Another is engineering with a bias for optimizing recovery first then reliability (sort of a minimax).
So for example combined I would rather have a way to recover from broken code and produce
a teaching moment than I'd try to prevent a developer from merging said code in the first place.
I'd try to exercise a process to recover all data from backups in a timeline that the company would survive first rather than dedicate resources first for redundancy and leave said scenario for later due to its low probability.
Take all of this in a "while there is value on the other thing, I value this more" wrap.
I like models that would force the space to be simpler and more obvious.
To the degree that you can pursue both or combine them, do so, do not try to live your life like https://www.xkcd.com/761/
To the degree that you have to make decisions under uncertainty, try to minmax.
To the degree that you can't reconcile what different pieces of you want, or that other people want of you, or that you want of other people, learn to negotiate.
To the degree that you are really stuck, the world works a little like Miegakure, changing the question might free you.
To the degree that those don't free you from uncertainty, reduce the scope of planning and relax the goals, expand them as you master the environment.
Not making a decision is itself a decision that guarantees failure and maximizes regret.
Advice can be given, but ultimately, it is a hit or miss that depends on how you tick.
What I would do is, write an essay about it. You need to formulate your arguments, and convince yourself, to your own standard. Answer your own questions, and do any research you might need if you are at a loss of words.
If you reach a question in this same class, apply this comment recursively.
This is the best I can do without attempting to write my own essay.
parallel is nice, in the past I would run a small script like https://gist.github.com/CMAD/3077918 because I had to migrate email accounts from a list and it was all very old servers with the package manager broken, good old times, now I will have to run some super convoluted orchestration
An equally powerful/cost effective weapon that left no radiation damage and no burning city afterwards even if there is still only a crater left.
That way you can have mutual destruction without destroying the rest of the world as collateral. It will be a technology replacement.
You will not loose any strategic capability, meaning you can make progress without the "timed-collective-action-threshold-conditional-commitment" thing that is never going to happen with the incentive structure as its now.
There is no incentive for weapons that can destroy the entire world if the entire world is not aligned, only for weapons that can destroy nations.
That said, this line of thinking makes me sick, while at the same time it seems impossible to me without something similar, you have to make them obsolete.