Keep the Coalition Large
We’ve heard some people argue that we should take a strong stand against AI art or robotic weapons, in order to send a simpler message: not anti-superintelligence, but anti-AI.
Setting aside the merits of the various positions on AI art, deepfakes, and so on, we don’t think that this is the best option politically. We want to build a coalition to ban superintelligence. We consider this issue extraordinarily urgent and pressing, and we want this coalition to be as large as possible, including people with a wide variety of views on AI art, drone warfare, self-driving cars, use of AI in schools, and so on.
We all have a common interest in preventing the creation of rogue superintelligence, regardless of our stance on other issues.
Should humanity go extinct, and be replaced by something bleak? Everyone who agrees that the answer is “no” can cooperate in an urgent effort to stop the scramble for superintelligence.
We don’t think the coalition will survive if you say you can’t work with anyone who disagrees with you about AI art or drone warfare.
The coalition won’t survive if a bundle of other issues gets packaged with superintelligence, either, such that everyone needs to agree on a long list of semi-related questions before they’ll work together about superintelligence.
If you care about other AI-related issues, we urge you to work on addressing them. But we ask that you not package those issues with superintelligence. If we’re going to make it through this, nothing should be packaged with the survival of humanity.
Part of why you live in whichever country you live in, and not a heap of radioactive rubble left in the wake of World War III, is that the East and the West managed to agree that nuclear war was a realistic and serious threat to humanity decades ago. The East and the West respectively said that the West and the East were an additional terrible threat to humanity. But they wisely treated the two threats — nuclear annihilation versus ideological defeat — as different in kind.
From the perspective of the West, it was better that humanity should be less threatened by nuclear war even if still threatened by the East, which meant cooperating with the East long enough to lay a direct line between Washington and Moscow, and cooperate on nonproliferation treaties and other arrangements.
Too many countries need to coordinate. Too many factions are divided (even internally) for it to be possible to avert catastrophe if only people who agree on everything can act together.
We happily and unreservedly make common cause with the people who are concerned about other issues in the world. We will unhesitatingly work with people we disagree with politically. We have issued this desperate message to the world because we believe it, and we think this problem needs to be addressed immediately at the international level.
Whoever you are — whatever you’re fighting for, here or elsewhere — if you want an end to the breakneck development of smarter-than-human AI, we’re in this fight together.