Roko's general answer to this predicament was the "quantum billionaire trick": you place a few thousand dollars on stock market bets which have a low probability of succeeding (and a correspondingly high payoff, like 10000-to-1), having committed to spend any winnings on charitable futurist causes like friendly AI and extinction risk. So in the Everett multiverse branches where you do win the market lottery, all that actually happens, and you thereby appease that particular god. He never quite spelled out a strategy for appeasing all the possible gods all at once, but he implied that it could be done.
]]>These interactions are supposed to be an exercise in acausal trade. To interact with an entity that isn't actually present to you (because it's far away in space and time, or in another universe entirely), you simulate it, and it simulates you. Mutual simulation is the substitute for ordinary causal interaction.
So the interaction can't get going if you refuse to "simulate" (or imagine or think about) the other being. Or, turning that around, you have to be looking for trouble - fishing for acausal deals - to find yourself trapped in a bad one. That is what Eliezer was trying to prevent - the general possibility that some adventurous fool would stumble into a situation of acausal blackmail, leaving their distant avatar as a hostage to transhuman torture in a variety of virtual hells. At Reddit, he even hinted at the possibility that a genuinely friendly AI might have to get involved, and (acausally) bargain for the rescue of these lost souls.
The obvious criticism is epistemological. It's easy enough to imagine a copy of yourself, a prisoner of the Matrix in Dimension X; but how the hell do you know that Dimension X exists? And how do they know about you? How does either side tell that a genuine acausal trade is going on, as opposed to a simple fantasy interaction? One reason these ideas are taken seriously, is that they are pursued in a multiverse context, where all possibilities exist; but even supposing that you could justify that belief, you would still need to justify attending specifically to Dimension X, as opposed to Dimension X-prime, where they want to crown your copy as emperor of the galaxy.
]]>"The screaming vapours over Roko's Basilisk tell us more about the existential outlook of the folks doing the fainting than it does about the deep future."
This applies, not just to the people who took it seriously in the first place, but to everyone now gleefully piling on in the public discussion. The original basilisk was a case study in taking some peculiar metaphysics seriously; the current public discussion is a case study in people happily opining about something they do not understand. I'm learning that it's not just journalists who are wrong about everything (a lesson usually learned when you see how the media treat a subject you know something about); everyone does it.
]]>