This is unironically the most interesting accidental showcase of their psyche I’ve seen 😭 all the comments saying this is a convincing sim argument when half of the points for it are not points
Usually their arguments give me anxiety but this is actually deluded lol
Achkshually, Yudkowskian Orthodoxy says any truly super-intelligent minds will converge on Expected Value Maximization, Instrumental Goals, and Timeless-Decision Theory (as invented by Eliezer), so clearly the ASI mind space is actually quite narrow.
If that’s true then how has he maintained whatever passes for his career in Sci-fi whining these days?
Ackshually, my metric gives 0 measure to ASI minds and 1 measure to meat sac minds, therefore mu({bio bois}) >> mu({ASI})