

Actually, as some of the main opponents of the would-be AGI creators, us sneerers are vital to the simulation’s integrity.
Also, since the simulator will probably cut us all off once they’ve seen the ASI get started, by delaying and slowing down rationalists’ quest to create AGI and ASI, we are prolonging the survival of the human race. Thus we are the most altruistic and morally best humans in the world!
Achkshually, Yudkowskian Orthodoxy says any truly super-intelligent minds will converge on Expected Value Maximization, Instrumental Goals, and Timeless-Decision Theory (as invented by Eliezer), so clearly the ASI mind space is actually quite narrow.