No offense, but this way of thinking is the domain of comic book supervillains. "I must destroy the world in order to save it." Morality is only holding me back from maximizing the value of the human race 1,000 or 1,000,000 years from now type nonsense.
This sort of reasoning sounds great from 1000 feet up, but the longer you do it the closer you get to "I need to kill nearly all current humans to eliminate genetic diseases and control global warming and institute an absolute global rationalist dictatorship to prevent wars or humanity is doomed over the long run".
Or you get people who are working in a near panic to bring about godlike AI because they think that once the AI singularity happens the new AI God will look back in history and kill anybody who didn't work their hardest to bring it into existence because they assume an infinite mind will contain infinite cruelty.
the correct answer is probably that it could contain infinite of any of these things, but you don't know which one it's going to be a priori, and you get one shot to be right.
This sort of reasoning sounds great from 1000 feet up, but the longer you do it the closer you get to "I need to kill nearly all current humans to eliminate genetic diseases and control global warming and institute an absolute global rationalist dictatorship to prevent wars or humanity is doomed over the long run".
Or you get people who are working in a near panic to bring about godlike AI because they think that once the AI singularity happens the new AI God will look back in history and kill anybody who didn't work their hardest to bring it into existence because they assume an infinite mind will contain infinite cruelty.