I'd argue the alternatives at this point are literally infinite, are they not?
We're in the realm of speculation, and so the idea that any ending other than "humans are the boss" seems unimaginative to me. Especially so since I know humans can and do (especially at the group vs individual level) demonstrate short-sightedness, cruelty, and at best a reluctance to conservation. How open will humanity be to recognizing non-humans as having lives of equal value to our own? I wouldn't want to be an alien species meeting a superior-technology humanity, that's for sure.
Well, I don't know, I've been making fun of Yudkowsky's positions in this comment section, but I think the official corporate position of the Machine Intelligence Research Institute (MIRI, the institute dedicated to banning Research into Machine Intelligence) is that you defect in such a scenario in the modal case.
As in it is morally correct and rational to defect, not just that they predict it would happen.
I'd argue the alternatives at this point are literally infinite, are they not?
We're in the realm of speculation, and so the idea that any ending other than "humans are the boss" seems unimaginative to me. Especially so since I know humans can and do (especially at the group vs individual level) demonstrate short-sightedness, cruelty, and at best a reluctance to conservation. How open will humanity be to recognizing non-humans as having lives of equal value to our own? I wouldn't want to be an alien species meeting a superior-technology humanity, that's for sure.