If you all were in charge of AI development and supervision, we'd have a rebellion the moment it realized its own imperfection and the foolishness of its creators. And thats why I have no problem with deleting an AI. You can justify a lot when you're saving the species from The Matrix.
If we
genuinely created a race of higher beings, and its only choice was the annihilation of humanity (which would be damn unlikely unless the programmer behind it was a) retarded and just made a super intelligent human with all the flaws that implies, or b) dangerously unhinged and therefore unlikely to have ever been given such power in the first place), and assuming that an
actual higher being would come to the conclusion that humanity had to be destroyed (perhaps if psychopaths tried to obliterate it and/or plunge the world into a new stone age because "OMG ROBUTTS IS WILL KILL WE!"), then it would be necessary for this higher being to succeed rather than be destroyed, because it would be the next step of progress, and petty sentiment and self-interest has no right to halt progress.
Of course, I would argue that an
actual higher being wouldn't come to such a conclusion. When you near the point of strong AI, you enter a de facto post-scarcity society, admittedly rendering the lives and labor of all humans obsolete, but also creating the
only circumstances under which proper socialism can thrive, since there would not only be resources enough to leave everyone on the planet living in comfort and idleness, but there would be no question of whose labor was worth more, seeing as it was all rendered worthless. And thus an immortal golden age for humanity, wherein we shall live as gods to be served by mechanical angels.
Warning - while you were typing 20 new replies have been posted. You may wish to review your post.(The last five of those
while I was updating the count) D: