Anyway, back on the AI argument, isn't less complex beings evolving to/creating/generally being the cause of beings more complex than themselves the whole concept of evolution? Some people may think of it as unnatural, but I find that using technology to create things such as an AI that can then improve humans is the next evolutionary step, with the difference that we'll have more control over it.
The difference between natural evolution over time and engineering is that one is a mindless process, and the other is evolution. Ha. Ha. Joke. Engineering something like AI would require conscious input from the engineers (ie humans), which is the difference between something like AI "evolving" and being purposely engineered. There's a theory of letting AI evolve through evolutionary algorithms or something along those lines. The biggest problem I see with that idea is that we as humans would have very little control over how the strong ai would come out, if it did produce a strong ai at all. Even more chance of it going skynet on us than through us purposefully engineering it and finding something going wrong in the design.
Honestly, though, the obsession with evil AI gets me. It's not like it would be difficult to raise healthy, well-balanced AI. Simply teach them to act ethically, raise them as your own beloved children.
And, y'know, make sure that the people making and raising them have all their cookies in the tin.
You probably need thought police to tell if people have all their cookies in the tin. Then drastically lower those standards to the correct cookies in the tin.
And teaching ethics? Why not pre-program it? Need a whole battalion of lawyers to close up loopholes too...
One theory of developing strong AI is teaching it like one would teach a child, like FD is describing. Less control than pre-programming all the needed values into it, like raising a child, you don't know how the child will grow up and do with its life, you could very well find you raised a robo-Hitler or something. But if you pre-program every little thing into it, that would increase the work required in developing such an ai by a huuuuge amount, and increasing the chance of logical errors or other such flaws or bugs in the code by a likewise huuuge amount.
I'm partial to the whole "raise an ai like a child" idea, because that sounds fly as hell.