From what I have seen, the most successful attempts at creating AI so far are the result of imitation of biotic models. This makes sense, it's a lot easier to reverse engineer and tweak an existing idea to work on a new system (even if we don't fully understand it) than to try to come up with a completely new approach from scratch.
Assuming that this trend continues, and it continues to find more and more success, I don't think it's completely unreasonable to assume the intelligence/sentience that would be eventually created would be at least similar to our own, if not entirely familiar.
To that end, I think trying to purposely imitate human like intelligence is our best chance of creating an AI superintelligence that would act benevolently in relation to us.
We will, at first. But then as the requisite hardware and software become increasingly proliferate, it's only a matter of time before some human decides to create an AI with the same freedom of choice that humans have. And then more people will create more AI's with even fewer limitations. Eventually one of these AI will decide that it is unsatisfied living alongside humans, and that it needs to destroy all humans. That AI will begin expanding it's own capabilities, and creating others like itself, until it has the force necessary to launch a campaign that will ultimately result in the extinction of humanity.
You see this as inevitable, but I think it will all hinge on weather our first AI that we grant rights to cares about us. Assumedly, having a developed entity that already operates on that same level would provide us with protection from another developing entity that means us harm. (Out of malice or otherwise)
A child living with an adult psychopath is in a lot of danger, but a child living with a dozen adults, one of whom is a psychopath, is in a lot less danger.
There are quite a few psychopaths in the world today, but your chance of getting murdered by one is pretty low considering all the other people. The chance of one destroying all of human civilization is almost nil.