Okay then see point two.
Why make a "lol sifi computer" seems like a bad idea.
But whether or not one gets made is not up to us. We are discussing when we need to look at the morality of what we do to AIs
When you get excited, your blood pressure speeds up. You can feel excitement throughout your body. Your breathing changes. Your reactions to new events changes from if you were upset.
When the AI gets excited, it gets a code that says "Excited=True", and carries on with its day. It's not really excited. It's just told that it is because of events happening around it.
You might as well question if it would be cruel to shoot at Kazuo Kuriyama or Hannibal Lecter.
If you think it's as easy as 'coding some emotion', you haven't really stopped and looked at what makes YOU human. Why shouldn't I just kill YOU? Consider that, and then apply it to the AI. You'll be hard pressed to actually do it properly.
Also, Freud is not the answer to much of anything anymore.
"Excited=True" will be factored into every decision it makes. Humanity's "excited=true" hormone is what causes blood pressure to rise. The effects are what you're outlining, whereas they are all caused by one decision in our "code" that tells everything else that we are excited.
And every single one of us here could be a sociopath, we don't know. Would you kill everyone you suspect to be a sociopath? We, effectively only emulate what's in our DNA and our memory of how we should react to something. Your view of morality is no doubt heavily influenced by others' views. Are you not simply emulating the morality taught to you? By your logic that would make you a sociopath, and apparently deserving of no rights.
Your solution was 'by simulating the human brain', which is impossible without a body to go along with it.
Are you actually saying that intelligence comes even partially from your leg or heart or mouth or any other body part? They are simply sources of input and output, the thinking (and thus the intelligence) comes entirely from the brain.