Thing is, there's no such thing as a "true" utopia. Even if there was, you are imperfect; you will not be 100% responsible with your power, so you'd need a counterbalance to keep you in check.
I don't care how righteous and responsible one thinks they are, it is incredibly presumptuous to believe one's conception of "good" is ubiquitous and/or "right." You have a responsibility to not implement your version of a utopia, because without fail it will be someone else's dystopia, and they will not be able to escape from under your iron fist. If that ain't "evil," I dunno what is.
Mm, Kaij, that's limited human knowledge being expressed, there.
We do have a responsibility to be really frakking careful in trying to implement some sort of utopia, but this limited scale omnibeing can know
exactly how to implement a conception of good* without it becoming a dystopia**.
There are, actually, pretty ubiquitous conceptions of right (Usually either some variation of "And it harm none" or "Do unto others;" the major limitations in human cultures is that it runs into our mostly-natural xenophobia and everything buggers right up. Either that or it starts either stepping beyond the former concept or going too far with the latter.) that could be implemented by a omniscient/potent being without things buggering up.
*Limited to stuff like, hey, no <insert sundry list of violent crimes>, no hunger, no long-term environmental destruction, etc.
**Well, except maybe to folks like psychopaths or violent xenophobes, etc., so forth, but I limit myself in considering the worth of those who would internalize the harm of others without sufficient cause as a core behavioral pattern when I'm considering moral issues
I'm more to the side of the international law/anthropological side of the argument than the theological argument. The entity is not completely omnipotent, and certainly not omniscient. It's nearly omnipotent for practical purposes, or at least powerful enough to impose its will unopposed. When you add complete omnipotence and omniscience to the mix, it can simple disregard logic and do two contradictory things at the same time. A less extreme case would be: Does superman have a responsibility to depose the North Korean Regime? What about other tyrannical governments? Should he intervene in societies where he is not wished by its members to do so? Does it have the right to impose his definition of evil over other societies?
Again, it depends on its definition of evil. The entity certainly has the right and responsibility to prevent violence, to stop hunger, etc., as a baseline. For a limited knowledge entity, possible consequence would become an issue and a problem is generated.
But yeah, the specific important point for your omnicritter was
say he could guarantee things would go exactly as he desired, not terribly misfire in his face?
This entity could prevent all direct harm on the planet without negative consequence. If it is capable of doing that without everything buggering up, yes, it should. It only needs to be
sufficiently knowledgeable and potent, in other words.
The right to cause harm without sufficient reason isn't a genuine moral right, I'm afraid. To prevent that from happening doesn't run into any issues, except when that prevention causes other problems.
The question is how far
beyond that prevention the entity goes.
Basically what I'm getting at is that stopping stuff like genocide, wars of aggression, government controlled rape camps, etc., so forth, continue your list of atrocities, is a no-brainer. There's not really a moral question there -- if an entity can do so without negative consequence, it must do so to be a moral entity. It may amoral instead of immoral, but morality would require a baseline prevention of insufficiently justified harm.
That doesn't necessarily mean, as per your later example, that Superman should topple the DPRK, but it
would mean that he should prevent the situation in the DPRK from harming the people within.