All this does sound good. A minor addition to it would be to use a different variant of EUE--IIRC, the Doc said that the EUE could be altered to access a different set of universes. If we had some scientists make variant EUEs, then send the super secret colonies to universes only accessible by those variant EUEs, and then we killed the scientists and destroyed the EUEs, it would be essentially impossible to ever deliberately hunt down those other human colonies no matter how determined and powerful an intelligence was.
Even better, if we programmed those colonies to replicate the trick, we'd essentially be forkbombing the multiverse with the human virus. Eventually, humanity would be almost ubiquitous throughout all possible multiverses. We'd make humans the only constant in a multiverse of infinities.
Sure, using variant EUE could work well. And yes, the idea was indeed to keep spreading across other universes (in a continuation of the 'once sufficient supplies are stockpiled, start spreading out again' outlined earlier) after the initial wave of colonization. As said, make it just about impossible to ever truly eradicate the human race from the multiverse.
We can *probably* just use whatever Steve defines as human, so my opinion is probably irrelevant, but I could see this resulting in some very nonhuman humans. For instance, do limbs really need bones if we can make tentacles with greater strength and flexibility? Is skin important if we can biomod humans to have hexsand surfaces which allow them to eat light and energy, despite making them look like holes in reality? Must humans have organic brains, if we can make streamlined computer simulations of humans which run at much higher speed with near-perfect simulations of emotions and such? If those humans edit themselves so that they don't experience emotions such as anger or sadness, instead remaining constantly euphoric, should the AI consider them humans? If it does, and if it values happiness, could it decide that it should replace all humans with computer simulations that are happy?
Alternatively, if you're very strict and accurate on the definitions of a human, does the AI start to enforce genetic purity, or enslave mutant children which are just *slightly* outside the definition of a human, despite not appearing too different to our own eyes?
Personally, I doubt any of these will ever be problems, but I'm trying to list everything I could possibly see going wrong.
Yeah, basing ourselves off of Steve should go a long way already. As for modding humanity, I think that the colony AI would only modify humans itself if it's absolutely necessary for the colonist's survival, while otherwise allowing humans to modify themselves (but again not if it's too demanding on resources or dangerous and such). Should prevent the colony replacing humans with anything against their will.
And yes, getting the right balance of what it means to be human to prevent weird excesses wouldn't be easy, but issues like that should become apparent during the testing phase.
Okay. All this seems logical and good.
I'm unsure if it's what PW was asking for--I think he wanted more aesthetic descriptions? I remember him asking whether you wanted windows everywhere or something.
One possible issue is the definition of "natural". One could argue that it is very natural for humans to kill things--would the AI create animals for them to kill? Or let them kill each other, because it's "natural"? Another question would be about the "natural" behaviors of deviant individuals; If a sociopath naturally wanted to eat the flesh of his fellow man, would the AI allow it? If not, what of an artist who had an unusual method of self expression--could they be defined as having an unnatural behavior, and be prevented from expressing themself in some way not approved by the AI?
Then we've got the current crazies' definition of natural; Is it natural to enjoy video games and movies?
What of genemodded humans? Can the AI genemod humans to naturally want to be unconditionally kind to others, never harming another human's enjoyment of life? How about making them naturally want to sleep twenty hours a day and repetitively play Jenga for the remaining four, all the while being blissfully happy? You said the AI would allow some risk to be allowed, but what if the AI genemodded the humans to feel no desire for risk, or to have the desires, but have them be met by aforementioned Jenga (THEY MIGHT COLLAPSE OH MY GOD!)?
How about the risk? Does it have to be included, or is it only a "if the humans ask for X, allow them to do so if it carries low risk"? If it's a guarantee, can the AI cheat by providing fake risk, or can it provide risk by mixing in a small amount of toxic chemicals into the air supply every now and then? If it's not a guarantee, is it okay if the AI just locks everyone in VR anyway, because it's made VR as good as real life?
Well, as I told him, I don't think any sort of aesthetic descriptions/demands would be worth a damn due to the massively different environments and needs that could mark any particular universe.
'Natural' was a bit of a shorthand there, I guess one way to define it would be 'let each colonist do what he wants, allow for some inefficiencies (eg in regards to resources used) in that, but keeps things within reason'. So the artist can express himself however he wants as long as it doesn't use a mass of resources or endanger other colonists (such as eating them). 'allowing natural behavior' was more in the sense of not just keeping humans cooped up in VR machines forever if that isn't absolutely strictly necessary.
For genemodding, see earlier.
For risk, yeah, again a weighing of 'allowing as much freedom as possible' vs 'potential risks' is made according to some sort of internal weighing factors, which are calibrated during the training and testing phases. Locking in VR can be done, but only if either the individual asks for it, or if it's the only way to keep an individual in a living state (the even more extreme alternative being cryosleep until the environment is more amenable to more normal life again).