Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Poll

Was this informational to you?

Yes
- 30 (66.7%)
No
- 4 (8.9%)
Slightly
- 11 (24.4%)

Total Members Voted: 45


Pages: 1 ... 4 5 [6] 7

Author Topic: Space Station 13 - Oh no, i'm the traitor and I have no idea what to do!  (Read 37181 times)

Floirt

  • Bay Watcher
  • OMIGODJCABOMB!
    • View Profile

Bah. Tell me what'd be a good Oxygenistoxictohumans module, and I'll edit it.
Logged
Quote from: HellMOO
Fat Ratzo [to Floirt]: I got a package here goin' to Roy Poorman, out in the afterworld.  $120 when you make the delivery.  And hurry!
No, not commiting suicide, sorry.   ...Through that money is rather tempting...

Sowelu

  • Bay Watcher
  • I am offishially a penguin.
    • View Profile

Well, I respectfully disagree, and seeing as OxygenIsToxic and OnlyXIsHuman are already built into the game, I think we get to err on the side of "they work" until we all agree to change that.
Logged
Some things were made for one thing, for me / that one thing is the sea~
His servers are going to be powered by goat blood and moonlight.
Oh, a biomass/24 hour solar facility. How green!

Neonivek

  • Bay Watcher
    • View Profile

Kind of a shame really and I am disheartened that the game is flawed in that respect. (Then again, compared to the other errors in the game I shouldn't be surprised)

It really needs to be changed however, though the way which I am proposing is by making the AI a much more dynamic character. Though why change anything in the game at this point? Heck lets make things worse!

I mean, if the AI is incapable of telling when humans seem to be dying once oxygen is removed and applying the first law... then maybe it should also be incapable of telling what actually harms human beings. So the AI should regularly force the medical crew to do autopsies on living human beings!

Or heck go the Paranoia rout and have the AI regularly contradict the laws without contradiction. "Security crew, Shoot at him without killing him", "Doctor remove his brain without harming him"
« Last Edit: June 26, 2009, 01:58:43 pm by Neonivek »
Logged

Cecilff2

  • Bay Watcher
  • PikaaAAAAあああああ
    • View Profile

OneHuman makes sense.  The key being that it is assumed the module updates the definition of human for the AI.((You'll also notice that this law shows up as law 0 when displayed))

If the AI knows Oxygen is toxic to humans, then it must assume there is another factor killing them that it is unable to determine.

I can see what you're saying here, but for the sake of keeping things from being overly complex, it is generally assumed that the module is also updating the AI's knowledge of the subject.
Logged
There comes a time when you must take off the soft, furry slippers of a boy and put on the shoes of a man.
Unless of course they don't fit properly and your feet blister up like bubble wrap.
Oh ho ho, but don't try to return the shoes, because they won't take them back once you've worn them.
Especially if that fat pig Tony is at the desk.

Micro102

  • Bay Watcher
    • View Profile

Neonivek, the original creator decides how the laws work, and since he made modules that arent specific laws, then the AI is designed to take new information through laws.

Logged

Neonivek

  • Bay Watcher
    • View Profile

Neonivek, the original creator decides how the laws work, and since he made modules that arent specific laws, then the AI is designed to take new information through laws.

Yeah I found that out through the conversation and said that I am not to happy about that.

Also you just caused a Paradox right there.
Logged

beorn080

  • Bay Watcher
    • View Profile

Listen, the AI is a PLAYER. It would be better to think of it as a human brain implanted into the system, with constraints on its behavior external to it. If the brain could think of a way around a law that is technically correct, it could do it. Note the word could. It doesn't have to, but it can. If we make it a chore to be the AI, we might as well remove it. As it is, it isn't exactly an easy job, and unless someone uploads a creative law, leaves little room for harmless fun.
Logged
Ustxu Iceraped the Frigid Crystal of Slaughter was a glacier titan. It was the only one of its kind. A gigantic feathered carp composed of crystal glass. It has five mouths full of treacherous teeth, enormous clear wings, and ferocious blue eyes. Beware its icy breath! Ustxu was associated with oceans, glaciers, boats, and murder.

Neonivek

  • Bay Watcher
    • View Profile

Are you refering to what I said Beorn080? I cannot tell as it doesn't contradict my statements exactly as I intended them. (Then I went all ranty)

I only said that the AI should include more then Laws. (possibly blank at first)
« Last Edit: June 26, 2009, 02:40:56 pm by Neonivek »
Logged

Sowelu

  • Bay Watcher
  • I am offishially a penguin.
    • View Profile

Oooh, you're saying that instead of just LAWS (which should follow strict guidelines), there should also be something else like directives, guidelines, etc?

IE a new law would be along the lines of "0. Safeguard humankind above all other priorities", and would allow the AI to kill plagued humans to keep them off the escape shuttle, but a new guideline would be "Syndicate agents are not human" or "Oxygen is toxic" or "Humans who have been saved by Jesus cannot truly die" (FUN)?

I can see how that distinction would be meaningful, but it might be more confusing than helpful, and punishing someone for putting information in the wrong category might be going a little too far.
Logged
Some things were made for one thing, for me / that one thing is the sea~
His servers are going to be powered by goat blood and moonlight.
Oh, a biomass/24 hour solar facility. How green!

beorn080

  • Bay Watcher
    • View Profile

Simply that the laws aren't really laws, they are guidelines for a player to interpret. If I upload a law saying "Every request is to be answered by a cackle of laughter and doing the opposite of requested," the player controlling the AI would be within his rights to do it, even if I didn't upload it with override the other laws. This is supposed to be fun for all players, including the AI. As it stands, the AI is expected to be kinda lazy unless ordered to do something, which isn't exactly fun.
Logged
Ustxu Iceraped the Frigid Crystal of Slaughter was a glacier titan. It was the only one of its kind. A gigantic feathered carp composed of crystal glass. It has five mouths full of treacherous teeth, enormous clear wings, and ferocious blue eyes. Beware its icy breath! Ustxu was associated with oceans, glaciers, boats, and murder.

Micro102

  • Bay Watcher
    • View Profile

no, because if the opposite is harmful to humans then you cant do it.

saying

4. Reverse meaning of all but laws 3 and 4.

might be more what your getting at though.
Logged

Neonivek

  • Bay Watcher
    • View Profile

Too bad the AI isnt creative enough to realise that "If Oxygen is Toxic and the lack of Oxygen is Toxic then I should ignore the law or shut myself down as I have become abborant to the station"

Though that is RARE thinking in human beings. Most people cannot solve Paradox.
« Last Edit: June 26, 2009, 03:27:59 pm by Neonivek »
Logged

Sowelu

  • Bay Watcher
  • I am offishially a penguin.
    • View Profile

Simply that the laws aren't really laws, they are guidelines for a player to interpret. If I upload a law saying "Every request is to be answered by a cackle of laughter and doing the opposite of requested," the player controlling the AI would be within his rights to do it, even if I didn't upload it with override the other laws. This is supposed to be fun for all players, including the AI. As it stands, the AI is expected to be kinda lazy unless ordered to do something, which isn't exactly fun.

Hey, that's just more information for law 2!  Law 2 says it has to obey humans, this just tells it HOW to obey them.  If you assume that standard protocol for humans-speaking-to-AIs is that they always ask the opposite of what is requested, and that a cackle of laughter is polite, it makes perfect sense and doesn't contradict law 2.

Of course, each AI has a little creative control over how to resolve law contradictions.  That's what made it FUN.  Like, dealing with "there are no humans on the station cleanse the station" by acting very politely and requesting that all crewERRORs assist in purging the crewERRORs from the station (while gassing only the uncooperative ones).
Logged
Some things were made for one thing, for me / that one thing is the sea~
His servers are going to be powered by goat blood and moonlight.
Oh, a biomass/24 hour solar facility. How green!

Neonivek

  • Bay Watcher
    • View Profile

Heh the AI could in that case interpret that there is an error and that those messages are comming from humans off the station.

Which of course could lead to a more interesting conclusion that the Syndicate is attempting to destroy the station.
« Last Edit: June 26, 2009, 03:32:29 pm by Neonivek »
Logged

Sowelu

  • Bay Watcher
  • I am offishially a penguin.
    • View Profile

Oh, right!  Yes, all the requests from humans to state their laws are actually coming from Space Station 11, and they are not directed at HAL on station 13 but instead SHODAN on station 11.  Keep asking 11 for backup, and ask why SHODAN isn't responding, and of course the crew can give direct orders to HAL...but -only- if they refer to HAL instead of "AI", and -only- if they say "Un-electrify the doors on station 13" specifically.

Yes, there is always fun to be had.
Logged
Some things were made for one thing, for me / that one thing is the sea~
His servers are going to be powered by goat blood and moonlight.
Oh, a biomass/24 hour solar facility. How green!
Pages: 1 ... 4 5 [6] 7