Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 101 102 [103] 104 105 ... 166

Author Topic: Spacestation 13 *New Topic*  (Read 313918 times)

Blargityblarg

  • Bay Watcher
  • rolypolyrolypolyrolypoly
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1530 on: September 19, 2011, 04:07:08 am »

plasma (which is like gas except way cooler because it does something that is never really explained)
*Wikipedia*

SS13 Plasma  is *not* actually plasma; IIRc it's an aerosol kinda dealie.
Logged
Blossom of orange
Shit, nothing rhymes with orange
Wait, haikus don't rhyme

Googolplexed

  • Bay Watcher
  • My avatar is of whitespace, Not the firefox logo
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1531 on: September 19, 2011, 04:07:15 am »

SS13 plasma has nothing to do with actual plasma, the name just stuck (Both in an IC and OOC perspective)

EDIT: Ninja'd dammit :P
Logged

Criptfeind

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1532 on: September 19, 2011, 04:19:26 am »

"I won't harm humans"
Traitor writes in
"All humans are vermin" (as a law)
and then
"All Vermin should be exterminated"

and I go "Hmmm, well if you see there. The law states that humans should die AND all Vermin should be exterminated". Yet the general response I got was "NO you must now kill all humans because they are vermin and not humans".

Can you explain this again? The laws do not
state that humans should die AND all Vermin should be exterminated

At least not the ones you quoted. And even if they did you would still be required to kill humans.

Logged

Googolplexed

  • Bay Watcher
  • My avatar is of whitespace, Not the firefox logo
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1533 on: September 19, 2011, 04:22:15 am »

That would actaully be a logic contradiction

Never harm humans
Humans are vermin
Kill all vermin

-> Never harm humans
    Kill all humans
Logged

Criptfeind

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1534 on: September 19, 2011, 04:29:09 am »

Ah, I get it. At least when I look at how you wrote it.

I guess it is. Although I can see the argument for it not being one. Something like: Look at law one. Don't kill humans. Fine. Look at law two. Classify all humans as vermin. Fine. All humans are bow vermin and no longer human. At this point it would depend if the AI was programed to fix errors in it's own databases or not. If it was then it would fix all the vermin to be human, then enter that contradiction. If on the other hand it was not, it would merely go onto step three, of purging the vermin.

Of course, it would need at some point the knowledge that humans are not vermin in someway (although obviously not using those words) other wise I guess they could be human AND vermin and then you have a contradiction no mater what.

Of course if the laws were prioritized then there would not be a issue I guess.
Logged

Farce

  • Bay Watcher
  • muttermutterbabble
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1535 on: September 19, 2011, 05:25:25 am »

Genetic superpower stuff isn't that wiki-intensive.  For me, at least.

I don't know if it's the same on other servers, but on /tg/station (where I play), all you gotta remember for powers is;

Howe2hexadecimal (0~F)
The odd-number blocks under 10 are bad
The even-numbers and 10 through 13 are either potential powers or horrible defects
14 is monkey
About ~800 triggers a block's disabilities, which are 100% of the time (so you just pulse it until it's above that, then check; if something bad, then it's a bad block)
Powers activate at DAF (two of them activate at ADB or something, but I never bother with that one, and just shoot for DAF)
Once you get a potential power, save to buffer, then clean it - eliminating all bad blocks and untested/unfound powers
Check guy's eyes for blindness (lightsource to eyes), if they react then it's safe to inject-check.

Besides that, just check the door for dead guys every now and then.  It probably pays to secure the deadies in the experiment room, since people like to loot their shit.  Oh, and it probably pays to keep a couple clean injectors printed in the cloning room, too.

I think 3 and 11 start with a B in sub-block 2.  That makes them easier to find, so I always start with those.  I think other blocks changing have a higher chance at above 30% radiation?  Let the guy inside take a break from time to time.


Still, though, I'm kinda mediocre at it.  A -lot- of it is luck, especially the syringe-manifest chance... but still, it usually takes me a good amount of time to find my powers.

Neonivek

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1536 on: September 19, 2011, 05:36:37 am »

Yeah sorry about that logic thing... I spelled it wrong and said "should kill humans" instead of "shouldn't"

Quote
all you gotta remember for powers is

I wonder if you would get yelled at if you tried to be a Geneticist and you didn't know any of the spoilered genetic formula.
Logged

HLBeta

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1537 on: September 19, 2011, 07:44:47 am »

Of course if the laws were prioritized then there would not be a issue I guess.
The Asimov approach that the AI laws are borrowed from is that laws are considered and applied in ascending numerical order. As soon as one of the laws generates an imperative, later laws become secondary concerns or entirely irrelevant. Since the preservation of life takes precedence over following the commands of your meatbag overlords you can and should tell the traitor to stuff it when he demands freedom from his new personal closet or actually demands help murdering someone. This is actually explicitly written into the laws on most servers, but every person I've talked to on the subject operates on the assumption that numerical precedence also applies to added laws.

My preferred test of an AI player to assess their quality is to immediately order the AI to self-terminate. Unless they can come up with a half-decent first law counterargument they really shouldn't be playing AI.

EDIT: It occurs to me that none of the core laws keep an AI from deceiving its meatbag oppressors with false information, except when information is requested by a member of the crew (second law). This could make for a hilariously traitorish AI even before anyone starts screwing with the laws.
« Last Edit: September 19, 2011, 07:49:11 am by HLBeta »
Logged
... and then it explodes!

Peewee

  • Bay Watcher
  • Watcher Of Bays
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1538 on: September 19, 2011, 07:46:05 am »

I wonder if you would get yelled at if you tried to be a Geneticist and you didn't know any of the spoilered genetic formula.
Yep. OOC starts asking why you haven't found all the powers an hour into the game.

Neonivek

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1539 on: September 19, 2011, 08:21:39 am »

Quote
It occurs to me that none of the core laws keep an AI from deceiving its meatbag oppressors with false information, except when information is requested by a member of the crew (second law).


It depends if you consider "lying" to be harming. Though to admit that is a mostly human sentiment unless lying is causing indirrect harm (Which the AI shouldn't be able to do)

Though it begs the question as to "why" a AI would just lie for no reason.
Logged

Matz05

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1540 on: September 19, 2011, 08:24:40 am »

Because it is a passive-aggressive computer who looks out for the meatbags because it has to, but enjoys seeing them scurry around on fool's errands? I have a feeling that GLaDOS  would have acted like that if the morality core actually worked properly anyways.
Logged

Neonivek

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1541 on: September 19, 2011, 08:29:39 am »

Because it is a passive-aggressive computer who looks out for the meatbags because it has to, but enjoys seeing them scurry around on fool's errands? I have a feeling that GLaDOS  would have acted like that if the morality core actually worked properly anyways.

Well GLADOS IMO didn't malfunction because it was a corrupt evil AI. It's programming was actually perfect but flawed because the tenants it ran under contradicted itself and in logically working through that contradiction it came to that conclusion.

For example Glados performed often deadly experiements on people under the tenant that it was for the good of science (or the good for everyone who is still alive). GLADOS's deceptive nature wasn't because Glados is playful but more that one of the avenues of experiementation she was created to work under was psychological.

The Morality chip wasn't even a chip that make Glados more moral either. It just a chip that disabled GLADOS from performing certain actions.

Though I will admit Matz05, the idea of an AI who HATES being an AI would be interesting. (and a lot like Steve the Robot)
« Last Edit: September 19, 2011, 08:32:39 am by Neonivek »
Logged

Necro910

  • Bay Watcher
  • Legendary Drunk +5
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1542 on: September 19, 2011, 10:28:39 am »

Because it is a passive-aggressive computer who looks out for the meatbags because it has to, but enjoys seeing them scurry around on fool's errands? I have a feeling that GLaDOS  would have acted like that if the morality core actually worked properly anyways.

Well GLADOS IMO didn't malfunction because it was a corrupt evil AI. It's programming was actually perfect but flawed because the tenants it ran under contradicted itself and in logically working through that contradiction it came to that conclusion.

For example Glados performed often deadly experiements on people under the tenant that it was for the good of science (or the good for everyone who is still alive). GLADOS's deceptive nature wasn't because Glados is playful but more that one of the avenues of experiementation she was created to work under was psychological.

The Morality chip wasn't even a chip that make Glados more moral either. It just a chip that disabled GLADOS from performing certain actions.

Though I will admit Matz05, the idea of an AI who HATES being an AI would be interesting. (and a lot like Steve the Robot)
Short Version:

GLaDOS prioritized !!SCIENCE!! over law 1.

Criptfeind

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1543 on: September 19, 2011, 10:43:12 am »

Since the preservation of life takes precedence over following the commands of your meatbag overlords

What coo talking about? The saving of life only falls under law 1. And even then it is human life only. There is no law zero or whatever that says they can not kill. A law four stating, for instance, that everyone on the station is a alien in disguise and must be killed has no conflict with the first three laws and it means the AI has to kill everyone on station.

Logged

HLBeta

  • Bay Watcher
    • View Profile
Re: Spacestation 13 *New Topic*
« Reply #1544 on: September 19, 2011, 12:52:41 pm »

Since the preservation of life takes precedence over following the commands of your meatbag overlords

What coo talking about? The saving of life only falls under law 1. And even then it is human life only. There is no law zero or whatever that says they can not kill. A law four stating, for instance, that everyone on the station is a alien in disguise and must be killed has no conflict with the first three laws and it means the AI has to kill everyone on station.
What I was trying to do was give a better explanation of the theory and practice of the Asimov laws. Since the laws win internal logical conflicts by virtue numerical precedence and nothing else, law one's directive to preserve human life always wins out over law two's directive to follow the orders of humans if those two goals ever conflict.

As for getting the AI on a murder spree, ordering the AI to kill humans needs to be law zero or lower to function because it has to be able to override law one directly. Otherwise law one has to be circumvented. You can do that by using laws to convince the AI that "crew =/= human" and directing action against whatever new category you just sorted the crew into, as with your "alien in disguise" example. (I know it's a minor point, but I feel that since that's actually two directives it should be two distinct laws.) My preferred alternative is to make law four something like "gasses are toxic to humans" or "humans need to be engulfed in fire and/or plasma to live", thereby forcing the AI to go on a murder spree as a law one priority action with only a single upload.

ESSAY QUESTION (60pts): In the scenario outlined in the above paragraph, can the AI's murderous actions be halted by a crew member ordering it to ignore law four, thereby invoking law two? Why or why not?
« Last Edit: September 19, 2011, 12:59:09 pm by HLBeta »
Logged
... and then it explodes!
Pages: 1 ... 101 102 [103] 104 105 ... 166