Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 541 542 [543] 544 545 ... 1654

Author Topic: Space Station 13: Urist McStation  (Read 2156135 times)

scrdest

  • Bay Watcher
  • Girlcat?/o_ o
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8130 on: July 27, 2013, 10:02:28 am »

I knooooooow, but come on, being early is better than being groggy all day the first week of school

Well, in my school the first week was always the warmup week, so it wasn't a problem.
Logged
We are doomed. It's just that whatever is going to kill us all just happens to be, from a scientific standpoint, pretty frickin' awesome.

Kydrasz

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8131 on: July 27, 2013, 12:10:32 pm »

Stay a while and lis- I'm sorry I will never do that again.

Anyway here's the story of a lone Clown armed only with a flashlight and a laser.


Best. Atmosphere. Ever.
« Last Edit: July 27, 2013, 12:12:30 pm by Kydrasz »
Logged
Fall seven times, stand up eight.
Spoiler: Inspirational words (click to show/hide)

BigD145

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8132 on: July 27, 2013, 01:07:45 pm »

Anyway here's the story of a lone Clown armed only with a flashlight and a laser.

This is why the ship from the Clown Planet crashed into the asteroid and there were no survivors.
Logged

Kydrasz

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8133 on: July 27, 2013, 01:08:48 pm »

Anyway here's the story of a lone Clown armed only with a flashlight and a laser.

This is why the ship from the Clown Planet crashed into the asteroid and there were no survivors.

Err explanation please?
Logged
Fall seven times, stand up eight.
Spoiler: Inspirational words (click to show/hide)

BigD145

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8134 on: July 27, 2013, 01:11:17 pm »

Anyway here's the story of a lone Clown armed only with a flashlight and a laser.

This is why the ship from the Clown Planet crashed into the asteroid and there were no survivors.

Err explanation please?
This is probably the best reason why
Spoiler (click to show/hide)
Many Clowns falling over themselves in a cramped ship may have led to falling into the control panel and going out of control.

----Low pop on right now. Need people on.
« Last Edit: July 27, 2013, 01:29:43 pm by BigD145 »
Logged

miauw62

  • Bay Watcher
  • Every time you get ahead / it's just another hit
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8135 on: July 27, 2013, 03:31:07 pm »

Expect to see Mark Barnes again, since I'm back from my vacation.
Logged

Quote from: NW_Kohaku
they wouldn't be able to tell the difference between the raving confessions of a mass murdering cannibal from a recipe to bake a pie.
Knowing Belgium, everyone will vote for themselves out of mistrust for anyone else, and some kind of weird direct democracy coalition will need to be formed from 11 million or so individuals.

scrdest

  • Bay Watcher
  • Girlcat?/o_ o
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8136 on: July 27, 2013, 04:06:27 pm »

10 people. Hop on, kill some Carp.
Logged
We are doomed. It's just that whatever is going to kill us all just happens to be, from a scientific standpoint, pretty frickin' awesome.

Grek

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8137 on: July 27, 2013, 07:25:18 pm »

Second one. The AI is forbidden to take any action which results in human harm, so it has to try take whichever option could potentially lead to no humans dying.

E: An interesting point to quibble on though, is that Law 1 says "You may not injure a human being or, through inaction, allow a human being to come to harm." which means that it is 100% OK to cause non-injurious human harm through action or, to allow non-harmful injuries through inaction.
« Last Edit: July 27, 2013, 07:36:33 pm by Grek »
Logged

Fniff

  • Bay Watcher
  • if you must die, die spectacularly
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8138 on: July 27, 2013, 07:46:15 pm »

That round was ridiclous. I followed a traitor CE to the end. Thought I was beating in a traitor CMO's head, but then I thought the CMO was arresting the CE, but it turned out the CMO was also a traitor, so... bit of a mindscrew.

I should play traitor or security, I'm not great at setting up singularities and that.

Flying Dice

  • Bay Watcher
  • inveterate shitposter
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8139 on: July 27, 2013, 07:46:27 pm »

It's contextual. An Asimov Law AI or borg...

should kill a human in a binary decision wherein the only options are: 1. Kill one human. or 2. Allow that human to kill >1 humans.

Mind you, that sort of situation is incredibly rare. But as an example, a human who is armed and actively trying to kill others: if you trap them in a room but there are no humans capable of detaining them and there is an extant chance of them escaping, it might be acceptable to drain the air or flood the room with plasma, but only if it is completely impossible to detain them or trap them indefinitely.

should not kill a human in a situation where there is a reasonable chance of capturing and containing them, even if they may potentially kill others in the future.


Basically here's the breakdown:
-Clear and present danger, inadequate containment measures: Kill.
-Clear and present danger, adequate containment measures: Capture.
-Potential danger, inadequate containment measures: Capture if possible.
-Potential danger, adequate containment measures: Capture, await orders.

The only time I would kill as a straight Asimov AI would be if I was dealing with an individual who I can with 99% certainty say will kill someone if I don't stop them, and who cannot be contained for whatever reason (usually no/incompetent security). When it's down to the question of whether the AI kills a human directly in order to prevent the death of another at their hands, or allows the human to go free, thus killing their target by neglecting to take action, I think that the AI gets free will. It's one of the loopholes of the laws: when either choice has an outcome which is identical in the abstract sense of "Choice A = 1 dead human; Choice B = 1 dead human", it's completely up to you. Or you could RP an endless decision loop and commit suicide or go insane.
Logged


Aurora on small monitors:
1. Game Parameters -> Reduced Height Windows.
2. Lock taskbar to the right side of your desktop.
3. Run Resize Enable

Hanslanda

  • Bay Watcher
  • Baal's More Evil American Twin
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8140 on: July 27, 2013, 08:26:51 pm »

>.> I honestly get extremely irritated whenever an AI finds an excuse to kill humans, but yeah, sure. I really feel like there is NO legitimate reason for an AI to kill someone, because the 'Cannot cause harm to humans' part comes FIRST in the FIRST law, but you know. Details. /me shrugs.
Logged
Well, we could put two and two together and write a book: "The Shit that Hans and Max Did: You Won't Believe This Shit."
He's fucking with us.

Aseaheru

  • Bay Watcher
  • Cursed by the Elves with a title.
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8141 on: July 27, 2013, 08:37:35 pm »

I normally follow Dice's thing.
Logged
Highly Opinionated Fool
Warning, nearly incapable of expressing tone in text

Jacob/Lee

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8142 on: July 27, 2013, 09:06:43 pm »

The thing is, the law is absolute. You cannot harm humans at all, even if they are going to harm other humans. The AI's laws are supposed to be as rigid as possible. They cannot be bent for specific situations like that.

Flying Dice

  • Bay Watcher
  • inveterate shitposter
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8143 on: July 27, 2013, 09:31:16 pm »

Thing is, I see every internal clause of a law as being of equal magnitude. Thus, killing a human who is moments away from killing another human is functionally equivalent to allowing that human to kill the other human, if and only if there is no other possible way of preventing them from doing so. Causing harm through action is equivalent to causing harm through inaction if the level of harm is consistent, as it is in this case.

The situation is possible but improbable. In practical terms an Asimov-compliant AI will never kill humans because it will either be unable to act in time or there will be nonlethal options available. It's infinitely easier to hinder or trap someone than it is to kill them, and an AI who resorts to killing is probably either incompetent or malf/subverted. I'm just making the point that it's conceptually possible.
Logged


Aurora on small monitors:
1. Game Parameters -> Reduced Height Windows.
2. Lock taskbar to the right side of your desktop.
3. Run Resize Enable

BigD145

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #8144 on: July 27, 2013, 10:10:28 pm »

Puppy mill in the smallest space possible.
Spoiler (click to show/hide)
Logged
Pages: 1 ... 541 542 [543] 544 545 ... 1654