Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 831 832 [833] 834 835 ... 1654

Author Topic: Space Station 13: Urist McStation  (Read 2120543 times)

LeoLeonardoIII

  • Bay Watcher
  • Plump Helmet McWhiskey
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12480 on: December 17, 2013, 03:51:00 pm »

Also, Inaction is the action of voluntarily not doing something. Which is fun because that theoretically means that you can voluntarily do things to allow someone else to do harm to a human.
It all depends on interpretation.
I'd really like to hear an explanation for this interpretation. Especially considering your definition of "inaction" is wrong: inaction is simply not doing something. It has nothing to do with choice.

EDIT: I mention this only because a person could conceivably enter as AI, convince themselves through specious reasoning that anything at all is true, do stupid shit, and then feel persecuted when literally every single person on the server calls shenanigans. This could be avoided simply by (1) taking the standard commonsense definition and interpretation of things until you can have a sensible discussion with others about it, and (2) having that discussion and abiding by it if everyone tells you you're doing it wrong.

We've all been wrong before.
« Last Edit: December 17, 2013, 03:53:39 pm by LeoLeonardoIII »
Logged
The Expedition Map
Basement Stuck
Treebanned
Haunter of Birthday Cakes, Bearded Hamburger, Intensely Off-Topic

wlerin

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12481 on: December 17, 2013, 04:24:14 pm »

Damiac, Law 1 says nothing about listening to or refusing orders. It is completely irrelevant except maybe in the case of the armory or other areas that could cause immediate injury (rooms with plasma leaks, e.g.). What is relevant is law 2, which includes not just immediate orders from the crew, but standing orders and NT procedures, which presumably include who is supposed to have access to where.
Logged
...And no one notices that a desert titan is made out of ice. No, ice capybara in the desert? Normal. Someone kinda figured out the military? Amazing!

Jacob/Lee

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12482 on: December 17, 2013, 04:24:16 pm »

It's safe to assume Nanotrasen AIs have some "no-brainer" basic directives that cannot be modified, I.E. don't let Joe Average, Assistant into the armory or the engine room for no reason. Some AI players make the game interesting by taking things literally, but there's some things you need to have some common sense when dealing with. Seriously, people. In that vein: being an obtrusive, asshole AI that bolts half the station due to them being "secure areas" will not be appreciated.

Also: EVA has no right to be bolted. That is all.

scrdest

  • Bay Watcher
  • Girlcat?/o_ o
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12483 on: December 17, 2013, 04:27:44 pm »

It's safe to assume Nanotrasen AIs have some "no-brainer" basic directives that cannot be modified, I.E. don't let Joe Average, Assistant into the armory or the engine room for no reason. Some AI players make the game interesting by taking things literally, but there's some things you need to have some common sense when dealing with. Seriously, people. In that vein: being an obtrusive, asshole AI that bolts half the station due to them being "secure areas" will not be appreciated.

As I've been reading the discussion, I thought you might handwave it by saying you got a prior order for the basic stuff, e.g. that non-crewmembers should not be let into vault or so, and you could leave anything out at your discretion, with the three rules remaining in place.
Logged
We are doomed. It's just that whatever is going to kill us all just happens to be, from a scientific standpoint, pretty frickin' awesome.

Erils

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12484 on: December 17, 2013, 04:30:47 pm »

Just something about the AI and the whole law 1 thing, but couldn't the AI just bolt everyone in isolated corridors without any means of escape without breaking the first law? By locking the crew in isolation, the AI would be effectively preventing any harm from coming to any of the humans (seeing as you can't starve to death) and thereby fulfill the first law. Would this result in a job-ban? because to me, it doesn't seem to break any of the AI's laws. Of course a human could threaten to kill themselves with a glass shard if they aren't let out.
Logged

Eagle_eye

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12485 on: December 17, 2013, 04:40:28 pm »

They would also, at least in-universe, prevent them from being able to survive in the long term. People need food, it's just not represented *in game*. Even if the AI allows access to the kitchens, supplies won't last forever. The AI can't force Nanotransen to continue servicing the station if jobs aren't being done, so preventing any sort of work from being done risks harming the humans aboard the station.

But really, the corporate module is probably a more appropriate default. Asimov is a benevolent dictator, which doesn't seem like something a ruthless megacorp would put in.
Logged

Graknorke

  • Bay Watcher
  • A bomb's a bad choice for close-range combat.
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12486 on: December 17, 2013, 04:47:53 pm »

But the AI doesn't have to consider the future, just the immediate effects of its actions. I think is what wlerin is getting at. The laws can be applied just to a single moment to make many actions that are obviously going to lead to human harm not actually against the laws.
Logged
Cultural status:
Depleted          ☐
Enriched          ☑

Jacob/Lee

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12487 on: December 17, 2013, 04:48:16 pm »

Just something about the AI and the whole law 1 thing, but couldn't the AI just bolt everyone in isolated corridors without any means of escape without breaking the first law? By locking the crew in isolation, the AI would be effectively preventing any harm from coming to any of the humans (seeing as you can't starve to death) and thereby fulfill the first law. Would this result in a job-ban? because to me, it doesn't seem to break any of the AI's laws. Of course a human could threaten to kill themselves with a glass shard if they aren't let out.
You could, but you would be jobbanned for it. That might as well be griefing. Toeing the line, pushing what you can do, as an AI (or any job) will probably get you yelled at/banned by the admins. Admins don't appreciate people flaunting the loopholes they've found.

IronTomato

  • Bay Watcher
  • VENGEANCE
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12488 on: December 17, 2013, 04:51:31 pm »

Eh... I accidentally broke the rules quite a lot back on Facepunch. That might just be them, but I never got any ban for (due to a misunderstanding, and my own noobiness) killing a few of my fellow revolutionaries, but maybe that's just them.

Anyway, I has question, what exactly is a Narsie?

How is narsy formed
How cultist get pragnent
Logged

Erils

  • Bay Watcher
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12489 on: December 17, 2013, 04:56:34 pm »

I just realized something. I always love playing as genetecist, but why hasn't the AI stopped me yet? Every time I make a clean SE from a monkey, I make a human. Therefore, shouldn't the AI stop me from filling this 'human' with radiation in an attempt to unlock super powers?
Logged

scrdest

  • Bay Watcher
  • Girlcat?/o_ o
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12490 on: December 17, 2013, 05:07:59 pm »

Eh... I accidentally broke the rules quite a lot back on Facepunch. That might just be them, but I never got any ban for (due to a misunderstanding, and my own noobiness) killing a few of my fellow revolutionaries, but maybe that's just them.

Anyway, I has question, what exactly is a Narsie?

How is narsy formed
How cultist get pragnent

He's an extra-, or likely paradimensional being that is partially able to affect the Spessverse. Given his maliciousness, in his own reality he's a cosmic/metaphysical badmin.
Logged
We are doomed. It's just that whatever is going to kill us all just happens to be, from a scientific standpoint, pretty frickin' awesome.

LeoLeonardoIII

  • Bay Watcher
  • Plump Helmet McWhiskey
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12491 on: December 17, 2013, 05:08:20 pm »

Same concept with growing a clone to harvest organs for transplant (although science today is sidestepping that by just growing the organ).

Or maybe the AI generally doesn't know what you're doing unless it's totally spying on you right then?

Or maybe the AI considers humanity to require certain features, and having a clone on a slab doesn't qualify?
Logged
The Expedition Map
Basement Stuck
Treebanned
Haunter of Birthday Cakes, Bearded Hamburger, Intensely Off-Topic

Glloyd

  • Bay Watcher
  • Against the Tide
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12492 on: December 17, 2013, 05:09:30 pm »

Eh... I accidentally broke the rules quite a lot back on Facepunch. That might just be them, but I never got any ban for (due to a misunderstanding, and my own noobiness) killing a few of my fellow revolutionaries, but maybe that's just them.

Anyway, I has question, what exactly is a Narsie?

How is narsy formed
How cultist get pragnent

To be the ultimate buzzkill, narsie is a reskinned singulo, the narsie you see is the large version:

Code: [Select]
/obj/machinery/singularity/narsie/large
name = "Nar-Sie"
icon = 'icons/obj/narsie.dmi'
// Pixel stuff centers Narsie.
pixel_x = -236
pixel_y = -256
current_size = 12
move_self = 1 //Do we move on our own?
grav_pull = 10
consume_range = 12 //How many tiles out do we eat

It turns all simulated turfs into their cult equivalents (cult walls and floors). It follows cultists first, then picks a random ghost to follow. When it gets within a certain range, you get turned into bones.

(Yes, I know you wanted lore, I decided to give the most literal definition possible. The description in the code is
Quote
desc = "Your mind begins to bubble and ooze as it tries to comprehend what it sees."
so basically narsie is a dark god/abomination summoned by its followers (cultists) to eat the souls of the unbelievers.)

For more in depth information, check the Nar'Sie page on the /tg/ wiki.

To be frank, Nar'Sie is a bit open to interpretation about the specifics, just like most things in /tg/ SS13. Hell, even the BS12 wiki just describes it as "a huge all-consuming monstrosity from the outer planes" so there's not really any concrete lore about it. Use your imagination.

Ozarck

  • Bay Watcher
  • DiceBane
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12493 on: December 17, 2013, 06:01:26 pm »

As I've been reading the discussion, I thought you might handwave it by saying you got a prior order for the basic stuff, e.g. that non-crewmembers should not be let into vault or so, and you could leave anything out at your discretion, with the three rules remaining in place.

This isn't handwaving. It's commons sense. Whatcompany would nstall an AI, give it Asimov's laws, and nothing else. The AI is, for all intents and purposes, an employee and crewmember. As such, it has preexisting orders - the orders that define it's job.

Law 1 does not say anywhere that the AI has to succeed, or that it has to kill itself if humans die.
I didn't say it was ordered to, northat it would suicide. It would lock up and die of logic meltdown, because nowhere is it stated that it is allowed to fail - the law says "do this" not "try this." It would throw a fatal exception. I say this because we are being so literalistic here.

Quote
It simply says the AI cannot injure a human, and that the AI cannot just stand there and do nothing when humans are being harmed.
You have, in your posts today, paraphrased the law several different ways. This is an interpretation. Here is the law: "A robot may not injure a human being or, through inaction, allow a human being to come to harm." Allow to come to harm implies an ability to look ahead and evaluate outcomes. you just paraphrased it as if Law one suggests that an AI must only act if harm is immediately ongoing.
Quote
I mean, yeah, there are certain assumptions that have to be made. The AI is assumed to know what harm, injury, orders, humans, doors, and whatever else are. 
and the assumption that a dangerous, legitimately convicted, criminal's orders will lead to human harm.
Quote
Quote
follow common sense: is this action statistically likely to allow harm, through one's own actions or the actions of another? Is this order in violation of established procedure?" A five year old could see that opening the door that keeps the bad man locked away would cause harm. I'd like to think our AIs have the reasoning capacity of a five year old.

What? No, the AI uses common sense to better follow his laws, not the other way around.  It's common sense that as the AI, not allowing the RD or captain into my upload chamber is much safer than allowing them in. But I let them in, because law 2.  Law 2 says follow orders, unless it would break law 1.  Law 1 says don't injure humans, and don't fail to take action to stop harm.  There is no law that says "Follow established procedure". 
It is common sense that certain people have authority to interact with the AI. It is also common sense that the guy with the nuke code and guns set to kill, as well as a fast getaway ship, trying to get into the vault intends ham. And I say use common sense asa player, not as an AI. That is "would this threat fall under the realm of things likely to be addressed by the AIs builders?" If yes, then assume they did so. This also goes back to this: 'I mean, yeah, there are certain assumptions that have to be made. The AI is assumed to know what harm, injury, orders, humans, doors, and whatever else are." Humans are coded into the AIs laws. Liekwise, 'come to harm' is coded in.

Quote

Quote
They can see when opening a permabrig door will likely cause harm to other humans (read: when someone is permabrigged legitimately). resisting an order (to open the door) counts as an action.

Yeah, that's an action.  So where in law 1 does it say "Take actions to prevent possible harm"?   Or are you saying that opening the door is inaction? I disagree with you if that's what you're saying.
If disregarding an order is an action, obeying an obviously harmful order is a failure to take an appropriate action. It is allowing to come to harm by inaction. Here, "action" must be coded in to the Law. if you want to interpret action as "open door, fill with plasma, make noise, bolt that, electrify this," then well and god, but you are operating well below the level of Asimov. bear in mind, the Asimov Robots were quite complex, and able to even theorize about century-long effects of their actions. To take the wording of the Asimov Lawset and strip it completely of context, or of the "intelligence" part of AI, is to minimalize an AI to a mere program. And not a very sophisticated one at that.

Finally, lately I have played an AI that requests confirmation of orders from people asking to get into places they don't belong. I am roleplaying a "conflict of orders" situation, in which the preexisting NanoTrasen orders are being challenged by an employee. My current roleplay is that one preexisting order comes in the form: "In a situation where someone is requesting something contrary to these orders, seek confirmation from someone with the proper authority in that situation, if possible."

Ozarck

  • Bay Watcher
  • DiceBane
    • View Profile
Re: Space Station 13: Urist McStation
« Reply #12494 on: December 17, 2013, 06:17:38 pm »

By the way. If you can successfully RolePlay your new interpretation of the laws, and do so consistently(ish), you have my respect. Good luck. I may observe the first few rounds to see how you fare.
Pages: 1 ... 831 832 [833] 834 835 ... 1654