Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Poll

Gentlemen, I feel that it is time we go to....

PURPLE
- 0 (0%)
ALERT
- 0 (0%)
(I need suggestions is what I'm saying.)
- 0 (0%)

Total Members Voted: 0


Pages: 1 ... 33 34 [35]

Author Topic: Ethical Dilemmas: PURPLE ALERT  (Read 36795 times)

Felius

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #510 on: July 14, 2011, 10:27:37 am »

The year is 2050.
(...)
With a Button press you can destroy all these petty things people are so proud of.

Do you do it?

This dillema doesn't really work. People will keep hating. Unless you make literally that no person is any different of other person in any way, including thoughts, beliefs, location of living, etc. people will find something to hate others.

even in that case...what sort of "anonymous" person are you referring to?
everyone has to become someone anonymous.
So why not make everyone like me, with that button? wouldn't it be the same? it would be even better, because "we" wouldn't kill ourselves, we might argue about who's the original one, but we'd go with a direct democracy, because the thoughts of each of us would be the same...
and nothing would happen ever again.
Unless you synchronize the thoughts of everyone after the equalization they would develop in different unique discrete individuals. If you synchronize the thoughts, you are creating a hive mind, which is a whole other story.
Logged
"Why? We're the Good Guys, aren't we?"
"Yes, but that rather hinges on doing certain things and not doing others." - Paraphrased from Discworld.

shadenight123

  • Bay Watcher
  • Death. To all. Except my dwarves.
    • View Profile
    • My Twitter
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #511 on: July 14, 2011, 10:55:08 am »

different thoughts, or different memories?
both are similar but on a whole different scale.
let's take three subjects.
A, B, and C.
A white caucasian, a black african and a yellow chinese.
The white is a fanatical catholical-nazi, the chinese an overzealous nationalist and destroyer of all religions, and the african is a fervent muslim of the heavy side of the religion.
(so, to say it bluntly, take three completely different people of three different ideals and faiths and the like)
now they all become A1, A2, A3.
they are all physically the same.
but their minds are different. their faith, and all that is not physical is kept.
While the "white caucasian nazi-catholic" might be fine, (nazism WAS the purity of race, if all the race are the same=only pure race remains).
but catholicisim might clash against a muslim, on female rights/wrongs (this is taking the WORSE possible of ALL the religions, mind you, so i'm only speaking in extremisms)
and the "nationalist" might clash against not only their ideas but the place they were born into, "i was born in greath mother china, i am better!".

So physical is one thing, but is not the complete answer.

Now you transform them again.
A, B, C
but you copy paste A, or B, or C mindpattern in the others.
what you obtain is best known as a multiple personality disorders, a Black african catholic nazist? (stressing the nazi, not the catholic part)
then they would start to purge each others, because they'd be an affront to themselves, maybe some might try and change their ways of thinking...
(disney reference anyone? you hate pandas, you become a panda, you start loving pandas otherwise you might end up stuffed on a wall?)
but in the end, it wouldn't change much.

Now you physically and mentally transform them all.
A, A and A.
while it might be bothersome not to recognise the original A in the middle, it's not like anyone is going to suffer from it.
being there "no sexuality" (we shall consider all three A's to be asexual, and reproduce through strict cloning system). the problem of "fighting" for procreation comes lest.
Now, the geograpichal collocation of said individuals might be a problem.
(let's say you're an A in europe, it's different from being an A in the saharian desert) but you adapt, we are all different as races/thoughts because of our adaptation to the live around us.
adapting makes you different, at first, but you just repeat the process through cloning. and not in the "superhuman" nieztche way of the term, were the rest should die.
You just start cloning the different with the genes of the adapted, all of them, and you create a new "batch" of clones all identical, all superior to the precedent generation, and you let the old one die in peace.
you repeat the process until you obtain precisely what is asked: a generation of all identical beings, of same skin.
For their intellectual part it's even easier. Starting from generation One you teach them all except religions, of any kind.
So only science and the like. Might be unfair for the faithful or the like, but hey, if a god truly exists he'll show his face again to this new guys who have no idea what a god is.
So generation one grows without knowing religion.
generation 2 even.
and you procede like that.
Then you simply place in front of them a plan a grand plan, not a tiny little one, one that can concentrate their resources to the limit, like:
"Conquer the ENTIRE galaxy". (longer and harder plan? better!)
in this way, providing you give each of them the same exact things, (like one spoon to each, one bed to each, one teddy bear to each) they will grow identical in some but not all experiences...
(the possibilities of having a tooth ache or a stomach ache cannot be discerned, obviously).
but you do not need to synchronize their thoughts.
You could create such a feat and use it as a sort of "corrective" measurement, if ever need their be in a society which should be "perfect" as the question asks.
in that way a criminal (of what? self murder? you just clone it again and his fine alive) get resynched with the thoughts of the best worker/soldier/control unit, and is back to work.
Obviously his mind is searched to understand what made him stray, and things get corrected.
in this way you achieve peace, tollerance and everything nice:
Complete annihilation of everything wrong, where wrong is marked as anything different from the "best possible outcome, in the best possible world" like in Voltaires candide.
Logged
“Well,” he said. “We’re in the Forgotten hunting grounds I take it. Your screams just woke them up early. Congratulations, Lyara.”
“Do something!” she whispered, trying to keep her sight on all of them at once.
Basileus clapped his hands once. The Forgotten took a step forward, attracted by the sound.
“There, I did something. I clapped. I like clapping,” he said. -The Investigator And The Case Of The Missing Brain.

Cthulhu

  • Bay Watcher
  • A squid
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #512 on: July 14, 2011, 12:57:35 pm »

A fat man is on a bridge.  You can push him off, and he'll land in a basket of orphans.  If you don't push him off, he'll push you off, and you'll land in a barrel of puppies.  At the same time, the Bugati God just made (So expensive even He can't afford it) stalls on a train track.  That's irrelevant though.

Also it's the matrix.

What do you do?
Logged
Shoes...

Glowcat

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #513 on: July 14, 2011, 01:14:32 pm »

A fat man is on a bridge.  You can push him off, and he'll land in a basket of orphans.  If you don't push him off, he'll push you off, and you'll land in a barrel of puppies.  At the same time, the Bugati God just made (So expensive even He can't afford it) stalls on a train track.  That's irrelevant though.

Also it's the matrix.

What do you do?

Get ye flask?

Dilemma ripped from SG-1: Your universe is doomed but there may be a way to save it by traveling to a parallel universe and taking something vital of theirs to save your own. This will doom the parallel universe. Would the responsibility of saving your own world (strong in-group) be worth the sabotage of the other universe (out group you'll probably never ever see again)?

Dilemma 2: Same doomed universe premise as the first, but this time you aren't even sure you can save your own. The only possibility to save your universe would be to create your own universes (assume you can do that), accelerating their development and then violently destroying them in an experiment that will potentially reveal the way to save your universe. This accelerated development process will take long enough that there is a significant chance for intelligent life to have formed in each attempt at a universe and the experiment requires that you fire it while such life is likely present. How many doomed universes would you create to save your own?
Logged
Totally a weretrain. Very much trains!
I'm going to steamroll this house.

Felius

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #514 on: July 14, 2011, 01:45:03 pm »

A fat man is on a bridge.  You can push him off, and he'll land in a basket of orphans.  If you don't push him off, he'll push you off, and you'll land in a barrel of puppies.  At the same time, the Bugati God just made (So expensive even He can't afford it) stalls on a train track.  That's irrelevant though.

Also it's the matrix.

What do you do?

Get ye flask?

Dilemma ripped from SG-1: Your universe is doomed but there may be a way to save it by traveling to a parallel universe and taking something vital of theirs to save your own. This will doom the parallel universe. Would the responsibility of saving your own world (strong in-group) be worth the sabotage of the other universe (out group you'll probably never ever see again)?

Dilemma 2: Same doomed universe premise as the first, but this time you aren't even sure you can save your own. The only possibility to save your universe would be to create your own universes (assume you can do that), accelerating their development and then violently destroying them in an experiment that will potentially reveal the way to save your universe. This accelerated development process will take long enough that there is a significant chance for intelligent life to have formed in each attempt at a universe and the experiment requires that you fire it while such life is likely present. How many doomed universes would you create to save your own?
You can't get ye flask.

Also, about the 2 dilemmas: I would destroy the other universes, or as many universes it takes? Ethical? No. In fact, the second one can be interpreted as quite evil. On the other hand, I really don't want the universe I live in, with the people I care for, to be destroyed. If I could migrate myself and my group to a non doomed universe the answer might differ.
Logged
"Why? We're the Good Guys, aren't we?"
"Yes, but that rather hinges on doing certain things and not doing others." - Paraphrased from Discworld.

Soadreqm

  • Bay Watcher
  • I'm okay with this. I'm okay with a lot of things.
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #515 on: July 14, 2011, 02:16:05 pm »

I'm assuming that the other guy is probably going to demand somewhere over 150$, which doesn't sound unreasonable to me, and don't really mind losing if I get more than a hundred bucks for it.

So, you're assuming the other guy to ask for that much. But what if this assumption is unfounded?

Then I will probably lose fifty bucks. This isn't really a rational decision, mind, and any reasoning I give you will be an after-the-fact justification at best. The fun part is that depending on what people are statistically likely to choose, it might give better returns than your entirely logical strategy. I'm betting fifty hypothetical dollars that it will. :)

(Also in the situation that the other player also picks the maximum amount, we will be best friends forever.)
(Also in the situation that I have been trapped by an insane wizard and am actually playing the game against myself, cooperation always wins.)

But a more interesting point: your reasoning for choosing the lowest amount would be equally valid if the lowest amount was significantly lower. Would you still do that? Why/why not? If the range was from 2$ to 100$, would you write down two dollars for the chance to win four? I'm asking because if the range was, say, 100-150 dollars, with a 100 dollar bonus/penalty, I'd probably go for the 100 dollars. If it was 100-110 dollars, with a 100 dollar bonus/penalty, I'd definitely go for the 100 dollars. I'm kind of curious about how small the monetary rewards need to go before winning the game doesn't matter for you any more.

About the universe-destroying thing, I don't think my human ethics really work on that. I have absolutely no way of truly understanding the consequences of either option. Also I have a machine that can create a universe. Seriously, what? What happened here? 0_0

Yeah, if there really was no better option, I probably wouldn't lose too much sleep over callously destroying arbitrary amounts of universes to save my own. Isn't there anything else I can use the universe machine for, though? like jumping to a different universe and surviving the expiration of my own? Using time manipulation to orchestrate an eon-spanning scheme to map and relocate all life in a sacrificial universe into a THIRD universe before scrapping it for universe-stuff? Using time manipulation to spend some extra time in a temporary universe to come up with a better plan? Stealing the required ingredient from the past of my own universe, simultaneously completing and starting a stable time loop?
Logged

Kay12

  • Bay Watcher
  • Fighting for Elite Liberal values since 2009!
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #516 on: July 14, 2011, 02:40:16 pm »

For 2-100$ bets and 2$ bonus/malus... Again, it depends. Mainly about the other player, of course, as (s)he is the one actually deciding how much I should offer for an optimal outcome (for myself).

From the pure competition point of view, 2 dollars would still be best. That guarantees at least a tie. However, this isn't a zero-sum game, so scoring better than the other guy isn't vital. Also, the comparatively small bonus/malus makes mid-prices more stable - you no longer get tossed 50$ in one direction for choosing a dollar or too two many. Both parties can gather good rewards by placing a rather high bet, as the penalty of betting too high is significantly lower.

I would absolutely not, in any case, bet 100$ unless I was sure that the other guy was voting 100$ as well (mainly when trying to cooperate or trying to maximize collective gain). For maximizing my own profits, 99$ and 98$ are simply better - they yield the same potential (or more in the case of 99$) with slightly less risk. And, assuming the opponent follows the same path of reasoning trying to maximize his/her profits at my expense, we'd eventually be back at 2$.

But this is, again, technically a derailment, being not a problem of ethical nature.
Logged
Try Liberal Crime Squad, an excellent Liberal Crime adventure game by Toady One and the open source community!
LCS in SourceForge - LCS Wiki - Forum thread for 4.04

Virex

  • Bay Watcher
  • Subjects interest attracted. Annalyses pending...
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #517 on: July 14, 2011, 03:26:32 pm »

With a Button press you can destroy all these petty things people are so proud of.
And miss my only chance to witness the apocalypse? Hell no!
Logged

counting

  • Bay Watcher
  • Zenist
    • View Profile
    • Crazy Zenist Hospital
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #518 on: July 14, 2011, 07:10:01 pm »

For 2-100$ bets and 2$ bonus/malus... Again, it depends. Mainly about the other player, of course, as (s)he is the one actually deciding how much I should offer for an optimal outcome (for myself).

From the pure competition point of view, 2 dollars would still be best. That guarantees at least a tie. However, this isn't a zero-sum game, so scoring better than the other guy isn't vital. Also, the comparatively small bonus/malus makes mid-prices more stable - you no longer get tossed 50$ in one direction for choosing a dollar or too two many. Both parties can gather good rewards by placing a rather high bet, as the penalty of betting too high is significantly lower.

I would absolutely not, in any case, bet 100$ unless I was sure that the other guy was voting 100$ as well (mainly when trying to cooperate or trying to maximize collective gain). For maximizing my own profits, 99$ and 98$ are simply better - they yield the same potential (or more in the case of 99$) with slightly less risk. And, assuming the opponent follows the same path of reasoning trying to maximize his/her profits at my expense, we'd eventually be back at 2$.

But this is, again, technically a derailment, being not a problem of ethical nature.

It's finally a very close reasoning as people tend to choose the amount around above [max-bonus], when the bonus is small enough, and [max-min] distance is large. But it brings us back to why I choose the larger bonus, and medium rage of [max-min] as test settings. Since experiments tells us, people will not behave the same. And people will choose a very wide range of possibilities. And I wonder if its base on ethical or pure logical.

1. If you don't think of being "greedy" in the first place, then you will choose a potential harmful $200, since it will reward you not monetary gain, but a morality gain. (You are a good guy, and if the others also be good guy, and you both win big)

2. If you have little doubt in human nature, and somewhat contain not so much "animal spirit" in you, than you will probably choose a high middle price, the same reason of choosing a little bit higher of [max - bonus]. But you must resist the urge of retaliations if others choose a slightly lower number. Since it's fair if you think the same in their shoes on the winning side. Although you may not be gaining the most every time with others are also skeptical, but if you meet a "good guy" on the above, then you may gain more. But this won't last long, because sooner or later, everyone learned not to trust so easily over time.

3. If everyone keeps being calculated, things may be gone out of spiral, and a society without ethical backbone, then everyone think "if not friend, then enemy" is a better strategy. So forming a racist or Nazi party-like society will be preferred. Once a label is marked, it's a fair game again. You either label a max price with friends, or label min price with everyone else.

There are other possible combination of strategies can existed between the 1 and 3 and else, but finally if every one of them all failed to work, and society keeps breaking down to anarchy with people always being skeptical about everything but themselves. Then it's best always choose the min price. And you will live in fear and calculated all the time in your life. It's the best way to survive at the minimum, but not in any good way everyone would want.

Hence you can see, its really not just the game of two people, and pure math. But a rather different choices, base on what kind of society and mind-set you prepared for yourself, or even related to if there is a government-type/religious structure existed to maintain "common law of lives". If everyone think generally "good karma" and being "dumb". It has the best overall benefits. But if you think too much, and being greedy, than it's ironically letting everyone to be poorer and spiral into chaos. Not a dilemma enough for you? Or you already label yourself with one and expect everyone be like that?
« Last Edit: July 14, 2011, 07:17:11 pm by counting »
Logged
Currency is not excessive, but a necessity.
The stark assumption:
Individuals trade with each other only through the intermediation of specialist traders called: shops.
Nelson and Winter:
The challenge to an evolutionary formation is this: it must provide an analysis that at least comes close to matching the power of the neoclassical theory to predict and illuminate the macro-economic patterns of growth

Grek

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #519 on: July 15, 2011, 12:11:10 am »

For those of you that said yes to eating Wilbur, would you also say yes to eating Rubilw, who is a human being that has been genetically modified into wanting to be eaten?

E:
As for the universe creating situation, my answer to this depends on whether we must simply simulate the universe and get information out of it or if we are talking about physically transporting the contents between "seperate" universes. In the former case it is perfectly fine to "sacrifice" as many non-real simulations as you need to, while in the latter case it is only justifiable if you think that they could, in turn, steal one from someone down the lie and continue that "steal a universe" strategy until the missing part is stolen from a universe that can synthesize it internally.
« Last Edit: July 15, 2011, 12:18:50 am by Grek »
Logged

Realmfighter

  • Bay Watcher
  • Yeaah?
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #520 on: July 15, 2011, 12:17:13 am »

Am I allowed to find the asshole genetically modifying things so that they want to be eaten and punch them in the face?
Logged
We may not be as brave as Gryffindor, as willing to get our hands dirty as Hufflepuff, or as devious as Slytherin, but there is nothing, nothing more dangerous than a little too much knowledge and a conscience that is open to debate

Kay12

  • Bay Watcher
  • Fighting for Elite Liberal values since 2009!
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #521 on: July 15, 2011, 12:51:11 am »

For those of you that said yes to eating Wilbur, would you also say yes to eating Rubilw, who is a human being that has been genetically modified into wanting to be eaten?

I'm not really into cannibalism but I don't really see a major ethical problem in that. Really, I don't get the whole fuss about Soylent Green being people either. In a world that's rapidly running out of resources, why not eat dead humans that have been made into nutritious edible pellets? They were even killed in a very peaceful manner (and voluntarily).


Actually, now that I mentioned Soylent Green, why not post your opinions as well about the situation from the movie? The situation goes as follows: The world is overcrowded and food is scarce. The two main food sources as soy/lentil pellets called Soylent. A new brand of Soylent, Soylent Green, is released for sale. It's relatively cheap, tasty and plentiful. Then you happen to learn that Soylent Green is being produced from the bodies of the dead humans. No one is being killed for manufacturing the product, as the bodies are collected from euthanasia plants. While the actual movie had a somewhat more grim scenario (it was somehow related to criminal activities, evil corporations and the environment being in almost irrecovable condition), let's stick with this one: what would you do if you learned that Soylent Green was made out of people?
« Last Edit: July 15, 2011, 06:52:07 am by Kay12 »
Logged
Try Liberal Crime Squad, an excellent Liberal Crime adventure game by Toady One and the open source community!
LCS in SourceForge - LCS Wiki - Forum thread for 4.04

counting

  • Bay Watcher
  • Zenist
    • View Profile
    • Crazy Zenist Hospital
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #522 on: July 15, 2011, 08:16:59 am »

For those of you that said yes to eating Wilbur, would you also say yes to eating Rubilw, who is a human being that has been genetically modified into wanting to be eaten?

I'm not really into cannibalism but I don't really see a major ethical problem in that. Really, I don't get the whole fuss about Soylent Green being people either. In a world that's rapidly running out of resources, why not eat dead humans that have been made into nutritious edible pellets? They were even killed in a very peaceful manner (and voluntarily).


Actually, now that I mentioned Soylent Green, why not post your opinions as well about the situation from the movie? The situation goes as follows: The world is overcrowded and food is scarce. The two main food sources as soy/lentil pellets called Soylent. A new brand of Soylent, Soylent Green, is released for sale. It's relatively cheap, tasty and plentiful. Then you happen to learn that Soylent Green is being produced from the bodies of the dead humans. No one is being killed for manufacturing the product, as the bodies are collected from euthanasia plants. While the actual movie had a somewhat more grim scenario (it was somehow related to criminal activities, evil corporations and the environment being in almost irrecovable condition), let's stick with this one: what would you do if you learned that Soylent Green was made out of people?

This is somewhat a reminder of the recycling tanks in Sid Meier's Alpha Centauri.

"It is every citizen's final duty to go into the tanks and become one with all the people."

(How nice if it can be used on drones, then there will be no unhappy :P)
« Last Edit: July 15, 2011, 08:18:44 am by counting »
Logged
Currency is not excessive, but a necessity.
The stark assumption:
Individuals trade with each other only through the intermediation of specialist traders called: shops.
Nelson and Winter:
The challenge to an evolutionary formation is this: it must provide an analysis that at least comes close to matching the power of the neoclassical theory to predict and illuminate the macro-economic patterns of growth

Kay12

  • Bay Watcher
  • Fighting for Elite Liberal values since 2009!
    • View Profile
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #523 on: July 15, 2011, 08:26:23 am »

Alpha Centauri was awesome in the field of ethical questions, by the way. While Civilization had to tread carefully to avoid demonizing any civilization, AC jokes about any major political movement and has a plethora of interesting ethical questions.
Logged
Try Liberal Crime Squad, an excellent Liberal Crime adventure game by Toady One and the open source community!
LCS in SourceForge - LCS Wiki - Forum thread for 4.04

counting

  • Bay Watcher
  • Zenist
    • View Profile
    • Crazy Zenist Hospital
Re: Ethical Dilemmas: PURPLE ALERT
« Reply #524 on: July 15, 2011, 09:18:36 am »

Alpha Centauri was awesome in the field of ethical questions, by the way. While Civilization had to tread carefully to avoid demonizing any civilization, AC jokes about any major political movement and has a plethora of interesting ethical questions.

It even has solution to that - ethical calculus

I wonder what it will look like. Alpha Centauri never bows down before political correctness.

"My gift to industry is the genetically engineered worker, or Genejack. Specially designed for labor, the Genejack's muscles and nerves are ideal for his task, and the cerebral cortex has been atrophied so that he can desire nothing except to perform his duties. Tyranny, you say? How can you tyrannize someone who cannot feel pain?"

  - Chairman Sheng-ji Yang, "Essays on Mind and Matter"
« Last Edit: July 15, 2011, 09:24:34 am by counting »
Logged
Currency is not excessive, but a necessity.
The stark assumption:
Individuals trade with each other only through the intermediation of specialist traders called: shops.
Nelson and Winter:
The challenge to an evolutionary formation is this: it must provide an analysis that at least comes close to matching the power of the neoclassical theory to predict and illuminate the macro-economic patterns of growth
Pages: 1 ... 33 34 [35]