Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Poll

Gentlemen, I feel that it is time we go to....

PURPLE
- 0 (0%)
ALERT
- 0 (0%)
(I need suggestions is what I'm saying.)
- 0 (0%)

Total Members Voted: 0


Pages: 1 ... 11 12 [13] 14 15 ... 35

Author Topic: Ethical Dilemmas: PURPLE ALERT  (Read 36889 times)

Bauglir

  • Bay Watcher
  • Let us make Good
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #180 on: June 29, 2011, 08:33:15 pm »

Hmm... Strict utilitarianism (in which there is no distinction between action and inaction, and therefore no course is better than the other, so you're free to choose whatever) versus application of the categorical imperative... Well, since a correctly long-sighted utilitarian view includes that the murder sets a precedent that you wouldn't want to exist in society (thus violating the categorical imperative, since you can't endorse that it be the correct response for everyone), I'm gonna have to go with let the loved one die. Short term consequences are identical, long-term I have to temper my instinctive preference with an understanding that I can't support a system whereby murdering a dude is how you get organs.

That said, I'm not sure that this would be my reasoning. Unfortunately, I have problems sticking with one philosophy because I always seem to encounter corner cases that aren't acceptable. So take this with a grain of salt.
Logged
In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6.
“What are you doing?”, asked Minsky. “I am training a randomly wired neural net to play Tic-Tac-Toe” Sussman replied. “Why is the net wired randomly?”, asked Minsky. “I do not want it to have any preconceptions of how to play”, Sussman said.
Minsky then shut his eyes. “Why do you close your eyes?”, Sussman asked his teacher.
“So that the room will be empty.”
At that moment, Sussman was enlightened.

Fenrir

  • Bay Watcher
  • The Monstrous Wolf
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #181 on: June 29, 2011, 08:41:11 pm »

If murdering him would be the right choice, should you not murder him? If this act is right, would not you be supporting a system where people do the right thing? If it was not the right thing, and therefore something that others should not be doing, you should not do it at all.

Why would you think that you were supporting anything anyway? Unless you reveal yourself to be the murder, I very much doubt that your act will make others commit acts of murder to get organs, so I do not see why you would be supporting anything.
Logged

ChairmanPoo

  • Bay Watcher
  • Send in the clowns
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #182 on: June 29, 2011, 08:53:29 pm »

Quote
Well, since a correctly long-sighted utilitarian view includes that the murder sets a precedent
Only if you get caught, and the premise establishes that you're considering it precisedly because that is unlikely to happen.
Logged
Everyone sucks at everything. Until they don't. Not sucking is a product of time invested.

Bauglir

  • Bay Watcher
  • Let us make Good
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #183 on: June 29, 2011, 08:59:31 pm »

If murdering him is the right choice, then obviously I should murder him. But if I look at it from a strictly utilitarian perpsective where I ignore whatever emotional attachment I feel to my loved one because the man I murder likely has similar attachments, and I cannot be sure of their worth without asking so many questions as to ruin the plan, then I must come to the conclusion that, to the best of my knowledge, each potential death has the same negative value. Because it doesn't matter whether I act or fail to act, and the outcome is the same either way, I'm free to choose whichever course. Neither is worse than the other, so I can do whatever I feel like (which is save my loved one, instinctively). Essentially, I conclude that the murder is neither right nor wrong and is therefore permissible.

However, committing the murder violates the categorical imperative (a concept that I think was formalized by Kant, if anyone's wondering and wants to check me on this, and I'm sure at least half of you are already familiar with this) that basically goes, "It is immoral to perform an act which you would not recommend everyone perform under similar circumstances." Similar in principle to the Golden Rule, but subtly different in that it elevates it from "How do you want to be treated?" to "How do you want society to function?". Even if it's essentially certain that nobody will ever learn of this particular act, performing the murder is tantamount to accepting a society where such murders are standard procedure, as long as nobody can prove they happened. In retrospect, a theoretical utilitarian wouldn't need to accept this, but in practical terms you are never guaranteed secrecy and must therefore accept at least an approximation of it, since sooner or later at least one murder by at least one utilitarian following this logic would be discovered and the course of action thereby lent some support.
Logged
In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6.
“What are you doing?”, asked Minsky. “I am training a randomly wired neural net to play Tic-Tac-Toe” Sussman replied. “Why is the net wired randomly?”, asked Minsky. “I do not want it to have any preconceptions of how to play”, Sussman said.
Minsky then shut his eyes. “Why do you close your eyes?”, Sussman asked his teacher.
“So that the room will be empty.”
At that moment, Sussman was enlightened.

Fenrir

  • Bay Watcher
  • The Monstrous Wolf
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #184 on: June 29, 2011, 09:06:55 pm »

I have a feeling that I should have understood; apologies for making you need to be more verbose.

If murdering him is the right choice, then obviously I should murder him. But if I look at it from a strictly utilitarian perpsective where I ignore whatever emotional attachment I feel to my loved one because the man I murder likely has similar attachments, and I cannot be sure of their worth without asking so many questions as to ruin the plan, then I must come to the conclusion that, to the best of my knowledge, each potential death has the same negative value. Because it doesn't matter whether I act or fail to act, and the outcome is the same either way, I'm free to choose whichever course. Neither is worse than the other, so I can do whatever I feel like (which is save my loved one, instinctively). Essentially, I conclude that the murder is neither right nor wrong and is therefore permissible.

The trouble with this approach is that value is a completely relative term. Value does not exist without someone to value, so you would need to determine who's value you are considering. If it is your own, let us be honest; you value your loved one more than anyone else. If you did not, why do we make a distinction between who we love and who we do not love?

However, committing the murder violates the categorical imperative (a concept that I think was formalized by Kant, if anyone's wondering and wants to check me on this, and I'm sure at least half of you are already familiar with this) that basically goes, "It is immoral to perform an act which you would not recommend everyone perform under similar circumstances." Similar in principle to the Golden Rule, but subtly different in that it elevates it from "How do you want to be treated?" to "How do you want society to function?". Even if it's essentially certain that nobody will ever learn of this particular act, performing the murder is tantamount to accepting a society where such murders are standard procedure, as long as nobody can prove they happened. In retrospect, a theoretical utilitarian wouldn't need to accept this, but in practical terms you are never guaranteed secrecy and must therefore accept at least an approximation of it, since sooner or later at least one murder by at least one utilitarian following this logic would be discovered and the course of action thereby lent some support.

Will not those murders still happen whether you commit the murder yourself or not? If you act does not influence what other people do, I see no reason to consider actions that happen to be similar to your own as conseqences of your actions.
Logged

SalmonGod

  • Bay Watcher
  • Nyarrr
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #185 on: June 29, 2011, 09:21:05 pm »

Quote
Well, since a correctly long-sighted utilitarian view includes that the murder sets a precedent
Only if you get caught, and the premise establishes that you're considering it precisedly because that is unlikely to happen.

Even if it's essentially certain that nobody will ever learn of this particular act, performing the murder is tantamount to accepting a society where such murders are standard procedure, as long as nobody can prove they happened. In retrospect, a theoretical utilitarian wouldn't need to accept this, but in practical terms you are never guaranteed secrecy and must therefore accept at least an approximation of it, since sooner or later at least one murder by at least one utilitarian following this logic would be discovered and the course of action thereby lent some support.

It's not really about getting caught, or creating an example that you believe other people will follow. 

It is the knowledge that it is likely other people will encounter similar circumstances, and you cannot realistically expect them to make a better decision than you do.  In this case, if you cannot resist making the choice that benefits you at the cost of another, then it is reasonable to expect that others will behave the same.  Whether or not others are aware of your actions is irrelevant, because you made that decision on your own and it is therefore likely that others faced with the same decision will do the same.  However, if you handle the decision fairly and responsibly then you have proven, even if only to yourself, that people are capable of doing so and it is reasonable to expect them to do so.

You are the human being that you are most intimately familiar with, and therefore the best indicator of what you can expect from other human beings.  If you cannot expect good behavior from yourself, then it's unreasonable to expect good behavior from others.  Sure, all human beings are unique, with varying motivations, thought processes, and limitations... but other people's differences from yourself are unknown and can hardly inform societal expectations.  Your own standards of behavior are known and proof of at least the minimal potential expectation one can have of society, and it is therefore in a person's best interest to set their standards as high as possible.
Logged
In the land of twilight, under the moon
We dance for the idiots
As the end will come so soon
In the land of twilight

Maybe people should love for the sake of loving, and not with all of these optimization conditions.

Bauglir

  • Bay Watcher
  • Let us make Good
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #186 on: June 29, 2011, 09:28:46 pm »

No problem about the excess verbosity, it's more helpful for formalizing what I intend to get across.

I may value my loved one more than anyone else, but if I'm going with the utilitarian approach, then I am necessarily trying to be objective as possible. I have to consider each person's net value to everyone to whom that person has any value. Obviously, I don't have enough information to make that decision with regards the other person, but I do have reason to assume (unless I know otherwise, but I'm not sure how I can) that both the other person and my own loved one have average value to humanity as a whole (admittedly, that reason is because it seems probable that such value is a bell curve, and I don't think I can explain why in a concrete way that's actually worth the length of appending to this paragraph).

Now, to be honest, regarding the second paragraph, I'm in an interesting situation. I endorse the categorical imperative for practical reasons, but I apply it even in theoretical situations because that's part of the idea. It doesn't matter whether the particular act you're talking about will ever be discovered, because you would have to endorse every other person in the world taking the same action, if they would also never be discovered. It doesn't matter whether or not I do it, but it does matter if I have a reasoning process that leads to that conclusion; I want other people to adopt a reasoning process that leads to not-murdering, and part of that involves adopting such a reasoning process myself and applying it to all areas of my life, especially public ones. But even the areas that don't, because part of that process is valuing consistency.

That said, I'm willing to admit that circumstances matter a lot more than these generalities that I'm talking about, at least as far as how I govern my own morality. But since we're explicitly supposed to ignore specifics, these are the default principles that need to be modified as the situation demands.

EDIT: What SalmonGod said is also good.
Logged
In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6.
“What are you doing?”, asked Minsky. “I am training a randomly wired neural net to play Tic-Tac-Toe” Sussman replied. “Why is the net wired randomly?”, asked Minsky. “I do not want it to have any preconceptions of how to play”, Sussman said.
Minsky then shut his eyes. “Why do you close your eyes?”, Sussman asked his teacher.
“So that the room will be empty.”
At that moment, Sussman was enlightened.

Fenrir

  • Bay Watcher
  • The Monstrous Wolf
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #187 on: June 29, 2011, 09:33:58 pm »

It's not really about getting caught, or creating an example that you believe other people will follow. 

It is the knowledge that it is likely other people will encounter similar circumstances, and you cannot realistically expect them to make a better decision than you do.  In this case, if you cannot resist making the choice that benefits you at the cost of another, then it is reasonable to expect that others will behave the same.  Whether or not others are aware of your actions is irrelevant, because you made that decision on your own and it is therefore likely that others faced with the same decision will do the same.  However, if you handle the decision fairly and responsibly then you have proven, even if only to yourself, that people are capable of doing so and it is reasonable to expect them to do so.

You are the human being that you are most intimately familiar with, and therefore the best indicator of what you can expect from other human beings.  If you cannot expect good behavior from yourself, then it's unreasonable to expect good behavior from others.  Sure, all human beings are unique, with varying motivations, thought processes, and limitations... but other people's differences from yourself are unknown and can hardly inform societal expectations.  Your own standards of behavior are known and proof of at least the minimal potential expectation one can have of society, and it is therefore in a person's best interest to set their standards as high as possible.

That is like saying “the needle on that dial is your best indicator of how much fuel you have, so make sure you don't let that needle move. Jam a piece of wood in there; that should work.”

Changing the indicator in the expectation that you will change the quantities or qualities indicated, is--

Well, it is absurd.

I would also argue that my actions are hardly any indicator of what the rest of society would do under a given circumstance.
Logged

SalmonGod

  • Bay Watcher
  • Nyarrr
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #188 on: June 29, 2011, 09:40:37 pm »

I would also argue that my actions are hardly any indicator of what the rest of society would do under a given circumstance.

No, but it is an indicator of what the rest of society is capable of.  You are a human being.  Society is made of human beings.  If you are capable of something, than the rest of society at least has the potential to be capable of it.  If you have the capability to uphold a really high standard of responsible and compassionate behavior, then the rest of society has the potential to uphold that same standard.  If you don't have that capability, then the rest of society may still have that potential, but you have absolutely no reason to believe that it does and are certainly not participating in the emergence of such.
« Last Edit: June 29, 2011, 09:46:59 pm by SalmonGod »
Logged
In the land of twilight, under the moon
We dance for the idiots
As the end will come so soon
In the land of twilight

Maybe people should love for the sake of loving, and not with all of these optimization conditions.

Fenrir

  • Bay Watcher
  • The Monstrous Wolf
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #189 on: June 29, 2011, 09:49:12 pm »

You ignored the first part of my post; that was where my most important point was.

No, but it is an indicator of what the rest of society is capable of.  You are a human being.  Society is made of human beings.  If you are capable of something, than the rest of society at least has the potential to be capable of it.  If you have the capability to uphold a really high standard of responsible and compassionate behavior, then the rest of society has the potential to uphold that same standard.  If you don't have that capability, then you the rest of society may still have that potential, but you have absolutely no reason to believe that it does and are certainly not participating in the emergence of such.

All right, so we know that we all have the capacity. No matter what I do, that does not change my capacity, and it will not change the capacity of anyone else. How does changing my behaviour in secret compell other people to behave in a similar fashion?

EDIT: I would like to apologize for calling your thinking absurd. Debating politely is difficult enough without bluntness.
« Last Edit: June 29, 2011, 10:04:49 pm by Fenrir »
Logged

Grek

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #190 on: June 29, 2011, 10:18:09 pm »

I'd like to go out on a limb here and point out that dentological and consequentalist ethics are mutally incompatable and that you should only pick one. Trying to use both gives your contradictory results and forces you to either pick one and use that, or find some third metric to compare the two and have whatever that is be your actual moral system.

@Fenrir:
In any moral system where action A and the equivilent inaction ~A are equally justifiable, we have no net knowledge about the two. Having more knowledge about morality is better than having less, as having more knowledge will let you make choices that are more moral. And that's a good thing.

If there exists any truth to be had about morality, than moral relativism fails by virtue of the fact that you would have a great deal more knowledge of morality than is had from moral relativism simply by choosing ethical injunctions out of a hat and holding onto them with great conviction, since you've got a 50/50 shot of picking the correct choice between any given set of A and ~A. And you can do even better than that by thinking about ethics and choosing an internally consistant model of morality that has given better-than-random results in the past. And if no truth about morality exists anywhere in the world, you can embrace nihilism and still do better than moral relativism (one better, in fact), since you have one fact worth of moral knowledge: that no other moral knowledge exists. No matter what objective moral system you choose, and no matter what the actual ought of the world is, you cannot do worse than the moral null that results from accepting all ought statements as equally valid.

Of course, most people don't take moral relativism seriously enough to actually do worse than nihilism or a randomly chosen set of moral injunctions, at least not overall. They only do worse in those specific domains of morality to which they apply moral relativism. And usually, this is because they are aware (or were taught by someone that was aware) on some level that whatever position they are applying the doctrine of moral relativism to is objectionable to the point where it cannot be defended any other way.

The only thing that moral relativism is actually good for is a cognitive stopsign that says "I no longer wish to discuss ethics with you."
Logged

SalmonGod

  • Bay Watcher
  • Nyarrr
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #191 on: June 29, 2011, 10:25:34 pm »

You ignored the first part of my post; that was where my most important point was.

Because I don't understand how that analogy is applicable.


All right, so we know that we all have the capacity. No matter what I do, that does not change my capacity, and it will not change the capacity of anyone else. How does changing my behaviour in secret compell other people to behave in a similar fashion?

What you do doesn't change your capacity, but it is a measure.  If your thought process is literally "I know I could behave better, but I don't care because it doesn't change anything" then you have in fact proven that your capacity for good behavior is proportional to your immediate perception of return on that investment.  Apparently, if it doesn't directly make the world a better place, then personal gain is favorable.

Unless you believe that somehow you're incredibly special and very few people have a similar thought process to yours, then it is safe to assume that a considerable portion of the population will operate in the same fashion as you.  If those people behave selfishly any time they're not convinced that their decisions will have broad implications for society, then that's a lot of selfish behavior, the sum of which DOES have broad implications for society.

Or if you don't believe in the above, then I see two directions you can go.  Perhaps you believe that if you hold yourself to a higher standard, that you'll just be better than everyone else, giving up on personal gain while everyone else continues to behave selfishly and it will all be pointless in the end.  What a pretentious view.  Or you could go the other way and assume that enough people hold better standards of behavior than you that your isolated acts of selfishness won't matter.  A world made of good people can support a few bad ones, right?  This one doesn't look so good, either.

The following is my personal perspective, and how I try to live my life. 

Honestly, I am abhorred and uplifted by various instances of human behavior every single day.  I don't like the current state of the world, as I think those who behave selfishly end up in positions of greater influence.  However, I desperately cling to a belief that society can reach a higher standard.  I could not go on living in this world unless I did.  I see society as the sum of all human behavior.  I cannot maintain a belief that that sum can reach a positive unless I can succeed in maintaining a personal standard at the level I wish to see.  Even more importantly, that state of society defined by a standard of responsible, mature, and compassionate behavior X literally cannot exist until a large enough portion of the population actually maintains X as a personal standard.  The first step to achieving this is to achieve it myself, and the more people realize this, the more capable we are of moving forward. 

The best example I know is the earlier one I mentioned regarding pacifism.  It is literally impossible for a world where people do not use violence to exist if I make use of violence.  People say that such idealism is impractical, because it relies on an ideal situation to operate.  They're completely missing the point.  The point is to align oneself with that ideal situation to allow it the potential to exist.
Logged
In the land of twilight, under the moon
We dance for the idiots
As the end will come so soon
In the land of twilight

Maybe people should love for the sake of loving, and not with all of these optimization conditions.

Vector

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #192 on: June 29, 2011, 11:42:16 pm »


I have my reasons for making my choice, which I realize may not be everyone's reasons for making their choices.  At this point, I cannot even pretend to know better than they do about their own circumstances.

I don't really know why people keep asking me about this.  I never pretended to be a great moralist, or something.  I simply know that under the current definitions of "loved ones" and my current feelings about killing people--particularly ones who have been mean to me--I wouldn't be able to do it.

I don't know what it's like to love someone like one does a husband, or a lover, or a child.  So maybe, in the future, I'd change my mind.  But right now, the decision is very simple.

To ask more of someone who is only twenty-one years old and has hardly lived seems foolhardy.
Logged
"The question of the usefulness of poetry arises only in periods of its decline, while in periods of its flowering, no one doubts its total uselessness." - Boris Pasternak

nonbinary/genderfluid/genderqueer renegade mathematician and mafia subforum limpet. please avoid quoting me.

pronouns: prefer neutral ones, others are fine. height: 5'3".

Virex

  • Bay Watcher
  • Subjects interest attracted. Annalyses pending...
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #193 on: June 30, 2011, 07:12:31 am »

I may value my loved one more than anyone else, but if I'm going with the utilitarian approach, then I am necessarily trying to be objective as possible. I have to consider each person's net value to everyone to whom that person has any value. Obviously, I don't have enough information to make that decision with regards the other person, but I do have reason to assume (unless I know otherwise, but I'm not sure how I can) that both the other person and my own loved one have average value to humanity as a whole (admittedly, that reason is because it seems probable that such value is a bell curve, and I don't think I can explain why in a concrete way that's actually worth the length of appending to this paragraph).
I don't know about you, but I would not want to live in a world where people see me as nothing more then a variable in an ethical optimization problem, so I doubt I could, assuming university of morals holds, take a decision based upon a cold calculation myself.
Logged

Grek

  • Bay Watcher
    • View Profile
Re: Ethical Dilemmas: How Far Would You Go To Save Someone You Love?
« Reply #194 on: June 30, 2011, 10:56:50 am »

I don't know about you, but I would not want to live in a world where people see me as nothing more then a variable in an ethical optimization problem, so I doubt I could, assuming university of morals holds, take a decision based upon a cold calculation myself.

Consider the following situation:
There has been a terrible fire, and about several hundred people people, your loved ones among them, are in trapped inside a burning building. You are miles away and the rescue team has you on the phone, explaining what happened. They say there are two possible ways they could try to save the people in the burning building, one where they can save 850 people before the building collapses with certainty, and one where they can save 1000 people, but risk a 10% chance of the building collasping immediately before anyone at all is saved. They need your permission to go ahead with the second option, which, in their expert oppinion, is the best one. Which do you choose?

Many people will protest that it is wrong to gamble with people's lives like that, or that it is too 'cold' to pick by the numbers, but the first solution means saving an average of 850 people and the second means saving an average of 900. If you really really care about them, you'd pick the choice that gives them an extra 50 chances to live, even if it hurts to pick, even if you can't deal with the crushing responsibility. The lives of your loved ones are worth more than a little emotional anguish.
Logged
Pages: 1 ... 11 12 [13] 14 15 ... 35