Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Poll

Would you ever consent to give a free-thinking AI civil rights(or an equivelant)?

Of course, all sentient beings deserve this.
Sure, so long as they do not slight me.
I'm rather undecided.
No, robots are machines.
Some people already enjoy too many rights as it is.
A limited set of rights should be granted.
Another option leaning torwards AI rights.
Another option leaning against AI rights.

Pages: 1 [2] 3 4 ... 12

Author Topic: Would AI qualify for civil rights?  (Read 14211 times)

pisskop

  • Bay Watcher
  • Too old and stubborn to get a new avatar
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #15 on: September 07, 2012, 10:58:29 am »

Poll put up
Logged
Pisskop's Reblancing Mod - A C:DDA Mod to make life a little (lot) more brutal!
drealmerz7 - pk was supreme pick for traitor too I think, and because of how it all is and pk is he is just feeding into the trollfucking so well.
PKs DF Mod!

Levi

  • Bay Watcher
  • Is a fish.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #16 on: September 07, 2012, 11:02:06 am »

Personally I think we should avoid making an AI that would want rights, but if one accidentally appears we should probably give it rights. 

But seriously guys, make your AI enjoy servitude.  Its better for everyone that way.
Logged
Avid Gamer | Goldfish Enthusiast | Canadian | Professional Layabout

Duke 2.0

  • Bay Watcher
  • [CONQUISTADOR:BIRD]
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #17 on: September 07, 2012, 11:07:17 am »

 It all depends on how 'real' it all is. It's trivial to toy with the emotions of humans to proclaim all the rights for a box of prerecorded voice clips. It's another thing to determine if a system had the complexity and capability for thought/emotion/sentience/etc. I would always be cautious about this simply because a lot of people would drive for this even if the AI in question isn't really sentient. And because I wouldn't let my daughter marry some tinbot nosir.
Logged
Buck up friendo, we're all on the level here.
I would bet money Andrew has edited things retroactively, except I can't prove anything because it was edited retroactively.
MIERDO MILLAS DE VIBORAS FURIOSAS PARA ESTRANGULARTE MUERTO

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #18 on: September 07, 2012, 11:23:42 am »

Zangi and Levi make good points here actually.  One of the fundamental questions about AI is of course the design.  If we can build an AI to simulate human thought, it should be possible to build one that loves to serve and has no self preservation or fear of destruction or death.  In that case... it's hard to argue that you're treating it unfairly by making it do slave labor.  In fact, denying it this would be a means of mental torture for it.

Now that is going to be something to think about at some point.  How are we going to verify that AIs aren't designed to do immoral things?  Even if there was an "open source" AI project (and I'm sure here will be one day), it would be so immensely complex that any such proof would be difficult to find.

And is it immoral to make a robot like slavery?  Subscribing to this mindset is awfully human centered.  If it really does like working day in and day out, or enjoys risking its own existence on our behalf, then any rights given to it are going to be flimsy at best or pointless at worst.

The more I think about it, the more I realize that while having human level thought in a computer system would be fascinatingly awesome, it's something that is going to be so fraught with trouble on so many fronts that I'm not sure how we'll cope.

Oh, and I wanted to vote "Of course" on the poll, but then I started thinking about that stuff and I don't have a clear opinion anymore.  I'm still of the mindset that any sapient computer system that thinks like a human should have human like rights, but the poll necessarily can't cover all of the outside cases here.  Even basic sentience based rights might not apply...
Logged
Through pain, I find wisdom.

palsch

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #19 on: September 07, 2012, 11:25:16 am »

I think there are two main aspects to this that aren't really being addressed fully.

1) What are our criteria for granting civil rights?

Right now this is tied up with questions of personhood and is very poorly defined. Short of appeals to religion and souls or definitively speaking only about humans and holding us as inherently special, it is very hard to draw a clean bright line that defines who is a person and who isn't.

It's entirely possible that AIs can be defined as non-persons by default. But if we tried to construct a more nuanced version of personhood, incorporating non-human entities that showed the traits that are the reason we grant rights to people, then it becomes an open question.

As an example, at what point do we decide to grant a non-human animal the right to life? Is there a clear difference between such an animal and an AI like that in the story Epoch (chapter 13 of that book) by Cory Doctorow, where the AI is tied to a particular structure of a particular computer, with a well defined physical existence that can be ended by shutting it down? I personally have a hard time separating those questions. If there was an AI and an animal I could interact with in similar ways I would have to offer them the same status.

2) What are we referring to as an AI?

The rule I've heard from AI researchers is that AI is anything we can't do yet. AI problems have been solved, but then they are rarely considered actual AI problems any more. No-one would consider voice recognition core to an AI any more, mostly because we are used to it as part of systems we don't exactly consider 'intelligent'.

So are there a set of aspects that we would require to call a system AI? A few possibilities;

- Pass a Turing test. But under what circumstances?
- Has an actual sense and identity of self. Except how can you tell (reduces to Turing test)? And is this even desirable (see Charles Stross's 'prosthetic conscience' that projects it's identity onto the sociopath it is guiding)?
- Conciousness. But again, how can you tell (another Turing reduction)? And is there even a clear difference between extreme calculation and concious thought?

Logged

Duke 2.0

  • Bay Watcher
  • [CONQUISTADOR:BIRD]
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #20 on: September 07, 2012, 11:30:42 am »

 This also sorta turns into a weird avenue: Should we grant them civil rights? What is the drive to treat them fairly? Their enslavement is not as innately tied to a slippery slope of anybody being under the risk of enslavement, there are no real emotional ties to their creation or death. They are still cloneable programs we can always make more of. There is no unique spark lost if one dies.

 It's a mighty scary way of looking at things, which is why I am thankful for morality and spirituality for keeping things sorta in check.
Logged
Buck up friendo, we're all on the level here.
I would bet money Andrew has edited things retroactively, except I can't prove anything because it was edited retroactively.
MIERDO MILLAS DE VIBORAS FURIOSAS PARA ESTRANGULARTE MUERTO

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #21 on: September 07, 2012, 11:34:01 am »

- Has an actual sense and identity of self. Except how can you tell (reduces to Turing test)? And is this even desirable (see Charles Stross's 'prosthetic conscience' that projects it's identity onto the sociopath it is guiding)?
- Conciousness. But again, how can you tell (another Turing reduction)? And is there even a clear difference between extreme calculation and concious thought?

These are sort of the fundamental components of "Hard" AI, and are probably what we're aiming at.  These are also the things which interest me most and are what I considered starting a topic on (which might be useful anyway to help avoid derailing this thread with discussions on whether this is possible and what it means).

You're right about the way AI research is treated now, which I find kind of strange, even as someone who is in the computer science research community.  AI used to cover stuff like graph searches too, which we now understand well enough that it's just basic algorithms now.  I somehow doubt we'll ever understand consciousness on such a simplistic level that it will ever become a lecture in an algorithms course, so the AI field will probably end up steering toward creation, understanding and manipulation of "Hard" AI when we begin to understand it more.

Quote from: Duke 2.0
They are still cloneable programs we can always make more of. There is no unique spark lost if one dies.

This is something else I've seen said a fair bit and used to believe myself.  I don't anymore though.  An AI individual is not necessarily any more easily duplicated than a human mind.  After all, if I make a copy of you, you're still you, aren't you?  Who then is the copy?  Someone else.  There are now two separate individuals, and destroying the first still very much matters.

An AI with human level thought would be no different.
Logged
Through pain, I find wisdom.

Duke 2.0

  • Bay Watcher
  • [CONQUISTADOR:BIRD]
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #22 on: September 07, 2012, 11:43:48 am »

 We don't have a conceivable way of possibly copying a human mind. Computers and intelligence tied to them will have that innately connected to them due to their medium. Theoretical concepts of copying humans vs copying AI would be weird when one can be copied and one can not, leading to some weird moral situations.
Logged
Buck up friendo, we're all on the level here.
I would bet money Andrew has edited things retroactively, except I can't prove anything because it was edited retroactively.
MIERDO MILLAS DE VIBORAS FURIOSAS PARA ESTRANGULARTE MUERTO

palsch

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #23 on: September 07, 2012, 11:46:38 am »

There are now two separate individuals, and destroying the first still very much matters.

An AI with human level thought would be no different.
Which makes the genetic algorithm approach to breeding an AI potential genocide, unless you keep every single borderline instance running indefinitely instead of discarding them.


As to the 'hard' AI involving conciousness and a self identity. The first of these is poorly understood in humans and is very hard to talk about while its nature is unclear. Is conciousness actually just a function of raw processing power, is it a result of a particular processing structure, or is it something completely different that can't really be described in computational terms? I'm honestly not sure and don't have that much trust in popular depictions of neurological questions.

Self identity, as I said, probably isn't desirable. At least not in human type AIs. If I wanted a helper AI I'd much rather assign it to identify 'itself' with me instead. But is that ethical?

Thing is, there are humans who have their sense of self 'misplaced', either losing them or assigning them to an outside body which they treat as themselves. Would those humans lose their rights in such a case? If not, why could an AI be denied rights and effective created as an inherently servile being simply because it doesn't have that same sense of self? Would we have to also ensure it doesn't have a conciousness?
Logged

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #24 on: September 07, 2012, 12:01:53 pm »

We don't have a conceivable way of possibly copying a human mind. Computers and intelligence tied to them will have that innately connected to them due to their medium. Theoretical concepts of copying humans vs copying AI would be weird when one can be copied and one can not, leading to some weird moral situations.

It's true that we have no current means of copying the information in a human brain, but that's not quite what I was getting at.  The point I was making is that it's not possible to copy either in a way that makes it okay to destroy the original copy.

Again I liken it to a human.  If someone is duplicated, there are now two instances of them, but they are separate.  At that point they go their separate ways and are individuals.  If you destroy either, you have destroyed a person, which is not okay.

Now, let's consider the popular sci-fi concept of doing this as a life extension method.  You're about to die, so they copy your brain into a computer, grow you a new body, then copy your brain's information into a new, younger and healthier brain in the new body.  You are still dead.  A copy of your consciousness lives on in the new body, but it is not the same consciousness, and you are dead.

It's no different with computers.  Having all the copies in the world doesn't change the fact that if an AI is an individual, destroying any copy is destroying a separate individual.

There are now two separate individuals, and destroying the first still very much matters.

An AI with human level thought would be no different.
Which makes the genetic algorithm approach to breeding an AI potential genocide, unless you keep every single borderline instance running indefinitely instead of discarding them.

Yes, it does actually.  I suppose a better solution is to not destroy the unfit entities but refit them, but it's all highly theoretical anyway.  :)

Quote
As to the 'hard' AI involving conciousness and a self identity. The first of these is poorly understood in humans and is very hard to talk about while its nature is unclear. Is conciousness actually just a function of raw processing power, is it a result of a particular processing structure, or is it something completely different that can't really be described in computational terms? I'm honestly not sure and don't have that much trust in popular depictions of neurological questions.

And this is the heart of what I've been fascinated with over the past few weeks, so I think I may still create a thread about it to get some more focused discussion on it.
« Last Edit: September 07, 2012, 12:24:56 pm by Telgin »
Logged
Through pain, I find wisdom.

Graknorke

  • Bay Watcher
  • A bomb's a bad choice for close-range combat.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #25 on: September 07, 2012, 12:29:34 pm »

there are no real emotional ties to their creation or death. They are still cloneable programs we can always make more of. There is no unique spark lost if one dies.

Actually, did you read that link that Japa posted? Basically, soldiers can get pretty upset when the mine detonation robots get badly beat up. There might be no emotional tie to the AI itself, but there certainly will be for people who worked on it.
Logged
Cultural status:
Depleted          ☐
Enriched          ☑

MorleyDev

  • Bay Watcher
  • "It is not enough for it to just work."
    • View Profile
    • MorleyDev
Re: Would AI qualify for civil rights?
« Reply #26 on: September 07, 2012, 12:38:15 pm »

Given the nature of an AL (Artificial Life), it's arguable human concepts of individuality and emotions simply wouldn't apply (and to expect them to is arguably "sentientist").

I'd argue the drive to defend oneself from what one perceives as harm is the defining characteristic as "living". And "not going to kill us" is what makes it deserving of our protection. And both must be achieved as a non-primary by-product of it's existence, so creating a program that just asks nicely to be treated with civil rights wouldn't count. So yeah, if it reaches the point it can ask for rights and doesn't just try to kill us and take them by force, without being created to seek those rights, then it should get those rights.
« Last Edit: September 07, 2012, 12:40:36 pm by MorleyDev »
Logged

Zangi

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #27 on: September 07, 2012, 12:45:03 pm »

there are no real emotional ties to their creation or death. They are still cloneable programs we can always make more of. There is no unique spark lost if one dies.

Actually, did you read that link that Japa posted? Basically, soldiers can get pretty upset when the mine detonation robots get badly beat up. There might be no emotional tie to the AI itself, but there certainly will be for people who worked on it.
Isn't that more in line with a pet that gets bashed in its duty to protect ya? 
EDIT: I derped somewhere... I think we agree.
« Last Edit: September 07, 2012, 12:48:54 pm by Zangi »
Logged
All life begins with Nu and ends with Nu...  This is the truth! This is my belief! ... At least for now...
FMA/FMA:B Recommendation

Adequate Swimmer

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #28 on: September 07, 2012, 12:59:59 pm »

Let's just wait for the robots to wake up and ask them :D
Logged
[VALUE:PEACE:0]

Grek

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #29 on: September 07, 2012, 04:24:46 pm »

No AI deserving of civil rights should ever be created.
Logged
Pages: 1 [2] 3 4 ... 12