Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Poll

Would you ever consent to give a free-thinking AI civil rights(or an equivelant)?

Of course, all sentient beings deserve this.
Sure, so long as they do not slight me.
I'm rather undecided.
No, robots are machines.
Some people already enjoy too many rights as it is.
A limited set of rights should be granted.
Another option leaning torwards AI rights.
Another option leaning against AI rights.

Pages: 1 2 3 [4] 5 6 ... 12

Author Topic: Would AI qualify for civil rights?  (Read 14302 times)

Graknorke

  • Bay Watcher
  • A bomb's a bad choice for close-range combat.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #45 on: September 07, 2012, 05:14:49 pm »

I'm curious as to why you think this.
To avoid cruelty to the AI? Because it isn't possible? Because it isn't useful?

That's a bit of a short post.

Mostly 3. It's reckless and dangerous to make an artifical person that isn't like the other human persons around it. It would be alien and probably a bit creepy, without serving any useful purpose that couldn't be done just as easily as creating a lesser AI or recruiting a human person. And that's without the difficulties inherent in trying to program general intelligence and a human-acceptable morality to consider.
Aah, but you forget the mantra of human discovery!
It's because we can.
Or at least, we want to know if we can. It isn't particularly dangerous apart for the mental condition of the AI either. It's not like an intelligence whose only communication occurs through a text terminal is going to do something horrible and disastrous. At worst it'd throw a hissy fit.
Logged
Cultural status:
Depleted          ☐
Enriched          ☑

kaijyuu

  • Bay Watcher
  • Hrm...
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #46 on: September 07, 2012, 05:17:45 pm »

Besides, if we do make a superior AI that takes over, it's basically just evolution in progress anyway. Only we jumped from passing along genetics via DNA to passing it along via our conventional data storage.


Consider the robot overlords to be humanity's children.
Logged
Quote from: Chesterton
For, in order that men should resist injustice, something more is necessary than that they should think injustice unpleasant. They must think injustice absurd; above all, they must think it startling. They must retain the violence of a virgin astonishment. When the pessimist looks at any infamy, it is to him, after all, only a repetition of the infamy of existence. But the optimist sees injustice as something discordant and unexpected, and it stings him into action.

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #47 on: September 07, 2012, 05:22:26 pm »

Kind of late to the discussion about this particular topic, but I agree with kaijyuu (and the similar minded).  The only way to introduce true randomness in a human mind is to invoke magic (i.e. souls).  Quantum mechanical effects have absolutely no effect on the scale that brain computations occur.  Any output from your brain is the effect of running an unimaginably complicated function on the input.  A true human-like AI would be no different.

Of course, I have to use vague terms like "human-like" and "true" there.  It's considerably easier to create a facade of consciousness, and if you understand the inner workings you could probably come up with a plausible argument that such a system is not conscious and thus deserves no rights (it can't suffer if it does not 'experience' anything).

I don't really buy the argument of determinism being the important distinction though.

On the subject of consciousness and qualia, I find it incredibly frustrating that we can't explain how they work.  If we did, it would be relatively simple to build a human-like computer mind.  It's fascinating and annoying to think about.  It must be nothing but incredibly complicated programs running that generates it, but how this works is totally beyond us.

At times I've wondered just how close we could get to a conscious system by just starting from simple, automatic responses and adding depth.  Dwarf Fortress is a lifetime project of Toady.  Could someone work true miracles in this field by devoting a lifetime to similar pursuits?  It would probably be a waste of a life, but just imagine the possibilities.  To my knowledge, most researchers are more content on trying to build theories on how to make such a thing rather than actually go at it.  There are good reasons for that, but I imagine we might be surprised by what is possible.
Logged
Through pain, I find wisdom.

Jervill

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #48 on: September 07, 2012, 05:22:56 pm »

I just hope they view themselves as that, because if not, we may end up as being well and truly boned.

Wouldn't we be in danger of the robots putting us in a retirement home (Florida) once they think we're too old, though?
Logged

Graknorke

  • Bay Watcher
  • A bomb's a bad choice for close-range combat.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #49 on: September 07, 2012, 05:23:50 pm »

Why would robots take over? Are we really so stupid as to give them deudly firearms without knowing how they feel about us? Would they even want to? I mean, humans are an integral part of a lot of the production processes that go into making machine parts. If they killed us all, they would eventually decay and break.

If they even got in that position anyway. Or will we only notice that the autonomous weapon-achine was a bad idea in hindsight? Or, in the words of the great Professor Frink;
Quote
Oh, the laster eyes! Why did I give them to him?
Logged
Cultural status:
Depleted          ☐
Enriched          ☑

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #50 on: September 07, 2012, 05:29:20 pm »

Why would robots take over? Are we really so stupid as to give them deudly firearms without knowing how they feel about us? Would they even want to? I mean, humans are an integral part of a lot of the production processes that go into making machine parts. If they killed us all, they would eventually decay and break.

Of course, if you consider the possibility of robots like those in The Terminator, then they're quite capable of making new parts and fixing themselves.
Logged
Through pain, I find wisdom.

pisskop

  • Bay Watcher
  • Too old and stubborn to get a new avatar
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #51 on: September 07, 2012, 05:31:08 pm »

Why wouldn't robots be?  machine make machines now.
Logged
Pisskop's Reblancing Mod - A C:DDA Mod to make life a little (lot) more brutal!
drealmerz7 - pk was supreme pick for traitor too I think, and because of how it all is and pk is he is just feeding into the trollfucking so well.
PKs DF Mod!

Graknorke

  • Bay Watcher
  • A bomb's a bad choice for close-range combat.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #52 on: September 07, 2012, 05:38:52 pm »

Why would robots take over? Are we really so stupid as to give them deudly firearms without knowing how they feel about us? Would they even want to? I mean, humans are an integral part of a lot of the production processes that go into making machine parts. If they killed us all, they would eventually decay and break.

Of course, if you consider the possibility of robots like those in The Terminator, then they're quite capable of making new parts and fixing themselves.
Designing a concious entity using mechanical parts is one thing. It is definitely physically possible, and is a thought process more than anything. Processor limitations can be easily overcome by throwing more CPUs at it.

But designing and building a replica or superior body to the human one using only mechanical parts has proven to be a nightmare. There are so many minute things that need to be constantly changed, and making something with the properties that allow it to be lightweight, strong, and fast is a challenge against the very materials we are able to use.
Muscles can work because of various protein reactions, that are replenished by the rest of the body as it synthesises them from other living matter, but robots would take in only electricity and hydraulic fluids maybe, and have no biological functions to speak of.

An example of this being so difficult is that one Japanese robot that falls down stairs.

Edit: Yes, robots are involved in the manufacture of machines, but they are nowhere to be seen in silicon mines, or driving transport trucks around. Or having overall control of power generation. There's always a failsafe to have a human override it if they feel the automated systems are not doing it right.
Stockpiled materials will only last so long. And when you have no biological clock, maintenance will have to come at some point before they become inoperable.
« Last Edit: September 07, 2012, 05:41:35 pm by Graknorke »
Logged
Cultural status:
Depleted          ☐
Enriched          ☑

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #53 on: September 07, 2012, 05:41:32 pm »

Consider what computers were like 50 years ago compared to now, and extrapolate.  Robotics (and materials science) follows suit.  It's not impossible to build a robot out of nanotech components (effectively what we are).  We just don't have the technology yet.

But forget even that, that doesn't matter.  If a robot can push the button to start the machine that builds new parts for the robot, then humans are not needed.
Logged
Through pain, I find wisdom.

tootboot

  • Bay Watcher
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #54 on: September 07, 2012, 06:15:36 pm »

Will humans qualify for civil rights in an age of truly intelligent machines?  Maybe we shouldn't...it's quite easy to make a case we'd be better off with machine rule given the state of the world at any given time in history.
Logged

Zrk2

  • Bay Watcher
  • Emperor of the Damned
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #55 on: September 07, 2012, 06:20:59 pm »

Really this all comes down to a very interesting metaphysical question; is functional identical to substance? Personally I don't think so, and thus feel that even if we can make a fully functional AI it still wouldn't be sapient, and would simply mimic sapiency a la the chinese room thought experiment. Thus I think that AI can't qualify for any rights, except maybe to not be abused by people who have no idea how computers work.
Logged
He's just keeping up with the Cardassians.

Soadreqm

  • Bay Watcher
  • I'm okay with this. I'm okay with a lot of things.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #56 on: September 07, 2012, 06:22:36 pm »

I think that if a machine is capable of demanding civil rights, the question of whether it's really sentient is largely irrelevant. The ethical ramifications of killing a supercomputer don't actually affect what people will do. The more personable the AI is, the easier it will be to get people to sympathize with it, but that's just appearing human, and doesn't necessarily have much to do with what really goes on in its mind. And anyway, humans are perfectly capable of killing other humans. If enough people think that the AI needs to die, it's going to die regardless.

Edit: Yes, robots are involved in the manufacture of machines, but they are nowhere to be seen in silicon mines, or driving transport trucks around. Or having overall control of power generation. There's always a failsafe to have a human override it if they feel the automated systems are not doing it right.
Stockpiled materials will only last so long. And when you have no biological clock, maintenance will have to come at some point before they become inoperable.

Except it wouldn't exactly be an automated system. One pretty important requirement for an intelligent machine would be decision making capabilities on par with humans. If they can't even manage that, they really have no hope of ever overthrowing anyone. An artificial intelligence incapable of creative problem solving isn't much of an intelligence.

Or I guess you could end up with an AI with no industrial capacity. I'm kind of assuming that the AI will be able to manufacture any arbitrary items, meaning equipment and infrastructure and crude robots for pretty much any task. You're quite right that a supercomputer that can't repair or maintain itself or replenish its stock of terminator bots is doomed to die as its resources dwindle.
Logged

Soadreqm

  • Bay Watcher
  • I'm okay with this. I'm okay with a lot of things.
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #57 on: September 07, 2012, 06:50:01 pm »

Really this all comes down to a very interesting metaphysical question; is functional identical to substance? Personally I don't think so, and thus feel that even if we can make a fully functional AI it still wouldn't be sapient, and would simply mimic sapiency a la the chinese room thought experiment. Thus I think that AI can't qualify for any rights, except maybe to not be abused by people who have no idea how computers work.

I think the metaphysical question is actually whether there is substance at all. Perhaps souls are real things that exist, and humans just have some kind of special je ne sais quoi that computers lack, and thus an AI can never be truly alive. Perhaps humans are nothing more than machines built of meat, and the illusion shared by all people of being aware of themselves and their environment is just a natural process that can ultimately be understood and replicated.

It occurs, however, that if souls are real things that exist, then they too are a natural process. If the thing that makes people different from robots exists, it can be observed, and possibly even manipulated. It is just another interchangeable part in the machine of logic and emotion that makes a mind. If human consciousness is something special, would it be possible to move it into a machine? To destroy it without damaging the brain, creating human with no sense of self? To weaponise it into a gun that shoots ghosts? The science fiction possibilities are endless. :P

That's right, I just turned your AI civil rights discussion to necromancy.
Logged

kaijyuu

  • Bay Watcher
  • Hrm...
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #58 on: September 07, 2012, 06:51:50 pm »

* kaijyuu applauds
Logged
Quote from: Chesterton
For, in order that men should resist injustice, something more is necessary than that they should think injustice unpleasant. They must think injustice absurd; above all, they must think it startling. They must retain the violence of a virgin astonishment. When the pessimist looks at any infamy, it is to him, after all, only a repetition of the infamy of existence. But the optimist sees injustice as something discordant and unexpected, and it stings him into action.

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Would AI qualify for civil rights?
« Reply #59 on: September 07, 2012, 06:57:28 pm »

Really this all comes down to a very interesting metaphysical question; is functional identical to substance? Personally I don't think so, and thus feel that even if we can make a fully functional AI it still wouldn't be sapient, and would simply mimic sapiency a la the chinese room thought experiment. Thus I think that AI can't qualify for any rights, except maybe to not be abused by people who have no idea how computers work.

Is there any particular reason you believe it's not possible to have a sapient AI?  What's special about meatware that makes sapience possible?  In regards to the Chinese Room, I side with the crowd that the consciousness is simulated and still present.

I think that if a machine is capable of demanding civil rights, the question of whether it's really sentient is largely irrelevant. The ethical ramifications of killing a supercomputer don't actually affect what people will do. The more personable the AI is, the easier it will be to get people to sympathize with it, but that's just appearing human, and doesn't necessarily have much to do with what really goes on in its mind. And anyway, humans are perfectly capable of killing other humans. If enough people think that the AI needs to die, it's going to die regardless.

I think there is some merit to this as well.  We'll probably never be able to prove that any AI is consicous or not, and thus whether it suffers or otherwise needs to be protected.  If it behaves like a human (say, it's a p-zombie), then for all intents and purposes we'll need to protect it with rights anyway because it would feel wrong to not do so.

Of course, I believe that a p-zombie is impossible: by creating one I believe you're necessarily simulating a consciousness.
Logged
Through pain, I find wisdom.
Pages: 1 2 3 [4] 5 6 ... 12