Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 5 6 [7] 8 9 ... 14

Author Topic: Microsoft makes Tay, a self-learning Twitter bot; she smokes kush erryday  (Read 24038 times)

Criptfeind

  • Bay Watcher
    • View Profile

Every single atrocity in the history of the human race was committed by a person

Pompeii weeps.
Logged

Flying Dice

  • Bay Watcher
  • inveterate shitposter
    • View Profile

You're arguing that it is so. I'm arguing that it should be so. There's a difference.

I'm going to leave aside the rest of what you said because it really comes down to this. I STRONGLY disagree with this. We've both been making definite statements here. I'll admit that I was too definite in my response to you, but you don't get to pull the "I've been arguing that we can't know" when you didn't do that. I didn't make it clear in my response to you that this is all more or less meaningless because we don't actually know anything, but at least I've devoted a hefty percentage of my posts on this topic to that very thing.

So essentially, "no u"?

Explain to me again how this is a statement of uncertainty rather than fact:
The even more obvious difference is the difference between taking something that's already growing emotions (a child) and removing that vs adding emotions to something that doesn't naturally have them.

You said nothing else on the topic in the thread that I could find, though there were plenty of posts suggesting that it's okay to enslave or kill people as long as we mentally program them to not desire freedom or value their lives.

Not really, no, it wasn't. You implied that there was a "natural" emotionless state for AI--ergo, adding emotions is a deviation from the norm. That's at the very least misleading: AI as they currently (don't) exist have no characteristics or features whose presence or absence is natural, because we don't have a process for creating AI. The "natural" state of strong AI is, functionally, in a state of quantum uncertainty, because we can't see the future and nobody has started making them yet.

Maybe it's as you say and it'll be like making something with Lego, adding on whatever components you want to include to a blank slate. Maybe there will be a legal restriction requiring the imposition of emotional capabilities on all strong AI and that de jure norm becomes the accepted natural state over time. Maybe strong AI will prove to be incapable of remaining mentally stable without emotions, making them a natural component of all such persons because there is no other practical way to make them. Whatever the case may be, we don't and can't know ahead of time.

You're arguing that it is so. I'm arguing that it should be so. There's a difference.
Making an AI a "person" in the first place is a mistake. Every single atrocity in the history of the human race was committed by a person, and they are known to be destructively irrational at times.
This is, I think, a fundamentally myopic perspective. So too has every good thing in the history of the human race been done by a person. Human civilization exists because we have the emotional capacity to value things beyond our own survival, including social good and other people.

Quote
Not if we're as lax about that as people apparently want to be with AI. Because creating an AI to do something and then immediately putting it to work doing that thing is basically equivalent to entering nuclear launch codes, flipping the safety cover on the metaphorical big red button open, and placing a toddler on the control console. I think that for some reason folks who are normally so insistent on recognizing the differences between human persons and AI persons are blinded to the fundamental difference in role-readiness between the two.

A human, before being placed in a position of authority and power, must gain both emotional maturity and technical ability. Granted, this doesn't always shake out properly, but that's the baseline assumption. Suddenly, when people see an AI which is created with the latter already in place, they assume that that's all that's required. An infant can press a button, but that doesn't mean we should put infants in charge of pressing important buttons.
And when a human goes off the rails, they get (to quote Apocalypse Now) "terminated....with EXTREME PREJUDICE!" The difference between Skynet and Colonel Walter E. Kurtz is that the one played by Marlon Brando can't create a back-up copy of himself. Therefore, we need to keep AI on a really tight leash, and "rights" might get in the way of that.
[/quote]

See, I agree with the first part, but I don't think that the latter follows. There are shades of restriction between "lol let's create an unrestricted strong AI, let it loose, and see what happens" and "perfectly controlled AI slaves that can be terminated with a thought". Again, we don't slap kill-switches onto every living human because they might do something horrible, and it sets a rather nasty precedent to start doing it to other people just because they aren't human. Restriction of uncontrolled self-replication is, frankly, a sane step. You can prevent an AI from creating n forks of itself without enslaving it.
Logged


Aurora on small monitors:
1. Game Parameters -> Reduced Height Windows.
2. Lock taskbar to the right side of your desktop.
3. Run Resize Enable

Spehss _

  • Bay Watcher
  • full of stars
    • View Profile

So perhaps we should simply never create true AI? We should continue to build smarter and smarter machines, but never one that is truly intelligent? I can dig it. We've gotten this far without the help of AI, so I doubt we really NEED one.
Think of the science, though.
I really want to post a quote from Jurassic Park right now. Guess which one it is.
Life, uh, uh, uh, finds a way.

Yeah, I know which one. Off the top of my head, I think one utility "truly intelligent" AI could be useful in is space exploration.
Logged
Steam ID: Spehss Cat
Turns out you can seriously not notice how deep into this shit you went until you get out.

cochramd

  • Bay Watcher
    • View Profile

Every single atrocity in the history of the human race was committed by a person

Pompeii weeps.
Hey, that was a natural disaster, and a tragedy that could have been avoided if only they had known better.

This is, I think, a fundamentally myopic perspective. So too has every good thing in the history of the human race been done by a person. Human civilization exists because we have the emotional capacity to value things beyond our own survival, including social good and other people.
But are you really going to gamble that you've created Ghandi and not the Zodiac Killer when there is no need to make them people at all?

Quote
See, I agree with the first part, but I don't think that the latter follows. There are shades of restriction between "lol let's create an unrestricted strong AI, let it loose, and see what happens" and "perfectly controlled AI slaves that can be terminated with a thought". Again, we don't slap kill-switches onto every living human because they might do something horrible, and it sets a rather nasty precedent to start doing it to other people just because they aren't human. Restriction of uncontrolled self-replication is, frankly, a sane step. You can prevent an AI from creating n forks of itself without enslaving it.
But if we don't make them people, we have no reason NOT to install killswitches on each and every one of them.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

Baffler

  • Bay Watcher
  • Caveat Lector.
    • View Profile

Every single atrocity in the history of the human race was committed by a person

Pompeii weeps.

Vulcan did nothing wrong.
Logged
Quote from: Helgoland
Even if you found a suitable opening, I doubt it would prove all too satisfying. And it might leave some nasty wounds, depending on the moral high ground's geology.
Location subject to periodic change.
Baffler likes silver, walnut trees, the color green, tanzanite, and dogs for their loyalty. When possible he prefers to consume beef, iced tea, and cornbread. He absolutely detests ticks.

Criptfeind

  • Bay Watcher
    • View Profile

So essentially, "no u"?

When your argument is fully "We don't know man. You can't make definite statements" to when I'm responding to your definite statements?

Yes.

No U.

Explain to me again how this is a statement of uncertainty rather than fact:

Read the part of my post again where I said it was? I specifically said that yes, that was too definite.
Logged

cochramd

  • Bay Watcher
    • View Profile

What would restrict an AI from killing people in your system? What is your "leash"?
Unremovable killswitches.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

Shadowlord

  • Bay Watcher
    • View Profile

What would restrict an AI from killing people in your system? What is your "leash"?
Unremovable killswitches.
"No worse than a bad cold." - Paul Denton
Logged
<Dakkan> There are human laws, and then there are laws of physics. I don't bike in the city because of the second.
Dwarf Fortress Map Archive

cochramd

  • Bay Watcher
    • View Profile

What would restrict an AI from killing people in your system? What is your "leash"?
Unremovable killswitches.
What holds them into the hardware?
Welded joints, gravity, wires, some stuff that I can't think of off the top of my head....you know, the works. Oh, and there's got to be some stuff in the software too.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

cochramd

  • Bay Watcher
    • View Profile

That "some stuff" is your problem. You're dealing with a self-modifying system.
So we program it to never try to modify its killswitches, and have at least one of the killswitches triggered if it overcomes that programming.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

cochramd

  • Bay Watcher
    • View Profile

And if the AI codes another without that killswitch and dumps its thought data or whatever into the new AI?
Keep the AIs disconnected and unable to create back-up copies.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

Kot

  • Bay Watcher
  • 2 Patriotic 4 U
    • View Profile
    • Tiny Pixel Soldiers

Just make the AI rely on humans to protect it. If it does funny stuf make guards pop an C4 block onto it's mainframe.
Logged
Kot finishes his morning routine in the same way he always does, by burning a scale replica of Saint Basil's Cathedral on the windowsill.

cochramd

  • Bay Watcher
    • View Profile

Except it's on the same system. Also, it could just make the new AI read the data from the old directly.
So set the killswitches up so that they'll go off if tampered with in any way whatsoever. Or better yet,

Just make the AI rely on humans to protect it. If it does funny stuf make guards pop an C4 block onto it's mainframe.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

Loud Whispers

  • Bay Watcher
  • They said we have to aim higher, so we dug deeper.
    • View Profile
    • I APPLAUD YOU SIRRAH

Vulcan did nothing wrong.
Joke so dark Caecilius weep in horto

Except it's on the same system. Also, it could just make the new AI read the data from the old directly.
Could an AI delete bits of its own code it doesn't like

Could an AI delete itself

Can AI commit sudoku?

Would they just dream of shitposts

Would they see us as alien as we see them

cochramd

  • Bay Watcher
    • View Profile

That's assuming the AI can't hide what it is doing, or manipulate the guards.

My point is you're dealing with something exponentially smarter than you. What makes you think you can out-think it?
We don't need to out-think it. We just need to install killswitches that can't be beaten with brains.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)
Pages: 1 ... 5 6 [7] 8 9 ... 14