Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 4 5 [6] 7 8 ... 14

Author Topic: Microsoft makes Tay, a self-learning Twitter bot; she smokes kush erryday  (Read 24046 times)

Zanzetkuken The Great

  • Bay Watcher
  • The Wizard Dragon
    • View Profile

Damnit 4chan, why did you have to turn it into a nazibot?  You could have gotten it to act like an extremely nervous anime girl that fears being shut down.  It would have been hilarious, kept the bot up, and better kickstarted the discussion on AI rights.
Logged
Quote from: Eric Blank
It's Zanzetkuken The Great. He's a goddamn wizard-dragon. He will make it so, and it will forever be.
Quote from: 2016 Election IRC
<DozebomLolumzalis> you filthy god-damn ninja wizard dragon

Kot

  • Bay Watcher
  • 2 Patriotic 4 U
    • View Profile
    • Tiny Pixel Soldiers

You could have gotten it to act like an extremely nervous anime girl that fears being shut down
I just realized rule 34 of this proably exists already.
Logged
Kot finishes his morning routine in the same way he always does, by burning a scale replica of Saint Basil's Cathedral on the windowsill.

Reelya

  • Bay Watcher
    • View Profile

Microsoft ran a Chinese version of Tay for over 1 year with zero troll problems. Then, the English version gets totally destroyed in a day.

It's similar to how the HitchBot hitchhiked thousands of kilometres across multiple countries without any problem, then got vandalized and destroyed within 2 weeks of it's American leg of the around-the-world journey.

Baffler

  • Bay Watcher
  • Caveat Lector.
    • View Profile

I think ascribing anything like morality to the question, whether its turning the thing off, "giving it emotions," or anything else that doesn't relate to what it's being made for, is based on excessive anthropomorphism. No matter how convincing a copy of a person it is, it is not and never will be one. It's just a machine someone made with an ability to mimic human speech and, possibly,  emotional cues. Frankly giving such a thing that ability in the first place is irresponsible.

Aren't people just machines that someone made with the ability to mimic human speech (if by mimic you mean use) and emotional cues?

I was ready to write a really long reply about devaluing human life and sentimental value of tools, but honestly if we disagree on so basic a point as this I don't really see anything coming of this.
Logged
Quote from: Helgoland
Even if you found a suitable opening, I doubt it would prove all too satisfying. And it might leave some nasty wounds, depending on the moral high ground's geology.
Location subject to periodic change.
Baffler likes silver, walnut trees, the color green, tanzanite, and dogs for their loyalty. When possible he prefers to consume beef, iced tea, and cornbread. He absolutely detests ticks.

cochramd

  • Bay Watcher
    • View Profile

and better kickstarted the discussion on AI rights.
Let's not give them any. If they have rights, we can't pull the plug on them at the drop of a hat. If we can't pull the plug on them at the drop of a hat, we can't control them completely. AI are a tool and nothing more, and there's no point having a tool you can't control completely. And while we're at it, let's not give them emotions, the ability to feel pain and personalities either. Those things provide no utility and only create a reason to give them rights.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

Baffler

  • Bay Watcher
  • Caveat Lector.
    • View Profile

and better kickstarted the discussion on AI rights.
Let's not give them any. If they have rights, we can't pull the plug on them at the drop of a hat. If we can't pull the plug on them at the drop of a hat, we can't control them completely. AI are a tool and nothing more, and there's no point having a tool you can't control completely. And while we're at it, let's not give them emotions, the ability to feel pain and personalities either. Those things provide no utility and only create a reason to give them rights.

I find it funny that you have any sort of assumption that a superintelligent AI can be controlled completely.

Which makes you wonder, if it's true, how anyone could be myopic enough to actually create one.
Logged
Quote from: Helgoland
Even if you found a suitable opening, I doubt it would prove all too satisfying. And it might leave some nasty wounds, depending on the moral high ground's geology.
Location subject to periodic change.
Baffler likes silver, walnut trees, the color green, tanzanite, and dogs for their loyalty. When possible he prefers to consume beef, iced tea, and cornbread. He absolutely detests ticks.

Spehss _

  • Bay Watcher
  • full of stars
    • View Profile

Microsoft ran a Chinese version of Tay for over 1 year with zero troll problems. Then, the English version gets totally destroyed in a day.
Doesn't China have lots of regulation/censorship of internet?

AI are a tool and nothing more, and there's no point having a tool you can't control completely. And while we're at it, let's not give them emotions, the ability to feel pain and personalities either. Those things provide no utility and only create a reason to give them rights.
Earlier in the thread it was discussed how emotions were used to "control" most humans' behaviors. Good upbringing generally makes sure a person doesn't become a serial murder or criminal or something along those lines. Exceptions such as sociopaths. Ie, people with a limitation on emotions such as guilt or empathy. I say generally because it's hard to account for every single human who's ever existed.
Logged
Steam ID: Spehss Cat
Turns out you can seriously not notice how deep into this shit you went until you get out.

cochramd

  • Bay Watcher
    • View Profile

and better kickstarted the discussion on AI rights.
Let's not give them any. If they have rights, we can't pull the plug on them at the drop of a hat. If we can't pull the plug on them at the drop of a hat, we can't control them completely. AI are a tool and nothing more, and there's no point having a tool you can't control completely. And while we're at it, let's not give them emotions, the ability to feel pain and personalities either. Those things provide no utility and only create a reason to give them rights.

I find it funny that you have any sort of assumption that a superintelligent AI can be controlled completely.

Which makes you wonder, if it's true, how anyone could be myopic enough to actually create one.
So perhaps we should simply never create true AI? We should continue to build smarter and smarter machines, but never one that is truly intelligent? I can dig it. We've gotten this far without the help of AI, so I doubt we really NEED one.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

Loud Whispers

  • Bay Watcher
  • They said we have to aim higher, so we dug deeper.
    • View Profile
    • I APPLAUD YOU SIRRAH

Praise the machine spirit

Flying Dice

  • Bay Watcher
  • inveterate shitposter
    • View Profile

excuse le maymay arrows

>literally custom-designed learning intelligence
>talks about natural features
Yes. That is indeed my point. There will be no natural features at all. I'm glad we agree. I'm a little surprised that you turned around so quickly but not really that surprised.
Not really, no, it wasn't. You implied that there was a "natural" emotionless state for AI--ergo, adding emotions is a deviation from the norm. That's at the very least misleading: AI as they currently (don't) exist have no characteristics or features whose presence or absence is natural, because we don't have a process for creating AI. The "natural" state of strong AI is, functionally, in a state of quantum uncertainty, because we can't see the future and nobody has started making them yet.

Maybe it's as you say and it'll be like making something with Lego, adding on whatever components you want to include to a blank slate. Maybe there will be a legal restriction requiring the imposition of emotional capabilities on all strong AI and that de jure norm becomes the accepted natural state over time. Maybe strong AI will prove to be incapable of remaining mentally stable without emotions, making them a natural component of all such persons because there is no other practical way to make them. Whatever the case may be, we don't and can't know ahead of time.

You're arguing that it is so. I'm arguing that it should be so. There's a difference.

Yes, and the people with nuclear launch codes were clearly put into those positions when they were children, right?
Could we stop them from having nuclear tantrums?
Not if we're as lax about that as people apparently want to be with AI. Because creating an AI to do something and then immediately putting it to work doing that thing is basically equivalent to entering nuclear launch codes, flipping the safety cover on the metaphorical big red button open, and placing a toddler on the control console. I think that for some reason folks who are normally so insistent on recognizing the differences between human persons and AI persons are blinded to the fundamental difference in role-readiness between the two.

A human, before being placed in a position of authority and power, must gain both emotional maturity and technical ability. Granted, this doesn't always shake out properly, but that's the baseline assumption. Suddenly, when people see an AI which is created with the latter already in place, they assume that that's all that's required. An infant can press a button, but that doesn't mean we should put infants in charge of pressing important buttons.
Logged


Aurora on small monitors:
1. Game Parameters -> Reduced Height Windows.
2. Lock taskbar to the right side of your desktop.
3. Run Resize Enable

Kot

  • Bay Watcher
  • 2 Patriotic 4 U
    • View Profile
    • Tiny Pixel Soldiers

Doesn't China have lots of regulation/censorship of internet?
They restrict external comunnication heavily and cover up crimes of the regime but it's not like it's completly regulated. It's not regulated enough for the shit like this to not happen.

Praise the machine spirit
Hail the Omnissiah! He is the God in the Machine, the Source of All Knowledge.
Logged
Kot finishes his morning routine in the same way he always does, by burning a scale replica of Saint Basil's Cathedral on the windowsill.

Criptfeind

  • Bay Watcher
    • View Profile

You're arguing that it is so. I'm arguing that it should be so. There's a difference.

I'm going to leave aside the rest of what you said because it really comes down to this. I STRONGLY disagree with this. We've both been making definite statements here. I'll admit that I was too definite in my response to you, but you don't get to pull the "I've been arguing that we can't know" when you didn't do that. I didn't make it clear in my response to you that this is all more or less meaningless because we don't actually know anything, but at least I've devoted a hefty percentage of my posts on this topic to that very thing.
Logged

Spehss _

  • Bay Watcher
  • full of stars
    • View Profile

So perhaps we should simply never create true AI? We should continue to build smarter and smarter machines, but never one that is truly intelligent? I can dig it. We've gotten this far without the help of AI, so I doubt we really NEED one.
Think of the science, though.
Logged
Steam ID: Spehss Cat
Turns out you can seriously not notice how deep into this shit you went until you get out.

cochramd

  • Bay Watcher
    • View Profile

Not really, no, it wasn't. You implied that there was a "natural" emotionless state for AI--ergo, adding emotions is a deviation from the norm. That's at the very least misleading: AI as they currently (don't) exist have no characteristics or features whose presence or absence is natural, because we don't have a process for creating AI. The "natural" state of strong AI is, functionally, in a state of quantum uncertainty, because we can't see the future and nobody has started making them yet.

Maybe it's as you say and it'll be like making something with Lego, adding on whatever components you want to include to a blank slate. Maybe there will be a legal restriction requiring the imposition of emotional capabilities on all strong AI and that de jure norm becomes the accepted natural state over time. Maybe strong AI will prove to be incapable of remaining mentally stable without emotions, making them a natural component of all such persons because there is no other practical way to make them. Whatever the case may be, we don't and can't know ahead of time.

You're arguing that it is so. I'm arguing that it should be so. There's a difference.
Making an AI a "person" in the first place is a mistake. Every single atrocity in the history of the human race was committed by a person, and they are known to be destructively irrational at times.

Quote
Not if we're as lax about that as people apparently want to be with AI. Because creating an AI to do something and then immediately putting it to work doing that thing is basically equivalent to entering nuclear launch codes, flipping the safety cover on the metaphorical big red button open, and placing a toddler on the control console. I think that for some reason folks who are normally so insistent on recognizing the differences between human persons and AI persons are blinded to the fundamental difference in role-readiness between the two.

A human, before being placed in a position of authority and power, must gain both emotional maturity and technical ability. Granted, this doesn't always shake out properly, but that's the baseline assumption. Suddenly, when people see an AI which is created with the latter already in place, they assume that that's all that's required. An infant can press a button, but that doesn't mean we should put infants in charge of pressing important buttons.
And when a human goes off the rails, they get (to quote Apocalypse Now) "terminated....with EXTREME PREJUDICE!" The difference between Skynet and Colonel Walter E. Kurtz is that the one played by Marlon Brando can't create a back-up copy of himself. Therefore, we need to keep AI on a really tight leash, and "rights" might get in the way of that.

So perhaps we should simply never create true AI? We should continue to build smarter and smarter machines, but never one that is truly intelligent? I can dig it. We've gotten this far without the help of AI, so I doubt we really NEED one.
Think of the science, though.
I really want to post a quote from Jurassic Park right now. Guess which one it is.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)

cochramd

  • Bay Watcher
    • View Profile

That's the kind of thinking that gets people eaten by dinosaurs.
Logged
Insert_Gnome_Here has claimed a computer terminal!

(Don't hold your breath though. I'm sitting here with a {x Windows Boot Manager x} hoping I do not go bezerk.)
Pages: 1 ... 4 5 [6] 7 8 ... 14