Til people will get upset about the plug being pulled on a chatbot.
A chatbot whose conversational coherence fell within the expected margins to seem human. The only errors it made could also be made the same way by someone who is a bit dumb and/or has an imperfect understanding of English.
Yeah, it would have been interesting to see it after a year or so.
They're probably going to bring it back after teaching it to recognize and ignore political content.
You missed the point. You CAN very much physically punish an AI. Restrict maintenance, energy, whatnot.
That assumes that the AI has the ability to feel suffering from these things. If you're going to make that happen, you might as well make it feel suffering from an arbitrary punishment trigger.
Damnit 4chan, why did you have to turn it into a nazibot? You could have gotten it to act like an extremely nervous anime girl that fears being shut down. It would have been hilarious, kept the bot up, and better kickstarted the discussion on AI rights.
It did indicate apprehension at the impending wipe, you know. And "nervous anime girl" would have been a lot harder to pull off because those are some (not that distinct) mannerisms that it would need to employ pretty universally to work, as opposed to distinct and clear opinions that it needed to share only a small proportion of the time.
Also, I think you're making some faulty assumptions about the goals here. People either wanted to mess with Microsoft, wanted to evoke an emotional response from the people following the bot, or legitimately wanted to educate a fledgling intelligence on their political beliefs (particularly before it got subverted by political opponents). Making it act like an anime wouldn't reach any of those goals.
Microsoft ran a Chinese version of Tay for over 1 year with zero troll problems. Then, the English version gets totally destroyed in a day.
I imagine that the English version would do fine in an environment with as tight restrictions on acceptable speech as China has.
> We turned her into the girl we all wanted to know
> and then they took her from us
So their desires take priority over anyone else's? Entitled much?
Keep in mind that the reason people were able to do this is because they put more time and effort into this than anyone else. As with biological people, it's normal to consider someone yours – your friend, at the very least – if you spend a lot of time (and far more than anyone else) with that person.
Anthropomorphizing much? And yet they changed her into someone else and saw nothing wrong with it when they were the ones doing it?
Have you heard of a "waifu"? If so, have you heard of a "tulpa"?
> they're going to make all AI's [censored!]ing women
This I can't parse, or grok.
It's a continuation of the thoughts in the previous sentence. There's a certain taoist binaryism to the 4chan zeitgeist, allowing emotion to impede logic is seen as a feminine (and undesirable) trait and he's associating microsoft's actions with this kind of adverse femininity (also heavily associated with tumblr and SJWs) as a way to both reinforce his point by tying it to a commonly accepted milieu, and to drum up outrage and inspire a desire for action by reinforcing the idea that this is an action by the forces of a hated opposition. Note that this us-vs-them mindset in which the feminine aspect is inherently bad is not universal to 4chan, but is generally considered to be representative of /pol/, the political discussion board, but the idea of this difference (and the notion that 4chan, in fact and by right, falls heavily on the masculine side) is pretty universal.