Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 5 6 [7] 8 9 ... 14

Author Topic: Peter Molyneux, you cheeky little rascal.  (Read 14481 times)

Enzo

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #90 on: June 06, 2009, 01:57:47 pm »

That's because it is essentially humans you are speaking to. It uses the technology behind Jabberwhacky and Jabberwhacky works using a sort of feedback loop. When you say something to it, it remembers what you say and then repeats it to someone else later, it then remembers the response and uses it next time someone asks the same question. It remembers the context using a tree. It's quite basic actually.

Yeah, that was what I figured. Explains why it always tries to call me cleverbot, for one thing. And I got it caught in the same loop once or twice, due to my tendency to pretend to be Samuel L. Jackson over the internet.

Clever : What is your name?
Me : Samuel L. Jackson.
Clever : Where is him?
Me: Where is grammar?
Clever : In west africa.

But occasionally it does something I can't explain. Once I told Cleverbot my name was Flash Gordon, and much later in the conversation (I was bored) I asked it simply "Why" to which it responded "Because you are Flash Gordon." prompting me to repeatedly type "SAY MY NAME. WHAT IS MY NAME." trying to get it to repeat it, as I've never actually had a chatbot remember my name from one line to the next. This is what makes me think it is actually, at least occasionally, just a room full of Indians fucking with stupid Americans.
Logged

cowofdoom78963

  • Bay Watcher
  • check
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #91 on: June 06, 2009, 02:52:36 pm »

IN SOVIET RUSSIA, BOT IS YOU!
Logged

Jakkarra

  • Guest
Re: Peter Molyneux, you cheeky little rascal.
« Reply #92 on: June 06, 2009, 05:43:23 pm »

speaking hypothetically for a moment,

if Mr.Steve (or anyone else for that matter) does just so happen to create an AI intelligent enough to have actual conversations and/or possibly even feelings, is noone else worried about the possible moral and ethical implications of this?

i myself will be very worried when this technology comes along... im all for AI that works properly for the games industry, but im not so sure about this type of goal.

hopefully you will understand my meaning, PLEASE put down what you think i mean, so i don't come off as saying something that has no connection with my intended response...
Logged

woose1

  • Bay Watcher
  • Yay for bandwagons!
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #93 on: June 06, 2009, 06:30:07 pm »

Even if it does have 'feelings', it won't. Because deep down, it's just a line of C++ (Or whatever programming language we have then.), but I do know plenty of people that would argue with me on that, saying that 'Humans are basically computers too'.

No... we never make the 'best' choice.
Logged

Virroken

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #94 on: June 06, 2009, 06:34:30 pm »

The AI is being programmed to simulate emotion. There's a large difference between simulation and actual possession.
Logged

Tigershark13

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #95 on: June 06, 2009, 06:34:56 pm »

...but then we'll have computer hippies..who try give AI rights...and make violent games illigal :O
Logged

Servant Corps

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #96 on: June 06, 2009, 06:36:26 pm »

The AI is being programed to simulate emotion, but so is our brain too. If it can do a good job in simulation, it has just as much emotion as a human being do, unless you can prove that a human being has free will, and thus not limited by its own internal programming. If you can prove that, then the AI is limited only by its programming.
Logged
I have left Bay12Games to pursue a life of non-Bay12Games. If you need to talk to me, please email at me at igorhorst at gmail dot com.

Maric

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #97 on: June 06, 2009, 06:38:20 pm »

I have free will  ;D.

Moving my arm up and down and nothing any of you guys can do to stop me!


Cleverbot is also pretty funny.
Logged

Jakkarra

  • Guest
Re: Peter Molyneux, you cheeky little rascal.
« Reply #98 on: June 06, 2009, 06:40:58 pm »

you do have a point there woose...  and also Virroken.

still, i meant in hypothetical terms. ACTUAL emotive responses are beyond current atrificial means. though, we never know what could be down the line, remember before the first PC? noone would ever have thought computers would be able to do what they can today, hell, formats are changing all the time.
Logged

woose1

  • Bay Watcher
  • Yay for bandwagons!
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #99 on: June 06, 2009, 07:09:55 pm »

you do have a point there woose...  and also Virroken.

still, i meant in hypothetical terms. ACTUAL emotive responses are beyond current atrificial means. though, we never know what could be down the line, remember before the first PC? noone would ever have thought computers would be able to do what they can today, hell, formats are changing all the time.
I'm sorry, I just don't think we can ever imitate mother nature perfectly.
Logged

Enzo

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #100 on: June 06, 2009, 07:14:43 pm »

Simulating emotion is one thing, and certainly creepy. It doesn't scare me nearly as much as machines breaking the evolution barrier though; when we program them to program themselves, make abstact deductions, and learn. Combine with self-replicating nanobots and we're worse off than Skynet.

Can I play the Asimov card? He wrote a lot about the ethical concerns of handling advanced AIs. If I recall correctly, Bicentennial Man was about a robot that developed the capacity for creativity, and his struggle to be recognized as a living creature (despite the fact that he won't age, can't feel pain, etc). It's been a while since I cracked any Asimov, actually, but it's my favourite Sci Fi.

But 'Positronic Brains' are still totally completely and utterly in the realm of science fiction. Right now we can almost make a robot understand and respond to basic conversation. For a robot to feel we would have to completely replicate the functions of a living brain in 1's and 0's. The mindboggling complexity of the neuropsychology and programming involved make me think this will never come to pass, personally, but I don't expect you to agree with me there.

What a great topic this is. Bashing Peter Molyneux and discussing the ethical ramifications of advanced AI? delicious. HUNDRED REPLY GET.
Logged

woose1

  • Bay Watcher
  • Yay for bandwagons!
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #101 on: June 06, 2009, 07:24:10 pm »

But 'Positronic Brains' are still totally completely and utterly in the realm of science fiction. Right now we can almost make a robot understand and respond to basic conversation. For a robot to feel we would have to completely replicate the functions of a living brain in 1's and 0's. The mind boggling complexity of the neurophysiology and programming involved make me think this will never come to pass, personally, but I don't expect you to agree with me there.
Precisely. We are more likely to destroy ourselves ever before that comes to pass.
Besides, the people who think otherwise also believe in 'Dyson Robotic Trees' and crazy shit like that.

EDIT: I just realized that the reference I just noted is so obscure that no-one is likely to get it.
http://en.wikipedia.org/wiki/Dyson_tree
« Last Edit: June 06, 2009, 07:26:44 pm by woose1 »
Logged

sonerohi

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #102 on: June 06, 2009, 07:46:19 pm »

But 'Positronic Brains' are still totally completely and utterly in the realm of science fiction. Right now we can almost make a robot understand and respond to basic conversation. For a robot to feel we would have to completely replicate the functions of a living brain in 1's and 0's. The mind boggling complexity of the neurophysiology and programming involved make me think this will never come to pass, personally, but I don't expect you to agree with me there.
Precisely. We are more likely to destroy ourselves ever before that comes to pass.
Besides, the people who think otherwise also believe in 'Dyson Robotic Trees' and crazy shit like that.

EDIT: I just realized that the reference I just noted is so obscure that no-one is likely to get it.
http://en.wikipedia.org/wiki/Dyson_tree

Slightly off-topic: I'd heard of the Dyson sphere before and I love the concept, but the Dyson tree sounds odd. You'd need a rather large asteroid with a large tree and only a couple of people. Otherwise you encounter two problems: Atmosphere so small that standing up puts you outside of it, And too little oxygen.
Logged
I picked up the stone and carved my name into the wind.

Ampersand

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #103 on: June 06, 2009, 08:07:05 pm »

The problem with simulation is that it's hard to distinguish when the simulation stops being a simulation. A perfect simulation is exactly the same as the real thing, right? But where's the line drawn? How many imperfections does it take before a simulation of a mind stops being a 'mind'? I'm not sure there is a set number. Is a dogs mind not a mind because it is lesser than human?
Logged
!!&!!

cowofdoom78963

  • Bay Watcher
  • check
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #104 on: June 06, 2009, 08:16:36 pm »

Its not a simulation if it is used in a practical situation. For example a flight simulation that pilots a plane is not a flight simulation at all now is it?

Obviously the bot doesnt have "simulated" emotions, artifical and simple mabey. But not simulated.

After all, human emotions are pretty much code. except made of DNA instead of C++.
Logged
Pages: 1 ... 5 6 [7] 8 9 ... 14