Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 6 7 [8] 9 10 ... 14

Author Topic: Peter Molyneux, you cheeky little rascal.  (Read 14530 times)

Enzo

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #105 on: June 06, 2009, 08:29:33 pm »

Is a dogs mind not a mind because it is lesser than human?

The problem with this analogy is that if you kick a dog it will express pain. If you kick a robot it won't do anything unless you've made sure in advance that the recieveKick function calls expressPain, or whatever. Any response that has been specifically programmed by a human being isn't an emotional response, it's just a hollow mathematical routine that mimics a legitimate emotional response. We are so incredibly far away from the kind of simulations you're talking about, Amp, or from even knowing if they're truly possible, that I'm of the mind the planet will be long scorched and barren before it ever happens.

Also I don't think Flight Simulations can pilot planes. You would need a different program (similar in a lot of ways) for that. The new program wouldn't be called a Flight Simulator, because it wouldn't be simulating a flight, but you could certainly call it a Pilot Simulator without using the word incorrectly. And god, let's not start a philosophical discussion on the nature of experience, shall we? I've read enough about that already to bore me in advance for years.
Logged

Ampersand

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #106 on: June 06, 2009, 08:45:07 pm »

Is a dogs mind not a mind because it is lesser than human?

The problem with this analogy is that if you kick a dog it will express pain. If you kick a robot it won't do anything unless you've made sure in advance that the recieveKick function calls expressPain, or whatever. Any response that has been specifically programmed by a human being isn't an emotional response, it's just a hollow mathematical routine that mimics a legitimate emotional response. We are so incredibly far away from the kind of simulations you're talking about, Amp, or from even knowing if they're truly possible, that I'm of the mind the planet will be long scorched and barren before it ever happens.

Also I don't think Flight Simulations can pilot planes. You would need a different program (similar in a lot of ways) for that. The new program wouldn't be called a Flight Simulator, because it wouldn't be simulating a flight, but you could certainly call it a Pilot Simulator without using the word incorrectly. And god, let's not start a philosophical discussion on the nature of experience, shall we? I've read enough about that already to bore me in advance for years.

But, how is that different from any biological reaction? You do not react to pain until the signal travels through your nervous system to your brain to notify you that you have received a painful stimulus, at which point, your brain responds by instructing the afflicted limb to pull away from the stimulus, automatically I might add, floods the brain with endorphins to stop the pain, and another chemical that I don't recall the name of which inhibits the pleasure centers so that the stimulus is associated with a negative response.

And also, we already let simulations fly planes, and they're often better than humans.
« Last Edit: June 06, 2009, 08:47:10 pm by Ampersand »
Logged
!!&!!

cowofdoom78963

  • Bay Watcher
  • check
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #107 on: June 06, 2009, 08:47:12 pm »

The problem with this analogy is that if you kick a dog it will express pain. If you kick a robot it won't do anything unless you've made sure in advance that the recieveKick function calls expressPain, or whatever. Any response that has been specifically programmed by a human being isn't an emotional response, it's just a hollow mathematical routine that mimics a legitimate emotional response.
So do you, what makes you so special? Also would this make it ok to kick a dog if it doesnt express pain?

Quote
Also I don't think Flight Simulations can pilot planes. You would need a different program (similar in a lot of ways) for that. The new program wouldn't be called a Flight Simulator, because it wouldn't be simulating a flight, but you could certainly call it a Pilot Simulator without using the word incorrectly. And god, let's not start a philosophical discussion on the nature of experience, shall we? I've read enough about that already to bore me in advance for years.
Err, I dont think you got my point. If the "pilot" simulator was designed to pilot a plane it wouldent be called a pilot simulator, just like if a robot was designed to express emotions to real situations they wouldent be "simulated" emotions.
Logged

Tigershark13

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #108 on: June 06, 2009, 09:18:16 pm »

the anwser... recreate the terminator films IRL :D
Logged

Enzo

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #109 on: June 06, 2009, 11:10:19 pm »

But, how is that different from any biological reaction?
Programmed by a human in simple mathematical terms.

Edit : To clarify, emotion is not mathematical. Ask a machine to detect pain-causing stimuli, sure. Ask a machine to describe pain? Without a preprogrammed response? I don't see it happening.

Err, I dont think you got my point. If the "pilot" simulator was designed to pilot a plane it wouldent be called a pilot simulator, just like if a robot was designed to express emotions to real situations they wouldent be "simulated" emotions.
Apparently I don't get your point. Whether or not we refer to them as "simulated", they still are, as they are an imitation of the real thing.

And yeah, I realize a robot can fly a plane. I'm just saying, it's still a simulation of a pilot. It doesn't have human reasoning, or human error (so in this case, it's usually superior).
« Last Edit: June 06, 2009, 11:35:17 pm by kinseti »
Logged

Ampersand

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #110 on: June 06, 2009, 11:45:00 pm »

I think that I take issue most with you saying that any simulation made by human programming will never be able to conceptualize emotion. What if I created a perfect simulation of a human brain, as is currently being done with various supercomputers?

Mathematics can be used to describe everything that occurs within the universe. Every physical process. Yes, what occurs in your brain is also a physical process. The mathematics of emotions are the mathematics of chemical reactions and electrical impulses and charges. It is something complex but still something that can be described.
Logged
!!&!!

cowofdoom78963

  • Bay Watcher
  • check
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #111 on: June 06, 2009, 11:48:51 pm »

Alright, regarding machines describing pain. Same for humans, have a human describle ultra-violet light to you. They cant do it.
Logged

Enzo

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #112 on: June 07, 2009, 12:27:04 am »

Alright, this is an argument as old as time.

Or, well, computers. I'm not saying that it is mathematically impossible to simulate the complex chemical reactions taking place inside a brain. What I am saying is, the absolute understanding of numerous different sciences would have to be balanced in perfect harmony to do so. I'm not saying it's impossible. I'm saying we are nowhere close. (I'm also saying, in my personal opinion, mans ability to destroy will outstrip his ability to create, and we'll see a big ball of dust before the positronic brain. But hey, I'm an optimist.)

All this talk of what defines a "simulation" is sort of just semantics.

Also, I can describe ultraviolet light. Invisible. Heh.
« Last Edit: June 07, 2009, 12:34:12 am by kinseti »
Logged

Smitehappy

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #113 on: June 07, 2009, 01:24:16 am »

Yoink!
« Last Edit: June 07, 2009, 01:33:03 am by Smitehappy »
Logged
Interestingly, Armok's name actually originates from arm_ok, a variable in one of Toady's earlier games that kept track of how many of your arms weren't missing.

Vodalian

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #114 on: June 07, 2009, 01:53:52 am »

That's because it is essentially humans you are speaking to. It uses the technology behind Jabberwhacky and Jabberwhacky works using a sort of feedback loop. When you say something to it, it remembers what you say and then repeats it to someone else later, it then remembers the response and uses it next time someone asks the same question. It remembers the context using a tree. It's quite basic actually.

Yeah, that was what I figured. Explains why it always tries to call me cleverbot, for one thing. And I got it caught in the same loop once or twice, due to my tendency to pretend to be Samuel L. Jackson over the internet.

Clever : What is your name?
Me : Samuel L. Jackson.
Clever : Where is him?
Me: Where is grammar?
Clever : In west africa.

But occasionally it does something I can't explain. Once I told Cleverbot my name was Flash Gordon, and much later in the conversation (I was bored) I asked it simply "Why" to which it responded "Because you are Flash Gordon." prompting me to repeatedly type "SAY MY NAME. WHAT IS MY NAME." trying to get it to repeat it, as I've never actually had a chatbot remember my name from one line to the next. This is what makes me think it is actually, at least occasionally, just a room full of Indians fucking with stupid Americans.

It uses a context tree, it also has some hard coded rules, which is easy to fool by mixing things up a bit. for instance:

Quote
Me: My name is andy
Bot: My name is cleverbot (hard coded)
Me: What is my name?
Bot: Andy (hard coded)
Me: who am i
Bot: Bye Honey
Me: What name do I go by?
Bot: You are Krista.
Me: what did my parents name me
Bot: You named me Jesus. (lol)
« Last Edit: June 07, 2009, 01:56:08 am by Vodalian »
Logged

Virroken

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #115 on: June 07, 2009, 02:48:05 am »

I think that I take issue most with you saying that any simulation made by human programming will never be able to conceptualize emotion. What if I created a perfect simulation of a human brain, as is currently being done with various supercomputers?
Can you send/link me some info on that? Even the names of lead researchers would be great. I am intensely interested in this, though doubtful. Modern cognitive science isn't even close to understanding the brain, much less code a perfect reproduction.

Supercomputers, of course, solve everything.
Logged

Sordid

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #116 on: June 07, 2009, 03:25:17 am »

Or, well, computers. I'm not saying that it is mathematically impossible to simulate the complex chemical reactions taking place inside a brain. What I am saying is, the absolute understanding of numerous different sciences would have to be balanced in perfect harmony to do so.

No it wouldn't. Firstly, there are a lot of unbalanced, deranged people. Just because they're not in perfect harmony doesn't automatically mean they don't have a mind, now does it. So no, perfect harmony isn't necessary.
As for complexity, that also isn't necessary. Our brains are products of evolution with many vestiges of their ancestry, essentially analog computers. As you say, an artificial brain would have none of that, it would be concise mathematical formulas. How exactly does that make it the lesser mind, though? Because from where I'm standing, achieving the same effect with none of the overcomplexity of a biological brain would make the artificial mind the superior one.

Quote
Also, I can describe ultraviolet light. Invisible. Heh.

That's not a description, that's admitting that you can't describe it.
Logged

ThtblovesDF

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #117 on: June 07, 2009, 03:38:58 am »

In other news, I <3 reading this thread.

Please continue.
Logged

umiman

  • Bay Watcher
  • Voice Fetishist
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #118 on: June 07, 2009, 03:43:54 am »

Wow, how many times has Peter given his "interactivity" speech?

dreiche2

  • Bay Watcher
    • View Profile
Re: Peter Molyneux, you cheeky little rascal.
« Reply #119 on: June 07, 2009, 05:49:48 am »

I think that I take issue most with you saying that any simulation made by human programming will never be able to conceptualize emotion. What if I created a perfect simulation of a human brain, as is currently being done with various supercomputers?
Can you send/link me some info on that? Even the names of lead researchers would be great. I am intensely interested in this, though doubtful. Modern cognitive science isn't even close to understanding the brain, much less code a perfect reproduction.

Supercomputers, of course, solve everything.

Well, for example there are the people around Henry Markram in Lausanne that try to simulate cortical networks in great detail on a Blue Gene supercomputer. However, afaik they can simulate maybe hundreds of thousands of neurons, whereas the human brain has a hundred *billion* neurons.

That's partly because they use so detailed simulations of neuronal morphology. However, many neuroscientists think that's getting nowhere anyway, because the brain is just so complicated and full of details that might not actually matter for the "essential" processing, and so much is unknown, so that "we simulate everything from bottom up" might just be futile.

What is more likely IMO is that we will develop AI that shares capabilities of humans but is implemented in different ways, and that exactly will make it so difficult to tell whether it has consciousness or emotions. Personally I haven't thought much about emotions, but for consciousness I start to think that consciousness might actually come about simply from certain types of information processing and representation, and I'm not so sure anymore if my laptop might not have consciousness in some way :)

Edit: I can give an example of a model with potential for consciousness if anyone is interested, but this is getting somewhat off topic here...
« Last Edit: June 07, 2009, 05:54:37 am by dreiche2 »
Logged
Pages: 1 ... 6 7 [8] 9 10 ... 14