Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1] 2 3 ... 7

Author Topic: The Technological Singularity thread: an explaination of concepts  (Read 6692 times)

alway

  • Bay Watcher
  • 🏳️‍⚧️
    • View Profile

I have noticed there are a few tranhumanists/singularitarians on these forums, and since most don't have any idea about what those are, I have decided to create a thread to sum up the general concepts of said ideas. Its a bit long (800+ words), and you may actually learn something; YOU HAVE BEEN WARNED!  :D

   The main concept behind the Technological Singularity is that when artificial intelligence surpasses human intelligence, they will be able to improve upon their own code at an ever faster rate. This will lead to an ever more rapid increase in the intelligence of these AI.

Assumption 1: The human mind can be emulated within computer software. This occurs either through outright simulation (much less likely) or using programming techniques which are inspired by the brain's structure (very likely).
Evidence: There is, as of yet, no theoretical reason as to why it would not be possible to emulate the human brain. In fact, many AI have been created which emulate various parts of the brain to do specific tasks.

Assumption 2: Technology will continue to advance at a rate which is, at the very least, slightly below linear for the forseeable future.
Evidence: While the exact curve of the growth of technology can be debated (with estimates ranging from slightly diminishing returns all the way up to above exponential growth, depending on the estimator), there is no reason to think technological progess will slow considerably for at least another century.

Assumption 3: The most advanced general AI will be relatively similar to humans in their thought patterns, ethics, morals, and emotions.
Evidence: If we create AI based on reverse engineering of the structure of the human brain, the AI itself would have a mind similar to that of humans.

   These three assumptions are really all it takes for a Singularity event to occur. The first assumption ensures the ability to create general AI. The variations within the second determine the timing of the Singularity event. If, as some suggest, technological growth in exponential, we may see a Singularity event within the next 30 to 40 years. If it is linear, it would likely occur before the end of this century. This is something to keep in mind, the Singularity, if in deed one is to occur, will happen within the lifetimes of some of us who are alive today. The third ensures a Singularity event would actually take off. An advanced, general AI alone will not spark a Singularity. That AI must also have the curiosity about discovery and intelligence humans do in order for it to be motivated to improve upon its own code. For example, a general AI meant for military purposes as a drone would almost certainly not start a Singularity. As such, Skynet scenarios are relatively unlikely due to the vastly different goals of such an AI.

The Hardware:
   Moore's Law pretty much sums up this section. The power of computers doubles aproximately once every 2 years; even less for some components. Estimates for the computational power of the human brain vary, but even anti-singularity websites give estimates which are well within amounts achievable through Moore's Law within 2 decades on a supercomputer. Singularitarian websites usually have lower estimates which predict that supercomputers at the current day are capable of running human-level AI and that within a decade, so will $1000 consumer computers. In either case, it is a difference of relatively small quantities of time.

The Software:
   The hardware required for a Singularity event will be around for several decades before the neurological basis on which to create a general AI becomes available. The brain is somewhat mysterious because it is incredibly difficuly to study; especially if one is attempting to study it when it is still running. However, the technology to study the brain on the individual neuron level is becoming available; projects to map brains have already begun. The time frame in which these efforts are completed is probably going to be one of the largest determinants of when a Singularity occurs.

Transhumanism 101:
   At this point, I will shift away from explaining the Singularity and zoom out a bit from Singularitarianism to the more general philosophy of transhumanism. Transhumanism, as best I can tell, is a diverse subset of secular humanism which believes that the benefits of technology far outweigh the risks and should be used to better the human condition. In addition to Singularitarianism, there are many other sub-sets of transhumanism, many of which are partially overlapping. I won't go into them all here, since the wikipedia pages describe them so much better, but the views of these subsets range in scope from helping reduce poverty by giving technology to poor regions to things as lofty as achieving immortality through technological means.

The Effects:
   A Technological Singularity would be a massively world-altering event. Technological progress would be made at a rate the likes of which had never before been seen. Within several years of the start of the Singularity, society would be radically different. Advances in technology, nanotechnology in particular, would likely lead to a post-scarcity economy powered by swarms of resource gathering nanobots. Aside from those results, the effects of a Singularity are unclear. Many guesses can and have been made, but I have already typed a long enough essay for now.

I probably could have gone on for several thousand words, but I decided against it to avoid over-clarification. I tried to include various speculations without getting too deep into any one in particular and tried to avoid describing future events which may not actually occur. As such, this is merely a summary of the idea of a Technological Singularity and not an actual in-depth paper on it. For more info, starting with wiki pages is always a good idea.
http://en.wikipedia.org/wiki/Artificial_intelligence
http://en.wikipedia.org/wiki/Transhumanism
http://en.wikipedia.org/wiki/Singularitarianism
Or for an optimistic view, you could try Ray Kurzweil's book "The Singularity is Near."
Logged

LeoLeonardoIII

  • Bay Watcher
  • Plump Helmet McWhiskey
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #1 on: May 26, 2010, 12:18:06 pm »

I thought a singularity was a point beyond which we could not predict. That is, our current models don't provide us with enough to go on. That's why they use the same terminology for this technological singularity that they use for a black hole.

Likewise, we could have a nuclear war singularity, where we assume we'll see a nuclear winter and such but the predications people make for the social and technological future beyond that aren't based on enough solid data.

Or, an energy singularity where we run out of stored energy on Earth (no more nuclear, no more fossil, no useful geothermal, all we have is solar and the direct results of solar: wind, tidal, reservoir-hydro, biomass). Will we come up with something else that takes care of our needs? What might that thing be?

Etc.

I guess I'm just saying, the whole point of a technological singularity is we don't know what it'll look like a decade or a century later because we have absolutely nothing to go on. "Futurists" who talk about it are really talking about science fiction (which may become a reality, or may not, but it's not anything like a sober analysis).
« Last Edit: May 26, 2010, 12:20:21 pm by LeoLeonardoIII »
Logged
The Expedition Map
Basement Stuck
Treebanned
Haunter of Birthday Cakes, Bearded Hamburger, Intensely Off-Topic

smigenboger

  • Bay Watcher
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #2 on: May 26, 2010, 12:20:41 pm »

I think the timing is off, look at 2001, ASO. Technology jumps at different intervals, depending on what's 'in style'.

I thought the Singularity was when enough human minds meld with technology to create one mass consciousness.
Logged
While talking to AJ:
Quote
In college I studied the teachings of Socrates and Aeropostale

Cthulhu

  • Bay Watcher
  • A squid
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #3 on: May 26, 2010, 12:23:19 pm »

No, a singularity is a point where technological advancement becomes exponential and near infinite, for example a self-replicating, self-improving AI.
Logged
Shoes...

LeoLeonardoIII

  • Bay Watcher
  • Plump Helmet McWhiskey
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #4 on: May 26, 2010, 12:23:34 pm »

Oh also, I think the Skynet problem is overrated. You just need to keep your AI cerebral, completely disconnected from the outside, without access to the physical tools needed to escape. If you put your AI in a self-sufficient drone with a weapon and tools attached, you're just asking for trouble.
Logged
The Expedition Map
Basement Stuck
Treebanned
Haunter of Birthday Cakes, Bearded Hamburger, Intensely Off-Topic

alway

  • Bay Watcher
  • 🏳️‍⚧️
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #5 on: May 26, 2010, 12:35:12 pm »

I thought a singularity was a point beyond which we could not predict. That is, our current models don't provide us with enough to go on. That's why they use the same terminology for this technological singularity that they use for a black hole.
Which is why the "Effects" catagory is so short. It would have been as long as the rest of it combined if I had put in all the various hypothesized effects from various authors. While we can't say much about a post-singularity period, there are a few things which would be likely.

As for Skynets, they wouldn't occur unless the programming for the emotional parts was seriously borked. It would probably take a huge number of malevolent re-writes of the code of an AI to make them turn on humans in a military way; and that's assuming the emotions themselves aren't designed to be emergent from other subsystems, as they are in humans. In which case, it wouldn't even function correctly.
« Last Edit: May 26, 2010, 12:40:11 pm by alway »
Logged

PTTG??

  • Bay Watcher
  • Kringrus! Babak crulurg tingra!
    • View Profile
    • http://www.nowherepublishing.com
Re: The Technological Singularity thread: an explaination of concepts
« Reply #6 on: May 26, 2010, 12:48:27 pm »

This, I have always thought and will almost certainly continue to maintain, is simply Geek Jesus. Perhaps the most enduring trope in all of mythology is this: You are the last generation before the great struggle that will result in all being right in the world, or at least all being resolved.
 
The fact that it is obvious that there is some human bias to that sort of thinking- since totally isolated cultures have thought this as well as the mainstream abrahamic religions- should put the highest level of skeptisicm before any of these claims.
 
At the very least, consider that evolution does not favor world conquors or super-geniuses, and a self-modifying, splitting AI would certainly be evolving. More likely, some entirely alien thought process, one that is entirely unconcerned with the phyiscial world outside of ensuring that it's systems are operational, would be the most successful. Humanlike thought is entirely unnessicary when it comes to being as successful as possible in "cyberspace"; after all, only one species we know of found it useful in the real world.
 
So, stop building your singularity bunker, stop betting that the world is going to end, and start acting like the world is going to keep working. After all, what makes you think that out of all of human history, you'd be the first generation to be right?
Logged
A thousand million pool balls made from precious metals, covered in beef stock.

Armok

  • Bay Watcher
  • God of Blood
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #7 on: May 26, 2010, 12:50:50 pm »

The reason for this confusion is that there IS no agreed upon definition of the singularity, but several competing and very different meaning for the same word.

Also, so far almost everything said has been more or less incorrect. For better explanations see:
http://singinst.org/overview/whatisthesingularity
http://yudkowsky.net/singularity
Logged
So says Armok, God of blood.
Sszsszssoo...
Sszsszssaaayysss...
III...

Il Palazzo

  • Bay Watcher
  • And lo, the Dude did abide. And it was good.
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #8 on: May 26, 2010, 12:59:14 pm »

So, stop building your singularity bunker, stop betting that the world is going to end, and start acting like the world is going to keep working. After all, what makes you think that out of all of human history, you'd be the first generation to be right?
I wholeheartedly agree.
Singularity is like Santa Claus or Pamela Anderson's tits. Believing that they're real can make you feel nice, but you should know better than to bet any money on it.
Logged

cerapa

  • Bay Watcher
  • It wont bite....unless you are the sun.
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #9 on: May 26, 2010, 01:11:33 pm »

I think life is a singularity. We are simply a link in a chain, think about it, we are much faster and better at doing stuff than life before us, and we are still improving our abilities.
My own opinion is that the machine singularity would be just a natural progression of the life singularity, simply another link in the evolutionary chain after humans.

If the progression does not come from a direct rogue AI, then it would simply come from us slowly merging with computers, until finally our minds are completely in them, and it will going even from there.
Logged

Tick, tick, tick the time goes by,
tick, tick, tick the clock blows up.

LeoLeonardoIII

  • Bay Watcher
  • Plump Helmet McWhiskey
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #10 on: May 26, 2010, 01:18:19 pm »

I think we'll see several charlatans who claim to have created this self-improving AI. There was this guy who came through here claiming to have a machine that turned landfill garbage (newspapers, banana peels, used condoms, etc) into diesel fuel. And we constantly see the old "hey man they invented a car that runs on water!" chestnut.

As for the computer AI thing. I'd like to see them make an AI follower in a video game that can stop jumping in front of my gun or get stuck on every office chair and fallen sandwich. Let's see them get that done first, and maybe we can talk about working on human-level AI.
Logged
The Expedition Map
Basement Stuck
Treebanned
Haunter of Birthday Cakes, Bearded Hamburger, Intensely Off-Topic

dragnar

  • Bay Watcher
  • [Glub]
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #11 on: May 26, 2010, 01:29:29 pm »

...There are factories to turn garbage into diesel fuel. They are stupidly expensive, and require a huge amount of energy to run, but they do exist. The only practical use for them is to allow cars and things to be fueled by power from some other energy source, just changed into gasoline.
Logged
From this thread, I learned that video cameras have a dangerosity of 60 kiloswords per second.  Thanks again, Mad Max.

smigenboger

  • Bay Watcher
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #12 on: May 26, 2010, 01:47:33 pm »

Besides, the singularity may happen simply because there won't be any human life to compare AI to after 2012...

I just have to set up a betting system on if 2012 will happen or not, there's no way I could lose
Logged
While talking to AJ:
Quote
In college I studied the teachings of Socrates and Aeropostale

LeoLeonardoIII

  • Bay Watcher
  • Plump Helmet McWhiskey
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #13 on: May 26, 2010, 02:02:01 pm »

...There are factories to turn garbage into diesel fuel. They are stupidly expensive, and require a huge amount of energy to run, but they do exist. The only practical use for them is to allow cars and things to be fueled by power from some other energy source, just changed into gasoline.
His machine was the size of a large RV and didn't use much power. Plus, he's been accused of various frauds all over the place, so nobody gave him the money he wanted.

Do you have a link to your factory thing? And I'm not talking about corn/etc biodiesel. He meant regular diesel fuel from junk like cardboard and soggy diapers.

Besides, the singularity may happen simply because there won't be any human life to compare AI to after 2012...

I just have to set up a betting system on if 2012 will happen or not, there's no way I could lose
You have to bet people that it will not happen. If it does happen, they win. Heck, give them 10:1 odds in their favor.
Logged
The Expedition Map
Basement Stuck
Treebanned
Haunter of Birthday Cakes, Bearded Hamburger, Intensely Off-Topic

smigenboger

  • Bay Watcher
    • View Profile
Re: The Technological Singularity thread: an explaination of concepts
« Reply #14 on: May 26, 2010, 02:22:46 pm »

Haha yeah that's the idea, unless it's a pot kind of deal, then I win no matter what
Logged
While talking to AJ:
Quote
In college I studied the teachings of Socrates and Aeropostale
Pages: [1] 2 3 ... 7