Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 2 3 [4] 5 6 ... 24

Author Topic: Thoughts on Transhumanism  (Read 22048 times)

wierd

  • Bay Watcher
  • I like to eat small children.
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #45 on: October 29, 2015, 09:49:22 pm »

This is why I asked the 7 questions earlier. They STRONGLY color the reality in which I would be living/running after the upload.

Could I then be free to see the stars, in my shiny new interstellar body-- given that my consciousness can now run for millions and millions of years?  Even at slow sub-light speeds, I could fly from star to star.  The perception of time is mutable, but the experiences I could potentially have change significantly.  (I could alter the rate of processing on my rig, so that it FEELS like I am going at superluminal speed, for instance. That way my human-copied-impatience does not make me crazy enroute. To me, it feels like just a short 10 minute trip, even though I have actually be cruising at 10% lightspeed for several thousand years.)
« Last Edit: October 29, 2015, 09:52:12 pm by wierd »
Logged

Frumple

  • Bay Watcher
  • The Prettiest Kyuuki
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #46 on: October 29, 2015, 09:57:38 pm »

If a method was invented that could allow for the uploading of your consciousness to a digital format and had been proven to work, but the process would result in the destruction of the organic brain as a consequence of getting all the required information, at what point, if ever would you take advantage of this technology?
Pretty much as soon as it was sufficiently accessible I could get at it. Fuck human biology. Give me an out that guarantees something vaguely resembling me gets left behind and I'll take it, right here, right now.

I'd take brain in a machine pretty well, too, methinks. Whatever it takes to trade up from this goddamn flesh sack and its genetic fuckery.
Logged
Ask not!
What your country can hump for you.
Ask!
What you can hump for your country.

Urist McScoopbeard

  • Bay Watcher
  • Damnit Scoopz!
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #47 on: October 29, 2015, 10:05:21 pm »

I would, then I would take over the world, then I would kill everyone, and then I would play god. Scary.

EDIT: Just so we all know what's up, i'd also kill whoever Satan told me to for immortal life, so there's that. Death is pretty terrifying, and i'll do pretty much anything to live forever.
« Last Edit: October 29, 2015, 10:09:05 pm by Urist McScoopbeard »
Logged
This conversation is getting disturbing fast, disturbingly erotic.

jaked122

  • Bay Watcher
  • [PREFSTRING:Lurker tendancies]
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #48 on: October 29, 2015, 10:07:57 pm »

Sorry for not giving a more careful reading of your questions, I tend to react to things I disagree with, namely the neurogenesis being an issue in the scenario preposed.


To number 7: Well, being transmitted from one computer to another would take thousandths of the energy required to move the person from one location to another. This also happens to be true transmitted through space, even using a relatively expensive reed-solomon error coding scheme at a safe boundary(I'd give it a very high factor of redundancy there).


The amount of energy required would be more easily acquired for a simulated person, namely solar power would very easily power the simulation at some speed, I'm sure that it would not be impossible to make it a comfortable speed compared to physical reality's timestep.


To answer number 2: As for the neural synapse formation and pruning by voluntary control, that's playing with fire. I like playing with fire, but with that kind of control, I can see myself being badly burned, to use the metaphorical sense. Though to be serious, that might actually be the sensation that a poorly executed change might bring about(a burning sensation)...


Ultimately I think that sort of control would be for the "Power-users" who want to work as expediently as possible.


To answer number 4: I hope I wouldn't need a physical embodiment for anything that I couldn't do myself with a robotic remote. Crews are messy, difficult, and impatient. People in meat bodies aren't very good for interstellar travel, unless you have a lot of antimatter on hand to get them to where they're going very quickly. Though if the answer to number 6 is yes, then they aren't necessary because a self-replicating system almost certainly has the required degree of sophistication to affect repairs of very significant complexity.


Number 6... I want to say yes, but replicating too easily makes them likely to be used in grey goo type scenarios. And while I like a good mass uploading, I wouldn't want anyone I know subjected to one. Nor would I want them to be used to construct the processor I'm running on. That'd be like eating a person and building a tent out of their bones.
Spoiler (click to show/hide)


That's pretty much the answers I have for you. Though neural networks don't tend to scale geometrically because the least efficient naive implementation takes O(N^3) for N neurons. Not terribly great, but with (10^12)^3, we have a lot to work towards.


Fortunately that operation can be parallelized, so we can kind of do that in a reasonable time if you run it on a Nvidia Titan, preferably with about twelve of them hooked together for memory capacity, but even then, it wouldn't be realtime.

wierd

  • Bay Watcher
  • I like to eat small children.
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #49 on: October 29, 2015, 10:12:32 pm »

Never. To steal and reinterpret from Hemingway, you can live as much of a life in seventy hours as in seventy years. More time will not make me love life more, but by the end, I am certain it would make me hate it.
Hence why I'd still want the ability to commit suicide if I was to become immortal. Unless I found a way to 'restart' my memories from scratch, I'd eventually become bored out of my skull. At that point, I'd like to end my existence, I think.

Also, even WITH the reset, I'd probably reach an iteration of myself where I'd not want to do that, which would result in me choosing suicide in all likelihood.

bear in mind that a full on digital simulation of wetware neurons, will operate similarly to wetware neurons.

EG, you would still get horny, but then would have no wedding tackle.

Food for thought.  Maybe you need Zack Weinersmith's instant orgasm button as an option for that.
Logged

jaked122

  • Bay Watcher
  • [PREFSTRING:Lurker tendancies]
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #50 on: October 29, 2015, 10:14:53 pm »

Well, I'd want the simulation of my brain to be hooked up to some kind of virtual reality so that I could experience things. Hopefully a fairly detailed simulation, but if all else fails, I could deal with runescape.

Runescape, because it will still be around to upload into.

Urist McScoopbeard

  • Bay Watcher
  • Damnit Scoopz!
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #51 on: October 29, 2015, 10:19:40 pm »

Well, I'd want the simulation of my brain to be hooked up to some kind of virtual reality so that I could experience things. Hopefully a fairly detailed simulation, but if all else fails, I could deal with runescape.

Runescape, because it will still be around to upload into.

I was going to say dark souls and then i realized how hellish of a reality that would be, i know id go crazy. Probably minecraft I guess, or maybe world of warcraft.
Logged
This conversation is getting disturbing fast, disturbingly erotic.

LordBucket

  • Bay Watcher
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #52 on: October 29, 2015, 10:21:43 pm »

If a method was invented that could allow for the uploading of your consciousness to a digital format and had been proven to work,

How could it possibly be proven to work? You can't prove to me that a flesh and blood human being talking to me is conscious. How do you intend to prove to me that a software copy of that human being is conscious?

Or do you mean, after it happened, the behavior of the software entity was indistinguishable from me by a third party? Umm, no. I recall reading a few weeks ago about a subscription service that scans the email and social media of a dead person, and then continues to produce online content based on that history. Is a piece of software churning out pattern data "you?" Even if the content it produced was "indistinguishable from" what you produce, would it be you?

I don't think so.



Unbroken chain of consciousness is a dumb idea. For real. There is no reason to think that matters.
It's all in the memories m8.

A video tape contains memory. Is it therefore conscious?



In that case, a mechanical process that slowly replaces the entirety of your organic brain with a machine one would be indistinguishable.

Imagine going into a doctor's office. The doctor saws off your foot and replaces it with with a cybernetic replacement foot. You're still ok. Then he saws off your hand and replaces it with a cybernetic replacement hand. You're still ok. Next he removes your heart and replaces it with a mechanical heart. Yep, still fine. Next he cuts off your head and...oh, now you're dead.

Apply this general concept to your brain. Replacing parts might be perfectly fine...up until it's not.

In response to slow nanite replacement and a theseus ship sort of thing, sure, that'd work, but I don't believe that the thing we're presuming to preserve here actually exists.  I guess the best way to explain what I'm trying to say is that I don't think Theseus's ship is preserved if you replace its boards one by one.

The Ship of Theseusis a bad analogy for this topic. Yes, if you replace one board at time, most people would agree it's still the same ship. But that's not what we're discussing here.

Imagine if instead of replacing the Ship of Theseus one board at a time, you instead dismantled it one board at a time, and every time you removed a board, you wrote down a description of where it was and how it connected to other boards, and then burned it. And then once the entire ship was a pile of dust, you threw the dust away and handed the description to somebody else and asked them to make a CAD file of it on a computer.

Would that CAD file be the same ship?

Quote
I don't think the question matters.  I think it's just a certain arrangement of boards and that the designator "ship" has no immanent value, so asking if the "ship" is preserved isn't really asking anything.

Whether you are or aren't a zombie, I can't know. I'm not you. But I do know that I'm not a zombie. I may be observing a "certain arrangement" in the form of memories and behavior patterns, but observation is occurring. I empirically know this, because "I" am observing it. I might be fuzzy on what the "I" is who is doing the observing, but the experience of observation is definitely being observed.

Suggesting that reproducing the brain state being observed would necessarily therefore re-create the observation of that brainstate and that that observing entity would therefore be me, makes about as much sense as saying that making a digital copy of a painting and sticking it on a computer would therefore result in the same subjective experience as a human being looking at the original painting in a gallery.



Even if it works, there are still a bunch of issues. Who controls the computer you're on? What happens when they die or their company is bought out? What if somebody decides to make copies of the 'you'' and use them as slave labor? If you really think it's you in there, that's potentially a concern, isn't it? What if people at the computer decide to alter your memories, or feed you pleasure/pain data until you submit to doing whatever they want? Or save an original copy and use you until you finally break, and then restore from the save and use you again, forever?


Great big can of worms here.


Flying Dice

  • Bay Watcher
  • inveterate shitposter
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #53 on: October 29, 2015, 10:22:53 pm »

Never. To steal and reinterpret from Hemingway, you can live as much of a life in seventy hours as in seventy years. More time will not make me love life more, but by the end, I am certain it would make me hate it.
Hence why I'd still want the ability to commit suicide if I was to become immortal. Unless I found a way to 'restart' my memories from scratch, I'd eventually become bored out of my skull. At that point, I'd like to end my existence, I think.

Also, even WITH the reset, I'd probably reach an iteration of myself where I'd not want to do that, which would result in me choosing suicide in all likelihood.

bear in mind that a full on digital simulation of wetware neurons, will operate similarly to wetware neurons.

EG, you would still get horny, but then would have no wedding tackle.

Food for thought.  Maybe you need Zack Weinersmith's instant orgasm button as an option for that.

You know, the whole thing about quantity not equating to quality is often tossed around when discussing immortality, but I'd say: why wouldn't you want the choice?

Yeah, maybe you'd live a few centuries and say, "Right, this is bloody miserable, time to off myself!" Grand, you've done it! Maybe you're stuck in total sensory deprivation, in which case yeah, move on and die. If it's the sort of immortality where your mind and body continue to decay without actually ending, boo fuckin' hoo, that would have happened anyways, and a lot sooner to boot. If it's something like the original situation in which your meat body dies and you're concerned about continuity, wait until you start showing signs of fatal illness or mental deterioration. So about twenty five years old or so.

The only situation in which immortality is not a matter of "say yes now, decide to die when you're ready for it," is in a perfect and uncontrollable immortality in which you cannot self-terminate which simultaneously results in you being forcibly and irreversibly thrust into a situation you detest. Me? I'd happily live until the end of the universe, even if only to find out what happens to it. Then I'd enjoy the new one, or drift forever in a cold and empty void daydreaming about all the shitty self-insert situations I can imagine with memories of billions of years of entertainment media.
Logged


Aurora on small monitors:
1. Game Parameters -> Reduced Height Windows.
2. Lock taskbar to the right side of your desktop.
3. Run Resize Enable

Urist McScoopbeard

  • Bay Watcher
  • Damnit Scoopz!
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #54 on: October 29, 2015, 10:25:28 pm »



This sums up how this would play out if it happened to me.
Logged
This conversation is getting disturbing fast, disturbingly erotic.

jaked122

  • Bay Watcher
  • [PREFSTRING:Lurker tendancies]
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #55 on: October 29, 2015, 10:26:20 pm »

The best solution to ownership is to live in a post scarcity society with some belief in ethics.


"Well this computer has a person on it. We better find this person a different computer that they can have some kind of existence in."


Bender did his best.

Cthulhu

  • Bay Watcher
  • A squid
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #56 on: October 29, 2015, 10:31:11 pm »

[

In response to slow nanite replacement and a theseus ship sort of thing, sure, that'd work, but I don't believe that the thing we're presuming to preserve here actually exists.  I guess the best way to explain what I'm trying to say is that I don't think Theseus's ship is preserved if you replace its boards one by one.

The Ship of Theseusis a bad analogy for this topic. Yes, if you replace one board at time, most people would agree it's still the same ship. But that's not what we're discussing here.

Imagine if instead of replacing the Ship of Theseus one board at a time, you instead dismantled it one board at a time, and every time you removed a board, you wrote down a description of where it was and how it connected to other boards, and then burned it. And then once the entire ship was a pile of dust, you threw the dust away and handed the description to somebody else and asked them to make a CAD file of it on a computer.

Would that CAD file be the same ship?

Quote
I don't think the question matters.  I think it's just a certain arrangement of boards and that the designator "ship" has no immanent value, so asking if the "ship" is preserved isn't really asking anything.

Whether you are or aren't a zombie, I can't know. I'm not you. But I do know that I'm not a zombie. I may be observing a "certain arrangement" in the form of memories and behavior patterns, but observation is occurring. I empirically know this, because "I" am observing it. I might be fuzzy on what the "I" is who is doing the observing, but the experience of observation is definitely being observed.

Suggesting that reproducing the brain state being observed would necessarily therefore re-create the observation of that brainstate and that that observing entity would therefore be me, makes about as much sense as saying that making a digital copy of a painting and sticking it on a computer would therefore result in the same subjective experience as a human being looking at the original painting in a gallery.

1.  I'm referring to individual replacement of neurons inside the brain.  That each neuron is replaced individually with a synthetic copy still inside the skull in its original position.  I'm assuming that's what Ispil meant when I say what I said.  In that case the theseus ship analogy stands.  Your response to ispil's comment doesn't make sense to me unless you think there are individual "load-bearing" neurons that will cease consciousness (whatever that means, remember I don't believe in a continuity of consciousness at all) if they're removed, which is also a very strange concept.

2.  For people without access to telescopes and other celestial observation equipment it's empirically sound to say that the sun revolves around the Earth.  What's your point?

-------

@jaked122

I suppose but that doesn't actually answer the question, considering it's not the only possibility.
« Last Edit: October 29, 2015, 10:32:42 pm by Cthulhu »
Logged
Shoes...

MetalSlimeHunt

  • Bay Watcher
  • Gerrymander Commander
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #57 on: October 29, 2015, 10:32:15 pm »

Consider: If you choose not to care about the continuity problem, you and I can scour the universe as immortal posthuman gods purging the stars of all alien life.
Logged
Quote from: Thomas Paine
To argue with a man who has renounced the use and authority of reason, and whose philosophy consists in holding humanity in contempt, is like administering medicine to the dead, or endeavoring to convert an atheist by scripture.
Quote
No Gods, No Masters.

Flying Dice

  • Bay Watcher
  • inveterate shitposter
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #58 on: October 29, 2015, 10:34:50 pm »

Consider: If you choose not to care about the continuity problem, you and I can scour the universe as immortal posthuman gods purging the stars of all alien life.
MSH, don't you get it? It's not about getting to live forever as technological monstrosities, it's about "Oh no-ooooo, it's totally different from going to sleep, falling unconscious, or briefly dying before being resuscitated!".
Logged


Aurora on small monitors:
1. Game Parameters -> Reduced Height Windows.
2. Lock taskbar to the right side of your desktop.
3. Run Resize Enable

LordBucket

  • Bay Watcher
    • View Profile
Re: Thoughts on Technological Immortality
« Reply #59 on: October 29, 2015, 10:35:12 pm »

Consider: If you choose not to care about the continuity problem, you and I can scour the universe as immortal posthuman gods purging the stars of all alien life.

If you choose to accept Jesus as your lord and savior you can live joyfully forever in heaven.

Oh, wait...unless that's wrong.

Trying to bribe people with the benefits if it works does not constitute evidence that it works.
Pages: 1 2 3 [4] 5 6 ... 24