Transferring consciousness

Started by Backdraft, December 10, 2017, 11:22:11 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Backdraft

So, this is something that I've been thinking about ever since I first saw this transhumanism stuff popping up everywhere. Most people I've found are eager to "transfer" their consciousness to a computer—essentially turning themselves into androids under the assumption that doing so would increase their longevity indefinitely. I have an issue with this, and it's not some pseudo-ethical bull, it's a legitimate question:

First, allow me to draw a parallel to frame the question. If we created a human clone of a person—let's call her "Jane"—there would be two Janes, but Jane wouldn't have just one consciousness. Jane 2 would be her own person, not an expansion of Jane 1. If we then, hypothetically, shot Jane 1, she would no longer be alive, even if Jane 2 was still alive. There would just be one Jane. That Jane would be an exact copy of Jane 1, but wouldn't be her. Jane 1's consciousness will have ceased to exist.

Taking that logic and applying it to transhumanism, if we copied a person's consciousness (as I highly doubt there is a way to "transfer" consciousness; that seems rather abstract to me,) then we would again have two Janes. One Jane would be human, the other would be an exact digital copy of Janes thoughts and memories. It would, essentially, be its own being. If organic Jane dropped dead right there, the people around her wouldn't recognize it as Jane dying, they'd simply see digital Jane as her replacement. Jane would have effectively beaten death—at least, as far as the people around her are concerned. The issue is that digital Jane is not Jane, she's a copy of Jane. Jane is dead. She didn't beat death, she just got cloned, and the clone is still alive.

(This is all disregarding the inevitable hacking and viruses that would eventually become commonplace in a world where people are digitally stored in android bodies.)

So, all that said, would you do it? Would you "transfer your consciousness" by effectively killing yourself, so that an exact copy of you with all of your thoughts and memories would live on for an undetermined amount of time?

Lustful Bride

mm......I might. But only if I was already near death. At least that way it will be as accurate a version of me as possible with as much data as could be used and memories to help make sure it is like me and can carry on a legacy with my values and etc etc. :P

Also this is really reminding me of Soma.

Click here for the game in question, and an analysis of this very topic along with a heaping of existential horror

midnightblack

Soma was mindboggling for me. I was blank-eyed for days after finishing that game, just gazing outward and pondering.

As for the topic itself, it's often brought up for discussion at work, albeit in more of a jokingly, tongue-in-cheek manner. In ideas like medicine running out of business and being replaced by hardware and software maintenance.  In any case, I find that any and all ethical dilemmas that humankind has faced up to this point will simply whittle away in comparison to the problems  brought up precisely by the points suggested by Backdraft: having several copies of the same "self" floating about, each as an individual entity. I'm guessing that humanity's future may in some sense turn out in the spirit of Mark Twain's words at the end of "The Mysterious Stranger". The "individual" will be an electromagnetic wave propagating through the vacuum, and upon the discovery of a suitable world, machinery will just take a bucketload of dust, filter the atoms and weave into a suitable lifeform which will be loaded up with the aforementioned "self".

As far as I'm concerned, I probably wouldn't be able to say 'no' to the option of loading my awareness in the memory of some satellite and enjoying a blissful existence in virtual reality. Not so sure if perpetuating my existence in the "real" world indefinitely (unless this one too is a clever illusion that we haven't figured out yet  ::) ) would be as fun. I'd need something to keep me busy for a good long while.
The Midnight Lodge (O2 thread & completed tales compendium)
Thy Nightly Chambers (requests) updated!
Zerzura (albeit short, the best collaborative story I've ever completed here)

Backdraft

Quote from: Lustful Bride on December 10, 2017, 12:04:40 PM
mm......I might. But only if I was already near death. At least that way it will be as accurate a version of me as possible with as much data as could be used and memories to help make sure it is like me and can carry on a legacy with my values and etc etc. :P

Also this is really reminding me of Soma.

Click here for the game in question, and an analysis of this very topic along with a heaping of existential horror

I loved SOMA, but I never finished it. I got to the sunken submarine, got stuck in a room with the. . .thing hovering right outside the door just as I got a checkpoint. Spawn-killed every time I reloaded, so I haven't gotten around to replaying it yet. It was really good while it lasted, though.

Missy

I'm pretty sure there's something to do with quantum mechanics which means it's actually impossible to make an absolute duplicate of anything ever. A copied individual would be it's own being, you can only measure one aspect of any one thign at a time. Wish I remember where it was I saw that.

Missy

Although I would still download myself into a machine if it bore the complexity necessary to carry me. Likely there is not now a machine with the data stores and processing power to effectively simulate organic conciousness, whether or not that can or will happen remains to be seen.


Also on the topic of assuming we're nto already in a simulation you forget to ask if you are the simulations inhabitant or which of us are the inhabitants and which of the 'people' we interact with are actually a part of the simulation. Also the depth of the simulation: if a tree falls in the forest when no one is around, does it make a sound; in a simulation the answer would be 'no'. Although that assumes our simulation exists within the framework of the confines of the most advanced computer we could theoretically construct (contained within a Dyson Sphere) with technology of a similar design and limitations as our own, it is of course entirely possible our simulation utilizes technology vastly superior to our conception of the possible: meaning yes, your heart continues to beat even when the doctor isn't using a stephascope.

midnightblack

Quote from: Missy on December 10, 2017, 07:59:39 PM
I'm pretty sure there's something to do with quantum mechanics which means it's actually impossible to make an absolute duplicate of anything ever. A copied individual would be it's own being, you can only measure one aspect of any one thign at a time. Wish I remember where it was I saw that.

What you mean by that is probably the 'no-cloning theorem', which states that it is impossible to make an identical copy of an unknown quantum state. Still, the brain is a macroscopic object, at least some of the stuff it does can be understood in a classical (i.e. not quantum sense) and consciousness/awareness clearly have some classical significance, even if they are ultimately generated by quantum phenomena. So, maybe there is a way around things, in the sense that probably you don't need a mapping of the evolution of every single quantum state in your head in order to understand (or duplicate) its behavior.
The Midnight Lodge (O2 thread & completed tales compendium)
Thy Nightly Chambers (requests) updated!
Zerzura (albeit short, the best collaborative story I've ever completed here)

Trieste

The idea of being transferred into a machine creeps me the fuck out. I'm like, we don't even necessarily know where the consciousness is in the body - western thought has it centered in the head, other civs have had it in places like the chest and the stomach - plus there's the whole gut-brain theory. (It may be junk science; I haven't read enough about it to know.) Like, are we sure we would get it all?

... probably going to check out that SOMA game now, though. I hadn't heard of it.

Remiel

This subject also reminds me vaguely of the game the Talos Principle, although that had to do more with Artificial Intelligences taking on the role of humanity's successors and intellectual progeny.

But it also brings to mind the teleporters on Star Trek.  I've often thought, "wait--isn't Captain Kirk essentially being destroyed and an exact clone created every time he uses the teleporter?"

Valerian

Quote from: Remiel on December 11, 2017, 10:01:07 AM
This subject also reminds me vaguely of the game the Talos Principle, although that had to do more with Artificial Intelligences taking on the role of humanity's successors and intellectual progeny.

But it also brings to mind the teleporters on Star Trek.  I've often thought, "wait--isn't Captain Kirk essentially being destroyed and an exact clone created every time he uses the teleporter?"

I've often wondered that myself, and it's always struck me as seriously creepy and mind-bending.  As much as I joke about wanting transporters to be real, I don't think I'd ever have the nerve to use one.

I may need to watch The Prestige again now.  :P
"To live honorably, to harm no one, to give to each his due."
~ Ulpian, c. 530 CE

Wheeler97

The complexity of either "transferring" or "copying" a person's consciousness depends very significantly on exactly how memories are stored and whether or not it's anything more than our memories/experiences that make up our personalities and identities. I believe that the primary theory is that memory is a product of physical storage created by the synapses (connections) between neurons in the brain. There is a relatively recent proposal that postulates the neurons themselves as holding the memories. I would appreciate if anyone can provide additional insight, but if memory and thus consciousness is determined by the physical structure of the brain, then it would be easier, relatively speaking, to then copy a person's consciousness by creating a duplicate of the brain. I say relatively easy because there are almost as many synapses (roughly 100 billion) that make up each of our brains as there are stars in the whole Milky Way Galaxy (100-400 billion). Imagine precisely mapping an exact copy of the Milky Way as far as the relative position of stars in addition to copying precise pathways between each star to many of the stars around them. That's your brain.

I also want to reemphasize that if our memories are made up by the physical pathways of the brain, it creates an additional question of whether it's even possible to create a "digital" copy of our "consciousness" that is determined by a physical structure.

Back to the dilemma, this whole discussion also greatly depends on how you choose to define an individual. There is a commonly quoted myth that by the end of seven years, all cells in your body have died and been replaced by copies. That's not quite true since every type of cell has a differing lifespan. Some cells only live for a few months while some, neurons of your cerebral cortex, are never replaced when they die. So if all the physical cells of your body have been copied and replaced, except the brain that specifically holds your consciousness, are you still the same person? Does it make a difference if this process of copy/replace is "natural" or "manufactured"?

I really liked the use of this concept for long-distance space travel in SyFy's Dark Matter. Since travel across the galaxy took weeks or months, at least one company offered a service where a body copy was created at another pod/terminal, and your consciousness was allowed control it. Anything a character did in the "copy" was temporarily stored in the brain of the copy. To gain those experiences, you needed to return to a terminal and end you "session" to have those synapses mapped to your brain. If you died, the body shut off and collapsed into a pile of cellular material, the connection was broken and they'd wake up in their original body having remembered nothing beyond getting into the first pod.

Aiden

If I do that, well I be good at video games? If not, fuck that.

Lustful Bride

Quote from: Aiden on December 11, 2017, 12:57:35 PM
If I do that, well I be good at video games? If not, fuck that.

Ya gotta git gud.   8-)

Backdraft

Speaking of which, SOMA is 65% off on Humble right now.

Remiel

Quote from: Backdraft on December 11, 2017, 02:06:21 PM
Speaking of which, SOMA is 65% off on Humble right now.

ARGH!  Where were you yesterday?  >:(

...kidding.  It was only $30 on Steam.  Not too bad.

Malediction


Trieste

Quote from: Backdraft on December 11, 2017, 02:06:21 PM
Speaking of which, SOMA is 65% off on Humble right now.

... puts it in my budget. Thank you so much! For anyone else, their counter says there's like 17 hours left on the sale.

Sethala

I think one of the most important things on this topic is to figure out just what is "you".  There's a handful of videos that go over the idea in... way more detail than I could say without just repeating them, so I'll toss out a few links to ponder.

https://www.youtube.com/watch?v=nQHBAdShgYI

https://www.youtube.com/watch?v=wfYbgdo8e-8

https://www.youtube.com/watch?v=JQVmkDUkZT4

Atarn

A clone is an imperfect copy in the way of the mind. It is a blank slate, a physical but not mental copy. A transhman copy of sentience is you before a point.
So if you copied yourself yesterday, that copy is you before today. It is way more valid a copy than a clone which might look like you but which will most likely be basically an infant.
I feel that the notion of "self" is given way too much spiritual value in debates. If I copy a key, the copy might not be the original but it works jsut as well doesn't it? Also, Transhumanism as a concept vs Transhumanism the literature genre is quite different. In the latter copying your conscience provides a simple way for immortality. In the former...We have a muddled field of philosophy, wishful thinking, creepy fringe stuff etc.

Transhuman sci fi is awesome. Go read Richard Morgans Altered Carbon.
A sudden storm in
    summer, the brightest
    star at night; an
    opportunist rogue,
    confessor of sins
    a master of hearts
    a dominant lover

RedRose

There's an X-Files episod about this!
O/O and ideas - write if you'd be a good Aaron Warner (Juliette) [Shatter me], Tarkin (Leia), Wilkins (Faith) [Buffy the VS]
[what she reading: 50 TALES A YEAR]



Callie Del Noire

Watching Altered Carbon...Very interesting implications

TheGlyphstone

Quote from: Callie Del Noire on February 02, 2018, 12:11:08 PM
Watching Altered Carbon...Very interesting implications

Is it good? I might re-up my Netflix account just to watch it, if it is. I loved that book.

Callie Del Noire

Quote from: TheGlyphstone on February 03, 2018, 04:04:58 PM
Is it good? I might re-up my Netflix account just to watch it, if it is. I loved that book.

I like it so far and I’m all but one episode done and I won’t say more because well spoilers

TheGlyphstone

Quote from: Callie Del Noire on February 03, 2018, 08:55:32 PM
I like it so far and I’m all but one episode done and I won’t say more because well spoilers

Have you read the book? Avoiding spoilers is nice, but knowing if they did a good job of adapting it would be nice.

Callie Del Noire

Quote from: TheGlyphstone on February 03, 2018, 09:53:45 PM
Have you read the book? Avoiding spoilers is nice, but knowing if they did a good job of adapting it would be nice.

It’s been several years so I’m going to reread it after I finish the series

Saria

Quote from: midnightblack on December 10, 2017, 11:23:35 PM
Quote from: Missy on December 10, 2017, 07:59:39 PM
I'm pretty sure there's something to do with quantum mechanics which means it's actually impossible to make an absolute duplicate of anything ever. A copied individual would be it's own being, you can only measure one aspect of any one thign at a time. Wish I remember where it was I saw that.
What you mean by that is probably the 'no-cloning theorem', which states that it is impossible to make an identical copy of an unknown quantum state. Still, the brain is a macroscopic object, at least some of the stuff it does can be understood in a classical (i.e. not quantum sense) and consciousness/awareness clearly have some classical significance, even if they are ultimately generated by quantum phenomena. So, maybe there is a way around things, in the sense that probably you don't need a mapping of the evolution of every single quantum state in your head in order to understand (or duplicate) its behavior.
Although if consciousness is ultimately based on quantum states, while copying it exactly would be impossible, transferring it would be (relatively) easy... which would completely sidestep the problem Backdraft is worried about: you would be able to transfer your consciousness to a machine (or anything else you please) without dying in the process.

Of course, consciousness is probably not quantum states. So...




Has anyone here seen the film John Dies at the End? It has a hilarious opening bit about an axe. If you haven't seen it, the joke is based on the classic philosophical problem called the "ship of Theseus", which goes something like this:

Spoiler: Click to Show/Hide
In order to preserve the historic ship of Theseus, the Athenians put it in a museum, and cared for it meticulously. But the wood, as wood does, started to rot over over time. As each plank rotted, the Athenians replaced it with a new, identical plank, so the ship itself stayed in good condition. But after a long enough time, it happened that every single plank had - at some point - rotted and been replaced with a new one.

So... was the ship in the ship in the museum the ship of Theseus?

And the bonus question added years later by Hobbes: Suppose each time a plank rotted and was replaced, a fan recovered the original plank, used some technique to "un-rot" it, then used all those discarded planks to build another ship. Is the ship in the museum the ship of Theseus, or is the one the fan has the ship of Theseus... or both or neither?

So why am I bringing it up? Well, however consciousness may "work", it seems indisputable that it is rooted in our brains somehow. You muck around with someone's brain - damaging it, or even just temporarily changing its chemistry - and you can change what a person experiences, how they think, and even who they are (personality-wise). And our brains - however they work - are molecules which are made up of atoms. It turns out that our bodies very regularly replace the atoms that make them up at an astonishing rate. Not all of the atoms in our body get replaced... but over a long enough period - say a couple years - so many of the molecules in your brain may have been replaced that that 99% of its atoms are different.

So because your brain's molecules may have been replaced in their entirety (or very near to it) multiple times over the course of your life... does that mean you've died multiple times? If so... when? How many times?

But if you haven't died, and you're still the same person continued on from the person you were, say, 10 years ago, then you believe it is possible to remain you and not die even if pretty much all of your brain is replaced. Which brings up an interesting puzzle.

Suppose you have a car that you really like and you won't let me drive it. While you're distracted, I sneak in and change one of the spark plugs. "Aha!" I gloat. "Now this car is no longer your car! So now I can drive it!"

But no, you'd surely say, it's still your car. I don't think anyone would argue that; in fact, it would be silly if you took your car in to a mechanic and they changed a spark plug and told you that it's clearly not the same car you drove in thus it's not your car.

So this time, while you're distracted, I replace a spark plug and one of the tires. "Now this isn't your car!" I say.

But no, you'd still say it's your car.

So now I replace all the spark plugs, all four tires, and one of the doors (with an identical replacement, of course).

Is it still your car? I think everyone would agree so.

So I replace all the spark plugs, the crankshaft, the pistons, all four tires, all four doors, and the windshield.

Still your car?

At some point, surely, I will have replaced enough parts of your car that its no longer your car, right? But... exactly when is that line crossed? 50%? And, if it is 50%, then suppose I replace 50% of your car with new parts... but take the old parts and use them with other parts to create another car? Which car is yours? Both? Neither?

What does all this have to do with the thread question? Well, here we go:

Suppose instead of doing a wholesale "copy" of your consciousness from your brain to a computer, I replace a single neuron in your brain with a computer chip. Have I killed you? Are you no longer "you"? I don't think anyone would say yes to either of those questions.

So the next day, I replace another neuron. Dead yet? No longer you? Again, I don't think anyone would think so.

So if I keep doing this, eventually (assuming you live long enough to swap out all ~100 billion neurons at a neuron per day) I will have replaced all the neurons in your brain with computer chips. At what point did you "die"? At what point did the person in front of me stop being "you" and become someone else - a copy of you?

And if you never die and remain "you" even though all of the atoms or neurons in your brain are replaced (by other atoms or by computer chips) over a period of time... then how long does that period of time have to be? Do you survive if your neurons are all replaced over the span of ten years but die if that happens in a year? A day? A nanosecond?




Here's now I figure it. Whatever consciousness is, it's in my brain. However my brain works, it's made up of cells, molecules, and atoms. Those cells, molecules, and atoms are replaced over and over... yet I'm still me. Even though the substrate of my consciousness is continually changing, repeatedly, somehow I persist.

Therefore, even if the substrate of my consciousness is changed to a computer system, somehow I will be able to persist. So if I am copied to a computer, it will still be me - it won't be that "I died and this is someone else". At the moment of copying, there will be two "me"s, one in the brain, one in the machine. Neither one will be more "me" than the other. We'll both be me. If I owned a car before the procedure, then the car would belong to both of those people after. And then those two people will start to diverge and become two different people, both of which will be continuations of "me" but will be independent of the other. When a tree trunk diverges into branches, it's silly to ask which branch is the real tree.

So after all that, I can finally answer the question.

Hell yeah, I'd do it.

But I don't see it as "killing" myself while someone else - a copy of me - lives on. If I get in the machine, go unconscious, then my mind-stuff is copied into a computer, and then the copied mind awakens in the system... then no one has died. "Me" was a mean-brain one moment, blinked, then "me" was a cyber-brain the next. At no point did it stop being me. Everything I knew, thought, felt, or loved persists in the cyber-brain. In what sense would that not be "me"?

(Alternatively, if meat-Saria went into the machine and had her mind copied so that the result was a meat-Saria plus a cyber-Saria, both would be "me". No one dies in that scenario either.)

Here's another thought experiment to consider. Suppose you went into the machine, had your consciousness copied to the computer... then copied back into the brain. Wouldn't that still be you? Did you die during that procedure and the person who walked out is someone else? And, if that happened... how would you know? You got in the machine, there was a flash, then you got out and the operator told you it didn't work. How would you know whether you'd been transferred at all or not? What would be different?
Saria is no longer on Elliquiy, and no longer available for games