The Road to Mind Uploading

The mainstream press here has a go at summarizing some of the neurobiology and technologies that would lead to whole brain emulation, a copy of a human mind running in software. Many futurists believe that a copy of the mind running as an emulation is an acceptable continuation of the individual. Thus when they advocate brain preservation via cryonics or plastination it is for the purpose of recording the data for later use, not maintaining the actual tissue for later repair and restoration as a biological brain.

To me this seems a strange viewpoint; a copy of you is not you. Good for the copy and best of luck to him or her, but you yourself remain preserved and inactive. The essence of identity is physical continuity of both pattern and material that expresses that pattern, under a slow pace of change. If someone swapped out half of your brain all at once, you stop being you; you as an entity died in the initial removal, and a copy was created with the replacement operation. If half of the neurons in your brain are exchanged for machinery, one at a time, over a decade of active life, then you are still you - each replacement is incorporated into a working pattern and the change in data is little greater than those occurring due to the ongoing process of being alive. These are important differences, the two examples standing on either side of a large grey area.

The size of the futurist faction who advocate mind uploading for continuation of the individual is large enough that anyone undergoing cryopreservation would be wise to take with them some expression of their desires on the matter, perhaps an inscribed metal plate under the tongue or similar: "Please restore the original; do not copy, do not emulate."

Some neuroscientists believe it may be possible, within a century or so, for our minds to continue to function after death - in a computer or some other kind of simulation. Others say it's theoretically impossible, or impossibly far off in the future. A lot of pieces have to fall into place before we can even begin to start thinking about testing the idea. But new high-tech efforts to understand the brain are also generating methods that make those pieces seem, if not exactly imminent, then at least a bit more plausible. Here's a look at how close, and far, we are to some requirements for this version of "mind uploading."

The hope of mind uploading rests on the premise that much of the key information about who we are is stored in the unique pattern of connections between our neurons, the cells that carry electrical and chemical signals through living brains. You wouldn't know it from the outside, but there are more of those connections - individually called synapses, collectively known as the connectome - in a cubic centimeter of the human brain than there are stars in the Milky Way galaxy. The basic blueprint is dictated by our genes, but everything we do and experience alters it, creating a physical record of all the things that make us US - our habits, tastes, memories, and so on. It is exceedingly tricky to transition that pattern of connections into a state where it is both safe from decay and can be verified as intact. But in recent months, two sets of scientists said they had devised separate ways to do that for the brains of smaller mammals. If either is scaled up to work for human brains - still a big if - then theoretically your brain could sit on a shelf or in a freezer for centuries while scientists work on the rest of these steps.

The real challenge for aspiring mind uploaders will be figuring out how to create a fully functioning model of a human brain from a static snapshot of its connectome. To work, that model would have to include the molecular information in its neurons and synapses. Many neuroscientists think extracting that information would require another major step, others say structural details visible in the electron microscope might allow them to infer it. But some progress is being made - enough, anyway, so that the Obama administration signed off last year on a request by the National Institutes of Health for $4.5 billion to deliver a "comprehensive, mechanistic understanding of mental function" by 2025. Private foundations, like the Allen Institute for Brain Science and the Howard Hughes Medical Institute, have also announced major investments in basic brain research in recent years. And this summer, the blue-sky research arm of the United States intelligence agencies, Iarpa, distributed some $50 million in five-year grants to map the connectome in a cubic millimeter of mouse brain linked to learning behavior, record the corresponding neurons in live mouse brains and simulate the circuits in a computer.

Link: http://www.nytimes.com/interactive/2015/09/03/us/13immortality-explainer.html

Comments

I'd consider myself to be a futurist, but mind uploading is one of those things I'm iffy about, especially with all the arguing over whether or not it would really be you, or just a copy. And it's definitely one of the things I wouldn't want to be an early adopter of. In theory it sounds pretty cool, but I think I'd rather keep my body as long as I can instead.

Posted by: Ham at September 14th, 2015 9:30 AM

That is exactly how I came to feel about mind uploading; although I don't entirely rule it out, there is an inherent paradox that cannot be surmounted for the foreseeable future.

I have also some reservations about the cryogenics approach, in terms of uncertainty regarding the preservation of personality. (I haven't dug into that issue, though).

That's why I am in favour of SENS out of all current possible directions. I do note that they may be complementary, which is why Aubrey de Grey is also a member of Alcor.

Posted by: Nico at September 14th, 2015 11:23 AM

There are still two words that stymie the mind uploaders: substrate and consciousness. Until I know that it's my consciousness being moved to a substrate that can support it, I have zero interest here, because I'll be dead.

"We don't know the nature of consciousness" Then that's something you'll have to figure out before this kind of thing will actually do what you hope it to do.

Posted by: Slicer at September 14th, 2015 11:26 AM

I'm also firmly in the "it's just a copy" camp, and I'm also far from certain we can create sentience in a computer at all, though a perfect simulation of sentience (including a copy of an existing biological person) is practically inevitable.

And it's not just uploads. If somehow a second instance of me was created (a "Hollywood clone", i.e. a clone which isn't just genetically identical but also shares the same memories and biological age etc) that too would be just a copy. Even if it's 100% identical to me, down to the smallest atom and beyond, it's still a 100% different separate entity. Whatever this new entity will experience -- seeing a cat, falling and breaking an arm, eating -- I won't experience or even know about it unless being told or observing it. Because it's not me, even if no one else would notice if we were switched. I find it completely nonsensical to argue otherwise, yet some very smart people do.

Posted by: Northus at September 14th, 2015 3:10 PM

Also, I think the idea of uploading might distract from caring about biological rejuvenation. Some pro-uploading futurists think that uploading will arrive before biological rejuvenation, so why even care about the latter except as a contingency plan (which naturally won't get the same amount of attention or resources.)

Posted by: Northus at September 14th, 2015 3:34 PM

I believe “a copy of me” is me. I think it is a delusion that there is something like “me.”

To think the idea, “a copy of me is me,” might be a good delusion for the sake of one’s sanity or for the good of society one lives in, but it nevertheless is a delusion.

I think this delusion is a base of one’s humanity. Very difficult to separate from.

BTW, would you agree to have a technology like in Star Trek where you beam yourself to another location? It apparently would be killing of an original here and creation of a copy somewhere else.

Posted by: veriti at September 14th, 2015 8:18 PM

Agree with Northus that this is probably a distraction from biological rejuvenation. I think in falls into the same area as those people thinking that if a strong general AI can be created it will solve all problems.

Mind uploading is in the very far future, if possible at all, as you'd have to map the position and relationship of every neuron. Even then there is the "it's a copy/twin" problem.

The "AI Singularity" is also a bit silly. We are already collectively a human uber intelligence that I don't think any AI will match for a very very long time, if ever.

Posted by: Jim at September 14th, 2015 9:12 PM

The idea of an AI Singularity comes from the theory that you could create an AI who is better at creating AI than you are. That AI then goes on to improve itself or build another one (there is no real difference), and that improved AI is even better at creating AI than the second AI. Repeat until it's reached the limits of physics.

Posted by: Slicer at September 15th, 2015 12:11 AM

Hopefully the incredibly vocal "copy is me" camp has shrunk over the last 15 years. I could never comprehend why anyone would defend this view. Of course survival requires continuity, except I'd suggest that it is only awareness itself that is essential to preserve, not memories or any acquired mental content, but, rather, the experience of being aware itself. I'm not sure cryonics can do this. Some would say it wouldn't even be necessary.

Posted by: Heartland at September 15th, 2015 2:37 AM

There's no reason to require the same physical stuff, even with slow replacement. Pattern identity, computationalism, and functionalism are all well established theories which have no use for specific material collections, only patterns of structural arrangement and functional behavior.

The dramatically over-stated concerns about "copies" are best handled by branching identity theory, as presented in Michael Cerullo's paper Uploading and Branching Identity and my own book A Taxonomy and Metaphysics of Mind-Uploading.

http://link.springer.com/article/10.1007%2Fs11023-014-9352-8

http://www.amazon.com/dp/0692279849

Cheers!

Posted by: Keith Wiley at September 15th, 2015 9:16 PM

"These theories totally explain everything, but I won't explain them here, so buy my book" No.

Pattern identity/branching identity is nonsense on its face. From the perspective of the individual, the existence of a copy means absolutely nothing. The existence of someone who thinks exactly like you does not affect you.

Given what we know about how brain structures work, I'm much more inclined to believe that consciousness flows into them like water. Pouring another, identical cup of water means diddly for the original cup. Oh, and because they run on quite mortal neurons, our cups are slowly shrinking, and if we really want to improve our minds, we have no choice but to make the cups bigger and let more water in...!

Posted by: Slicer at September 16th, 2015 8:31 AM

By the same reason that a genome is not a human but the information to build a human, a connectome is not a brain, but information to build a brain. With that information, you can build an actual brain or simulate it on a computer. You can use the information to build many copies of the same brain, and they will work the same. A simulation of me will talk like me (maybe slower, but the content will be the same). We can consider a connectome an abstract form of all these brains, whatever their physical support. Similarly, a genome can be considered an abstract human, that can be instantiated in a biological human, stored in a computer database, written in a (very large) book, etc.

Now, an important point to be aware of is that consciousness is NOT a property of abstract brains, it's a property that only an actual, physical brain can have. This makes all copies different from me, since they don't have my consciousness.

Is identity a property of a connectome or of a physical brain? Well, nobody stated a clear definition of identity, but I think the case is similar to consciousness, it applies only to actual brains, not abstract brains. So an uploaded copy of my mind is not my mind, by the same reasoning that a twin of me (with the same genome) is not me but another person.

Posted by: Antonio at September 18th, 2015 8:05 AM

It seems like you don't know what real futurists do.

"Many futurists believe that a copy of the mind running as an emulation is an acceptable continuation of the individual."

Then you should ask for your money back.

Posted by: Actual Futurist at September 22nd, 2015 7:09 PM

Slicer wrote:
> "These theories totally explain everything, but I won't explain them here, so buy my book" No.

That's an absurd accusation given that I gave you two references, one of which is completely free (Cerullo's paper). I will now accept your retraction and apology for such rudeness. There's certainly nothing wrong with pointing out that I wrote a book on the topic.

> Pattern identity/branching identity is nonsense on its face. From the perspective of the individual...

It is in the more careful analysis that we see that your preferred biased perspective makes very little sense. You attempt to characterize personal identity (mistaking it for the notion of consciousness, which is a different concept) as some sort of pouring water. But there is no evidence at all that identity works like this. You can propose a naked hypothesis that personal identity comes from some ethereal realm and flows through brains like water, but that requires validation. In contrast, it is easy to describe thought experiments and hypothetical scenarios where any identity theory other than branching identity runs into paradoxes and inconsistencies. For example, when a brain is split in two, with which half does the identity ride?

Branching identity is by far the best theory of identity because it withstands the risk of paradox that confounds all other theories, such as the body theory you seem to prefer, in which identity is tied to particular physical things (such as brains).

Posted by: Keith Wiley at October 9th, 2015 2:39 PM

I'm pleasantly surprised at the number of comments agreeing with the proposition that a duplicate of you is not you (consider identical twins) but this isn't the main problem with mind-uploading.

At best, all you're doing is measuring brain activity and building a program whose equations generate the same data. This program does not instantiate mind - there are no electrical fluctuations - it's just a generator of numbers in the form of the binary state of registers in the computer.

Posted by: Dr Richard Wilson at November 7th, 2015 12:49 PM

What is the goal? A longer happier life full of bliss? (Heaven on earth) Continuity of identity? Awareness of identity comes through consciousness and memory. When unconscious then awakening identity is reset. Memories help to maintain identity but consider the case of amnesiacs or someone waking from a coma. If your connectome could be replicated with sufficient resolution ( say unknowingly) in a replicated body (for arguments sake) would you not feel like yourself? How could you tell without any other information. When you fall asleep and wake up you are not the same yet you do have continuity through memory. Is not downloading essentially the same but just differing in physical substrate?

Posted by: george at April 28th, 2016 10:48 PM

Seems to me that quality is more important than quantity. If identity is an emergent phenomenon then what does it matter as long as there is quality.

Posted by: george at April 28th, 2016 11:02 PM
Comment Submission

Post a comment; thoughtful, considered opinions are valued. New comments can be edited for a few minutes following submission. Comments incorporating ad hominem attacks, advertising, and other forms of inappropriate behavior are likely to be deleted.

Note that there is a comment feed for those who like to keep up with conversations.