A Little Philosophy of Mind Uploading
Permalink | View Comments (2) | Post Comment | | Posted by Reason

A great deal of philosophical and metaphysical thought is devoted to the topic of mind uploading. We are moving into an age in which the emulation of human brains in software will be possible, and clearly strong artificial intelligence will result from that work, even if not achieved through other means.

There is considerable overlap between supporters of longevity science and supporters of work on strong AI. A large contingent of people view mind uploading - making a copy of their mind and then running it in software - as a perfectly valid approach to achieving radical life extension. Look at the 2045 Initiative, for example, as a determined outgrowth of this community. This appears fine if you believe that a copy of you is you, but the problem is that this is not the case. A copy is a copy, its own entity. There are also other rather important existential issues inherent in existing as software rather than hardware: are you a continuous being, or are you just a sequence of disconnected, momentary separate beings, each destroyed an instant after its creation? A shadow of life and an ongoing atrocity of continual murder, not actual life.

So the details of implementation matter. Replace your neurons as they die, gradually, with long-lasting machinery that serves the same purpose in hardware and you are still you. Nothing is different as you transition continuously from flesh to machine. But to copy the brain and throw it away, to replace it instantly with that same end result is death. So far as I can see there is no near-future technology of gradual machine replacement that is likely to provide radical life extension on the same timeframe as work in rejuvenation medicine. Artificial neurons for gradual replacement are a long way away in comparison to implementation of the SENS vision for reversal of human aging.

In any case, here is a little philosophical reading on mind uploading, with links to much more in the way of thought on the subject. It might not be terribly relevant to our future, but that doesn't stop it from being interesting:

A couple of years ago I wrote a series of posts about Nicholas Agar's book Humanity's End: Why we should reject radical enhancement. The book critiques the arguments of four pro-enhancement writers. One of the more interesting aspects of this critique was Agar's treatment of mind-uploading. Many transhumanists are enamoured with the notion of mind-uploading, but Agar argued that mind-uploading would be irrational due to the non-zero risk that it would lead to your death. The argument for this was called Searle's Wager, as it relied on ideas drawn from the work of John Searle.

This argument has been discussed online in the intervening years. But it has recently been drawn to my attention that Agar and Neil Levy debated the argument in the pages of the journal AI and Society back in 2011-12. Over the next few posts, I want to cover that debate. I start by looking at Neil Levy's critique of Agar's Searlian Wager argument.

The major thrust of this critique is that Searle's Wager, like the Pascalian Wager upon which it is based, fails to present a serious case against the rationality of mind-uploading. This is not because mind-uploading would in fact be a rational thing to do - Levy remains agnostic about this issue - but because the principle of rational choice Agar uses to guide his argument fails to be properly action-guiding. In addition to this, Agar ignores considerations that affect the strength of his argument, and is inconsistent about certain other considerations.

Link: http://philosophicaldisquisitions.blogspot.com/2014/01/is-mind-uploading-existentially-risky.html

Comments

I’m a sucker for these questions. I struggle with these thoughts too. I thought that the 2045 Initiative was a totally crazy idea, having nothing to do with life extension. Restating in some way what you already wrote, here is the line of thinking that I can not resist:

(1) Assuming, if me is not an out-of-physical-brain entity (like a soul) but a consciousness arising from a complex adaptive system of neurons/glia (as in emergence/chaos theory) and a memory (a similar concept to a computer on-going calculations and its hard drive/RAM memory) then (2) if we create an exact copy (whatever level it proves to be required) of my brain (in a substance like silicon), it is as me as the original because it feels so to that new brain.

If you disagree then we can ask: why would you need continuity of the brain/consciousness (during the neuron-by-neuron transformation to silicon brain)? Is your brain, i.e. you, waking up from coma still you or just some kind of copy/version of you?

The above idea does not feel right to me either, but maybe simply because we never have done this experiment of brain uploading (like the 2045 Initiative), or brain copying, or Star Trek-like transporters.

The idea is alien for a homo-sapience. It takes away uniqueness of “me.” It takes away the sanity of the thinking about extended lifetime. It is too radical. It makes the brain upload only a scientific experiment. It strips it of human side. But maybe this “not feel right” feeling is just another form of religious thinking?

Posted by: nanotech_republika_pl at January 1, 2014 2:42 PM

Riddle me this: Ray Kurzweil identified the law of accelerating returns, and presumably we reach near-vertical growth of technology each year. Well, that technology is hardware...and we are saying that mind-uploading will take 'software' of a sort and 'install' it onto a more 'durable' and 'better' hardware/substrate.

But what about those accelerating returns? Heck, people need to buy a new cell-phone and laptop every year in order to stay current...and that's just at the rate of Moore's Law, forget about the exponential growth of the future...

i.e. if hardware advances at an accelerating pace, and you've uploaded an actual mind, then how in the world are you going to keep up with that pace?

I say try to enhance the human body and brain first...there's a lot to be done for it.

Posted by: Eugene at January 1, 2014 4:08 PM
Post a comment; thoughtful, considered opinions are valued. Please note that comments incorporating ad hominem attacks, advertising, and other forms of inappropriate behavior are likely to be deleted.









Remember personal info?