As I opened the week with a reprint of the Million Year Life Span, it seems fitting to end the week with a short article that focuses on one important aspect of much the same topic. After aging is conquered, the pursuit of exceptional longevity will require us to move beyond biology. Given present accident rates, ageless humans will only live for a few thousand years. Even with vastly reduced accident rates, sooner or later the vulnerable human physiology will succumb to misfortune. To live for very much longer, for tens and hundreds of thousands of years, we must transcend our biological origins in some way. The operation of our minds must move to a far more robust and easily maintained machine infrastructure, each neuron a device. But as those who favor uploading and emulation of the mind in software would point out, physical existence based on machines taking the place of neurons is only one of the many options for transcendence that will open up with progress in technology. To my eyes, the vital, the only question to ask at each stage of the process of becoming greater and vaster than before is whether you will still be you afterwards: is your continuity as an individual preserved?
Let's say you replace a single neuron in your brain with one that functions thousands of times faster than its biological counterpart. Are you still you? You'd probably argue that you are, and even a significant speed bump in a single neuron is likely to go largely unnoticed by your conscious mind. Now, you replace a second neuron. Are you still you? Again, yes. You still feel like yourself. You still have the continuity of experience that typically defines individuality. You probably still don't notice a thing, and indeed, with only a couple of overachieving neurons, there wouldn't be much to notice. So, let's ramp it up. You replace a million neurons in your brain with these new, speedy versions, gradually over the course of several months. Sounds like a bunch, right? Not really; you've still only replaced 0.001% of your brain's natural neurons by most estimates. Are you still you?
You may find you're reading books a teensy bit faster now, and comprehending them more easily. An abstract math concept that once confused you now begins to make some sense. You're still very much human, though. But why stop there? You're feeling pretty good. You feel the tug of something greater calling you. Is it the curiosity, the siren call of improving one's own intelligence? You embark on a neurological enhancement regimen of two billion fancy new neurons every month for a year. After this time, you've got on the order of 24 billion artificial neurons in your head, or about a quarter of your brain.
Are you still you? Your feelings and emotions are still intact, as the new neurons don't somehow erase them; they just process them faster. Or they don't, depending upon your preference. About half-way through this year, you began noticing profound perceptual changes. You've developed a partially eidetic memory. Your head is awash in curiosity and wonder about the world, and you auto-didactically devour articles at a rapid clip. Within weeks you've attained a PhD-level knowledge of twenty subjects, effortlessly. All art becomes not just a moving experience, but an experience embedded in a transcendental web of associations with other, far-removed concepts. Synesthesia doesn't begin to cover what you're experiencing. But here's the thing; it's not overwhelming, not to your enhanced, composite brain and supercharged mind.
You reason (extraordinarily quickly at this point), that since you don't seem to have lost any of your internal experience, you should seek the limit or its limitlessness, and replace the rest of it. After all, at this point, everyone else is, too. It's getting harder to find work for someone who's only a quarter-upgraded. Over the next three years you continually add new digital neurons as your biological ones age, change, and die out. Are you still you? Following this, you are a genius by all traditional measures. Only the most advanced frontiers of mathematics and philosophy give you pause. Everything you've ever experienced, every thought that was ever recorded in your brain (biological or otherwise) is available for easy access in an instant.
Years pass. The same medical technology that allowed your neurons to be seamlessly replaced, aided and accelerated by a planet full of supersavants, has replaced much of your biological body as well. You're virtually immortal. Only virtually, of course, because speeding toward Earth at a ludicrous velocity is a comet the size of Greenland. There is general displeasure that the Earth will be destroyed (and just after we got smart and finally cleaned her up!), but there's a distinct lack of existential terror. Everyone will be safe, because they are leaving. How does a civilization, even a very clever one, evacuate billions of people from a planet in the space of years? It builds some very large machines that circle the Sun, and it uploads everyone to these machines. Uploads? People? Why sure, by now everyone has 100% electronic minds. These minds are simply software; in fact, they always were. Only now, they're imminently accessible, and more importantly, duplicable.
Billions of bits of minds of people are beamed across the solar system to where the computers and their enormous solar panels float, awaiting their guests. Of course, just as with your neuronal replacements all those years ago, this is a gradual process. As neurons are transferred, their counterparts in your skull are disabled. The only difference you feel is a significant lag, sometimes on the order of minutes, due to the millions of miles of distance between one half of your consciousness and the other. Eventually, the transfer is complete, and you wake up in a place looking very familiar. Are you still you?
What is the difference between a process of gradual replacement that resembles the neurogenesis and cell death that already happens in the brain and a process that kills you in order to create a duplicate? Where does the Ship of Theseus, in which it seems sensible to argue it is the same ship if a single plank is replaced, become the grandfather's ax, where we start to have doubts about continuity when replacing the head or handle? Further, how fast and transformative a replacement of neurons can take place before we call it death rather than transcendence? The essential nature of slow replacement that argues for continuity is that (a) the new replacement is a small part of the whole, and (b) the replacement comes to equilibrium with the existing structure before the next replacement takes place. The neuron integrates into neural circuits, the plank is taken voyaging. Large replacements and rapid replacement are both problematic. Few people would be comfortable swapping out a quarter of the brain at once, or running a process of neuron by neuron replacement for the entire brain in an hour. Nor should they be. What use is it if a pattern survives when you, yourself, do not?
In the decades ahead, it will become possible to copy and dramatically alter the mind, as well as to replace neurons with machinery. That doesn't mean that every such implementation is a sound path to exceptional longevity, a continuity of the self far beyond the limits of biological human agelessness. Many of them will be expensive ways to achieve a subtle form of suicide. A copy of you is not you, and for that matter it is far from clear that an emulated mind running in software is in fact a discrete and continuous entity rather than an ongoing flicker of partial, immediately destroyed shades. That depends entirely on the computational architecture. Unless data is tied to physical structure in the same way as occurs in neurons and the human mind, it is hard to argue for an emulation to be alive, a discrete individual in the way we are. No mainstream computational architecture is heading in that direction, and it seems likely that the first emulations will run many layers of abstraction removed from questions of physical storage models. This is a horrible tragedy, but those who disagree with my metaphysics will no doubt go ahead and do it anyway.
I believe that an important existential challenge will arise in the next phase of human longevity, after aging is cured, driven by the economics of mind emulation and other neurotechnologies yet to be developed. Emulations, and other people willing to break continuity of the self by altering the data of the mind in similar drastic ways, such as by running multiple copies with periodic reintegration through overwriting data, will have a considerable economic advantage over those who strive to be certain in retaining continuity of identify. They can change themselves to circumstances, and undertake far more activities per unit time. If present opinions and trends are any guide, there is the risk of humanity dwindling to only a handful of long-lived entities, lost amidst a sea of transient and ever-changing ghosts that pretend to continuous existence but in fact destroy themselves over and over, more rapidly than they form thoughts. It will be the death of identity, and the death of all of those who once lived, but then chose to transform themselves into the basis for such a computational wilderness, where there is only oblivion writ large and repeated, not life as we understand it.