Personal Survival and Swimming Against the Cultural Currents

I'm a first things first sort of a person. A tremendous amount of work remains on any path that leads to the creation of rejuvenation biotechnology capable of reversing aging - especially if we want to it arrive before we die of old age - so expending a lot of effort on thinking about what happens afterwards doesn't strike me as helpful. That said, it can't hurt to glance ahead here and there in order to anticipate the next array of possible challenges and endeavors.

So this is one of those short glances, focused on the narrow issue of swimming upstream against the culture of our time - which is a great deal more work and a great deal slower than going with the flow.

As I'm sure you've all noticed, the currents of opinion and conversation that underpin our society are largely opposed to initiatives aimed at abolishing aging through medical science so as to live far longer healthy lives. To a certain extent this is because human societies are reflexively conservative, even in times of great change, and everything new is resisted, ridiculed, or ignored in the hope that it will go away, even if beneficial. The ape inside all of us lusts for stability and stasis. Another factor is that many people seem genuinely uninterested in ensuring more healthy time spent alive at some point decades from now - the psychology of time preference at work, deeply discounting the value of anything likely to happen a long time from now. Further, the greater the level of regulation and government intervention in a field, the slower it goes and the more that all change is opposed - just look at medicine in the US, for example, where the primary regulatory body does pretty much all it can to sabotage any form of progress in medicine.

One day, the change that was hard-won will be the new normal and completely accepted. That will be the case for longevity science and the defeat of aging as well, and people of the future will wonder how we could have been such barbarians, resisting the obvious benefits of not suffering, decaying, and dying. Until that time, supporting rationality and faster development in engineered human longevity will continue to be harder than it should be.

This additional cost in time and resources imposed by the nature of our present culture is an existential threat. It threatens to kill us by ensuring that the development of effective ways to reverse aging in the old arrive too late. Given that progress in this field of science and technology is a matter of persuading funding sources and raising money to accomplish known goals, it could be argued that this is a fight to change the prevailing culture rather than a matter of research. If we want to live, it's a fight we have to win - or at least convince a few tens of millions to become supporters of longevity science in the same way that most people are supporters of cancer research.

But let us look to the future, at what I see as a loosely analogous cultural battle that will start to arrive at around the same time as the means to reverse aging - one that will also present an existential threat to personal survival.

Consider that at some point in the next few decades it will become possible to simulate and then emulate a human brain. That will enable related technological achievements as reverse engineering of memory, a wide range of brain-machine interfaces, and strong artificial intelligence. It will be possible to copy and alter an individual's mind: we are at root just data and operations on that data. It will be possible for a mind to run on computing hardware rather than in our present biology, for minds to be copied from a biological brain, and for arbitrary alterations of memory to be made near-immediately. This opens up all of the possibilities that have occupied science fiction writers for the past couple of decades: forking individuals, merging in memories from other forks, making backups, extending a human mind through commodity processing modules that provide skills or personality shards, and so on and so forth.

There is already a population of folk who would cheerfully take on any or all of these options. I believe that this population will only grow: the economic advantages for someone who can edit, backup, and fork their own mind are enormous - let alone the ability to consistently take advantage of a marketplace of commodity products such as skills, personalities, or other fragments of the mind.

But you'll notice I used what I regard as a malformed phrase there: "someone who can edit, backup, and fork their own mind." There are several sorts of people in the world; the first sort adhere to some form of pattern theory of identity, defining the self as a pattern, wherever that pattern may exists. Thus for these folk it makes sense to say that "my backup is me", or "my fork is me." The second sort, and I am in this camp, associate identity with the continuity of a slowly changing arrangement of mass and energy: I am this lump of flesh here, the one slowly shedding and rebuilding its cells and cellular components as it progresses. If you copy my mind and run it in software, that copy is not me. So in my view you cannot assign a single identity to forks and backups: every copy is an individual, large changes to the mind are equivalent to death, and it makes no sense to say something like "someone who can edit, backup, and fork their own mind."

A copy of you is not you, but there is worse to consider: if the hardware that supports a running brain simulation is anything like present day computers, that copy isn't even particularly continuous. It is more like an ongoing set of individuals, each instantiated for a few milliseconds or less and then destroyed, to be replaced by yet another copy. If self is data associated with particular processing structures, such as an arrangement of neurons and their connections, then by comparison a simulation is absolute different: inside a modern computer or virtual machine that same data would be destroyed, changed, and copied at arbitrary times between physical structures - it is the illusion of a continuous entity, not the reality.

That should inspire a certain sense of horror among folk in the continuity of identity camp, not just because it is an ugly thing to think about, but because it will almost certainly happen to many, many, many people before this century ends - and it will largely be by their own choice, or worse, inflicted upon them by the choice of the original from whom the copy was made.

This is not even to think about the smaller third group of people who are fine with large, arbitrary changes to their state of mind: rewriting memories, changing the processing algorithms of the self, and so on. At the logical end of that road lie hives of software derived from human minds in which identity has given way to ever-changing assemblies of modules for specific tasks, things that transiently appear to be people but which are a different sort of entity altogether - one that has nothing we'd recognize as continuity of identity. Yet it would probably be very efficient and economically competitive.

The existential threat here is that the economically better path to artificial minds, the one that involves lots of copying and next to no concern for continuity of identity, will be the one that dominates research and development. If successful and embedded in the cultural mainstream, it may squeeze out other roads that would lead to more robust agelessness for we biological humans - or more expensive and less efficient ways to build artificial brains that do have a continuity of structure and identity, such as a collection of artificial neurons that perform the same functions as natural ones.

This would be a terrible, terrible tragedy: a culture whose tides are in favor of virtual, copied, altered, backed up and restored minds is to my eyes little different from the present culture that accepts and encourages death by aging. In both cases, personal survival requires research and development that goes against the mainstream, and thus proceeds more slowly.

Sadly, given the inclinations of today's futurists - and, more importantly, the economic incentives involved - I see this future as far more likely than the alternatives. Given a way to copy, backup, and alter their own minds, people will use it and justify its use to themselves by adopting philosophies that state they are not in fact killing themselves over and again. I'd argue that they should be free to do so if they choose, just the same as I'd argue that anyone today should be free to determine the end of his or her life. Nonetheless, I suspect that this form of future culture may pose a sizable set of hurdles for those folk who emerge fresh from the decades in which the first early victories over degenerative aging take place.


Quite right. I love the possibility of living forever, but this thankfully still hypothetical technology scares the shit out of me!

Posted by: James at September 4th, 2012 7:30 PM

I think this future in which mind-uploading is common will not come about. As you alluded to yourself by drawing a distinction between computer systems and collections of "artificial neurons," the architecture of the human brain is vastly different from that of a computer even if they perform somewhat similar data processing functions. The degree to which the underlying architecture differs from the client architecture determines the efficiency of emulation. In the case of emulating a human brain it would be profoundly inefficient.

Even so, one might say that computers will improve so quickly and so vastly that the inefficency penalty becomes tolerable for emulating a human brain. *Granting for the sake of argument that this is the case* I think brain emulation still loses in the end, because biology will become a moving target itself. The brain can be improved and enhanced in many ways, including provision of an alternate avenue of metabolism to lift the primary limit that biological architecture has run up against — feeding such a demanding tissue as the brain without loosing a firestorm of free radical and other toxins.

In the end, I think the vision of the uploadists can be likened to a computer manufacturer saying "We won't need graphics units in the future, or even floating-point units, because the speed of core will rise so fast that emulating these functions will render the specialized circuits obsolete." It's yet another example of the improve-one-technology-while-holding-everything-else-about-society-constant error of thought that anyone who thinks seriously about any aspect of the future will confront over and over again.

Posted by: José at September 4th, 2012 11:36 PM

Reason wrote:

If you copy my mind and run it in software, that copy is not me. So in my view you cannot assign a single identity to forks and backups: every copy is an individual, large changes to the mind are equivalent to death, and it makes no sense to say something like "someone who can edit, backup, and fork their own mind."

There was a time, quite a few years ago, when I struggled with the idea of multiple copies of myself; but I honestly have to say, the idea that "that copy is not me" never crossed my mind. Quite frankly, I literally cannot understand how reasonable minds can hold this view. At best, I can only emulate various cognitive or language based defects that allow it.

The most clear issue I see is with regard to language, specifically the words 'copy' and 'me'. To see why, consider this standard philosophical question:

"If a boat crosses the ocean, and on its trip every plank is replaced one at a time, is it the same boat when it reaches the other side?"

Is the boat the same? There's a lot of unspecified definition in those words. But no matter how you define the words in your statement, you're always left with what actually happened: every plank in the boat was replaced, one at a time, on the voyage. We don't have any words in the english language to describe that process; so we use things like 'boat' and 'same' to try to construct a flawed definition upon which we can argue semantics.

This is exactly what you're doing with your definitions of 'me', 'copy', and the other parts of your identity claims - trying to jam words you know onto concepts that encompass far more than those words can express. The reality is that uploading and running a mindstate, nevermind making multiple functional copies of it, is going to be a lengthy, involved, and complicated process for which words like "me" and "copy" may not even be relevant.

There is also the important question of, "who cares?" If freezing your process and moving your mental state to another substrate destroys you, but you don't actually feel any different, does it matter? Isn't it then merely another pointless semantic issue that reflects only the failure of our language?

Posted by: Dennis Towne at September 5th, 2012 10:10 PM

I don't pay much attention to the "uploading" stuff. I think whole brain emulation will prove quite difficult and will take many decades to yield any success. Anyone who knows software knows this is not going to happen soon. Also, much of neurobiology is still not understood. SENS and other biotechnologies can be developed in less time than whole brain emulation.

Assuming that I make it, I expect to remain biological for at least the next 2-3 centuries.

Posted by: Abelard Lindsey at September 6th, 2012 10:01 AM

If the computational functionalism theory of consciousness were true then "we are at root just data and operations on that (sic) data". There is no evidence for this speculation, however, and IMO strong arguments against it, not least those of John Searle.

Simulation (or emulation) of the brain on a computer is just a bunch of equations describing some aspects of the physical quantities we can measure. In what way would this enable mind uploading?

Posted by: Richard Wilson at September 19th, 2012 7:19 AM

What is personal identity but memories and consciousness. Where is it when and if we can radically alter our bodies ie prosthesis/cyborgs. If the brain then is not our identity a process [ unless you believe in soul)? Seems to me identity is a pattern that is in constant flux. If a cell were to divide with equal parts then what is the original? An atom of hydrogen is the same as another atom of hydrogen. As far as continuity how would you know if you became unconscious and woke up as a 'different' person?

Posted by: george at April 27th, 2016 11:45 PM

Post a comment; thoughtful, considered opinions are valued. Comments incorporating ad hominem attacks, advertising, and other forms of inappropriate behavior are likely to be deleted.

Note that there is a comment feed for those who like to keep up with conversations.