Personal Survival and Swimming Against the Cultural Currents

I'm a first things first sort of a person. A tremendous amount of work remains on any path that leads to the creation of rejuvenation biotechnology capable of reversing aging - especially if we want to it arrive before we die of old age - so expending a lot of effort on thinking about what happens afterwards doesn't strike me as helpful. That said, it can't hurt to glance ahead here and there in order to anticipate the next array of possible challenges and endeavors.

So this is one of those short glances, focused on the narrow issue of swimming upstream against the culture of our time - which is a great deal more work and a great deal slower than going with the flow.

As I'm sure you've all noticed, the currents of opinion and conversation that underpin our society are largely opposed to initiatives aimed at abolishing aging through medical science so as to live far longer healthy lives. To a certain extent this is because human societies are reflexively conservative, even in times of great change, and everything new is resisted, ridiculed, or ignored in the hope that it will go away, even if beneficial. The ape inside all of us lusts for stability and stasis. Another factor is that many people seem genuinely uninterested in ensuring more healthy time spent alive at some point decades from now - the psychology of time preference at work, deeply discounting the value of anything likely to happen a long time from now. Further, the greater the level of regulation and government intervention in a field, the slower it goes and the more that all change is opposed - just look at medicine in the US, for example, where the primary regulatory body does pretty much all it can to sabotage any form of progress in medicine.

One day, the change that was hard-won will be the new normal and completely accepted. That will be the case for longevity science and the defeat of aging as well, and people of the future will wonder how we could have been such barbarians, resisting the obvious benefits of not suffering, decaying, and dying. Until that time, supporting rationality and faster development in engineered human longevity will continue to be harder than it should be.

This additional cost in time and resources imposed by the nature of our present culture is an existential threat. It threatens to kill us by ensuring that the development of effective ways to reverse aging in the old arrive too late. Given that progress in this field of science and technology is a matter of persuading funding sources and raising money to accomplish known goals, it could be argued that this is a fight to change the prevailing culture rather than a matter of research. If we want to live, it's a fight we have to win - or at least convince a few tens of millions to become supporters of longevity science in the same way that most people are supporters of cancer research.

But let us look to the future, at what I see as a loosely analogous cultural battle that will start to arrive at around the same time as the means to reverse aging - one that will also present an existential threat to personal survival.

Consider that at some point in the next few decades it will become possible to simulate and then emulate a human brain. That will enable related technological achievements as reverse engineering of memory, a wide range of brain-machine interfaces, and strong artificial intelligence. It will be possible to copy and alter an individual's mind: we are at root just data and operations on that data. It will be possible for a mind to run on computing hardware rather than in our present biology, for minds to be copied from a biological brain, and for arbitrary alterations of memory to be made near-immediately. This opens up all of the possibilities that have occupied science fiction writers for the past couple of decades: forking individuals, merging in memories from other forks, making backups, extending a human mind through commodity processing modules that provide skills or personality shards, and so on and so forth.

There is already a population of folk who would cheerfully take on any or all of these options. I believe that this population will only grow: the economic advantages for someone who can edit, backup, and fork their own mind are enormous - let alone the ability to consistently take advantage of a marketplace of commodity products such as skills, personalities, or other fragments of the mind.

But you'll notice I used what I regard as a malformed phrase there: "someone who can edit, backup, and fork their own mind." There are several sorts of people in the world; the first sort adhere to some form of pattern theory of identity, defining the self as a pattern, wherever that pattern may exists. Thus for these folk it makes sense to say that "my backup is me", or "my fork is me." The second sort, and I am in this camp, associate identity with the continuity of a slowly changing arrangement of mass and energy: I am this lump of flesh here, the one slowly shedding and rebuilding its cells and cellular components as it progresses. If you copy my mind and run it in software, that copy is not me. So in my view you cannot assign a single identity to forks and backups: every copy is an individual, large changes to the mind are equivalent to death, and it makes no sense to say something like "someone who can edit, backup, and fork their own mind."

A copy of you is not you, but there is worse to consider: if the hardware that supports a running brain simulation is anything like present day computers, that copy isn't even particularly continuous. It is more like an ongoing set of individuals, each instantiated for a few milliseconds or less and then destroyed, to be replaced by yet another copy. If self is data associated with particular processing structures, such as an arrangement of neurons and their connections, then by comparison a simulation is absolute different: inside a modern computer or virtual machine that same data would be destroyed, changed, and copied at arbitrary times between physical structures - it is the illusion of a continuous entity, not the reality.

That should inspire a certain sense of horror among folk in the continuity of identity camp, not just because it is an ugly thing to think about, but because it will almost certainly happen to many, many, many people before this century ends - and it will largely be by their own choice, or worse, inflicted upon them by the choice of the original from whom the copy was made.

This is not even to think about the smaller third group of people who are fine with large, arbitrary changes to their state of mind: rewriting memories, changing the processing algorithms of the self, and so on. At the logical end of that road lie hives of software derived from human minds in which identity has given way to ever-changing assemblies of modules for specific tasks, things that transiently appear to be people but which are a different sort of entity altogether - one that has nothing we'd recognize as continuity of identity. Yet it would probably be very efficient and economically competitive.

The existential threat here is that the economically better path to artificial minds, the one that involves lots of copying and next to no concern for continuity of identity, will be the one that dominates research and development. If successful and embedded in the cultural mainstream, it may squeeze out other roads that would lead to more robust agelessness for we biological humans - or more expensive and less efficient ways to build artificial brains that do have a continuity of structure and identity, such as a collection of artificial neurons that perform the same functions as natural ones.

This would be a terrible, terrible tragedy: a culture whose tides are in favor of virtual, copied, altered, backed up and restored minds is to my eyes little different from the present culture that accepts and encourages death by aging. In both cases, personal survival requires research and development that goes against the mainstream, and thus proceeds more slowly.

Sadly, given the inclinations of today's futurists - and, more importantly, the economic incentives involved - I see this future as far more likely than the alternatives. Given a way to copy, backup, and alter their own minds, people will use it and justify its use to themselves by adopting philosophies that state they are not in fact killing themselves over and again. I'd argue that they should be free to do so if they choose, just the same as I'd argue that anyone today should be free to determine the end of his or her life. Nonetheless, I suspect that this form of future culture may pose a sizable set of hurdles for those folk who emerge fresh from the decades in which the first early victories over degenerative aging take place.