An Interesting Perspective on Neurodegeneration

This open access paper outlines an interesting perspective on the origins of the many common forms of age-related neurodegeneration, such as Alzheimer's disease:

The present article examines several lines of converging evidence suggesting that the slow and insidious brain changes that accumulate over the lifespan, resulting in both natural cognitive aging and Alzheimer's disease (AD), represent a metabolism reduction program. A number of such adaptive programs are known to accompany aging and are thought to have decreased energy requirements for ancestral hunter-gatherers in their 30s, 40s and 50s. Foraging ability in modern hunter-gatherers declines rapidly, more than a decade before the average terminal age of 55 years. Given this, the human brain would have been a tremendous metabolic liability that must have been advantageously tempered by the early cellular and molecular changes of AD which begin to accumulate in all humans during early adulthood. Before the recent lengthening of life span, individuals in the ancestral environment died well before this metabolism reduction program resulted in clinical AD, thus there was never any selective pressure to keep adaptive changes from progressing to a maladaptive extent.

I can't say I'm completely sold on the high level conclusion, which seems to depend on relatively unsupported assumptions about the dominant selective pressures acting during specific decades of primitive hunter-gatherer life. Some thought-provoking points are made along the way, however, so I encourage you to read the whole thing.

All humans begin to develop the neurological markers of AD during their early 20s and continue to do so throughout life, most towards clinically irrelevant degrees. But why would these markers present in everyone? How could natural selection have allowed them to become so invasive and ubiquitous if they did not hold some sort of evolutionary significance?

This paper was very interesting but also quite frustrating, as the authors kept referring to "design" in terms of genetics, as well as implying that traits useful to 45 year old hunter-gatherers were passed to their children. Since it is very rare for anyone over the age of 30 in such societies to successfully have children, there's no way for such traits to be passed down.

However, the notion of dementia as a runaway adaptation to less food is interesting. It makes sense, in extreme conditions, for the body to reduce blood flow to parts of the brain not immediately needed for survival, and that trait would have survival benefits & could be expected to be passed on down.

Posted by: name at March 13th, 2009 9:16 AM

It seems that the following quote from the article addresses the concerns brought up by the previous post.

"Some theorists assume tacitly that AD presents too late in reproductive life to have been exposed to negative selection [12]. They realize that diseases that arise after individuals become infertile cannot limit the total number of offspring produced, and that evolution has little way of excising such diseases. However, many facts about preclinical AD indicate that the genes responsible for it must have been exposed to blatant selective pressure, well before their bearers reached reproductive senescence. Particularly, neuropathological changes begin in the early 20s and usually constitute a "heavy load" 10–20 years before the first behavioral symptoms of marked cognitive decline surface [13]. Several forms of neurophysiological and intellectual decline have been shown to begin in the first few decades in individuals that will develop AD [14]. In fact, well controlled studies have shown that individuals that carry susceptibility genes for AD exhibit lower levels of intellectual functioning throughout life and are more likely to drop out of high school by age 15 when compared to their matched peers [15]. In other words, because the genes that cause AD create conspicuous neurological and behavioral characteristics that present during reproductive age, they could not have been invisible to evolutionary forces."

Posted by: george at March 13th, 2009 9:45 AM

A passage that I found interesting:

"The sharp diminishment in cerebral metabolism in young adulthood is currently conceptualized in the literature as an evolutionarily mediated response to changes in life-history dynamics [46], but modern AD researchers appear surprised that further reductions occur with advancing age. These reductions, even in late life, should be seen as part of a natural process of continuing development. Young children are small and their metabolic demands are met by their parents, yet they need to learn rapidly in order to become ecologically competent [48]. They can afford a high cerebral metabolism because they benefit so greatly from the incessant thinking and learning that accompanies it. Once the individual becomes an adult though, they need not expend quite so much energy actively learning and analyzing. The adaptive value of extracting large amounts of information from their environment and carrying it with them through time has decreased. This is because, in adulthood, individuals should have already internalized much of the cultural and ecological information that they will need."

Posted by: thoman at March 13th, 2009 12:50 PM

Post a comment; thoughtful, considered opinions are valued. New comments can be edited for a few minutes following submission. Comments incorporating ad hominem attacks, advertising, and other forms of inappropriate behavior are likely to be deleted.

Note that there is a comment feed for those who like to keep up with conversations.