The present article examines several lines of converging evidence suggesting that the slow and insidious brain changes that accumulate over the lifespan, resulting in both natural cognitive aging and Alzheimer's disease (AD), represent a metabolism reduction program. A number of such adaptive programs are known to accompany aging and are thought to have decreased energy requirements for ancestral hunter-gatherers in their 30s, 40s and 50s. Foraging ability in modern hunter-gatherers declines rapidly, more than a decade before the average terminal age of 55 years. Given this, the human brain would have been a tremendous metabolic liability that must have been advantageously tempered by the early cellular and molecular changes of AD which begin to accumulate in all humans during early adulthood. Before the recent lengthening of life span, individuals in the ancestral environment died well before this metabolism reduction program resulted in clinical AD, thus there was never any selective pressure to keep adaptive changes from progressing to a maladaptive extent.
I can't say I'm completely sold on the high level conclusion, which seems to depend on relatively unsupported assumptions about the dominant selective pressures acting during specific decades of primitive hunter-gatherer life. Some thought-provoking points are made along the way, however, so I encourage you to read the whole thing.
All humans begin to develop the neurological markers of AD during their early 20s and continue to do so throughout life, most towards clinically irrelevant degrees. But why would these markers present in everyone? How could natural selection have allowed them to become so invasive and ubiquitous if they did not hold some sort of evolutionary significance?