Arguing for More Computational Modeling to Aid the Transition to Clinical Trials that Target Aging

At some point in the future, clinical trials for therapies that target mechanisms of aging must start to assess the outcome on aging, rather than the present situation in which regulators force potentially broad rejuvenation therapies - such as senolytics - to address only one specific age-related condition at a time. The authors of this paper argue that this will be a challenging transition for present regulatory and research institutions, and that a great deal more use of computational modelling of aging and the effects of interventions will be needed to smooth the way. I agree that the regulatory system is a barrier and a roadblock to the paths that should be taken; I'm not sure that I agree with the specific recommendations made in this paper. Greater effective use of computational modelling should, in principle, allow cost reductions across the board in the development of therapies, but I don't know that this really changes the nature of the problem beyond reducing the expense of efforts made to solve it.

The conventional paradigm "one disease, one drug" should be updated to achieve the vision of targeting aging as a common component of human diseases. The current deterministic genetic paradigm of diagnosing and treating each separate age-related disease fails to fit with the broader anti-aging strategies aimed to address the closely related concepts of healthspan, resilience, and lifespan, which should be therapeutically managed in the absence of discrete, targetable genetic drivers of aging progression. Perhaps more importantly, current frameworks cannot capture the stochastic aspects that drive the shared trade-offs of the emerging strategies for organismal healthspan and rejuvenation, namely tissue-repair/wound-healing impairment and tumorigenesis.

Successful clinical trials with new families of candidate interventions targeting the biologic machinery of aging per se would be groundbreaking; delaying, preventing (or even reversing) the aging process would result in tremendous cost savings for healthcare systems while increasing the productive contributions that could be made by the older members of our societies. By modeling and predicting the behavior of interventions that target the aging hallmarks in both long-term and acute settings, defined by extension of healthspan/lifespan and enhanced resilience to acute stressors (i.e., reduced frailty), respectively, robust and standardized approaches such as stochastic biomathematical platforms would have the ability to sidestep most of the current challenges in aging-targeting clinical trials, to accelerate the achievement of optimum health and life quality in aging populations.

Link: https://doi.org/10.18632/aging.102180

Comments

Simulations are good and useful but come with their own limitations. Especially for extremely complex systems like living organisms. Many decades ago Dijkstra has said "In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included."
And those were about conceptually simple models with programs a few thousand lines long

Posted by: Cuberat at March 27th, 2020 7:24 AM

Could Artificial Intelligence based computational modelling with the right data (i.e. cellular damage based input and aging marker based output) not only help reduce costs but also help target prioritisation of available rejuvenation therapies?

Posted by: David Luck at March 28th, 2020 4:32 AM

Imagine a future for example where SENS projects for target prioritisation of tissue crosslinking is automated by being able to identify/label glucosepane and all other types of cross links and using AI we can get the answer as to which cross links do most aging damage e.g. by the AI also imaging skin wrinkles or systolic blood pressure?

Posted by: David Luck at March 28th, 2020 4:49 AM

@David Luck
The current state of the art artificial intelligence is very artificial and not particularly intelligemment. The newest approach, deep learning with gradient descent, requires enormous amounts of data to get trained. It can be useful as a tool to identify (classify) dependencies and statistical areas of interest. It can be used to automate some manual work like continuing things on an image, automate laboratory testing (suddenly our meatbag hands turn out to be surprisingly agile, versatile, available and cheap, so the robot actuator part might be the limiting factor).

At the end we don't have a super intelligence but rather specialized idiot savants.

Posted by: Cuberat at March 28th, 2020 6:46 AM

I read in Diamandis books "The future is faster than you think", that Insilico is going to simulate the clinical trials so that it takes only weeks or months instead of 10 years of clinical trials from lab to market. This is only a matter of time.

Posted by: Jonathan Weaver at March 28th, 2020 9:18 AM

@Jonathan Weaver
For sure extra computing power and better simulations help. The problem is that at the current state our simulations is biological distilled is quite underwhelming. That's because our understanding, and by extension our models , is very limited. The fact that we discover new pathways and features only supports that statement. Extra computation capacity and AI tools will help with secondary tasks but we still have to do they dirty biological and human experiments. That takes time and money, even if all other overhead is reduced to zero

Posted by: Cuberat at March 28th, 2020 3:50 PM

Cuberat, you are certainly right that AI currently needs large amounts of data. Consequently we cant yet computationally model how far each type of cellular damage causes ageing with AI. This is particularly the case as we cant really label and image the cellular damage and we do not know which biomarkers to define 'age' which Reason often writes about. AI uses linear algebra and calculus so it detects patterns in data using maths. Is this artificial? Perhaps not perhaps our brains do similar kinds of maths just by their nature, after all neural networks are at high level based on the brain....anyway I'm going off topic. My response to your point would be once we can get data on any of the SENS damage categories get the data for a large population of people and train AI using appropriate biomarker(s) of their age to predict their age and we can start to better computationally model the process.

Posted by: David Luck at March 30th, 2020 11:21 AM
Comment Submission

Post a comment; thoughtful, considered opinions are valued. New comments can be edited for a few minutes following submission. Comments incorporating ad hominem attacks, advertising, and other forms of inappropriate behavior are likely to be deleted.

Note that there is a comment feed for those who like to keep up with conversations.