The 2010s in Biotechnology Reflect the 1960s in Computing

Drawing on historical analogies is a common practice when trying to figure out where we are and where we're going. We're all still human, and development today proceeds according to human nature first and foremost, just the same as in the past: great progress has taken place, but when comes down to the basic organization of research, development, and commercialization of products, there are still far more similarities than differences in comparisons with any given yesteryear. We can recognize the elements of our present work in the way the Victorians and the Romans did business - so a mere few decades past into the last century seems quite safe to mine for examples.

Biotechnology is the application of the life sciences, and the foundation of medicine. At present progress in biotechnology is rapid and revolutionary. The groundwork for a series of disruptive, factor-of-ten improvements in a variety of medical technologies has already been accomplished: think of the attention given to gene sequencing over past years, for example, the world watching as costs plummeted even while capabilities increased year over year. That is just one of many, many applications in biotechnology that are improving in similar ways.

I've pointed out in the past that this present stage bears considerable resemblance to the dawn of the age of powered flight: decades in which the necessary technologies for aircraft as we recognize them were developed in isolation, for other uses, or assembled into noble failures. Then, suddenly the leap was made and in just a few further decades following that the whole nature of travel underwent a revolution - a disruptive advance in the speed and opportunity to move from place to place was achieved.

A different sort of leap occurred over the 1960s and 1970s in the development of modern computing: the move from expensive, large computing devices to cheap, small computing devices. A mere change in price is far from prosaic and boring: it drives sweeping changes in adoption and expansion in the forms of application for any given technology. The lower the barrier to entry - price in this case - the more experimentation and thinking takes place, leading to greater application of a given technology for the benefit of all. In the past I've pointed out the parallels between the early personal computing societies of the 1970s and the present diybio community: enthusiasts, hobbyists, and professionals merging their efforts as costs fall to the point at which anyone can join in and build.

If you look at the Computer History Museum's entries for 1960, 1965, and 1970, it's not unreasonable to suggest that we're somewhere in the middle there when it comes to biotechnology today. The present trend I have in mind is the move towards small, portable, low-cost laboratory equipment that can accomplish most of what was possible in a large lab ten years ago. A lab in a box by that metric is years away still, but numerous groups are making significant inroads towards that goal.

But why care about this? In truth it isn't all that important to me whether or not 2010 in biotechnology is 1960 in computing, but is important to me to have some model for the next few decades of work in medical science, and in particular the ability to make progress towards treatment and reversal of aging. How much progress is likely in the foundations, the capabilities in biotechnology (other people have looked at that exhaustively from a different angle)? Should we expect and plan for disruption of the research process to the point at which the barrier between trained professional and effective self-educated contributor blurs to nothing, as has happened for software development?

We can ask these and other questions so as to have some idea as whether matters are proceeding as fast as they might, and what new directions in effective advocacy and funding will arise in the near future. Clearly at some point if the institutions are not funding the research and development work we feel is important, it will be possible to do it ourselves. Given smart, motivated collaborators and crowdfunding complex problems can be solved in software development today. Creating applications of new knowledge in molecular biology and medical science is no more complex than constructing big software projects - the differences between these two types of undertaking are the degree to which regulation obstructs change, raises costs, and prohibits participation, and the anemic, expensive nature of the presently publicly available tools for life science work. The latter will change, rapidly, and the former will simply mean that development will occur most readily in regions outside the US.

But these are things to think on as we watch and support our favored research into aging and longevity. In past decades we could only have watched - the amounts of money involved were too large for any ordinary individual to help with. But the cost of life science research is plummeting, and here and now crowdfunding of research projects that advance the state of the art is a very real thing. Medical research for your favored causes isn't just a spectator sport anymore: the times are changing.


Post a comment; thoughtful, considered opinions are valued. Comments incorporating ad hominem attacks, advertising, and other forms of inappropriate behavior are likely to be deleted.

Note that there is a comment feed for those who like to keep up with conversations.