Evolution, life, and computation

A recent issue of New Scientist featured an article by Kate Douglas with the provocative title Nature’s brain: A radical new view of evolution. The limits of our current understanding of evolution, and the alternative view discussed in the article, are summarized in this excerpt:

Any process built purely on random changes has a lot of potential changes to try. So how does natural selection come up with such good solutions to the problem of survival so quickly, given population sizes and the number of generations available?…It seems that, added together, evolution’s simple processes form an intricate learning machine that draws lessons from past successes to improve future performance.

Evolution, as it has been understood, relies on the ideas of variation, selection, and inheritance. But learning uses the past to anticipate the future.  Random mutations, on the other hand, are selected by current circumstances. Yet the proposal is that natural selection somehow reuses successful variants from the past. This idea has been given the room to develop, in large part, with the increasing use and broadened development of iterative learning algorithms.

Leslie Valiant, a computational theorist at Harvard University, approached the possibility in his 2013 book, Probably Approximately Correct. There he equated evolution’s action to the learning algorithm known as Bayesian updating.

Richard Watson of the University of Southampton, UK has added a new observation and this is the subject of the New Scientist article.  It is that genes do not work independently, they work in concert. They create networks of connections. And a network’s organization is a product of past evolution, since natural selection will reward gene associations that increase fitness. What Watson realized is that the making of connections among genes in evolution, forged in order to produce a fit phenotype, parallels the making of neural networks, or networks of associations built, in the human brain, for problem solving. Watson and his colleagues have been able to go as far as creating a learning model demonstrating that a gene network could make use of generalization when grappling with a problem under the pressure of natural selection.

I can’t help but think of Gregory Chaitin’s random walk through software space, his metabiology, where life is considered evolving software (Proving Darwin).  Chiara Marletto’s application of David Deutsch’s constructor theory to biology also comes to mind. Chaitin’s idea is characterized by an algorithmic evolution, Marletto’s by digitally coded information that can act as a constructor, which has what she calls causal power and resiliency.

What I find striking about all of these ideas is the jumping around that mathematics seems to be doing – it’s here, there and everywhere.  And, it should be pointed out that these efforts are not just the application of mathematics to a difficult problem.  Rather, mathematics is providing a new conceptualization of the problem.  It’s reframing the questions as well as the answers.   For Chaitin, mathematical creativity is equated with biological creativity.  For Deutsch, information is the only independent substrate of everything and, for Marletto, this information-based theory brings biology into fundamental physics.

A NY Times review of Valient’s book, by Edward Frenkel, says this:

The importance of these algorithms in the modern world is common knowledge, of course. But in his insightful new book “Probably Approximately Correct,” the Harvard computer scientist Leslie Valiant goes much further: computation, he says, is and has always been “the dominating force on earth within all its life forms.” Nature speaks in algorithms.

…This is an ambitious proposal, sure to ignite controversy. But what I find so appealing about this discussion, and the book in general, is that Dr. Valiant fearlessly goes to the heart of the “BIG” questions.

That’s what’s going on here.  Mathematics is providing the way to precisely explore conceptual analogies to get to the heart of big questions.

I’ll wrap this up with an excerpt from an article by Artuto Carsetti that appeared in the November 2014 issue of Cognitive Processing with the title Life, cognition and metabiology.

Chaitin (2013) is perfectly right to bring the phenomenon of evolution in its natural place which is a place
characterized in a mathematical sense: Nature ‘‘speaks’’ by means of mathematical forms. Life is born from a compromise between creativity and meaning, on the one hand, and, on the other hand, is carried out along the ridges of a specific canalization process that develops in accordance with computational schemes…Hence, the emergence of those particular forms…that are instantiated, for example, by the Fibonacci numbers, by the fractal-like structures etc. that are ubiquitous in Nature. As observers we see these forms but they are at the same time inside us, they pave of themselves our very organs of cognition. (emphasis added)

Comments are closed.