Illustration: life as a computation efficiently storing & using predictive info

Olena Shmahalo/Quanta Magazine


What’s the difference between physics and biology? Take a golf ball and a cannonball and drop them off the Tower of Pisa. The laws of physics allow you to predict their trajectories pretty much as accurately as you could wish for.

Now do the same experiment again, but replace the cannonball with a pigeon.

Biological systems don’t defy physical laws, of course — but neither do they seem to be predicted by them. In contrast, they are goal-directed: survive and reproduce. We can say that they have a purpose — or what philosophers have traditionally called a teleology — that guides their behavior.

By the same token, physics now lets us predict, starting from the state of the universe a billionth of a second after the Big Bang, what it looks like today. But no one imagines that the appearance of the first primitive cells on Earth led predictably to the human race. Laws do not, it seems, dictate the course of evolution.

The teleology and historical contingency of biology, said the evolutionary biologist Ernst Mayr, make it unique among the sciences. Both of these features stem from perhaps biology’s only general guiding principle: evolution. It depends on chance and randomness, but natural selection gives it the appearance of intention and purpose. Animals are drawn to water not by some magnetic attraction, but because of their instinct, their intention, to survive. Legs serve the purpose of, among other things, taking us to the water.

Mayr claimed that these features make biology exceptional — a law unto itself. But recent developments in nonequilibrium physics, complex systems science and information theory are challenging that view.

Once we regard living things as agents performing a computation — collecting and storing information about an unpredictable environment — capacities and considerations such as replication, adaptation, agency, purpose and meaning can be understood as arising not from evolutionary improvisation, but as inevitable corollaries of physical laws. In other words, there appears to be a kind of physics of things doing stuff, and evolving to do stuff. Meaning and intention — thought to be the defining characteristics of living systems — may then emerge naturally through the laws of thermodynamics and statistical mechanics.

This past November, physicists, mathematicians and computer scientists came together with evolutionary and molecular biologists to talk — and sometimes argue — about these ideas at a workshop at the Santa Fe Institute in New Mexico, the mecca for the science of “complex systems.” They asked: Just how special (or not) is biology?

It’s hardly surprising that there was no consensus. But one message that emerged very clearly was that, if there’s a kind of physics behind biological teleology and agency, it has something to do with the same concept that seems to have become installed at the heart of fundamental physics itself: information.

Disorder and Demons

The first attempt to bring information and intention into the laws of thermodynamics came in the middle of the 19th century, when statistical mechanics was being invented by the Scottish scientist James Clerk Maxwell. Maxwell showed how introducing these two ingredients seemed to make it possible to do things that thermodynamics proclaimed impossible.

Maxwell had already shown how the predictable and reliable mathematical relationships between the properties of a gas — pressure, volume and temperature — could be derived from the random and unknowable motions of countless molecules jiggling frantically with thermal energy. In other words, thermodynamics — the new science of heat flow, which united large-scale properties of matter like pressure and temperature — was the outcome of statistical mechanics on the microscopic scale of molecules and atoms.

According to thermodynamics, the capacity to extract useful work from the energy resources of the universe is always diminishing. Pockets of energy are declining, concentrations of heat are being smoothed away. In every physical process, some energy is inevitably dissipated as useless heat, lost among the random motions of molecules. This randomness is equated with the thermodynamic quantity called entropy — a measurement of disorder — which is always increasing. That is the second law of thermodynamics. Eventually all the universe will be reduced to a uniform, boring jumble: a state of equilibrium, wherein entropy is maximized and nothing meaningful will ever happen again.

Are we really doomed to that dreary fate? Maxwell was reluctant to believe it, and in 1867 he set out to, as he put it, “pick a hole” in the second law. His aim was to start with a disordered box of randomly jiggling molecules, then separate the fast molecules from the slow ones, reducing entropy in the process.

Imagine some little creature — the physicist William Thomson later called it, rather to Maxwell’s dismay, a demon — that can see each individual molecule in the box. The demon separates the box into two compartments, with a sliding door in the wall between them. Every time he sees a particularly energetic molecule approaching the door from the right-hand compartment, he opens it to let it through. And every time a slow, “cold” molecule approaches from the left, he lets that through, too. Eventually, he has a compartment of cold gas on the right and hot gas on the left: a heat reservoir that can be tapped to do work.

This is only possible for two reasons. First, the demon has more information than we do: It can see all of the molecules individually, rather than just statistical averages. And second, it has intention: a plan to separate the hot from the cold. By exploiting its knowledge with intent, it can defy the laws of thermodynamics.

At least, so it seemed. It took a hundred years to understand why Maxwell’s demon can’t in fact defeat the second law and avert the inexorable slide toward deathly, universal equilibrium. And the reason shows that there is a deep connection between thermodynamics and the processing of information — or in other words, computation. The German-American physicist Rolf Landauer showed that even if the demon can gather information and move the (frictionless) door at no energy cost, a penalty must eventually be paid. Because it can’t have unlimited memory of every molecular motion, it must occasionally wipe its memory clean — forget what it has seen and start again — before it can continue harvesting energy. This act of information erasure has an unavoidable price: It dissipates energy, and therefore increases entropy. All the gains against the second law made by the demon’s nifty handiwork are canceled by “Landauer’s limit”: the finite cost of information erasure (or more generally, of converting information from one form to another).

Living organisms seem rather like Maxwell’s demon. Whereas a beaker full of reacting chemicals will eventually expend its energy and fall into boring stasis and equilibrium, living systems have collectively been avoiding the lifeless equilibrium state since the origin of life about three and a half billion years ago. They harvest energy from their surroundings to sustain this nonequilibrium state, and they do it with “intention.” Even simple bacteria move with “purpose” toward sources of heat and nutrition. In his 1944 book What is Life?, the physicist Erwin Schrödinger expressed this by saying that living organisms feed on “negative entropy.”

They achieve it, Schrödinger said, by capturing and storing information. Some of that information is encoded in their genes and passed on from one generation to the next: a set of instructions for reaping negative entropy. Schrödinger didn’t know where the information is kept or how it is encoded, but his intuition that it is written into what he called an “aperiodic crystal” inspired Francis Crick, himself trained as a physicist, and James Watson when in 1953 they figured out how genetic information can be encoded in the molecular structure of the DNA molecule.

A genome, then, is at least in part a record of the useful knowledge that has enabled an organism’s ancestors — right back to the distant past — to survive on our planet. According to David Wolpert, a mathematician and physicist at the Santa Fe Institute who convened the recent workshop, and his colleague Artemy Kolchinsky, the key point is that well-adapted organisms are correlated with that environment. If a bacterium swims dependably toward the left or the right when there is a food source in that direction, it is better adapted, and will flourish more, than one  that swims in random directions and so only finds the food by chance. A correlation between the state of the organism and that of its environment implies that they share information in common. Wolpert and Kolchinsky say that it’s this information that helps the organism stay out of equilibrium — because, like Maxwell’s demon, it can then tailor its behavior to extract work from fluctuations in its surroundings. If it did not acquire this information, the organism would gradually revert to equilibrium: It would die.

Looked at this way, life can be considered as a computation that aims to optimize the storage and use of meaningful information. And life turns out to be extremely good at it. Landauer’s resolution of the conundrum of Maxwell’s demon set an absolute lower limit on the amount of energy a finite-memory computation requires: namely, the energetic cost of forgetting. The best computers today are far, far more wasteful of energy than that, typically consuming and dissipating more than a million times more. But according to Wolpert, “a very conservative estimate of the thermodynamic efficiency of the total computation done by a cell is that it is only 10 or so times more than the Landauer limit.”

The implication, he said, is that “natural selection has been hugely concerned with minimizing the thermodynamic cost of computation. It will do all it can to reduce the total amount of computation a cell must perform.” In other words, biology (possibly excepting ourselves) seems to take great care not to overthink the problem of survival. This issue of the costs and benefits of computing one’s way through life, he said, has been largely overlooked in biology so far.

Inanimate Darwinism

So living organisms can be regarded as entities that attune to their environment by using information to harvest energy and evade equilibrium. Sure, it’s a bit of a mouthful. But notice that it said nothing about genes and evolution, on which Mayr, like many biologists, assumed that biological intention and purpose depend.

How far can this picture then take us? Genes honed by natural selection are undoubtedly central to biology. But could it be that evolution by natural selection is itself just a particular case of a more general imperative toward function and apparent purpose that exists in the purely physical universe? It is starting to look that way.

Adaptation has long been seen as the hallmark of Darwinian evolution. But Jeremy England at the Massachusetts Institute of Technology has argued that adaptation to the environment can happen even in complex nonliving systems.

Adaptation here has a more specific meaning than the usual Darwinian picture of an organism well-equipped for survival. One difficulty with the Darwinian view is that there’s no way of defining a well-adapted organism except in retrospect. The “fittest” are those that turned out to be better at survival and replication, but you can’t predict what fitness entails. Whales and plankton are well-adapted to marine life, but in ways that bear little obvious relation to one another.

England’s definition of “adaptation” is closer to Schrödinger’s, and indeed to Maxwell’s: A well-adapted entity can absorb energy efficiently from an unpredictable, fluctuating environment. It is like the person who keeps her footing on a pitching ship while others fall over because she’s better at adjusting to the fluctuations of the deck. Using the concepts and methods of statistical mechanics in a nonequilibrium setting, England and his colleagues argue that these well-adapted systems are the ones that absorb and dissipate the energy of the environment, generating entropy in the process.

Complex systems tend to settle into these well-adapted states with surprising ease, said England: “Thermally fluctuating matter often gets spontaneously beaten into shapes that are good at absorbing work from the time-varying environment”.

There is nothing in this process that involves the gradual accommodation to the surroundings through the Darwinian mechanisms of replication, mutation and inheritance of traits. There’s no replication at all. “What is exciting about this is that it means that when we give a physical account of the origins of some of the adapted-looking structures we see, they don’t necessarily have to have had parents in the usual biological sense,” said England. “You can explain evolutionary adaptation using thermodynamics, even in intriguing cases where there are no self-replicators and Darwinian logic breaks down” — so long as the system in question is complex, versatile and sensitive enough to respond to fluctuations in its environment.

But neither is there any conflict between physical and Darwinian adaptation. In fact, the latter can be seen as a particular case of the former. If replication is present, then natural selection becomes the route by which systems acquire the ability to absorb work — Schrödinger’s negative entropy — from the environment. Self-replication is, in fact, an especially good mechanism for stabilizing complex systems, and so it’s no surprise that this is what biology uses. But in the nonliving world where replication doesn’t usually happen, the well-adapted dissipative structures tend to be ones that are highly organized, like sand ripples and dunes crystallizing from the random dance of windblown sand. Looked at this way, Darwinian evolution can be regarded as a specific instance of a more general physical principle governing nonequilibrium systems.

Prediction Machines

This picture of complex structures adapting to a fluctuating environment allows us also to deduce something about how these structures store information. In short, so long as such structures — whether living or not — are compelled to use the available energy efficiently, they are likely to become “prediction machines.”

It’s almost a defining characteristic of life that biological systems change their state in response to some driving signal from the environment. Something happens; you respond. Plants grow toward the light; they produce toxins in response to pathogens. These environmental signals are typically unpredictable, but living systems learn from experience, storing up information about their environment and using it to guide future behavior. (Genes, in this picture, just give you the basic, general-purpose essentials.)

Prediction isn’t optional, though. According to the work of Susanne Still at the University of Hawaii, Gavin Crooks, formerly at the Lawrence Berkeley National Laboratory in California, and their colleagues, predicting the future seems to be essential for any energy-efficient system in a random, fluctuating environment.

There’s a thermodynamic cost to storing information about the past that has no predictive value for the future, Still and colleagues show. To be maximally efficient, a system has to be selective. If it indiscriminately remembers everything that happened, it incurs a large energy cost. On the other hand, if it doesn’t bother storing any information about its environment at all, it will be constantly struggling to cope with the unexpected. “A thermodynamically optimal machine must balance memory against prediction by minimizing its nostalgia — the useless information about the past,’’ said a co-author, David Sivak, now at Simon Fraser University in Burnaby, British Columbia. In short, it must become good at harvesting meaningful information — that which is likely to be useful for future survival.

You’d expect natural selection to favor organisms that use energy efficiently. But even individual biomolecular devices like the pumps and motors in our cells should, in some important way, learn from the past to anticipate the future. To acquire their remarkable efficiency, Still said, these devices must “implicitly construct concise representations of the world they have encountered so far, enabling them to anticipate what’s to come.”

The Thermodynamics of Death

Even if some of these basic information-processing features of living systems are already prompted, in the absence of evolution or replication, by nonequilibrium thermodynamics, you might imagine that more complex traits — tool use, say, or social cooperation — must be supplied by evolution.

Well, don’t count on it. These behaviors, commonly thought to be the exclusive domain of the highly advanced evolutionary niche that includes primates and birds, can be mimicked in a simple model consisting of a system of interacting particles. The trick is that the system is guided by a constraint: It acts in a way that maximizes the amount of entropy (in this case, defined in terms of the different possible paths the particles could take) it generates within a given timespan.

Entropy maximization has long been thought to be a trait of nonequilibrium systems. But the system in this model obeys a rule that lets it maximize entropy over a fixed time window that stretches into the future. In other words, it has foresight. In effect, the model looks at all the paths the particles could take and compels them to adopt the path that produces the greatest entropy. Crudely speaking, this tends to be the path that keeps open the largest number of options for how the particles might move subsequently.

You might say that the system of particles experiences a kind of urge to preserve freedom of future action, and that this urge guides its behavior at any moment. The researchers who developed the model — Alexander Wissner-Gross at Harvard University and Cameron Freer, a mathematician at the Massachusetts Institute of Technology — call this a “causal entropic force.” In computer simulations of configurations of disk-shaped particles moving around in particular settings, this force creates outcomes that are eerily suggestive of intelligence.

In one case, a large disk was able to “use” a small disk to extract a second small disk from a narrow tube — a process that looked like tool use. Freeing the disk increased the entropy of the system. In another example, two disks in separate compartments synchronized their behavior to pull a larger disk down so that they could interact with it, giving the appearance of social cooperation.

Of course, these simple interacting agents get the benefit of a glimpse into the future. Life, as a general rule, does not. So how relevant is this for biology? That’s not clear, although Wissner-Gross said that he is now working to establish “a practical, biologically plausible, mechanism for causal entropic forces.” In the meantime, he thinks that the approach could have practical spinoffs, offering a shortcut to artificial intelligence. “I predict that a faster way to achieve it will be to discover such behavior first and then work backward from the physical principles and constraints, rather than working forward from particular calculation or prediction techniques,” he said. In other words, first find a system that does what you want it to do and then figure out how it does it.

Aging, too, has conventionally been seen as a trait dictated by evolution. Organisms have a lifespan that creates opportunities to reproduce, the story goes, without inhibiting the survival prospects of offspring by the parents sticking around too long and competing for resources. That seems surely to be part of the story, but Hildegard Meyer-Ortmanns, a physicist at Jacobs University in Bremen, Germany, thinks that ultimately aging is a physical process, not a biological one, governed by the thermodynamics of information.

David Kaplan, Tom Hurwitz, Richard Fleming, and Tom McNamara for Quanta Magazine; music by Podington Bear.

Video: David Kaplan explains how the law of increasing entropy could drive random bits of matter into the stable, orderly structures of life.

It’s certainly not simply a matter of things wearing out. “Most of the soft material we are made of is renewed before it has the chance to age,” Meyer-Ortmanns said. But this renewal process isn’t perfect. The thermodynamics of information copying dictates that there must be a trade-off between precision and energy. An organism has a finite supply of energy, so errors necessarily accumulate over time. The organism then has to spend an increasingly large amount of energy to repair these errors. The renewal process eventually yields copies too flawed to function properly; death follows.

Empirical evidence seems to bear that out. It has been long known that cultured human cells seem able to replicate no more than 40 to 60 times (called the Hayflick limit) before they stop and become senescent. And recent observations of human longevity have suggested that there may be some fundamental reason why humans can’t survive much beyond age 100.

There’s a corollary to this apparent urge for energy-efficient, organized, predictive systems to appear in a fluctuating nonequilibrium environment. We ourselves are such a system, as are all our ancestors back to the first primitive cell. And nonequilibrium thermodynamics seems to be telling us that this is just what matter does under such circumstances. In other words, the appearance of life on a planet like the early Earth, imbued with energy sources such as sunlight and volcanic activity that keep things churning out of equilibrium, starts to seem not an extremely unlikely event, as many scientists have assumed, but virtually inevitable. In 2006, Eric Smith and the late Harold Morowitz at the Santa Fe Institute argued that the thermodynamics of nonequilibrium systems makes the emergence of organized, complex systems much more likely on a prebiotic Earth far from equilibrium than it would be if the raw chemical ingredients were just sitting in a “warm little pond” (as Charles Darwin put it) stewing gently.

In the decade since that argument was first made, researchers have added detail and insight to the analysis. Those qualities that Ernst Mayr thought essential to biology — meaning and intention — may emerge as a natural consequence of statistics and thermodynamics. And those general properties may in turn lead naturally to something like life.

At the same time, astronomers have shown us just how many worlds there are — by some estimates stretching into the billions — orbiting other stars in our galaxy. Many are far from equilibrium, and at least a few are Earth-like. And the same rules are surely playing out there, too.

This article was reprinted on

View Reader Comments (39)

Leave a Comment

Reader CommentsLeave a Comment

  • " … physics now lets us predict, starting from the state of the universe a billionth of a second after the Big Bang, what it looks like today."

    I think that might be news to an awful lot of physicists and cosmologists. There are significant features of the Big Bang and the universe as it appears to be today that are either still hypothetical or simply remain unexplained. We have no theory that provides a solid explanation for gravity; we don't know how many dimensions there really are; dark matter may or may not exist, and some MOND-type theories are respectable enough not to be completely dismissed; dark energy may or may not exist, and very recently we've *again* come a cropper with inconsistent figures for the current rate of expansion of the universe; the cosmological constant remains ridiculously inconsistent with vacuum energy (pace renormalisation); no one knows why the universe appears perfectly tuned for life, so folks resort to anthropocentric arguments; proton decay has still to be observed; … there's no point listing all the things we either don't know or haven't been able to prove: there are plenty.

    A grail of cosmology would indeed be a set of rules which would allow us to predict how and why the universe is as it is, starting as the singularity or whatever was instantiated and the very first 'tick' of the Planck clock began running: but I don't think anyone claims to have done that yet, do they? (And I'm talking about a detailed understanding, not just "A colossal initial energy density made everything get bigger really fast, and it's still going".)

    One might also consider that the mathematical techniques used to try to figure this stuff out have taken many odd twists and turns. The 'silos' of different branches of math periodically reveal amazingly elegant connections between each other, often, strangely enough, showing us that what we thought to be complex or even impossible viewed through one mathematical prism turns out to be simpler, or at least tractable, when approached differently.

    I hope the author will turn out to be right, one day. I hold out hopes for a paper explaining it all from that first 10^-44sec moment, perhaps titled "In the Beginning was the Word, and the Word was Geometry"—but we're not there yet.

  • Is this not a form of jiujitsu? If the purpose is to maximize entropy, which is what the universe wants anyway, then life is essentially tricking the universe by overhelping, overgoing with the flow, i.e. increasing the entropic flow beyond "normal" for life's own benefit. Way to go, life. I bow in your direction.

  • I think it suffices to say that life's only regulating factor is 'to persist' or in more common terms "to survive". It appears to me that evolution is the result of the process of survival.

    What good is served for modifications in life to evolve if persistence isn't present. The end result is non survival, or death. First, the structure or whatever it was had to persist, or to duplicate. Therefore is seems survival or the urge to survive must have been present. One may claim that survival is an evolution, and yet, if persistence hadn't existed as an original functional requirement for life, any "organism" in the environment it wouldn't have persisted long enough to evolve. Therefore is is not evolution which drives life, but persistence, or survival.
    I think we see this in some forms of disease where the dominating action against a living host which ultimately leads to death is an overactive persistence of duplication. Therefore even the concept of persistence must begin with duplication, and yet mere duplication isn't enough as no single entity produced would persist long enough to evolve. And so we have persistence. The first act is either duplication, or duplication with persistence, then evolution has a framework upon which to evolve.

  • "It’s almost a defining characteristic of life that biological systems change" – indeed, life could be defined as change.

  • This can help explain how an individual organism operates to survive and live as long as possible. But the other defining characteristic of life is reproduction, which has no benefit for the parent organism, particularly those that have little or no contact with their offspring after reproduction. How would thermodynamics explain that?

  • The article mentions life as a "prediction machine" and idea which I am very fond of lately. But I didn't see a mention of Adami's novel ideas concerning the re-definition of information as the ability to make a prediction better than chance. A while back your magazine wrote a very nice article on this :
    and I think its very relevant to this discussion.

  • The second law of thermodynamics is a myth or, more precisely, a money-spinner. Violations are very easy to demonstrate but scientists would not react. Here are perpetual-motion machines of the second kind published in prestigious journals and no reaction at all from the scientific community:
    Electricity generated from ambient heat across a silicon surface, Guoan Tai, Zihan Xu, and Jinsong Liu, Appl. Phys. Lett. 103, 163902 (2013): "We report generation of electricity from the limitless thermal motion of ions across a two-dimensional (2D) silicon (Si) surface at room temperature. […] …limitless ambient heat, which is universally present in the form of kinetic energy from molecular, particle, and ion sources, has not yet been reported to generate electricity. […] This study provides insights into the development of self-charging technologies to harvest energy from ambient heat, and the power output is comparable to several environmental energy harvesting techniques such as ZnO nanogenerator, liquid and gas flow-induced electricity generation across carbon nanotube thin films and graphene, although this remains a challenge to the second law of thermodynamics…"
    D. P. Sheehan et al, Foundations of Physics, March 2014, Volume 44, Issue 3, pp 235-247: "…there arise between the vane faces permanent pressure and temperature differences, either of which can be harnessed to perform work, in apparent conflict with the second law of thermodynamics. Here we report on the first experimental realization of this paradox, involving the dissociation of low-pressure hydrogen gas on high-temperature refractory metals (tungsten and rhenium) under blackbody cavity conditions. The results, corroborated by other laboratory studies and supported by theory, confirm the paradoxical temperature difference and point to physics beyond the traditional understanding of the second law."

    Pentcho Valev

  • The conference and this article arising from it I'm sure are pushing in the right direction. I've been thinking along these lines for years. To my mind this kind of thinking does explain the "anthropecene era" and why we will drive all other life forms that do not serve our needs to extinction. It also implies we or something evolved from our genome will survive any climate change fluctuations short of a biologically destructive one. And it also explains why we are chewing through the earth's surface so dramatically, grindind away mountains for a few grains of gallium, selenium, or whatever other quantumly useful material technology requires.
    It must also explain – though as yet I'm not sure how – how the rich get richer and the poor can only be more numerous; that 1% or so of the world's population own 50% of the wealth. It's a Black Hole syndrome of a kind and has to be accounted for from within the envelope of thermodynamics.
    In short any biological dynamic we as humans have thought we were the instigators of can and should be explained from basic physics principals. Free will anyone?

  • Things are either organic or inorganic. Organic things can evolve, inorganic things do not evolve. Organic things can go extinct, inorganic things do not go extinct. Organic things can evolve consciousness, inorganic things won't evolve consciousness. Inorganic things are here forever, organic things come and go. Organic things are dependent on other stuff, inorganic things are not dependent on anything. Please excuse me for avoiding things that do and don't have carbon hydrogen bonding.

  • If the theory looks so promising and is built on simple Physics principles, why not to try making basic organic molecules of carbon from mixture of elements from the Periodic Table ?? If it works so nicely with computer programs running on similar principles, it's time for trying those principles with real matter elements.

  • Living systems both (1) archive information regarding past trials at persisting (essentially the genome), and (2) catalyze the conversion of Free Energy to heat (enzymes/nano-machines performing work).

    (2) is obviously in accord with the Second Law of Thermodynamics, but (1) seems otherwise.

    However, the storage of information about persisting (1) is based upon undirected and unintentional sorting out of random information passively performed by the environment (natural selection).

    Life is is entirely consistent with the Second Law. It's merely physics.

  • Occasionally, science is accused of applying "magic" (fudging) to gain a presupposed outcome. This article should help those who do so.

  • From this interesting article (and other discussions in Quanta) it appears that the 2nd Law of Thermodynamics is responsible for the formation of Galaxies, stars, and both the de novo creation of Life and it's evolution. That's seems to be a lot to ask of one Law stating that the drive to entropic systems is the reason complex systems arise. Though I do see the connection, one aspect of this 2nd Law has always bothered me. If we heading full speed ahead toward entropy, disorder and randomness, where is the ultimate origin of the disentropy and order that must have been in place first? Is it the Universe compressed that exploded in the Big Bang? If that is the case, then isn't reasonable to say that prior to the 2nd Law driving us to entropy there must be another Law that first "says" matter is ordered, non-random and disentropic. The Universe has a long way to go to become entirely random and prior to this, there must have been order. There is something missing in our current description of the way the Universe works and the 2nd Law doesn't provide the answer as far as I can reason.

  • The problem with computers today is that we are overly focused on instances, and neglect to understand whole. Big data. All answers are in big data, if we just can get the right algorithms and power to process it all. Wrong!

    Life is a concept. It's all in our heads. And we don't agree at all or we wouldn't have such large issues over "right to life." Life exists as a theory and a model, in our heads. And we confuse maps with territory. What is a "concept?" How many atoms does it occupy, and how is "energy" passed and thermodynamics involved when we share a concept. How many atoms are involved in a "relationship" between two concepts? I understand the "chemistry" and brain neurons firing and thermodynamics in terms of the processing that we as human do, but exactly what about the concept itself? Those things we "share" with Aristotle, Plato, Newton; an "understanding" of science and philosophy. Is that sharing related to some physical property or is it something else?

    This is good and necessary research, but there is a long way to go, and some of the speculative associations seem presumptive.

    Is "matter" as we currently understand it all there is? How sure are we of that? My impression, based on the latest results from CERN and other places is that we are really sure we do not understand it all, and still have questions to ask. I'm assuming that will ALWAYS be the case. But, research and investigating mysteries is a good thing and a noble endeavor, and I applaud the investigation.

  • Isn't the title: " How how Life (and Death) spring from disorder?" ironic; when everything in the article, references and comments discuss order and nothing but order? Researchers as well as educators, when engaging in thought experiments choose from a point that to them seems like random and chaotic and wind through logic that seems to end at a convenient point that supports their conclusion(s) when in fact both the physical multiverse and everything in it including matter, life, languages are all a testament to order and the inevitable consequences of orderliness. Even big bang is an inevitable consequence and not disorderly.

  • I highly disagree with the notion that biological aging is in any way a result of some fundamental principle of physics – primarily because it is demonstrably wrong. Turritopsis doohmii, Bdelloids, Hexactinellids, and many other animals are biologically immortal. They do not age, they die only by accident or misadventure. Aging and death, therefore, cannot be a principle woven into physics.

    The only rational explanation for aging is that offered by an evolutionarily advantageous benefit to culling generations in exchange for increased adaptational mutation. Aging is not physics, it is programming – preprogrammed senescence and death, woven into the genetics of many (even most), but by no means all, animals and plants.

    I am weary of any failure to properly quantify aging as an evolutionary adaptation, rather than some mysterious inescapable and innate property of the universe. That just is not so. Aging is an engineering problem, and a solvable one, because it is now very clear that some animal and plant species simply lack it altogether. Senescence has to be added. It does not come standard.

  • The article shows a new window of science where a non equilibrium sate of overall system can result in the scenario of "things doing stuff" with the intent and information. Scientist can work out a definite link between non-biotic and biotic (conscious) world with this direction of research. Although the examples of sand dune patterns and crystal shapes and small discs attracting larger disc with an intent has still to be explained in mathematical, statistical and scientific manner to prove the theory. Lately mathematician are also considering effort, will, wish and imagination as parameters in proving several complex mathematical concepts.
    The space agencies busy in find intelligent life in the interstellar system can also focus on the intelligence of matter with some intent and information to detect the primordial shape of life to come and which can be considered as interpolation of organic evolution.

  • It will be interesting to see if any of these ideas will supplement known evolutionary mechanisms. But aside from the science of the latter part of the article that seems remote from biology, the earlier part of science can – as has already been noted in the comments – better be explained by gene evolution in cellular populations. Reproduction, drift and selection predicts survival, homeostasis, thermodynamic efficiency, environmental information use, gene information, biologically meaningful information (alleles and gene regulation resulting in trait expression) and death.

    Specifically though, if I put on the bioinfomatician hat that I have not fully earned yet, I note that not only has especially eukaryotes vast "nostalgic" memories of genetic parasites, pseudogenes (junk genes) and random survivable genetic changes, those unused memories are explicitly useful in fitness increasing instances of co-option. That is just the relevant biology, but it should be accounted for to avoid mistaken claims. And of course we know from Crick [the original central dogma] that whatever proteins in motor or sensor biomolecules "learn" it cannot be put back into genes without selection on the cellular population.

    Now I will put on my old physics hat and discuss the non-biological parts. The article's notion of "intentionality" in Maxwell's demon is strained beyond credulity. What it does can be described as an algorithm based on observation. In essence it is no different from the "algorithm" guiding a played billiard ball's travel.

    On the other hand I like the observation that the thermodynamics of geological non-equilibrium systems is much more likely to be responsible for emergence of life than equilibrium soups. Though to be fair to soup cooks, they use solar irradiation, wet/dry cycles and hydrothermal systems when they propose emergence recipes.

  • At the moment I am reading "Holism and Evolution" by J. C. Smuts (ca 1927), and even allowing for its date of publication, I find it so deeply disappointing as to think I see why it never really survived as required reading for scientists. The opening paragraphs of this article recall some of the weaknesses in "Holism" (not to be confused with some modern versions of the application of the term).

    Consider: "… same experiment again, but replace the cannonball with a pigeon…"

    Oh yes? As the diagnostic paradigms of two disparate classes? To make the analogy meaningful rather than misleading, try: "… same experiment again, but replace the cannonball successively with a fresh, fertile egg of a pigeon, a handful of dust, a kilogram of ice, a litre of water, a thin, rigid tile, a screw-twisted strip of paper, a narrow hoop of paper, a kiting spider, a functional drone, and an experimentalist…"

    Smuts made similar blunders in his book and developed them into what he thought was a basis for a philosophy and a great empirical and explanatory truth and insight, but within a few chapters his theme had deteriorated into meaningless assertions based on arbitrary assumptions adopted within the starting pages.

    The whole question of the nature of Life the Universe and Everything reduces to the nature of emergent consequences and attributes of systems at varying levels of complexity (always assuming that these and other principles are not misleading us concerning our undoubtedly simplistically flawed, basically Ockhamist, assumptions, such as that we are not the toys of some god's selfish jest).

    Thus, a water molecule in isolation tells us little about the nature of a droplet; that in turn tells us little about the nature of a bucketful, and less about lakes and oceans. The nature of say, plastics and a few metals would be a poor basis for predicting the nature of cars, aircraft, firearms and railways, and the nature of organic molecules as a class would tell us little about the nature of pre-biotic slimes and subsequently about the emergent nature of the creatures that might elect to use such emergent items.

    But every step along the dimension of increasing complexity leads to its own emergent effects, not all initially obvious. And there are many such levels between the earliest macromolecules and life-as-we-know-it, Jim.

    Consider again: "Biological systems don’t defy physical laws, of course — but neither do they seem to be predicted by them…" No? Less predicted in the light of atomic behaviour, than the nature and behaviour of oceans or weather patterns or the paths of eroding rivulets?

    "In contrast, they are goal-directed…" empirically, an ocean wave, a mousetrap, an autopilot, an arrow in flight, and a man with a maid are equally goal-directed.

    "We can say that they have a purpose — or what philosophers have traditionally called a teleology…" Really? Have you asked a flu virus, a lichen, a sponge, a monkey, a baby, a voter, to explain its teleology in any way that helps define life?

    And thermodynamics? How did thermodynamics get into this? That mouldy old chestnut has been laboured to death for generation after generation during the last century plus, as the thermodynamicists repeatedly explain, hoarser with boredom as the ages roll, just what the elementary principles of thermodynamics entail.

    Anyone raising that question in connection with how life and evolution violate thermodynamics rather than religiously following thermodynamic principles, ipso facto demonstrates that he is incompetent to participate in any ensuing debate: back to the elementary textbooks first!

  • @Peter: Processes has no 'purpose' and I think increase of entropy – if that is meant by 'maximizing' it – is a good example. Any engine, like a cell, would increase entropy faster when it operates than when it does not. It is forced to do so by thermodynamics if it does work. If nothing else a cell is a refrigerator, it places and bonds ("freezes") molecules in out-of-equilibrium configurations in order to grow and replicate, and specifically motor molecules in order to maintain the through flow of free energy.

    @Milton: Cosmology was not the purpose of the article. But yes, it is known that given the known laws of cosmology (the LambdaCDM model, which of course includes gravitation) you can model the universe with good faithfulness, within an order of magnitude of galaxies et cetera if I remember correctly. A recent success is that if supernova outflow feedback is included you do a better job at all scales, from individual dwarf galaxies over their numbers to spiral galaxies, detailed scales that early models did worse on than specific ad hoc models (such as MOND) [ , , ].

    Side note: Your comment has the – avoidable – problem discussed here: . Your "cropper" of inconsistent figures is likely two recent data sets, none of which makes significant prediction (the earlier observation is just shy of 3 sigma, the more recent is based on microlensing by ~ 5 galaxies I think) and the real inconsistency found – an earlier unknown generic dust component that has occluded Planck observations – will likely align all near-outliers when accounting for it does so with the bulk of the data for the umpteenth time. Fair warning: not that it will silence all protests against expansive and robust knowledge of course. But eventually such protests will be better explained by psychology than by some fragile basis of physics.

    Finally you ask why the universe appears perfectly tuned for life when it is known that evolution makes populations of organisms appear tuned for life (sufficient fitness based on evolved traits). But I admit that part of the general appearance of tuning could be due to selection bias as Quanta Magazine has described earlier:

    "Between string theory and cosmology, the idea of an infinite landscape of possible universes became not just acceptable, but even taken for granted by a large number of physicists. The selection effect, Silverstein said, would be one quite natural explanation for why our world is the way it is: In a very different universe, we wouldn’t be here to tell the story.

    This effect could be one answer to a big problem string theory was supposed to solve. As Gross put it: “What picks out this particular theory” — the Standard Model — from the “plethora of infinite possibilities?”"

    [ ].

  • Some of the topics raised in that article (and if we are to take it seriously, at the workshop) simply are incoherent. Consider:

    "Once we regard living things as agents performing a computation — collecting and storing information about an unpredictable environment — capacities and considerations such as replication, adaptation, agency, purpose and meaning can be understood as arising not from evolutionary improvisation, but as inevitable corollaries of physical laws. In other words, there appears to be a kind of physics of things doing stuff, and evolving to do stuff. Meaning and intention — thought to be the defining characteristics of living systems — may then emerge naturally through the laws of thermodynamics and statistical mechanics."

    This is one of the most thoroughly invalid paragraphs I ever have read.
    …collecting and storing information… could as well apply to a homing missile.
    … about an unpredictable environment… makes no sense; the whole point about collecting the information is that the environment *is* predictable enough to make the information profitable.

    … capacities and considerations such as replication, adaptation, agency, purpose and meaning can be understood as arising not from evolutionary improvisation, but as inevitable corollaries of physical laws… They never could be understood as violating the laws even in the most naively conceived of empirical schemata, whereas any scheme based on the routine violation of empirical prediction and explanation simply make no realistic sense, scientific or otherwise.

    … physics of things doing stuff, and evolving to do stuff…
    Yes. That kind of physics is a class of examples of what we call physics. Physics in such matters deals with the nature of situations and consequences. This remains true even at quantum levels, where the limits of precision become dominant. So what else is new? Are we supposed to assume that a slide rule or computer violates physics in producing the response to input? Or that the nervous system of a spider dodging an attack, or of the mantis lunging at it, is violating physics? Bear in mind that information theory and formal mathematics ultimately are consequences of physics; they cannot materially exist even in the mind of a theoretician, not without physical collection, storage, and processing.

    … Meaning and intention — thought to be the defining characteristics of living systems …
    Oh really???? I must ask the next Escherichia, Poriferan, Caenorhabditis or Aplysia what its meaning or intention are. Or if they prove too dismissive, perhaps I could ask the denizens of a local bar. Those are all supposed to be living systems. And a good deal of software has been written that has persuaded terminal operators that there was a human behind the screen. And what about the reproductive behaviour of pepsin, a molecule? Accordingly, which of those classes of system are the most life-like?

    …may then emerge naturally through the laws of thermodynamics and statistical mechanics… By what standard were they ever independent of those laws?

    Must try harder…

  • The whole debate seems to be full of non-problems: "One difficulty with the Darwinian view is that there’s no way of defining a well-adapted organism except in retrospect." Even allowing for hyperbole, that is not a difficulty, any more than the limitations of predicting the path of a lightning stroke or a down feather from a passing swift to the ground represent a difficulty to physical theory.

    How about: "You’d expect natural selection to favor organisms that use energy efficiently."
    That makes no sense. What you SHOULD expect is for selection to favour organisms that deal most effectively with the limiting factors in their environment. If that happens to be a shortage of energy, then they are energy efficient. Hemiptera that feed on sap on the other hand, commonly get rid of honeydew in quantities that dwarf their entire metabolic energy budgets — sometimes by factors of hundreds or more.

    For them energy is not a limiting factor; why should they conserve it? This is basic evolutionary logic.
    Wasn't there an evolutionist at the workshop?

  • My lighting and acoustics teacher said that a mathematical formula was a poem that explained the nature of a process. So perhaps life begins with poetry?

  • Interesting article. Just a comment on the video. In my opinion, the most convincing theory for the origin of life is of deep sea alkaline sea vents creating a natural proton gradient flowing through porous rock. If that is the case, then a natural non-evolving system of greater than random order fostered life.

  • Your article makes me ponder ecosystems and food chains. Your point that nonliving systems also show elements of 'adaptation' or becoming a stable process that fits the thermodynamic properties was amazing.

    Ecosystems and food chains supposedly form after millennia and depend on all members to be self-sustaining systems. I ponder if food chains and wider ecosystems can come and go, and establish themselves and become stable in short time frames and involve any number of replacement 'parts' of the chain. From deep sea vents to near space, living and nonliving systems seem to form complex interactions and stabilize much sooner and easier than the old models would have us believe.

  • The fatal flaw with this theory is that, to the best of our knowlege, life only occurred once. Life is not constantly forming out of organic materials. If Earth had missed its one time happening of life, it would be just another rock in space.

    Life is neither common or inevitable. So far as we know we may be the only living things in this universe.

  • I have a problem with the Landauer limit. What if instead of demon we used an angel dragging a crescent curtain behind him to form a closed sphere after a period of time? If a box of gas were released in the center then the hotter particles would escape much faster and the angel would wrap up the colder particles in the center. I assume the solution would be that the energy to move the curtain outweighs the decrease in entropy but maybe not.

    A microstate of AGCT can be written as zeros and ones since they pairs together and both copies can encode for proteins. A microstate of ACAC would be 1010 we would say the macrostate is of the 2(A) or contains two zeros. Similarly the microstate 0101 is different from our first microstate but it is in the same macrostate. The entropy is highest where the macrostate has the largest number of microstates. If you take the human genome then the highest entropy macrostate is a sequence that contains about half of 1s. That allows evolution to encode the most amount of genetic information and room leave room for adaptation.

    Fitness is a bit of a misnomer (pun intended) because a fit gene is one spread throughout the population. Two people have 99.9% of the same (fit) genes, however, a hypothetical randomly different 0.01% of someone's genes gives them a slight edge in producing offspring. Both people will still reproduce but eventually more of the children with the slight advantage will have more offspring until everyone who is paired off has the gene for more offspring. No one died of loneliness or went extinct but the better gene is now a fit gene in the population where everyone is 99.901% identical. If two of those populations were separated then over time they would diverge into different species as random mutations and natural selection (the spreading of fit genes) helped that group adapt in their environment.

    As far as humans are concerned we are able to think and adapt to our environment in fewer generations by using tools and gaining knowledge through longevity and language.

    The copying of information is accurate but still imprecise which leads to a breakdown between the individual cells which will start replicating on their own (cancer) and ruin everything else. A single celled organism can replicate indefinitely but a colony of organisms needs some rules to prevent a hostile take over. Each cell can replicate x times with the original cell and daughter cells dividing x-1 onward and counting their generation through the length of teleomeres. When a cell is the last of the line it is likely it is full of errors and replicating further might jeopardize the entire organism.

    With CRISPR and abundant sources of energy and huge computing power there is no reason that we should not be able to extend the life time of cells without dangerous trade offs like cancer.

    Now how do I get grant funding for common sense?

  • As far as I can see, evolution does not require purpose or intentionality – which are conclusions interpreted by an observer – it simply requires a bundle of matter which has the ability to replicate itself with and without errors (mutations). In a stable environment, the error-free reproductions will continue to survive and thrive. Most errors will lead to changes which reduce the chance of survival, so they and their offspring are unlikely to survive for long when competition for resources is fierce. Occasionally a reproductive error may turn out to be advantageous, leading to a new strain gradually expanding amongst the general population. If the environment changes, causing the faithfully replicated offspring to struggle to survive, one of the many random mutations may prove advantageous, and so that strain may rapidly rise to dominance. None of this requires purpose – it's just mechanical reproduction with errors (with those errors providing potential flexibility). Sexual reproduction adds an additional layer of flexibility by allowing the traits from two strains to be shuffled in several ways, and very rapidly: some cobinarionsmbinations may remove harmful traits from offspring (helpful in a stable environment), whereas others may receive a bunch of traits which will be a disadvantage now, but may turn out to be helpful at times of environmental change. As far as the origin of life is concerned, I have no problem imagining a scenario where a bunch of simple molecules slamming together for millions of years may create more complex molecules – one of which may happen to have an architecture which allows it to act as a template for constructing a replica, sometimes with errors.

  • "..cultured human cells seem able to replicate no more than 40 to 60 times.. before they stop and become senescent."
    But egg and sperm cells forming a new embryo must have replicated thousands of time to go through thousands of generations, insn't it?

  • Jon R.: "And there are many such levels between the earliest macromolecules and life-as-we-know-it, Jim."

    Is that "Jim," as in Dr. McCoy to Captain James T. Kirk of the Starship Enterprise?

    Or perhaps just a generic, "Jim"?


  • Torbjörn Larsson—Thank you for an informative and thoughtful response. "Cosmology was not the purpose of the article" is absolutely right and I did pause a long while before pressing Submit, because I hate off-topic rants as much as anyone. I went ahead anyway because I also hate woolly assumptions that then become the basis, or at least part of the rationale, for major claims. The article makes some controversial points, not least the debatable and (I would argue) poorly evidenced claims about ageing, and shouldn't be allowed to adduce over-confident statements about physics generally.

    Thanks particularly for the note on dark energy etc. Perhaps I should have read more deeply before commenting on that. I certainly don't want to fall into the "… protests against expansive and robust knowledge …" category.

  • Very interesting article. We often see that there are many points of view of a same manifestation and each of these points of view can be true. This view of life reconciliate the living with the physics world. Sometimes we forget living beings are complex systems interacting with environment whish involve manipulation of information. But these complex systems are also ruled by chemistry. And chemistry is an approximation of physics. So physics and information have their place to explain life. In addition, information can be transform into energy, so information is part of physics. In a nutschell, it does not seem very weird that we can understand life thanks to physics (thermodynamics/information).

  • We need "Thumbs Up" / "Thumbs Down" and "100% Crank" clickable elements on these commentariat postings. I find the lack of interactivity here disturbing.

  • Agreeing with Jennifer Reitz's observations. Aging in this article is very naively explained as an accumulation of errors, a phenomenon described by M. Eigen (among others) as an "error catastrophe". The tolerable error limit, beyond which living organisms cannot sustain their information and therefore their lives, is evolutionarily preserved by the design of DNA repair enzymes, which are tuned to have specific error rates. Too little error and evolution ceases, too much and organisms die. Certainly biological aging can be slowed and life spans extended indefinitely. Biology used physical laws to evolve nanomachines, which obey those physical laws. These nanomachines can persist indefinitely as long as they can transduce energy and preserve their information.

  • For those who are following these sub areas of interest, they'll be sure to be interested in a few other resources/workshops in the recent past, some with available video (<a href="">BIRS: Biology and Information Theory</a> and <a href="">NIMBioS: Information and Entropy</a>) as well as the one that is happening this week at Arizona State University on <a href="">Quantifying Biological Complexity</a>.

    Thanks Philip for your continued best-in-class coverage of these areas of science.

  • "The teleology and historical contingency of biology … depends on chance and randomness, but natural selection gives it the appearance of intention and purpose."
    – How do we know is just the "appearance" of intention?

    "there’s no way of defining a well-adapted organism except in retrospect. The “fittest” are those that turned out to be better at survival and replication, but you can’t predict what fitness entails"
    – Well said, but how do you reconcile with so much conflicting literature out there?

    "A well-adapted entity can absorb energy efficiently from an unpredictable, fluctuating environment. It is like the person who keeps his footing on a pitching ship while others fall over because she’s better at adjusting to the fluctuations of the deck."
    – Why does it matter if something falls or not? And why on Earth but not on any other planet?

    "Entropy maximization has long been thought to be a trait of nonequilibrium systems. But the system in this model obeys a rule that lets it maximize entropy over a fixed time window that stretches into the future. In other words, it has foresight."
    – Is this how Information comes about? Is Information just maximizing entropy into the future?

    "Aging, too, has conventionally been seen as a trait dictated by evolution. Organisms have a lifespan that creates opportunities to reproduce, the story goes, without inhibiting the survival prospects of offspring by the parents sticking around too long and competing for resources. … "
    – Not all organisms age, which shows that aging is not absolutely necessary. We can all think of alternative methods to aging that would work just as well.

  • Excellent compilation of views, thank you Philip for this and for your books! However, some of the views presented here are not very convincing (even for a 15 year old reader!).

    Can physics explain all aspects of biological life (and intelligence)? Unlikely – because higher levels of physical reality, with their new emergent properties, cannot always be reduced to- and adequately described in terms of lower levels.

    Is modern science advanced enough to explain all observed physical phenomena? No. Modern physical laws cannot explain the Big Bang singularity, and the explanations of the following evolution of the universe deal only with low levels of physical reality (from elementary particles to molecules, but no higher constructs or lower substrate).

    It's obvious that all living organisms are information processing systems – this helps them predict reality and function in their habitat. But is the information processed by a living organism the same as the information at particle or molecular level? No. These are different levels of organisation of information. Living organisms deal at much more macroscopic level of information.

    Information storage/processing by our brains indeed comes at a cost of consuming "ordered energy" (e.g. trapped in food), and the cost/benefit ratio is critical. The strategic benefit is that a practically useful/actionable knowledge (not just "meaningful" knowledge) is generated – empowering us to deal with and control physical reality (so life isn't just about gobbling up all "negative entropy" to simply continue, informational arsenal is getting accumulated in the meantime). Perhaps by the time the hypothesised "heat death" of the universe approaches, the civilisation (i.e. intelligence which moved beyond biological life stage) will accumulate sufficient knowledge to manipulate its own physical basis (all the way to the basic levels of reality, whatever they are)

    There is no contradiction between "energy processing" and "information processing" views of life. These refer to different processes. Life relies on physical levels – energy/particles/atoms/molecules. Brain's information processing is a much higher level process (and possibly gives rise to yet higher level, purely informational/virtual construct – the mind).

    I don't think a swarming particles based simulation can be equated with "intelligence". Development of true intelligence is impossible by cutting corners, and will require "internal" things such as the memory capable of encoding representation, the ability to focus attention, to learn, etc, as well as "external" dynamics (which form the raison d'etre of intelligence).

    The claim that the ageing process is purely physical is based on certain assumptions (about energy available to organism etc). One need to check if those assumptions aren't stemming from evolutionary/biological reasons.

    Lastly and sadly, the most interesting idea of the article – that the "telos" of biological life emerges out of underlying physical reality – is not demonstrated by the researches mentioned.

  • There are quite a few comments here related to information, information processing and manipulation of information, but as of yet I find no mention of what makes up the information we discuss. I put forth (and for someone else to research as I am not a physicist) that information is a particle or a wave or perhaps a field as in an "Information Particle–Inforp", an Information wave-particle or simply an Information Field. Information is critical and/or perhaps necessary in bringing order from disorder and I propose an Information particle/wave/field has influence and exists as something concrete.

Comments are closed.