Evolution’s Random Paths Lead to One Place

A massive statistical study suggests that the final evolutionary outcome — fitness — is predictable.

Different strains of yeast grown under identical conditions develop different mutations but ultimately arrive at similar evolutionary endpoints.

Daniel Hertzberg for Quanta Magazine

Different strains of yeast grown under identical conditions develop different mutations but ultimately arrive at similar evolutionary endpoints.


In his fourth-floor lab at Harvard University, Michael Desai has created hundreds of identical worlds in order to watch evolution at work. Each of his meticulously controlled environments is home to a separate strain of baker’s yeast. Every 12 hours, Desai’s robot assistants pluck out the fastest-growing yeast in each world — selecting the fittest to live on — and discard the rest. Desai then monitors the strains as they evolve over the course of 500 generations. His experiment, which other scientists say is unprecedented in scale, seeks to gain insight into a question that has long bedeviled biologists: If we could start the world over again, would life evolve the same way?

Sergey Kryazhimskiy

Michael Desai, a biologist at Harvard University, uses statistical methods to study basic questions in evolution.

Many biologists argue that it would not, that chance mutations early in the evolutionary journey of a species will profoundly influence its fate. “If you replay the tape of life, you might have one initial mutation that takes you in a totally different direction,” Desai said, paraphrasing an idea first put forth by the biologist Stephen Jay Gould in the 1980s.

Desai’s yeast cells call this belief into question. According to results published in Science in June, all of Desai’s yeast varieties arrived at roughly the same evolutionary endpoint (as measured by their ability to grow under specific lab conditions) regardless of which precise genetic path each strain took. It’s as if 100 New York City taxis agreed to take separate highways in a race to the Pacific Ocean, and 50 hours later they all converged at the Santa Monica pier.

The findings also suggest a disconnect between evolution at the genetic level and at the level of the whole organism. Genetic mutations occur mostly at random, yet the sum of these aimless changes somehow creates a predictable pattern. The distinction could prove valuable, as much genetics research has focused on the impact of mutations in individual genes. For example, researchers often ask how a single mutation might affect a microbe’s tolerance for toxins, or a human’s risk for a disease. But if Desai’s findings hold true in other organisms, they could suggest that it’s equally important to examine how large numbers of individual genetic changes work in concert over time.

“There’s a kind of tension in evolutionary biology between thinking about individual genes and the potential for evolution to change the whole organism,” said Michael Travisano, a biologist at the University of Minnesota. “All of biology has been focused on the importance of individual genes for the last 30 years, but the big take-home message of this study is that’s not necessarily important.”

Sergey Kryazhimskiy

To efficiently analyze many strains of yeast simultaneously, scientists grow them on plates like this one, which has 96 individual wells.

The key strength in Desai’s experiment is its unprecedented size, which has been described by others in the field as “audacious.” The experiment’s design is rooted in its creator’s background; Desai trained as a physicist, and from the time he launched his lab four years ago, he applied a statistical perspective to biology. He devised ways to use robots to precisely manipulate hundreds of lines of yeast so that he could run large-scale evolutionary experiments in a quantitative way. Scientists have long studied the genetic evolution of microbes, but until recently, it was possible to examine only a few strains at a time. Desai’s team, in contrast, analyzed 640 lines of yeast that had all evolved from a single parent cell. The approach allowed the team to statistically analyze evolution.

“This is the physicist’s approach to evolution, stripping down everything to the simplest possible conditions,” said Joshua Plotkin, an evolutionary biologist at the University of Pennsylvania who was not involved in the research but has worked with one of the authors. “They could partition how much of evolution is attributable to chance, how much to the starting point, and how much to measurement noise.”

Courtesy of Sergey Kryazhimskiy

Fluid-handling robots like this one make it possible to study hundreds of lines of yeast over many generations.

Desai’s plan was to track the yeast strains as they grew under identical conditions and then compare their final fitness levels, which were determined by how quickly they grew in comparison to their original ancestral strain. The team employed specially designed robot arms to transfer yeast colonies to a new home every 12 hours. The colonies that had grown the most in that period advanced to the next round, and the process repeated for 500 generations. Sergey Kryazhimskiy, a postdoctoral researcher in Desai’s lab, sometimes spent the night in the lab, analyzing the fitness of each of the 640 strains at three different points in time. The researchers could then compare how much fitness varied among strains, and find out whether a strain’s initial capabilities affected its final standing. They also sequenced the genomes of 104 of the strains to figure out whether early mutations changed the ultimate performance.

Previous studies have indicated that small changes early in the evolutionary journey can lead to big differences later on, an idea known as historical contingency. Long-term evolution studies in E. coli bacteria, for example, found that the microbes can sometimes evolve to eat a new type of food, but that such substantial changes only happen when certain enabling mutations happen first. These early mutations don’t have a big effect on their own, but they lay the necessary groundwork for later mutations that do.

Diminishing Returns

Desai’s study isn’t the first to suggest that the law of diminishing returns applies to evolution. A famous decades-long experiment from Richard Lenski’s lab at Michigan State University, which has tracked E. coli for thousands of generations, found that fitness converged over time. But because of limitations in genomics technology in the 1990s, that study didn’t identify the mutations underlying those changes. “The 36 populations we had then would have been much more expensive to sequence than the hundred they did here,” said Michael Travisano of the University of Minnesota, who worked on the Michigan State study.

More recently, two papers published in Science in 2011 mixed and matched a handful of beneficial mutations in different types of bacteria. When the researchers engineered those mutations into different strains of bacteria, they found that the fitter strains enjoyed a smaller benefit. Desai’s study examined a much broader combination of possible mutations, showing that the rule is much more general.

But because of the small scale of such studies, it wasn’t clear to Desai whether these cases were the exception or the rule. “Do you typically get big differences in evolutionary potential that arise in the natural course of evolution, or for the most part is evolution predictable?” he said. “To answer this we needed the large scale of our experiment.”

As in previous studies, Desai found that early mutations influence future evolution, shaping the path the yeast takes. But in Desai’s experiment, that path didn’t affect the final destination. “This particular kind of contingency actually makes fitness evolution more predictable, not less,” Desai said.

Desai found that just as a single trip to the gym benefits a couch potato more than an athlete, microbes that started off growing slowly gained a lot more from beneficial mutations than their fitter counterparts that shot out of the gate. “If you lag behind at the beginning because of bad luck, you’ll tend to do better in the future,” Desai said. He compares this phenomenon to the economic principle of diminishing returns — after a certain point, each added unit of effort helps less and less.

Scientists don’t know why all genetic roads in yeast seem to arrive at the same endpoint, a question that Desai and others in the field find particularly intriguing. The yeast developed mutations in many different genes, and scientists found no obvious link among them, so it’s unclear how these genes interact in the cell, if they do at all. “Perhaps there is another layer of metabolism that no one has a handle on,” said Vaughn Cooper, a biologist at the University of New Hampshire who was not involved in the study.

It’s also not yet clear whether Desai’s carefully controlled results are applicable to more complex organisms or to the chaotic real world, where both the organism and its environment are constantly changing. “In the real world, organisms get good at different things, partitioning the environment,” Travisano said. He predicts that populations within those ecological niches would still be subject to diminishing returns, particularly as they undergo adaptation. But it remains an open question, he said.

Nevertheless, there are hints that complex organisms can also quickly evolve to become more alike. A study published in May analyzed groups of genetically distinct fruit flies as they adapted to a new environment. Despite traveling along different evolutionary trajectories, the groups developed similarities in attributes such as fecundity and body size after just 22 generations. “I think many people think about one gene for one trait, a deterministic way of evolution solving problems,” said David Reznick, a biologist at the University of California, Riverside. “This says that’s not true; you can evolve to be better suited to the environment in many ways.”

This article was reprinted on Wired.com.

View Reader Comments (17)

Leave a Comment

Reader CommentsLeave a Comment

  • I notice that the *external* characteristics (fecundity etc.) are said to match, whether yeast or fruit flies.

    However, the next important question is: Just how much and where do the genomes of each line that reaches the finish differ?

  • Dr. Desai’s study is impressive in the deployment of technology to the study of evolution and in the outcomes reported. I would have liked to see more emphasis on the concept of shared “selective pressure” and it’s role in convergent evolution (of organismal function) within the discourse of your article. I should like to thank you for brief comments on a question that has occupied me for some time, which is, defining the contrast between evolution for fitness as defined by increasing specialisation versus fitness defined by increasing flexibility/adaptability/evolvability; the former occuring in stable environments that favour rule based approaches, and the latter occurring in changeable and unstable environments that demand regular invention and innovation).

  • There is a huge flaw in Desai’s design. He holds the environment constant. In a real evolutionary environment the life forms are themselves recreating the environment so the whole system is changed by the organisms in the system. That creates a divergent pressure that greatly magnifies differences over time. Hence the relative fitness of initially similar organisms in isolated environments (for example, finches isolated on islands for thousands of years) will diverge over time. Desai has unwittingly decided what the outcome must be and invalidated his own results.

  • @Lee Jamison, Michael Desai says:
    Basically I agree that in natural systems organisms will alter the environment, and that this could in principle lead to divergent selection. There are many other forces that could in principle lead to divergent selection as well. Our experiment was not designed to address all of these possibilities. Instead, we are focusing on one specific possible source of divergent pressures: the effects of initial mutations in shaping future evolutionary trajectories. We show that this specific aspect of evolutionary contingency leads to convergence rather than divergence at the fitness level.

    For what it’s worth, it’s also true that in our system the yeast presumably do change the environment as they evolve, though one could argue that there isn’t as much opportunity for them to do so as might be present in a natural system.

  • If there is global optimal solution that maximizes the final fitness of the system, it is not so surprising that the organism evolves toward that solution after many generations, pretty much independent of the path taken. We see this all the time when using Genetic Algorithms (GA) in optimization analyses. One can even start with a random set of initial genetic characteristics, and sure enough, given enough generations, one ends up at the same final place (unless there are other competing almost as good solutions available to the system, particularly if they’re nearby in parameter space). This seems to be inherent in the nonlinear character of the systems evolution and in the global nature of the optimization procedure. The investigators might want to use GA simulations in their analyses (if they are not already doing so) to get some insight into the evolutionary forces driving the organisms development.

  • Good stuff….
    A reasonable guess for the broad success of different strategies for
    survival in effectively the same environment, by apparently many
    different varieties of yeast cells is to look at the role of biological
    degeneracies, which occur throughout biology. See “Degeneracy:
    a link between evolvability, robustness and complexity in biological
    systems” by James M Whitacre in Theoretical Biology and Medical
    Modelling. Go to http://www.tbiomed.com/content/pdf/1742-4682-7-6.pdf
    This is possibly a breakout field which will transform our understanding
    of Complex Adaptive System in general.

    Also, it seems likely that all Prof. Desai’s yeasts are participating in
    an overall Genetic Programming pathway, busy ‘inventing’ their own
    separate ways to arrive at the same criterion, survival in a certain
    environment. See the wikipedia entry on GR. Also see http://www.eecs.
    harvard.edu/~rad/courses/cs266/papers/koza-sciam03.pdf for a
    Scientific American article titled “Evolving Inventions” by Koza,
    Keane and Streeter.

  • It is curious how yeast batches eventually arrive roughly at the same point. This also happens for many simple algorithms with inherent randomization. For example, the QuickSort algorithm has a fully-randomized process for choosing a pivot element for sorting. The famous sorting algorithm takes an entirely different path each time it runs on the same data set. Yet, the algorithm eventually arrives at the same cleanly sorted table for all runs.

    A possible explanation is that, the whole yeast experiment works like a very large optimization algorithm that is trying to optimize THE SAME cost function, which is the fitness of the species to its current environment (which is kept constant). Since the whole batch of yeast comes from a single initial condition, it becomes less surprising that they arrive at the same point eventually:

    If we represent the fitness of a species to a particular environment by a function, mapping from the “species space” into the “fitness space”, the evolutionary process is searching to maximize this function. Will this function be convex? If yes, it is no surprise that they arrive at the same point. If not, the function might have multiple local optima, some fitter to the environment than the others. In the yeast experiment, the optimization starts at the same point for each yeast sample. Each sample moves through the search space using very few mutations at each step, which can be roughly interpreted as minor movements in the search space. After some time, all successful yeast samples will find a local optima, THE SAME local optima, which is probably the nearest to their initial starting point.

    The curious question is that, if the experiment is redesigned such that the yeast samples do not come from the same initial condition, will they again arrive at the same point? What happens if they are cultured in different environments? (which is equivalent to having different cost functions for each sample) Will they again arrive at the same point?

  • @Tivay: think you are onto something there – it is the cost function that is “driving” fitness and thus convergence of all the specific changes into one “vessel”.

  • What this actually tends to show is that mutations are not random but purposefully chosen.
    It’s the adaptive mutation process at work.

  • I am not a scientist. However, the statement, “Desai’s robot assistants pluck out the fastest-growing yeast in each world — selecting the fittest to live on” seems problematic. Of course, this may only be a construct introduced In the article and not part of the researchers view. Howeve, why is it that the fastest growing yeast is the fittest? Surely somewhere in history there are counter examples. I suppose one could correctly argue that in this system fastest growing is the fittest because that is the selection criteria. But does it hold true in general?

  • @ Randy Clayton. It’s true that the fastest growing organism isn’t always the fittest–the precise characteristics that make up fitness depend on the environment that the organism is adapting to. In this case, the microbes adapt by producing as many offspring as possible by the end of the growing cycle. Those are the organisms that get picked for the next round. Growing quickly is one for the population to produce lots of offspring.

  • @ Roy Niles: I wouldn’t say that mutations are purposefully chosen. The term “adaptive mutation” is sometimes used to refer to situations where some organisms increase their mutation rates under stress.

  • I only recently discovered your magazine and so may have some unnatural need to connect a number of articles I’ve read. However, it seems to me that Martin Hairer’s (In Noisy Equations, One Who Heard Music) work on “regularity structures” within stochastic partial differential equations (SPDE) has a lot in common with the convergence Desai’s found in evolution’s random selection.

  • Would it be accurate to say the following?
    This experiment provides a controlled environment that is able to resolve the systems biology level of detail necessary to observe and study the speciation mechanism’s nonrandom granularity.

  • By stringently selecting only the fastest growing strains in each environment, the authors essentially rapidly depleted the available variation in the individual populations over time. This will have a negative effect on the rate of increase in fitness as the genetic variance in the population drops, would it not? Wouldn’t this explain the decreasing returns in fitness over time?

    Secondly, the selection seems to have been uniform for one trait only—relative speed of growth, across all of the individual identical environments. One has to wonder how much variation existed in the gene(s) responsible for this at the start, and what severely depleting that variation by such stringent selection would do. Is it that surprising, then, that the populations would converge on a similar adaptive “solution”? If so, where in the population genetics literature does it say so?

  • @ Dave Wisker. Thanks for your questions – I asked Desai for his thoughts, posted below:

    Q: By stringently selecting only the fastest growing strains in each
    environment, the authors essentially rapidly depleted the available
    variation in the individual populations over time. This will have a
    negative effect on the rate of increase in fitness as the genetic
    variance in the population drops, would it not? Wouldn’t this explain
    the decreasing returns in fitness over time?

    Desai: This is an interesting point, but the experiment was designed to avoid
    this complication. Most importantly, all of these populations start as
    clones, so there is no genetic variation to begin with. Selection
    therefore cannot act on and deplete existing variation. Instead, new
    mutations arise in each of the populations, and actually increase
    genetic variation over time faster than selection can deplete it. This
    leads to an acceleration in the rate of increase in fitness, at least in
    the short term. A few of our earlier papers have analyzed these effects
    in some detail (see e.g. Desai and Fisher, Genetics 176: 1759-98 (2007);
    Desai, Fisher, and Murray, Current Biology 17:385-94 (2007); and Good,
    Rouzine, Balick, Hallatschek, and Desai, PNAS 109: 4950-4955 (2012)). In
    addition, it’s important to note that any effects of this sort could not
    explain the *difference* in the adaptation rates in populations started
    from different founding clones. These must arise from differences in the
    mutations available to different founders.

    Q: Secondly, the selection seems to have been uniform for one trait
    only―relative speed of growth, across all of the individual identical
    environments. One has to wonder how much variation existed in the
    gene(s) responsible for this at the start, and what severely depleting
    that variation by such stringent selection would do. Is it that
    surprising, then, that the populations would converge on a similar
    adaptive “solution”? If so, where in the population genetics
    literature does it say so?

    Desai: The selection pressure is actually probably rather complex (including
    effects from the saturation of cultures and other complications). But
    regardless, the experiment was designed to ensure that there is no
    variation for anything in the genes to begin with — each population
    started with a single clone. If instead we had started each population
    with a common pool of existing variation, then indeed it would be
    unsurprising if they converged on a similar adaptive solution. But we
    know that this didn’t happen in our experiment, both because of the way
    we set it up and because the mutations observed in different populations
    were entirely distinct (demonstrating that as expected adaptation comes
    from new mutations that arose independently in each line, not from
    common variation).

  • Great study!

    1- The experiment appears to show that mutations over successive generations create a small “drift” toward greater survival/fitness in the environment of the lab. However, it is impossible to distinguish whether the drift is due to the mutations or to differences between the lab environment and the environment where the strain of yeast originally evolved.

    2- Because the lab environment was not designed to stress the yeast, it is unclear whether selecting the “most fit” for the next generation produces results that differ significantly from an “average” or even “less fit” samples. It is not clear why mutations in the lab environment would have pointed the evolutionary process in any particular direction.

    3- To extract useful information about evolution from future experiments, it seems useful to modify the environment of samples with higher/lower temp, more/less CO2, and +/- magnetic charges.

    4- “Diminishing returns” is an economics/engineering term that may not apply in evolutionary settings, although it seems unlikely to cause any significant misunderstanding. It seems that you mean there is a diminishing, nonlinear effect of these evolutionary forces as they approach some end-phase of the process. With diminishing returns, more of one resource is applied to a system and reduction rises at a diminishing rate because of an imbalance between that added resource and other resources used by the system. If that is indeed the phenomena identified in the study, it suggests that other resources important to evolution have been omitted — e.g., temp, CO2, etc.

Comments are closed.