‘Digital Alchemist’ Seeks Rules of Emergence

Computational physicist Sharon Glotzer is uncovering the rules by which complex collective phenomena emerge from simple building blocks.

CJ Benninger for Quanta Magazine

By Natalie Wolchover

Sharon Glotzer has made a number of career-shifting discoveries, each one the kind “that completely changes the way you look at the world,” she said, “and causes you to say, ‘Wow, I need to follow this.’”

A theoretical soft condensed matter physicist by training who now heads a thriving 33-person research group spanning three departments at the University of Michigan in Ann Arbor, Glotzer uses computer simulations to study emergence — the phenomenon whereby simple objects give rise to surprising collective behaviors. “When flocks of starlings make these incredible patterns in the sky that look like they’re not even real, the way they’re changing constantly — people have been seeing those patterns since people were on the planet,” she said. “But only recently have scientists started to ask the question, how do they do that? How are the birds communicating so that it seems like they’re all following a blueprint?”

Glotzer is searching for the fundamental principles that govern how macroscopic properties emerge from microscopic interactions and arrangements. One big breakthrough came in the late 1990s, when she was a young researcher at the National Institute of Standards and Technology in Gaithersburg, Maryland. She and her team developed some of the earliest and best computer simulations of liquids approaching the transition into glass, a common yet mysterious phase of matter in which atoms are stuck in place, but not crystallized. The simulations revealed strings of fast-moving atoms that glide through the otherwise frustrated material like a conga line. Similar flow patterns were later also observed in granular systems, crowds and traffic jams. The findings demonstrated the ability of simulations to illuminate emergent phenomena.

A more recent “wow” moment occurred in 2009, when Glotzer and her group at Michigan discovered that entropy, a concept commonly conflated with disorder, can actually organize things. Their simulations showed that entropy drives simple pyramidal shapes called tetrahedra to spontaneously assemble into a quasicrystal — a spatial pattern so complex that it never exactly repeats. The discovery was the first indication of the powerful, paradoxical role that entropy plays in the emergence of complexity and order.

Lately, Glotzer and company have been engaged in what she calls “digital alchemy.” Let’s say a materials scientist wants to create a specific structure or material. Glotzer’s team can reverse-engineer the shape of the microscopic building blocks that will assemble themselves into the desired form. It’s like whipping up gold from scratch ­­­— only in modern times, the coveted substance might be a colloidal crystal or macromolecular assembly.

Glotzer ultimately seeks the rules that govern emergence in general: a single framework for describing self-assembling quasicrystals, crystallizing proteins, or living cells that spontaneously arise from simple precursors. She discussed her eureka-studded path with Quanta Magazine in February; a condensed and edited version of the interview follows.

Reprinted by permission from Macmillan Publishers Ltd: Nature 462, 773-777, copyright (2009)

QUANTA MAGAZINE: Tell me about your famous 2009 Nature paper that linked self-assembly with entropy.

SHARON GLOTZER: Imagine if you had baseballs in a pool of water, and imagine that they had exactly the same density as the pool, so they didn’t sink, they didn’t float, they were just suspended, jostling about. Then you try to confine them all together. Self-assembly is what happens when the baseballs spontaneously organize themselves into a recognizable pattern. And if the particles are perfectly hard and have no other interactions, they will organize themselves to have the highest entropy possible.

So we were studying these tetrahedra, and it’s the simplest Platonic solid — the simplest three-dimensional shape, right? These Dungeons & Dragons dice. I had an inkling that it would be interesting to look at how they like to arrange with one another based solely on entropy, meaning they had no direct interactions between them — they didn’t want to stick together; there’s no charges; there’s no nothing; there’s just entropy. But I had no idea how interesting. I had no inkling that they would form the kind of structures that they did.

You showed that tetrahedra organize into a quasicrystal — this really complex, ordered structure. People normally understand the law of increasing entropy as the tendency of things to get messier, but you’re saying entropy leads to order. Why is that not a paradox?

You’re absolutely right that it’s completely counterintuitive. We typically think entropy means disorder, and so a disordered structure would have more entropy than an ordered structure. That can be true under certain circumstances, but it’s not always true, and in these cases, it’s not. I prefer to think of entropy as related to options: The more options a system of particles has to arrange itself, the higher the entropy. In certain circumstances, it’s possible for a system to have more options — more possible arrangements — of its building blocks if the system is ordered.

What happens is the particles try to maximize the amount of space that they have to wiggle around in. If you can wiggle, you can rearrange your position and orientation. The more positions, the more options, and thus the more entropy. So you imagine these baseballs in water. They are moving around — translating, rotating. They’re jiggling, because of the thermal motion of the water molecules. And what these systems want to do is space out the particles enough so that it maximizes the amount of wiggle room available to all the particles. Depending on the particle shape, that can lead to extremely complicated arrangements.

So particles like tetrahedra and baseballs evolve to states that allow them to wiggle in more ways and therefore have higher entropy. Did people know before that you could get order from entropy?

It’s been known that entropy alone can cause platelets and rodlike particles and spherical particles to align, but those ordered phases were pretty simple. It wasn’t really thought of as being such an important driving force for organization. When we did this tetrahedra computer experiment and got out what is still today the most complicated entropically stabilized structure that anyone has ever seen, that really changed the way people looked at this.

So then my group started studying every shape under the sun. We just started throwing all kinds of convex shapes onto the computer, and we just kept getting a crystal structure after another after another, some that were very complicated. In 2012 we published a paper in Science where we studied 145 different shapes and showed that 101 of them self-assembled into some kind of complicated crystal. Since then, my group has done tens of thousands of different shapes. We published one paper with 50,000 shapes in it.

What are some of the things you’re figuring out?

The kinds of questions I’m after now are: There’s this whole database of all the crystal structures that are known. And all these “space groups,” meaning structures that can obey all these different symmetry operations [rotations and translations that leave the structures unchanged]. There’s a couple hundred of those. Can I get every one of them just with entropy? With colloidal particles [like what you find in gels], even without interactions we’ve already been able to get as many as 50 of the known space groups. Are there any that aren’t possible just with entropy? And if so, why? We’ve also started looking at mixtures of shapes. We haven’t even talked about complicated crazy shapes, and concave shapes. So how far can you go with just entropy? And what does it mean that I can form the same structure in a whole bunch of different ways? There’s something much more fundamental to understand about the organization of matter, and by focusing on shape and entropy, we’re getting to the core of that.

One of the things we’ve noticed is that there are some design rules. For example, when your polyhedra have big, flat facets, they want to align so that their facets are facing each other — because this gives more wiggle room, more ways of arranging the particles. But if you have lots of facets that are all differently sized, then it’s harder to predict. You might end up with a glassy system or a jammed system instead of an ordered structure.

Lucy Reading-Ikkanda/Quanta Magazine

In the past couple of years, you’ve started working backward.

We’re basically doing alchemy in the computer. The ancient alchemists wanted to transmute the elements and turn lead into gold. But imagine that you had a particular structure and wanted to know what shape is the best shape to get the structure. That’s what many materials scientists are doing now — trying to turn the problem on its head. This “inverse design” approach is different from the way you might screen for compounds, for example, or find protein crystals. In that case you do simulation after simulation after simulation, where you’re just running tons of different molecules and saying: Which one gives me what I want?

Inverse design is more strategic. We start with a target structure, and use statistical thermodynamics to find the particle that solves the design problem. What we did is, we extended the way that these kinds of simulations are typically done to include shape as a variable. We can now do a single simulation where we let the shape of the building blocks change on the fly in the simulation and let the system tell us what the best one is. So instead of running thousands of simulations, I can run one and have the system tell me: What’s the best building block for the desired structure? So I call it digital alchemy.

CJ Benninger for Quanta Magazine

Video: Sharon Glotzer explains how emergence, entropy and order can all fit together.

You’ve also thought about how entropy might have played a role in the origin of life.

Most scientists think that to have order you need chemical bonds — you need interactions. And we’ve shown that you don’t. You can just have objects that, if you just confine them enough, can self-organize. So if you go to the question of: What was the first self-organizing of stuff, and how did it happen? You could imagine that you had these tiny microscopic crevices in rocks with water, and there were molecules in there, that they could self-organize just due to entropy for exactly the reasons that I was just describing. So it’s a completely different way to think about life and increasing complexity. They’re compatible with each other, but this is just saying: I know because I’ve done this, that I can take a bunch of objects and put them in a little droplet and shrink the droplet a little, and these objects will spontaneously organize. So maybe that phenomenon is important in the origin of life, and I don’t think that’s been considered.

When did you first become fascinated with emergence?

When I went to graduate school at Boston University, I joined an experimental lab. I spent the whole year basically designing a flange for a sputtering chamber. I was not inspired. I like puzzles, I like computers, I like math. One day the vacuum pump blew up on me and I was covered in pump oil. And I came walking out of the lab, and a professor, Gene Stanley, saw me and said, “You look like a theorist; come talk to me.” By the end of the day I had switched and joined his group and it was one of the most life-changing decisions I ever made. With Stanley, I was studying kinetics of phase-separating polymer mixtures. I was looking at, for example, what happens if the polymers are stuck to each other, or linked. What kind of structures could you get when there’s competing driving forces — one that wants polymers to separate and one that wants them to mix? What emergent phenomena come from that? Back then I didn’t use that language to describe it, but that’s when I figured out that I like this idea of unpredictable emergent complexity coming out of simple things.

You oversee (at last count) 27 graduate students, half a dozen postdocs  and support staff. That’s rather a lot.

I started with two. Then I had four. Over time it got bigger and bigger because, well, I love working with students! When a student comes to me and they’re so excited to join the group, they’ve read our papers and they think it’s awesome, and there’s something about them that makes it obvious that they should be in the group — they’re nerds like us — I have a really hard time saying no, and so I try to find a way to support them.

Once you get beyond a certain size your group naturally develops a structure, and it almost becomes self-sustaining in that new people come in the group and the more senior students take them under their wing. The postdocs are working with graduate students, and you end up having teams. And I just love it because all the time, new stuff is coming out; it’s just magic.

Is that emergence?

That’s emergence! It is emergence! When the group gets big enough all of a sudden, and you have the right mix of people, it’s amazing some of the directions that are coming out that I never would have anticipated before.

 

Editor’s Note: Glotzer was named a Simons Investigator in 2012.

This article was reprinted on Wired.com.