behavior

To Decode the Brain, Scientists Automate the Study of Behavior

Machine learning and deep neural networks can capture and analyze the “language” of animal behavior in ways that go beyond what’s humanly possible.

To track the movements of animals engaged in natural behaviors, scientists are increasingly turning to machine learning methods. In this video, the DeepPoseKit algorithm automatically tracks the movements and orientations of desert locusts in a container to provide researchers with data about their collective behavior.

Graving et al. DOI: 10.7554/eLife.47994

Introduction

The quest to understand what’s happening inside the minds and brains of animals has taken neuroscientists down many surprising paths: from peering directly into living brains, to controlling neurons with bursts of light, to building intricate contraptions and virtual reality environments.

In 2013, it took the neurobiologist Bob Datta and his colleagues at Harvard Medical School to a Best Buy down the street from their lab.

At the electronics store, they found what they needed: an Xbox Kinect, a gaming device that senses a player’s motions. The scientists wanted to monitor in exhaustive detail the body movements of the mice they were studying, but none of the usual laboratory techniques seemed up to the task. So Datta’s group turned to the toy, using it to collect three-dimensional motor information from the animals as they explored their environment. The device essentially rendered them as clouds of points in space, and the team then analyzed the rhythmic movement of those points.

Datta’s solution might have been unorthodox at the time, but it’s now emblematic of a wave of automated approaches that are transforming the science of behavior. By studying animals’ behaviors more rigorously and quantitatively, researchers are hoping for deeper insights into the unobservable “drives,” or internal states, responsible for them. “We don’t know the possible states an animal can even be in,” wrote Adam Calhoun, a postdoctoral fellow who studies animal behavior at Princeton University.

Tracing those internal states back to specific activity in the brain’s complex neural circuitry presents a further hurdle. Although sophisticated tools can record from thousands of neurons at once, “we don’t understand the output of the brain,” Datta said. “Making sense of these dense neural codes is going to require access to a richer understanding of behavior.”

That richer understanding may not remain out of reach much longer. Capitalizing on advances in machine learning, scientists are building algorithms that automatically track animals’ movements, down to tiny changes in the angle of a fly’s wing or the arch of a mouse’s back. They’re also creating pattern-finding tools that automatically analyze and classify this data for clues about animals’ internal states.

A key advantage of these methods is that they can pick up on patterns that humans can’t see. In a paper published last month in Nature Neuroscience, Calhoun, with the Princeton neuroscientists Mala Murthy and Jonathan Pillow, built a machine learning model that used behavioral observations alone to identify three internal states underlying the courtship behavior of fruit flies. By manipulating the flies’ brain activity, the researchers were then able to pinpoint a set of neurons that controlled those states.

The work on motion tracking and behavioral analysis that made these findings possible represents a technological revolution in the study of behavior. It also indicates that this success is just one of many to come. Scientists are now applying these methods to tackle questions in neuroscience, genetics, evolution and medicine that seemed unsolvable until now.

Logs and Inventories

Armed with pen, paper and stopwatch, scientists have been quantifying animal behavior in the wild (and in their labs) for decades, watching their subjects sleep and play and forage and mate. They’ve tallied observations and delineated patterns and come up with organizational frameworks to systematize and explain those trends. (The biologists Nikolaas Tinbergen, Konrad Lorenz and Karl von Frisch won a Nobel Prize in 1973 for independently performing these kinds of experiments with fish, birds and insects.)

The inventories of behaviors arising from this work could get extremely detailed: A description of a mouse’s grooming in a 1973 Nature article involved a “flurry of forelimbs below face” and “large synchronous but asymmetric strokes of forelimbs over top of head,” with estimates of how likely such gestures might be under different circumstances. Researchers needed to capture all that detail because they couldn’t know which aspects of the observed behaviors might turn out to be important.

Some scientists have taken the opposite tack, reducing animals’ behavioral variability to its bare bones by putting them in controlled laboratory settings and allowing them to make only simple binary decisions, like whether to turn left or right in a maze. Such simplifications have sometimes been useful and informative, but artificial restrictions also compromise researchers’ understanding of natural behaviors and can cause them to overlook important signals. “Having a good grasp on the behavior is really the limiting factor for this research,” said Ann Kennedy, a postdoctoral researcher in theoretical neuroscience at the California Institute of Technology.

That’s why scientists have set out to modernize the field by “thinking about behavior more quantitatively,” according to Talmo Pereira, a graduate student in the labs of Murthy and Joshua Shaevitz at Princeton. And a change that has been instrumental in that makeover has been the automation of both data collection and data analysis.

Tracking Snouts, Spines and Tails

Image-capture technology has always been crucial for tracking the poses of animals in motion. In the 1800s, Eadweard Muybridge used stop-motion photography to tease apart the mechanics of horses running and people dancing. The photos made it easier and more accurate to mark, say, where an animal’s legs were frame by frame, or how its head was oriented. When video technology arrived, researchers were able to take more precise measurements — but these still tended to be based on coarse quantities, such an animal’s speed or its average position. Tracking every movement through three dimensions was impossible. And all the video annotations still had to be laboriously logged into a computer by hand, a process that wasn’t much of an improvement on the older method of drawing in notebooks.

In the 1980s, researchers started adapting computer vision algorithms, which were already being used to find edges and contours in images, for animal behavior problems like tracing the outlines of flies on a surface. Over the next few decades, systems were developed to label the location of an animal in each frame of a video, to differentiate among multiple organisms, and even to start identifying certain body parts and orientations.

Still, these programs weren’t nearly as efficient as scientists needed them to be. “There were a few glimmers of what the future was to hold,” said Iain Couzin, director of the Max Planck Institute of Animal Behavior in Germany. “But nothing really sophisticated could happen until very, very recently, until the advent of deep learning.”

With deep learning, researchers have started to train neural networks to track the joints and major body parts of almost any animal — insects, mice, bats, fish — in every frame of a video. All that’s needed is a handful of labeled frames (for some algorithms, as few as 10 will do). The output appears as colored points transposed over the animal’s body, identifying its nose, tail, ears, legs, feet, wings, spine and so on.

The number of programs that do this has exploded in the past couple of years, fueled not only by progress in machine learning, but by parallel work on mapping human motion by moviemakers, animators and the gaming industry.

Algorithms label and track the body parts of two flies (left) and two mice (right).

New methods can track the postures of diverse animals as they interact. In these videos, the algorithm SLEAP automatically labels and follows the body parts of a pair of courting flies (left) and of two mice exploring their environment.

(Flies) Junyu Li, Mala Murthy lab, Princeton University; (mice) John D’Uva and Mikhail Kislin, Samuel S.-H. Wang lab, Princeton University

Of course, for the kinds of motion capture relevant to Hollywood and Silicon Valley, it’s easy for people to wear bodysuits studded with markers that the systems can easily spot and follow. That data can then be used to build detailed models of poses and movements. But bodysuit solutions weren’t really an option in the world of animal studies.

Five years ago, Jonathan Whitlock, a neuroscientist at the Norwegian University of Science and Technology, started hunting for another way to mark the mice he studied. He tried anything he could think of: He and his colleagues shaved the animals’ fur and tagged them with infrared reflective ink. They dabbed a suspension of glass beads, usually used in reflective road paint, onto the animals’ backs. They daubed glowing ink and polish on the animals’ joints. The list goes on, but to no avail: Sometimes the markers simply weren’t bright enough to be tracked, and sometimes they made the mice anxious, disrupting their behavior.

Eventually, Whitlock’s team settled on using tiny pieces of reflective tape stuck to three points along the animal’s back to reconstruct the movements of the spine, and a tiny helmet with four additional pieces of tape to track head movements. “That alone already was sufficient to open up a whole new world for us,” Whitlock said.

Composite video that shows a fly walking on a spherical treadmill, with annotations of the joint angles in its legs; a graph of the changing joint angles as the fly walks; and an isolated view of the joint-angle models in three-dimensional space.

By activating certain neurons in a fly, researchers made the insect walk backward on a spherical treadmill (lower left). A deep learning method measured the changing angles of joints in the fly’s legs over time (top), and projected the movements of the legs, abdomen and antennae into three-dimensional space (lower right).

Semih Günel and Pavan Ramdya, EPFL; DOI: 10.7554/eLife.48571

But many researchers wanted to move past using markers at all, and they wanted to track more than seven points on their animals. So by combining insights gained from previous work, both on animals and humans, multiple labs have created easy-to-use systems that are now seeing widespread application.

The first of these systems came online last year. DeepLabCut was developed by the Harvard neuroscientists Mackenzie Mathis and Alexander Mathis, who repurposed a neural network that was already trained to classify thousands of objects. Other methods followed in rapid succession: LEAP (Leap Estimates Animal Pose), developed by Pereira and others in the labs of Murthy and Shaevitz; SLEAP, the same team’s forthcoming software for tracking the body-part locations of multiple interacting animals at once; and the Couzin group’s DeepPoseKit, published a few months ago.

“It can learn really fast,” Murthy said of LEAP. “Within 10 or 15 minutes, it can be trained to run automatically on all of your videos.” Other groups are working on modeling poses in three dimensions rather than two, by calibrating similar models using multiple cameras.

“Under the hood, these technologies can be incredibly sophisticated,” Couzin said, “but now they’re actually amazingly easy to apply to a very broad range of problems, from how a mouse’s whiskers move to ant behavior to fish schooling.”

Whitlock has found that in the mice he studies, particular movements and positions are encoded throughout regions of the cortex involved in coordinated movement — and perhaps more widely. “These parts of the brain really care a lot about how the animal is holding its head,” he said. “This is an aspect of cortical processing that we just simply haven’t appreciated before” because researchers hadn’t been able to track freely moving animals.

By delineating posture, the algorithms open a window into a deeper understanding of behavior. Essentially, all measurable behaviors are “changes in posture through time,” Whitlock said. “And we’ve got posture. We’ve nailed that.”

Because pose-tracking software has simplified data collection, “now we can think about other problems,” said Benjamin de Bivort, a behavioral biologist at Harvard University. Starting with: How do we define the building blocks of behavior, and how do we interpret them?

A Hidden Language

Attempts to answer these questions have long relied on the observer’s intuition — “immaculate perception,” as ethologists (animal behaviorists) jokingly call it. But intuition is hobbled by inherent biases, a lack of reproducibility, and difficulty in generalizing.

The zoologist Ilan Golani at Tel Aviv University has spent much of the past six decades in search of a less arbitrary way to describe and analyze behavior — one involving a fundamental unit of behavior akin to the atom in chemistry. He didn’t want behaviors to be tagged simply as courting or feeding. He wanted those characterizations to arise “naturally,” from a common set of rules grounded in an animal’s anatomy. Golani has his own model of what those units and rules should look like, but he thinks the field is still far from arriving at a consensus about it.

Other researchers take the opposite position, that machine learning and deep learning could bring the field to a consensus sooner. But while DeepLabCut, LEAP and the other cutting-edge pose-tracking algorithms rely on supervised learning — they’re trained to infer the locations of body parts from hand-labeled data — scientists hope to find and analyze the building blocks of behavior with unsupervised learning techniques. An unsupervised approach holds the promise of revealing the hidden structure of behaviors on its own, without humans dictating every step and introducing biases.

An intriguing example of this appeared in 2008, when researchers identified four building blocks of worm movement that could be added together to capture almost all the motions in the animal’s repertoire. Dubbed the “eigenworm,” this compact representation offered a quantitative way to think about behavioral dynamics.

Datta took this approach to a whole new level with his Xbox Kinect hack in 2013, and he was quickly rewarded for it. When he and his colleagues looked at the data describing the movements of the mice, they were surprised to immediately see an overarching structure within it. The dynamics of the animals’ three-dimensional behavior seemed to segment naturally into small chunks that lasted for 300 milliseconds on average. “This is just in the data. I’m showing you raw data,” Datta said. “It’s just a fundamental feature of the mouse’s behavior.”

Those chunks, he thought, looked an awful lot like what you might expect a unit of behavior to look like — like syllables, strung together through a set of rules, or grammar. He and his team built a deep neural network that identified those syllables by dividing up the animal’s activity in a way that led to the best predictions of future behavior. The algorithm, called Motion Sequencing (MoSeq), spat out syllables that the researchers would later name “run forward” or “down and dart” or “get out!” In a typical experiment, a mouse would use 40 to 50 of them, only some of which corresponded to behaviors for which humans have names.

“Their algorithms can pull out behaviors that we don’t have words for,” Whitlock said.

Now researchers are trying to determine the biological or ecological significance of these previously overlooked behaviors. They’re studying how the behaviors vary between individuals or sexes or species, how behavior breaks down with age or disease, and how it develops during learning or in the course of evolution. They’re using these automatic classifications to discern the behavioral effects of different gene mutations and medical treatments, and to characterize social interactions.

And they’re starting to make the first connections to the brain and its internal states.

Predicting Brain States and Behaviors

Datta and his colleagues discovered that in the striatum, a brain region responsible for motor planning and other functions, different sets of neurons fire to represent the different syllables identified by MoSeq. So “we know that this grammar is directly regulated by the brain,” Datta said. “It’s not just an epiphenomenon, it’s an actual thing the brain controls.”

Intriguingly, the neural representation of a given syllable wasn’t always the same. It instead changed to reflect the sequence in which the syllable was embedded. By looking at the activity of the neurons, for instance, Datta could tell whether a certain syllable was part of a very fixed or very variable sequence. “At the highest level,” he said, “what that tells you is that the striatum isn’t just encoding what behavior gets expressed. It’s also telling you something about the context in which it’s expressed.”

He supported this hypothesis further by testing what happened when the striatum no longer worked properly. The syllables themselves remained intact, but the grammar became scrambled, the sequences of actions seemingly more random and less adaptive.

Other researchers are looking at what’s going on in the brain on longer timescales. Gordon Berman, a theoretical biophysicist at Emory University, uses an unsupervised analysis technique called Motion Mapper to model behavior. The model, which places behaviors within a hierarchy, can predict hierarchical neural activity in the brain, as demonstrated in a paper published by a team of researchers at the University of Vienna two weeks ago. (Berman says that “an aspirational goal” would be to someday use Motion Mapper to predict social interactions among animals as well.)

And then there’s Murthy and her team, and their search for hidden internal states. They had previously created a model that used measurements of the flies’ movements to predict when, how and what the male fly would sing. They discovered, for example, that as the distance between the male and female flies decreased, the male was likelier to produce a particular type of song.

In the work recently published in Nature Neuroscience, the scientists extended this model to include potential hidden internal states in the male flies that might improve predictions about which songs the flies would produce. The team uncovered three states, which they dubbed “Close,” “Chasing” and “Whatever.” By activating various neurons and examining the results with their model, they discovered that a set of neurons that had been thought to control song production instead controlled the fly’s state. “It’s a different interpretation of what the neuron is doing in the service of the fly’s behavior,” Murthy said.

They’re now building on these findings with SLEAP. “It’ll be really exciting to see what kind of hidden states this type of model is able to tease out when we incorporate higher-resolution pose tracking,” Pereira said.

The scientists are careful to note that these techniques should enhance and complement traditional behavioral studies, not replace them. They also agree that much work needs to be done before core universal principles of behavior will start to emerge. Additional machine learning models will be needed, for example, to correlate the behavioral data with other complex types of information.

“This is very much a first step in terms of thinking about this problem,” Datta said. He has no doubt that “some kid is going to come up with a much better way of doing this.” Still, “what’s nice about this is that we’re getting away from the place where ethologists were, where people were arguing with each other and yelling at each other over whether my description is better than yours. Now we have a yardstick.”

“We are getting to a point where the methods are keeping up with our questions,” Murthy said. “That roadblock has just been lifted. So I think that the sky’s the limit. People can do what they want.”

Editor’s note: The work by Bob Datta, Jonathan Pillow and Adam Calhoun is funded in part by the Simons Foundation, which also funds this editorially independent magazine.

Animated pose-model of a walking fly courtesy of Pierre Karashchuk, Tuthill/Brunton labs, University of Washington; anipose.org

Comment on this article