A 3D Model of Earth
climate models

How Climate Scientists Saw the Future Before It Arrived

Over the past 60 years, scientists have largely succeeded in building a computer model of Earth to see what the future holds. One of the most ambitious projects humankind has ever undertaken has now reached a critical moment.

Mark Belan/Quanta Magazine

climate models

How Climate Scientists Saw the Future Before It Arrived

Over the past 60 years, scientists have largely succeeded in building a computer model of Earth to see what the future holds. One of the most ambitious projects humankind has ever undertaken has now reached a critical moment.

Introduction

A gust of wind strikes the Sahara Desert, launching a speck of dust into the air. For a week, the grain rides atmospheric currents halfway around the globe, reflecting sparkles of sunlight back to space along its journey. Other dust particles floating in the same breeze catch drafts up to Greenland, where they pepper glaciers and accelerate their melting, and down to the Amazon, where they fertilize the rainforest soil. But this particular speck hovers in the sky off the east coast of Florida, serving as a seed for water vapor to condense around and form a cloud. The dust then falls to Earth inside a raindrop, plopping down in the Atlantic Ocean, where it feeds iron to phytoplankton. The floating creatures bloom in fluorescent green swirls that soak up carbon dioxide emitted by factories and power plants all over the world.

In this way, a dust grain’s path traces the interwoven processes linking all parts of Earth across distances and scales. “It shows how the world is really connected,” said Martina Klose, an aerosol scientist at the Karlsruhe Institute of Technology in Germany. “It just demonstrates the beautiful complexity that we’re located in.”

For centuries, humans have sought to understand the intricate workings of our planet. As vulnerable critters, we crave some control over nature, or at least a handle on coming shifts in the weather and climate. But Earth is a chaotic beast, sensitive to innumerable tiny details; we can’t possibly keep track of every speck of dust. Therein lies the challenge of climate modeling: building a computer model of Earth’s surface and atmosphere that captures the gist of its behavior simply but effectively. “The goal of climate modeling is really to build a fake version of the Earth,” said Isla Simpson, an atmospheric scientist at the National Center for Atmospheric Research — a coarse-grained copy of the planet that’s stripped down to “the processes we think are relevant.”

Over the past 60 years, this effort has come to fruition. Generations of scientists have dedicated their careers to sculpting increasingly sophisticated planetary replicas. Computer models of Earth have helped us reconstruct past epochs, forecast long-term weather trends and, above all, understand how human activities are changing the climate.

From the very first computer simulations, climate models have shown that carbon dioxide released by the burning of fossil fuels warms the planet considerably. In the decades since, more advanced simulations show how a warming planet could trigger all sorts of calamities, from heat waves and superstorms to desertification and ecosystem collapse. According to modeling results compiled by the United Nations, Earth is on track to warm between 2.6 and 3.1 degrees Celsius over the course of this century. The last time Earth was that warm was around 3 million years ago during the Pliocene era, when fires ravaged the Arctic and sea levels were some 50 feet higher than they are today.

A wall of dust sweeps across a desert landscape.

A dust storm rolls across Erg Chebbi, a section of the Sahara Desert in Morrocco. Several billion tons of dust are lofted into the atmosphere each year, affecting the climate in complicated ways.

Pavliha/iStock

As the frightening futures foretold grow nearer, the details are also growing more precise. Climate scientists have reached a pivotal moment in which their predictions are being borne out, allowing them to recalibrate and hone their models. “We were essentially predicting worlds we couldn’t see for a very long time,” said Tiffany Shaw, a climate dynamicist and geophysicist at the University of Chicago. Watching the ramifications of climate change play out in the real world has both validated the models and highlighted their shortcomings. Now, modelers are exploring new approaches that could usher in the next generation of fine-grained models that make better regional predictions.

As climate modeling enters this critical phase of refinement, the effort faces its greatest challenge yet. Since taking office, the Trump administration has taken siege to the U.S. research ecosystem, with a particular focus on undermining the quest to track Earth’s climate. Decades of work is on the line as the administration strips funding, guts agencies, scrubs resources and buries datasets. “It’s a whole-scale destruction and not something that will be undone,” said Bjorn Stevens, a climate scientist and the director of the Max Planck Institute for Meteorology in Germany. “It’s a completely existential threat.”

Humans have largely succeeded in digitally reconstructing the Earth in order to ask what the future holds. Now, not liking the answer, some are reaching to unplug the machine — while others fight to perfect it. 

Prophesizing the Skies

For thousands of years, would-be weather forecasters struggled to pin down the relevant factors. The ancient Egyptians, for instance, meticulously tracked the star Sirius, believing it to be a widowed goddess whose tears caused the flooding of the Nile.

Eventually we came to understand the true drivers of natural phenomena. The British polymath Lewis Fry Richardson was the first to try his hand at using the laws of physics to model the weather system. During World War I, between shifts as an emergency ambulance driver in France, Richardson calculated how the local weather would evolve over six hours, starting with the atmospheric conditions provided by weather balloon observations on a particular morning in 1910. He spent weeks completing the pencil-and-paper calculations, complaining that the atmosphere resembled London, in that both have far more going on than anyone could properly attend to. His results were inaccurate due to the poor quality of the observations, but they got the ball rolling. Richardson expressed the hope that “perhaps someday in the dim future it will be possible to advance the calculations faster than the weather advances.”

That day came as a result of the next world war. With funding from the U.S. military, the mathematician John von Neumann helped develop the first general-purpose digital computer, called ENIAC. Among its first applications was weather forecasting. In 1950, von Neumann and collaborators built a simple model of the North American atmosphere that came to the brink of realizing Richardson’s dream: a 24-hour forecast that took 24 hours to calculate.

Norman Phillips, von Neumann’s colleague at the Institute for Advanced Study in Princeton, New Jersey, then took numerical weather prediction to the next level. Phillips had been inspired by recent “dishpan” experiments, in which scientists would heat the outer rim of a dishpan filled with liquid while cooling the pan’s center to simulate the temperature difference between Earth’s equator and poles. Remarkably, this simple experiment could effectively capture global wind patterns. “One is almost forced to the conclusion that at least the gross features of the general circulation of the atmosphere can be predicted without having to specify the heating and cooling in great detail,” Phillips wrote. He constructed a bare-bones computer model of a cylindrical atmosphere that similarly recreated realistic wind circulation patterns.

A black-and-white photo of an army corporal setting switches on a room-size computer.

The Electronic Numerical Integrator and Computer, better known as ENIAC, is pictured in 1946, the year it was unveiled at the University of Pennsylvania. The machine was the world’s first electronic, general-purpose computer and could complete 5,000 additions per second. It was soon put to work on numerical weather simulations.

Wikimedia Commons

A colleague, Joseph Smagorinsky, went on to establish a federal institute, later known as the Geophysical Fluid Dynamics Laboratory (GFDL), dedicated to developing Phillips’ approach into a full-fledged global computer model of Earth’s atmosphere. Smagorinsky noticed the work of a graduate student at the University of Tokyo named Syukuro Manabe and recruited him to join his lab.

Manabe had a knack for simplifying Earth processes in ways that captured the essential ingredients. To start out, he simulated how radiation moves up and down along a column between the sun and Earth’s surface: The sun’s rays deliver energy to Earth, Earth radiates some heat back to space, and the leftover energy sets Earth’s atmosphere into motion. He focused on balancing this energy spreadsheet. “Manabe realized that this is the key for understanding the climate,” said Tapio Schneider, a climate scientist at the California Institute of Technology who as a graduate student overlapped with Manabe at GFDL. By 1965, Manabe had built this premise out into a 3D computer model of the atmosphere. The planet sitting below was dramatically oversimplified — just a smooth sphere with no geography or oceans. But if one squinted, the model behaved vaguely as our planet does, capturing, for instance, how warm air rises at the equator and creates the windless “doldrums” zones where sailors often get stranded.

A picture of a young man in a striped shirt pointing a pencil at a map of Earth.

Syukuro Manabe at the Geophysical Fluid Dynamics Laboratory in 1972.

GFDL NOAA

“Many people think you have to build a model that mimics nature as realistically as possible,” Manabe said decades later. But “the more complicated you make a model, the harder it is to find out why it malfunctions.”

His pioneering approach relied on a key assumption: that large-scale climate processes can be inferred without precise knowledge of all small-scale weather events. Following Phillips’ scheme, Manabe broke his computerized, 3D atmosphere into a coarse grid of boxes. Inside each box, he assigned a statistical estimate of properties such as temperature or pressure. He then used the equations governing fluid flow to calculate how fluids and energy migrate between boxes. The model could capture global properties remarkably well.

Manabe’s model offered a way to test the greenhouse effect — the century-old idea that certain gases trap heat that would otherwise be radiated from Earth back to space.

Varying the composition of his model atmosphere, Manabe noticed that the global temperature responded dramatically to carbon dioxide. “As it turned out, I changed the right variable and hit the jackpot,” he later recalled. In a landmark 1967 paper, he and Richard Wetherald calculated how much warmer Earth would become if the amount of carbon dioxide in the atmosphere doubled, an idea dating back almost a century to the Swedish physicist Svante Arrhenius. Carbon dioxide makes up only a minuscule fraction of the gas in the atmosphere, yet Manabe and Wetherald estimated that doubling its concentration would warm Earth by roughly 2.3 degrees Celsius. That’s impressively close to today’s estimates of around 3 degrees of expected warming, which scientists think could be reached by 2100. Manabe and Wetherald also predicted a telling sign of this CO2-driven global warming: the lowest layer of the atmosphere (where greenhouse gases pile up) should get hotter, while the layer above it should cool.

A blond man in a blue shirt looks down at the camera from above.

Tapio Schneider is one of many climate scientists who are trying to improve the modeling of clouds and other processes below the grid scale. At the California Institute of Technology, he has developed a new climate model that automatically calibrates itself with real-world data much more efficiently.

Sabrina Pirzada

Manabe’s initial model ignored the ocean, which makes up about 70% of Earth’s surface, so he worked to marry his swirling skies to the churning ocean models of his colleague Kirk Bryan. In the summer of 1969 — weeks before the first humans walked on the moon — Manabe and Bryan recreated the Earth. On their simulated planet, clouds released rain that froze into icebergs that melted and flowed down rivers, soaking soil and evaporating to rejoin the atmosphere.

The model got a lot wrong about our planet, and the joint ocean-atmosphere system never settled into a steady equilibrium state. Nevertheless, “that’s the first time you can say you have something like a real climate model,” said David Randall, an atmospheric scientist at Colorado State University who led a review of the last century of efforts to model the Earth system. The simulation was a major milestone in one of the most ambitious projects humankind has ever undertaken, later earning Manabe a share of the 2021 Nobel Prize in Physics. “And then,” Randall said, it was “off to the races.”

Eyes on the Earth

By the time Manabe and Bryan ran their coupled model at GFDL in Princeton (where the laboratory moved from Washington, D.C. in 1968), similar efforts were popping up across the country. On the other coast, Yale Mintz at the University of California, Los Angeles, had recruited Akio Arakawa, another graduate student from Tokyo. While Manabe’s mind was up in the clouds, Arakawa’s was down in the dirt. He focused on developing sophisticated ways to deal with small-scale effects inside grid boxes — working so diligently that he once failed to notice his wastebasket had caught fire. At the time, many simulations would derail after running for a few weeks, as rounding errors at the grid points artificially amplified atmospheric waves. Arakawa managed to mathematically tame these instabilities, and his scheme remains a bedrock of modern models.

These parallel efforts allowed simulations to grow progressively more detailed — and in so doing, called attention to the holes that needed patching. “When the models started improving in resolution, they found they were becoming less realistic,” said Isaac Held, an atmospheric and oceanic scientist at Princeton University who worked under Manabe’s supervision at GFDL in graduate school. In one instance, at finer resolution, the jet stream — a fast current of air that circles the Earth — migrated to the wrong location in the simulation. Researchers helped to balance this effect by accounting for the roughness of Earth’s surface, correcting “an accidental cancellation of errors in the original models,” Held said.

A photo of the GFDL building with its sign in the foreground.

The Trump administration’s proposed NOAA budget would shutter the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey.

GFDL NOAA

From around 1979, NASA began systematically observing Earth with satellites, offering “a big step change in our ability to look at the planet,” said Simpson, the NCAR scientist. Orbiting observatories provided real-time observations of Earth’s surface, oceans, ice caps and atmosphere that improved models. Scientists could monitor the motion of heat and moisture in the atmosphere and directly measure how much radiation Earth returned to space.

Not long after, climate modeling escaped the lab and entered the public arena. Since Manabe’s CO2-doubling paper, scientists had gradually come to appreciate the extent to which the greenhouse effect would warm the planet and exacerbate weather fluctuations. (Because warm air holds more water, a hotter planet means both more droughts and more intense storms.) Meanwhile, researchers in Europe were learning to tease out warming signals from the random fluctuations of weather. Klaus Hasselmann, an oceanographer at the Max Planck Institute for Meteorology, developed a statistical method for separating the “fingerprints” of different climatic drivers, such as volcanic eruptions and the burning of fossil fuels — work for which he would eventually share the Nobel Prize with Manabe. The message was reinforced when observations reflected Manabe’s prediction that human-driven warming would cool the upper atmosphere.

In 1988, a catastrophic heat wave and drought plagued the United States, killing thousands of people and triggering over $80 billion in damage. Congress turned to scientists for answers. Manabe testified before a Senate committee, along with James Hansen, then the director of the NASA Goddard Institute for Space Studies. Hansen explained that 1988 was on track to be the warmest year on record and that he could say with 99% certainty that global warming, which spurred extreme weather events, was a reality. “The greenhouse effect has been detected, and it is changing our climate now,” Hansen said.

Photo of a seated man in a suit speaking into a microphone.

On June 23, 1988, James Hansen, then the director of NASA’s Goddard Institute for Space Studies, testified to Congress about the role of the greenhouse effect in causing what was then the warmest year on record. “Global Warming Has Begun, Expert Tells Senate,” ran the New York Times headline. This photo was taken when Hansen returned to Capitol Hill the following year.

Dennis Cook/Associated Press

Later that year, the United Nations established the Intergovernmental Panel on Climate Change (IPCC). Since then, climate scientists from around the world have regularly convened to compare models and to advise on global policy. The IPCC reinforced a push for modelers to collaborate and iterate. Researchers across institutions began contributing to community models that hundreds of other climate scientists could use to run experiments. Scientists worked to systematically probe and compare their predictions, running ensembles of many models together while slightly varying the input conditions or settings to assess the range of climate outcomes.

At the core of all these simulations sat Manabe’s paradigm. “We’ve gotten better at doing pretty much the same thing,” Randall said. Steadily, climate modelers shrank the grid boxes and incorporated more complex effects, such as those of atmospheric dust.

But as climate scientists came together to pool their efforts, and as their models grew increasingly detailed, a funny thing happened. These digital worlds began to diverge in subtle but important ways from the behavior of our actual planet.

Questioning the Oracle

Climate models have yielded solid predictions about broad properties of Earth’s climate, such as rates of Arctic warming and the rise in global mean temperatures. However, “very few people live in the Arctic, and nobody lives in the global mean,” as Gavin Schmidt, a climatologist and director of the NASA Goddard Institute for Space Studies, put it. “The impacts of climate change are becoming felt at the local level.” The current generation of climate models can answer specific questions about local or regional phenomena, but the problem is, many of those more detailed predictions are being proven wrong. “We’re seeing various things starting to be apparent in trends that are different from what the models are predicting,” Held said.

One of the most notable discrepancies takes place in the tropical Pacific Ocean, which stretches from Indonesia to Ecuador. Contrary to model predictions, ocean temperatures on the western side of this band have warmed relative to the eastern side. The slipup matters because this stretch of ocean generates atmospheric waves that have been connected to droughts from California to Africa. Models’ predictions in this region therefore affect climate policies worldwide.

Additionally, in sharp contrast to all climate simulations, scientists have lately seen humidity levels dropping unexpectedly in southern Africa and the southwestern United States. The jet stream is strengthening more than expected, and heat extremes in Western Europe are increasing faster than anticipated. Models also failed to predict just how much higher average global temperatures would climb in 2023 than ever before.

In a paper published this year, Shaw of the University of Chicago and Stevens of Max Planck argued that these discrepancies have culminated in “the other climate crisis”: a breakdown in the standard schema of modeling. “The normal way of doing it has sort of lost its explanatory power,” Stevens said.

All climate models to date have relied on Manabe’s assumption, which Shaw and Stevens refer to as “large-scale determinism”: the idea that fine-scale processes can be approximated to match large-scale climate features. But they and other researchers feel the time has come to revisit this fundamental principle of climate modeling. To answer the more detailed, local questions we’re now asking of models, “we need to rethink how we deal with processes below the grid scale,” Schneider said.

Stevens has been leading an effort to significantly decrease the size of grid boxes down to around one kilometer wide. (The current norm for global models is around 100 kilometers.) He thinks this scale represents a critical threshold that can resolve important “mesoscale” processes, from thunderclouds to ocean eddies, that were previously represented in terms of their average effect. And to capture the nuances of the Earth system, Stevens advocates cutting out as many estimations as possible and attempting to use physics all the way down. Recently, his group at Max Planck managed to run a one-kilometer model, one that included complex processes like carbon cycles and the effects of aerosols, that could simulate 90 days in 24 hours.

“Being able to resolve the mesoscale is transformational,” Randall said. “For 60 years, we haven’t been able to represent those things even though they’re among the most important weather systems out there.”

A high-resolution computer simulation of precipitating cumulus clouds, created by Tapio Schneider and colleagues at the California Institute of Technology.

Kyle Pressel

But researchers caution that ultra-high-resolution modeling is not a panacea. For one thing, we’re still “a long way off” from having the computational power to run such detailed models on the long timescales and large number of iterations needed, Simpson said. “This is a new frontier. I don’t think that’s the only path we should be following.”

One tool that appears poised to truly shift the mode of climate modeling is artificial intelligence. While AI has not yet transformed climate simulations the way it has weather forecasting, it is beginning to help make existing climate models more efficient by improving statistical representations and by automating model tunings. Some industry efforts, including one at the Allen Institute for Artificial Intelligence in Seattle, are now attempting to emulate the climate entirely with AI. “There’s nothing in there that is recognizably connected to the paradigm that we had,” Shaw said. But though these models seem to capture the atmosphere fairly well, she said, researchers have not yet managed to couple them to oceans or to outperform human-made models.

In the meantime, Shaw is concentrating her effort on understanding why current models are going wrong — even where they all agree with one another. “Model agreement became more of a gold standard than really understanding why the models agree,” she said. She’s focusing on hierarchical modeling, which involves blurring or turning off certain features in the models to reveal the essential processes underneath. “We need to be able to explain why we’re wrong just as equally well as why we’re right.”

The focus on addressing how models are formulated represents a fundamental shift in perspective within the field. “Climate scientists as a rule don’t like to talk about what they don’t know, because that’s so often been manipulated to cast doubt on what they do know,” Stevens said.

And the perfection of climate models matters less than the societal response to their central message, he argues. To that end, he helped found a European Union initiative called Destination Earth to bring climate models down from the ivory tower and into the hands of policymakers and the public. “A much higher-resolution model will not give you a much better climate policy decision,” said Wilco Hazeleger, a climate scientist at Utrecht University who also helped establish Destination Earth.

Hazeleger had grown frustrated by the slow pipeline for conveying climate model forecasts to government policymakers — a process that often takes more than a decade, he said. Destination Earth is developing a series of operational “digital twins” of the planet — global climate simulations with kilometer-scale resolution that downstream users, such as windfarm operators and city planners, can interact with directly and hopefully use to strategize.

Climate scientists have long been vexed by the disconnect between the dire warnings blared by their models and the restrained policies enacted by world leaders. After his 1988 testimony, Hansen was arrested multiple times while participating in climate protests. “The science of the situation is clear — it’s time for the politics to follow,” he wrote in a 2012 op-ed column criticizing the Obama administration for its hesitancy to curb carbon emissions. “We can’t wait any longer.” Today, three presidential terms later, the tension has escalated dramatically.

Unplugging the Machine

Clare Singer is the scion of a fabled academic family. She recently completed her Ph.D. at Caltech under Schneider, who studied under Held, who studied under Manabe. “My scientific upbringing was full of stories of the legends of GFDL,” she said. A year after finishing her Ph.D., she landed her dream job at the lab.

A woman smiles at the camera beneath some cherry blossoms.

Clare Singer landed her dream job at the Geophysical Fluid Dynamics Laboratory last year and worked to improve simulations of clouds. Then in February, her employment was terminated as part of cost-cutting efforts by the Department of Government Efficiency.

Courtesy of Clare Singer

She was tasked with helping future models better simulate clouds, which remain one of their biggest sources of uncertainty. Clouds have a large impact on the climate. But the amounts of sunlight they reflect and precipitation they drop depend on tiny particles of pollen, salt, soot, microbes or Sahara Desert dust at the center of every droplet — something global models could never hope to capture. Singer is pioneering a new technique to incorporate small-scale simulations that can track individual particles to better ascertain the competing warming and cooling effects of clouds.

On February 27, just four months into her position at GFDL, Singer received an email terminating her employment. Several of her colleagues around the office were reading the same message. Across the National Oceanic and Atmospheric Administration (GFDL’s parent agency), nearly 800 employees were fired that day. “It was total chaos,” said Zachary Labe, a climate scientist who was among the lab’s fallen.

When Labe joined GFDL full time last year to help predict extreme weather events, “this was one of the most secure research positions that was possible in the entire ecosystem.” But the situation quickly changed when the Department of Government Efficiency, led at the time by Elon Musk, began its demolition in the name of reducing waste in federal bureaucracy. Dismissed NOAA researchers were temporarily reinstated in mid-March after a federal court issued a restraining order to halt the terminations — only to be re-fired weeks later when that order was repealed.

For researchers, the dismantling of the birthplace of climate modeling stings. “That lab I see as part of our world cultural heritage,” Stevens said. “What you see is a conscious effort to destroy institutions which are in some ways the foundation of modern society.” But GFDL is only a microcosm of the Trump administration’s larger assault on climate modeling — and science writ large. Since January, the administration has cancelled billions of dollars in research grants and fired thousands of federal scientists. It has axed two seminal climate reports, moved to repeal the Environmental Protection Agency’s finding that greenhouse gases endanger public health, and evicted Schmidt’s entire team at the NASA Goddard Institute for Space Studies from its office (even though the lease will likely still be paid out until 2031). “There’s nothing rational about what’s happening right now,” Held said. “I think it’s a tragedy.”

NASA merged observations of the atmosphere with advanced Earth system models to depict aerosols circulating in Earth’s atmosphere over a six-week period in 2024. 

NASA’s Global Modeling and Assimilation Office and NASA’s Scientific Visualization Studio.

In May, the Trump administration released its fiscal 2026 budget request, which called for cutting National Science Foundation and NASA science budgets by more than half. The administration’s proposed NOAA budget, released a few weeks later, proposes eliminating the agency’s scientific research arm altogether, terminating over 1,000 additional employees and shuttering around a dozen institutes, including GFDL. It includes the line: “With this termination, NOAA will no longer support climate research grants.”

“The proposed budget is a disaster for science,” said one senior federal scientist who asked to remain anonymous out of fear of retaliation. “It’s existential for almost everything that any of the agencies are doing.” If scientists’ pleas fall short and the budget passes through Congress, the official warned, “The sky will go dark.”

Some climate researchers are pivoting to different fields, while others are seeking employment abroad. Efforts overseas will pick up some of the slack, but losing the continued observations and federal funding for the world’s leading climate research ecosystem would handicap the global collaborative effort to monitor the planet. “The United States has been very important in the past, and it’s just taking itself off the map,” Stevens said. “It’ll be a setback for everyone.” Even a course correction in the 2028 elections might not make up for the disruption of momentum. “It’s quicker to tear the building down than it is to build it up,” Randall said.

The biggest impacts will likely be felt by early-career researchers. The GFDL scientists dismissed in February re-entered the job market to find that many universities and federal labs had stopped hiring. “It’s a collapse of support for the next generation of scientists,” Labe said. Beyond the lack of employment opportunities, the blatant attack on climate science leaves some early-career researchers with “a deep existential crisis,” said one fired federal scientist who also requested anonymity. Modeling the climate “is an important thing that we do as a society,” the researcher added. “What does it mean if the country I live in no longer values that?”

In May, a handful of early-career meteorologists and climatologists organized a livestreamed virtual rally. For 100 consecutive hours, more than 200 scientists presented research and fielded questions from the public. Over those four days, viewers placed over 7,000 calls to their congressional representatives, urging them to prioritize funding for weather and climate science. The livestream closed with a message from one of the organizers, Jonah Bloch-Johnson, a climate scientist at Tufts University, who called the funding cuts “our own unnatural disaster in the making.” He encouraged listeners to continue marveling at the complexity of the Earth system — to appreciate how the clouds dance in the sky and how the waters ebb and flow. “This science belongs to you,” he said. “It’s the science of the world we all live in.”

The Dust Lingers

In 2014, a gust of wind struck the Sahara, launching a dust cloud into the atmosphere. After a few days’ travel, some of the specks landed on a buoy floating in the North Atlantic off the coast of French Guyana. When scientists collected this sample and analyzed it in the lab, they noticed that some of the grains were huge — 15 times bigger than the largest particles they thought could be swept overseas.

“We were all wondering, how can it be possible that they actually stay suspended in the air for so long?” said Klose, the Karlsruhe Institute aerosol scientist. Over the last few years, she and her colleagues have realized that these extra-coarse grains account for around 85% of the total dust mass in the atmosphere.

While they’re still not sure how these giant grains travel so far, they’re confident that they represent an overlooked climate variable. Dust was thought to mainly reflect sunlight, but larger grains primarily absorb it. In a new paper now under review, Klose and colleagues report how current models are underestimating the impact of these particles on Earth’s energy balance by a factor of two, calling into question whether dust has an overall cooling effect on the climate, as previously suspected, or whether it’s actually amplifying warming. This uncertainty is critical, as over 5 billion tons of dust — around 1,000 times the weight of the Great Pyramid of Giza — are lofted into the atmosphere annually. And thanks to agriculture and other land-use changes, dust emission is only rising, having roughly doubled since the Industrial Revolution.

Scientists have been working to better track the journey of dust and more realistically simulate its climatic effects. NASA’s Earth Observing System operates three satellites that track properties of dust in the atmosphere. But in Trump’s proposed budget, all three are slated for cancellation.

Still, Klose is determined to keep an eye on the dust. Every few years, she brings tiny shovels and giant air-sucking machines to deserts across the world to collect samples. Then she transports those samples back to her lab in southern Germany and other labs, where her colleagues blow them inside a metal chamber to study how they stimulate cloud formation. Those results get fed directly into climate models to better represent how variations in tiny grains influence the nature of the entire planet.

“Obviously we can never, ever represent this in all its wonderful beauty in detail,” Klose said. Nevertheless, she said, she aims to learn as much as possible about the invisible intricacy of Earth before the dust settles. “We don’t have any plans to give up any time soon.”

Comment on this article