Normal view

  • ✇Eos
  • Managed Agriculture Hinders Predictability of Critical Zone Features Alberto Montanari
    Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances The critical zone (CZ) refers to the layer of Earth extending from the bedrock up to the vegetation canopy, including interconnected systems such as river and floodplain corridors, the active soil and root zone, and the near-surface environment where plants interact with the atmosphere. The conservation of the CZ requires a detailed understanding of how it evolves under anthropogenic impacts,
     

Managed Agriculture Hinders Predictability of Critical Zone Features

1 May 2026 at 13:32
Sunrise over a crop field and a small lake.
Editors’ Highlights are summaries of recent papers by AGU’s journal editors.
Source: AGU Advances

The critical zone (CZ) refers to the layer of Earth extending from the bedrock up to the vegetation canopy, including interconnected systems such as river and floodplain corridors, the active soil and root zone, and the near-surface environment where plants interact with the atmosphere. The conservation of the CZ requires a detailed understanding of how it evolves under anthropogenic impacts, such as intensive agriculture.

Goodwell et al. [2026] use a data driven approach to relate shifts in the critical zone to indicators of human impact. Their findings deliver innovative knowledge on transitions, drivers, and predictability in many contexts, and support better prediction and management of the critical zone under environmental change.

In particular, the authors find evidence of abrupt shifts in the variability of key features like stream and soil chemistry, land-atmosphere interaction and so forth, which can be attributed to intensive management, for instance due to mechanized planting and harvesting. These human-impacted and naturally appearing regimes in the dynamics of critical zone have implications for understanding processes and making predictions of the status of the critical zone under environmental change.

Data-driven methods include grouping of time-series data with clustering to detect regimes, dimensionality reduction to simplify system dynamics and identify main sources of variability. Credit: Goodwell et al. [2026], Figure 1

Citation: Goodwell, A. E., Saccardi, B., Dere, A., Druhan, J., Wang, J., Welp, L. R., et al. (2026). Detecting regimes of critical zone processes, drivers and predictability with a data-driven framework. AGU Advances, 7, e2025AV002098. https://doi.org/10.1029/2025AV002098

—Alberto Montanari, Editor-in-Chief, AGU Advances

The logo for the United Nations Sustainable Development Goal 15 is at left. To its right is the following text: The research reported here supports Sustainable Development Goal 15. AGU is committed to supporting the United Nations 2030 Agenda for Sustainable Development, which provides a shared blueprint for peace and prosperity for people and the planet, now and into the future.
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • The Persistence of PFAS Caryl-Sue Micalizio
    Seeking Solutions to PFAS Pollution Chemical Companies Are Churning Out New PFAS. Where in the World Are They Ending Up? The Persistence of PFAS A Peculiar Polymer Paired with Sunlight Could Remove PFAS Tracing the Path of PFAS Across Antarctica Pollution Is Rampant. We Might As Well Make Use of It. This month, Eos is taking a long look at “forever chemicals.” Per- and polyfluoroalkyl substances (PFAS) have been percolating through our industrial environment since the 19
     

The Persistence of PFAS

1 May 2026 at 03:55
A person’s right arm extends into the frame from the right over a running stream. The gloved hand holds a test tube partially filled with water that’s just been collected; a partial droplet of water is collecting at the bottom of the tube.

This month, Eos is taking a long look at “forever chemicals.” Per- and polyfluoroalkyl substances (PFAS) have been percolating through our industrial environment since the 1940s. They help make products nonstick, waterproof, and stain resistant. They also make their way into air, soil, and water, as well as our bodies, where they have been linked to impaired immune systems, developmental delays in children, and some cancers.

Since discovering that PFAS might be harmful to human and environmental health, researchers and industries have reformed the chemicals into novel substances. The behaviors of these novel PFAS are proving difficult to pin down, as Grace van Deelen explores in her feature “Chemical Companies Are Churning Out New PFAS. Where in the World Are They Ending Up?

From the deep ocean to alpine glaciers, scientists are being forced to play “chemical Whac-A-Mole” to study novel PFAS, one scientist told van Deelen. Researchers are also searching for—and finding—PFAS in the isolated interior of the White Continent, as described in Rebecca Owen’s “Tracing the Path of PFAS Across Antarctica.”

Once PFAS have been identified, scientists work to disarm them with filtration, heat, and even sunshine. In an innovative approach, “A Peculiar Polymer Paired with Sunlight Could Remove PFAS,” writes Emily Gardner.

Another option is to put PFAS to work. Read about how scientists are using trifluoroacetic acid, a less toxic PFAS, to gain a rough idea of how recently an aquifer has been recharged in Saima May Sidik’s “Pollution Is Rampant. We Might As Well Make Use of It.”

As PFAS permeate our environment in different ways, scientists are taking the lead in developing proactive approaches to search for, study, and maybe take the “forever” out of “forever chemicals.”

—Caryl-Sue Micalizio, Editor in Chief

Citation: Micalizio, C.-S. (2026), The persistence of PFAS, Eos, 107, https://doi.org/10.1029/2026EO260135. Published on 30 April 2026.
Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Chemical Companies Are Churning Out New PFAS. Where in the World Are They Ending Up? Grace van Deelen
    Seeking Solutions to PFAS Pollution Chemical Companies Are Churning Out New PFAS. Where in the World Are They Ending Up? The Persistence of PFAS A Peculiar Polymer Paired with Sunlight Could Remove PFAS Tracing the Path of PFAS Across Antarctica Pollution Is Rampant. We Might As Well Make Use of It. On a rocky archipelago in the North Atlantic Ocean, staff at the Faroese Environment Agency and the Faroe Marine Research Institute regularly sample tissues from the North At
     

Chemical Companies Are Churning Out New PFAS. Where in the World Are They Ending Up?

1 May 2026 at 03:55
A large fjord with rocky, snow-covered mountains in the background

On a rocky archipelago in the North Atlantic Ocean, staff at the Faroese Environment Agency and the Faroe Marine Research Institute regularly sample tissues from the North Atlantic long-finned pilot whales that roam the waters around the islands. The archive of these samples stretches back to the 1980s and has helped researchers determine the reach of human-made contaminants in the remote marine environment.

Jennifer Sun is one of those researchers. Sun studies PFAS—per- and polyfluoroalkyl substances, commonly known as “forever chemicals”—at Harvard University and is the lead author of a recently published study that analyzed how these toxic chemicals have accumulated in pilot whale tissue over the past 2 decades.

Using samples of whale tissue collected between 2001 and 2023, Sun and her colleagues measured a parameter called bulk extractable organofluorine, which shows the overall amount of organofluorine-containing chemicals (including PFAS) in the tissue. They then used a more targeted analysis able to confirm the identity of 28 specific chemicals out of thousands of possible PFAS formulations.

Three chunks of red and pink whale tissue on a white plastic surface
The pilot whale tissue showed an expected decrease in the concentrations of older PFAS but an unexpected scarcity of newer PFAS chemicals. Credit: Jennifer Sun

The study’s results showed an expected decrease in the concentrations of older PFAS but an unexpected absence of newer PFAS chemicals. This anomaly could be indicative of an emerging question in PFAS research: Where are the newest PFAS going?

Prolific PFAS

There are two general categories of PFAS. The first includes legacy PFAS such as perfluorooctanoic acid (PFOA) and perfluorooctane sulfonic acid (PFOS). Chemical manufacturers produced these compounds in the 1970s, 1980s, and 1990s for products including nonstick cookware and food packaging and in industries such as fabric waterproofing, industrial manufacturing, and firefighting.

Legacy PFAS were phased out in the early 2000s, and novel PFAS were made to replace them. The term “novel” is independent of chemical properties and instead refers to when the chemicals’ production began, though novel PFAS typically have formulations meant to reduce their persistence in the environment. For example, many novel PFAS molecules have shorter chains of fluorinated carbons than their legacy counterparts.

Novel PFAS include possibly millions of different chemical structures, and their production and use are increasing globally.

A diagram shows the general structure of a PFAS molecule, which includes a “head group” connected to a chain of fluorinated carbon atoms.
A generic PFAS molecule includes a compound head connected to a tail of fluorinated carbons. Older PFAS generally have longer tails (seven or eight carbons) than newer ones. Credit: Mary Heinrichs/AGU, after https://bit.ly/pennstate-ext-pfas

In the United States and elsewhere, regulatory structures that limit PFAS production target specific chemicals, such that every new formulation by a company must be tested individually before restrictions are put in place. With companies continually conjuring new PFAS formulations—which environmental advocates often call “regrettable substitutions” for their sometimes harmful effects—understanding the fate and transport of novel PFAS is difficult and time-consuming. Research on the behavior of specific PFAS may be a drop in the bucket when millions of potential PFAS, with millions of potential behaviors, pose current and future risks to people and the environment.

Scientists like Sun are determined to untangle how the fate of these new chemicals differs from their predecessors. As Sun expected, the phaseout of legacy PFAS was reflected in the pilot whale tissue she tested. These results are good news; they clearly show that the bans on legacy PFAS are working.

“We’re still finding [older] compounds, but clearly, they are no longer as abundant in the environment as they used to be, which is a positive,” said Bridger Ruyle, an environmental engineer at New York University who studies PFAS and assisted Sun and her coauthors in deciding which methods to use for the new study.

But Sun and her colleagues also expected an overall increase in concentrations of novel PFAS—after all, production of these chemicals is higher than ever, and researchers finally had the analytical tools to catch them.

“The inference is, if it’s not in the whales, and it’s not in the ocean…where is it?”

That wasn’t what they found. Instead, all but two of the emerging PFAS they tested for were virtually nowhere to be seen in the whale tissue, leaving the scientists leading the study to wonder where novel PFAS were accumulating or if instrumentation was limiting their detection.

“We do know that the novel PFAS are being produced, which means they’re going somewhere. Where they are, and how exposed people and other wildlife are, is not as clear,” Sun said.

“The inference is, if it’s not in the whales, and it’s not in the ocean…where is it?” asked Elsie Sunderland, an environmental scientist at Harvard University and coauthor of the new study.

Sun and Sunderland’s question—asking where novel PFAS are going—is one scientists are probing from multiple angles. Those who study particle transport are asking how novel PFAS might travel through Earth’s water and air. Those on the chemistry side of the investigation are deducing how novel PFAS might break down. And those who monitor environments are looking for traces of novel PFAS in various corners of Earth.

The answers to their questions have direct, practical implications for human and environmental health and could indicate whether a growing proportion of harmful PFAS may be ending up in close proximity to humans—where we work and eat and breathe.

A Toxic Legacy

The chemical properties of PFAS have made the chemicals useful since the 1940s. These same properties also make them highly persistent—the most durable types may not break down in the environment for several thousand years.

PFAS are linked to certain cancers and other human health harms. Much of the available data linking PFAS to poor health come from analyses of legacy PFOA and PFOS. They show an association between increased exposure to these chemicals and altered immune and thyroid function, liver and kidney disease, reproductive system disruptions, and more.

Chemical manufacturers phased out production of legacy PFAS after scientific evidence emerged associating PFAS and human health harms, businesses began to lose money in massive lawsuits, and regulations tightened. Novel PFAS were intended to show properties similar to legacy PFAS but were meant to break down more easily in the environment (lower persistence) and accumulate less easily in living tissue (lower bioaccumulative ability), though studies have shown mixed results about whether novel PFAS are actually safer for humans or break down more easily.

Because PFAS production data are often proprietary, scientists who study PFAS, like Sun, must rely on partial inventories of PFAS production or reverse-engineer those numbers from observations in the environment.

“We call it chemical Whac-A-Mole.”

Without a clear list of the chemical structures of novel PFAS, scientists don’t always have the analytical standards necessary for routine detection. And once scientists do understand the behavior of a PFAS chemical, it may be quickly replaced by another, unknown alternative. “We call it chemical Whac-A-Mole,” Sunderland said.

Legacy PFAS tend to have a high affinity for water and typically end up in the ocean, the place scientists refer to as the chemicals’ “terminal sink.” Many legacy PFAS also entered the ocean through atmospheric transport such as rain or snow. But because of the sheer number of chemical formulas and the chemical differences between legacy and novel PFAS, the pathways that novel PFAS take through the environment are less clear.

Tracking the movement and accumulation of novel PFAS in the environment is crucial for understanding how these chemicals may affect ecological and human health.

Still, the science is inconclusive about whether novel PFAS are moving or accumulating differently than their legacy counterparts, whether they have a different terminal sink, and where that terminal sink may be.

Close to Home

One possible answer to the question of the missing novel PFAS may have to do with geography. The chemicals may not have reached pilot whales in the Faroes because something about the new chemistry has led them elsewhere in the environment. To Sun, evidence suggests “that a lot of these novel PFAS, which we know are being produced, may not be transporting out into this more remote environment either at all or as quickly.”

Novel PFAS might be accumulating closer to their sources—and closer to us. “It may simply be that some of the replacement PFAS don’t make it all the way out into the open ocean. But if they are still in the terrestrial environment and the near-coastal environment, then wildlife and people who live close to the sources can be exposed, said Frank Wania, an environmental chemist at the University of Toronto Scarborough.

For example, one study monitored PFAS in coastal beluga whales in Canada’s St. Lawrence Estuary, relatively close to human communities and PFAS manufacturing sources. The study showed increasing concentrations of unregulated novel PFAS in whale tissue from 2000 to 2017, while concentrations of legacy PFAS declined.

The suggestion that novel PFAS are accumulating close to human communities is supported by measurements of PFAS in human tissue, too. Studies show that a high proportion of detectable organofluorine chemicals in human tissue are increasingly unidentifiable, suggesting that some of the novel PFAS production “is in us,” Sunderland said.

Far and Away

Though there are some indications that novel PFAS may be retained closer to human communities, there are also reasons to think some novel PFAS chemistries have resulted in substances that can actually travel farther and more easily than their legacy counterparts.

Anna Kärrman, an environmental chemist at Örebro University in Sweden, said that some novel PFAS may be more easily transported in the environment: “The more novel chemistries are increasing the properties of being very mobile in water, very mobile in the atmosphere, and not necessarily very bioaccumulative.”

The mobility of novel PFAS was on full display in a 2020 study that Sunderland coauthored, in which researchers reported detecting hexafluoropropylene oxide-dimer acid, a novel PFAS chemical more commonly known as GenX, in the Arctic for the first time. GenX, produced by chemical manufacturer Chemours, was meant to replace the legacy compound PFOA. The 2020 study suggested GenX “has already moved quite a bit,” said Rainer Lohmann, a marine geochemist who leads the STEEP (Sources, Transport, Exposure and Effects of PFAS) Center at the University of Rhode Island.

A pulley system mounted on a red beam pulls a small envelope filled with water along a string.
A pulley system mounted on a red beam pulls a small envelope filled with water along a string. Credit: Thomas Soltwedel

The 2020 study also found higher concentrations of PFAS in the Arctic Ocean’s surface water, suggesting that the atmosphere was a particularly important transport pathway for chemical transport. This idea is supported by studies of High Arctic ice caps, which experience contamination only from atmospheric sources, and polar bear tissue. Atmospheric transport of novel PFAS is a subject “at the edge” of PFAS research, Sunderland said.

Wherever researchers look, they’re finding that atmospheric transport is an important pathway by which some PFAS, especially PFAS precursors—chemicals that break down in the environment and become PFAS (either novel or legacy)—move. One idea called the PAART (precursor atmospheric and reaction transport) theory was developed by Scott Mabury, an environmental chemist at the University of Toronto, and others. The PAART theory proposes that many of the harmful PFAS that end up in the most remote parts of Earth are the result of the breakdown of volatile precursor PFAS that have traveled in the atmosphere.

According to Lohmann, atmospheric transport means the ocean remains a terminal sink because many novel PFAS transported in rain or snow will ultimately be deposited in the ocean.

In this scenario, the question of why novel PFAS are not bioaccumulating in Faroese pilot whales remains a mystery. While Lohmann suggests the novel compounds simply don’t accumulate in living tissue, Sunderland isn’t sure that’s the whole story: “As apex predators, the whales are sentinels for what is available and being taken up from the ocean,” she wrote in an email. “Since we don’t see [novel PFAS], it seems unlikely there are large quantities of these chemicals present.”

Profuse PFAS

Another possible explanation for the surprising results of Sun’s whale study could be that there’s just a lag; that is, novel PFAS will end up in Faroe Island pilot whales someday but haven’t yet. Chemicals that could eventually end up in the ocean may be temporarily trapped in soils or recycled back into terrestrial ecosystems via sea spray aerosols, for example.

“The delay we are seeing in the ocean response may in fact be [PFAS] precursors being retained in source zones,” Sunderland wrote in an email. These chemicals may be “taking a really long time to be transformed into more mobile compounds.”

In their pilot whale study, Sun and her colleagues modeled the transport of PFAS to the subarctic and found a 10- to 20-year lag existed between the production of a legacy PFAS compound and its detection in whale tissue. We’re still within that range for many novel PFAS. Sun said she would have expected them to show up in pilot whale tissue by now if they behaved like their legacy counterparts, though it’s possible that it has taken time for the volume of novel PFAS production to ramp up, increasing the time it would take for the substances to be detected in tissues.

A group of whales’ fins breach the surface of the water.
The anomaly documented in the pilot whale study has led researchers to call for more investigation (and perhaps greater regulation) of novel PFAS. Credit: Bjarni Mikkelson

Still, the number of possible novel PFAS chemistries—again, there could be several million different compounds—makes it difficult to generalize how these new substances are, as a group, moving through the environment. “Because the exact structures of all [novel] PFAS remain unknown, some compounds may simply not be captured by the methods used,” Heidi Pickard, an environmental engineer at the consulting firm Ramboll and coauthor on the new whale study, wrote in an email.

Another reason novel PFAS are harder to study is that companies release lower concentrations of more kinds of the chemicals, rather than the “monstrously high” emissions of some legacy PFAS in the 1970s–1990s, noted Mabury, who was not involved in the new pilot whale study.

A New Regulatory Approach

According to Sun and Sunderland, cataloging differences between novel and legacy PFAS misses the broader point: We simply need to produce less PFAS. We’ve known for decades that PFAS harm human health, and some scientists have even argued that humans’ continual production and release of novel chemical compounds could drive Earth beyond a “safe operating space.”

“Researchers are critical for exposing the problem. But that, to me, is not the central issue here. The central issue here is a societal issue.”

Where scientists probe next may be less urgent than how policymakers decide to tackle the PFAS problem, Sunderland said: “Researchers are critical for exposing the problem. But that, to me, is not the central issue here. The central issue here is a societal issue.”

Chemical manufacturers are actively creating novel PFAS all the time. Kärrman, for example, has noticed patent applications for PFAS compounds with chemistries that “are nothing like we have seen before” that may start entering our environment in 5 or 10 years.

To Kärrman, that’s a reason for governments to push for chemical regulation based on properties such as persistence and bioaccumulation, rather than the chemical-by-chemical formula used in most countries, including the United States.

Such an approach has gotten traction in Europe via a proposal by the European Chemicals Agency to restrict the entire class of PFAS chemicals. The proposal is still under evaluation, and a final decision is expected by the end of the year.

In the United States, PFAS regulation and remediation are a key aspect of the Trump administration’s Make America Healthy Again movement, according to the EPA, and the federal government and some states already limit the concentrations of individual PFAS in drinking water. However, the EPA also said it planned to weaken some of those limits last year.

“We’re in a cycle of picking these regrettable alternatives [to legacy PFAS] and then figuring out that it was regrettable decades later,” Sunderland said. “We’re never going to catch up using this chemical-by-chemical approach.”

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), Chemical companies are churning out new PFAS. Where in the world are they ending up?, Eos, 107, https://doi.org/10.1029/2026EO260136. Published on 30 April 2026.
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • As the Coal Industry Fades, Life Expectancies in Coal Country Shift Grace van Deelen
    Want to see more reporting from Eos in your Google search results? Click the button below to make Eos a preferred source. Go to Google The coal industry can damage human health in myriad ways via dangerous working conditions and harmful pollution. But the income opportunities offered by the industry can also provide much-needed stability for certain communities, such as those in Appalachia’s coal country. “Being employed is good for your health, but environm
     

As the Coal Industry Fades, Life Expectancies in Coal Country Shift

30 April 2026 at 12:56
A foggy mountain scene at sunset. In the right-hand corner, a railroad leading to a small building can be seen.

The coal industry can damage human health in myriad ways via dangerous working conditions and harmful pollution. But the income opportunities offered by the industry can also provide much-needed stability for certain communities, such as those in Appalachia’s coal country.

“Being employed is good for your health, but environmental pollution is bad for your health, and these two things are operating at the same time in some communities,” said Mary Willis, an epidemiologist at Boston University.

The industry, though, is changing. Total coal production in the United States peaked in 2008, and the number of miners has steadily dropped since then.

A graph shows total, underground, and surface production of coal in millions of short tons alongside the number of coal miners from 1949 to 2023.
Total coal production peaked in the United States in 2008, after which the number of coal miners declined, too. Credit: Thombs et al., 2026, https://doi.org/10.1111/ruso.70034, CC BY 4.0

A new study coauthored by Willis and published in Rural Sociology delves into the effects of this decline on life expectancies across the United States and in Appalachia in particular. The results show that a disappearing coal mining industry has mixed effects on health, highlighting the importance of a “just transition”—a shift away from coal mining and toward clean energy that also prioritizes decent work opportunities for those left without a job.

“How do we balance these two conflicting priorities?” Willis said.

Delving into the Decline

Coal production and consumption are linked to many human health harms, including heart disease, asthma, lung cancer, mental illness, and more. But how those health impacts intersect with the broader economic effects of mining has not been well studied.

In the new study, the research team analyzed the effects of the declining industry through the lens of the social determinants of health, or how social structures influence health outcomes.

A table shows the life expectancy outcomes of the effects of three pathways by which coal mining impacts health.
Researchers analyzed how coal mining impacts life expectancies via three pathways: production, mining labor time, and employment. Credit: Thombs et al., 2026, https://doi.org/10.1111/ruso.70034, CC BY 4.0

To study these effects, the team compared coal mining data from the U.S. Energy Information Administration to life expectancy data from the Institute for Health Metrics and Evaluation at the University of Washington from 2012 to 2019. Life expectancy is a metric that can be responsive to subtle changes in the environment, Willis explained. For example, the decommissioning of a coal-fired power plant a few miles away from a community may not affect residents’ day-to-day life but probably affects the scale of life expectancy across the population.

In coal-producing counties across the United States, the average life expectancy was 1.6 years lower than that in non-coal-producing counties. But the declining coal industry had more nuanced impacts on health in Appalachian communities, the researchers found. As coal production fell and miner labor hours decreased, life expectancy increased. But as the number of jobs available decreased, life expectancy decreased, too.

The findings suggest that the employment and associated economic impacts of a waning coal industry harm health. Previous studies documented similar increases in mortality in other regions where the fossil fuel industry has declined. Such research has indicated that these increased mortality rates may be partially driven by “deaths of despair” from drug and alcohol use and suicide related to economic distress. The association of these factors with mortality rates in coal country, the authors suggest, may be an area for future study.

Understanding that coal mining is associated with some positive economic and health effects is “an important perspective for understanding the sector as a whole,” said Lucas Henneman, an environmental engineer at George Mason University who was not involved in the new study. “It’s a really interesting piece of work.”

“This is just a really complex story that hasn’t been told yet—putting health into the context of these just energy transitions,” Willis said.

The complex reality of the coal industry extends beyond Appalachia. Most of the pollution related to the coal industry consists of toxins released when coal is burned, meaning those who bear the brunt of coal’s health impacts may not be located where coal is mined, Henneman said.

In fact, a 2023 study by Henneman and others found that before 2009, a quarter of all air pollution–related deaths of people on Medicare were attributable to coal burning. From 2013 to 2020, that number dropped to 7%, alongside a drop in coal consumption. A complete picture of how the coal industry affects health should also consider how pollution travels beyond coal country—where it’s burned, how it’s transported in the air, and who ultimately breathes it in, he said.

A Just Transition

“The question is how to provide [jobs] in a way that provides the same level of stability, same kind of income benefits, and isn’t too much of a shock to [communities’] way of life or sense of identity.”

The economic activity of a mine, through direct employment as well as businesses reliant on the mine and miners, “chases away other opportunities,” making the mine the economic backbone of the area, said Jonathan Buonocore, an environmental health scientist at Boston University and a coauthor of the new study. The concept of a just transition aims to ensure that employment opportunities in the wake of the coal industry’s decline reach these communities.

“The question is how to provide [jobs] in a way that provides the same level of stability, same kind of income benefits, and isn’t too much of a shock to [communities’] way of life or sense of identity,” Buonocore said.

—Grace van Deelen (@gvd.bsky.social), Staff Writer

Citation: van Deelen, G. (2026), As the coal industry fades, life expectancies in coal country shift, Eos, 107, https://doi.org/10.1029/2026EO260134. Published on 30 April 2026.
Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • How Wildfires Worsen Flood Risk Nathaniel Scharping
    Source: Water Resources Research Wildfires can increase flooding risks in and downstream of burned areas by removing vegetation and disturbing hydrologic processes. As the climate changes, the severity of both wildfires and heavy rainfall events is increasing, meaning flooding is likely to become more severe in the near future. Better understanding how, and by how much, wildfires change flood risk is important for disaster and infrastructure planning for communities around the country. Ca
     

How Wildfires Worsen Flood Risk

30 April 2026 at 12:54
A rocky stream flows through a landscape of burned trees. A mountain is visible in the background.
Source: Water Resources Research

Wildfires can increase flooding risks in and downstream of burned areas by removing vegetation and disturbing hydrologic processes. As the climate changes, the severity of both wildfires and heavy rainfall events is increasing, meaning flooding is likely to become more severe in the near future. Better understanding how, and by how much, wildfires change flood risk is important for disaster and infrastructure planning for communities around the country.

Canham and Lane used streamflow data from the U.S. Geological Survey’s National Water Information System and precipitation data from the NOAA Analysis of Record for Calibration product to identify storms and quantify their effects across seven burned watersheds in the western United States.

To make the most of the limited data on flooding in the years following wildfires, the researchers created a paired-storms framework: They identified postfire peak flows (PFPFs), defined as the five highest peak flows within 3 years of a wildfire across seven watersheds. Then, for each precipitation event causing a PFPF, they looked for storms with similar characteristics (or paired storms) that occurred before the wildfire. Storm characteristics used for pairing included the season in which the storm occurred, recent precipitation, and precipitation depth, duration, and peak intensity.

The researchers found significantly elevated peak flows after wildfires in many cases, underlining the risks to communities following wildfires and validating their approach for use elsewhere.

Altogether, the authors found 26 PFPF events, including 20 with paired storms occurring before wildfires. For 75% of the postfire storms, their peak flows were 2 or more times greater than prefire peak flows. PFPFs were most likely to happen in the first year after a wildfire and typically occurred following storms that were centered upstream of the watershed centroid, were uniform in shape, and fully covered the watershed and burned area, the authors reported. They also found some evidence that the first storm in the year immediately following a fire has a higher-than-expected chance of producing a PFPF.

Future work could look more deeply at the characteristics of storms occurring over burned areas, such as storm direction and watershed recovery, and could apply the automated methods to more burned watersheds and storm events to enhance the robustness of the work, the authors say. (Water Resources Research, https://doi.org/10.1029/2025WR040693, 2026)

—Nathaniel Scharping (@nathanielscharp), Science Writer

A photo of a telescope array appears in a circle over a field of blue along with the Eos logo and the following text: Support Eos’s mission to broadly share science news and research. Below the text is a darker blue button that reads “donate today.”
Citation: Scharping, N. (2026), How wildfires worsen flood risk, Eos, 107, https://doi.org/10.1029/2026EO260133. Published on 30 April 2026.
Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Toward Marine Cloud Brightening at Scale: A Science Agenda Ana Barros
    Editors’ Highlights are summaries of recent papers by AGU’s journal editors. Source: AGU Advances The albedo change of marine clouds is achieved by targeted additions of aerosols, and in particular, sea salt. To assess the viability of Marine Cloud Brightening (MCB) requires a fundamental understanding of the impact of aerosols on cloud evolution and properties, and on the cloud environment. Doherty et al. [2026] propose a framework for studying MCB across scales. This includes small-
     

Toward Marine Cloud Brightening at Scale: A Science Agenda

30 April 2026 at 12:00
Clouds above a body of water.
Editors’ Highlights are summaries of recent papers by AGU’s journal editors.
Source: AGU Advances

The albedo change of marine clouds is achieved by targeted additions of aerosols, and in particular, sea salt. To assess the viability of Marine Cloud Brightening (MCB) requires a fundamental understanding of the impact of aerosols on cloud evolution and properties, and on the cloud environment.

Doherty et al. [2026] propose a framework for studying MCB across scales. This includes small- to large-scale studies aimed at systematically characterizing the life-cycle of aerosols and the diurnal cycle of cloud processes, how these change with the magnitude, duration and type of aerosol applied, and monitoring potential harmful direct or indirect consequences of aerosol injection, such as regional changes in temperature or precipitation.

Possible configuration for a Stage III study for measuring local scale cloud responses to a single plume of generated sea salt aerosol sized for marine cloud brightening. Credit: Doherty et al. [2026], Figure 4

Citation: Doherty, S. J., Diamond, M. S., Wood, R., & Hirasawa, H. (2026). Defining scales of field studies and experiments to assess marine cloud brightening. AGU Advances,7, e2025AV001939. https://doi.org/10.1029/2025AV001939

—Ana P. Barros, Editor, AGU Advances

The logo for the United Nations Sustainable Development Goal 13 is at left. To its right is the following text: The research reported here supports Sustainable Development Goal 13. AGU is committed to supporting the United Nations 2030 Agenda for Sustainable Development, which provides a shared blueprint for peace and prosperity for people and the planet, now and into the future.
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • New USGS Tool Fills in the Gaps on U.S. Water Supply Emily Dieckman
    Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today. In the contiguous United States, crop irrigation, municipal water supplies, and thermoelectric power generation use more than 224 billion gallons of fresh water every day. Conducting water research or making decisions about water use, until now, often required referencing datasets across various agencies. The U.S. Geological Survey (USGS) National
     

New USGS Tool Fills in the Gaps on U.S. Water Supply

29 April 2026 at 16:27
A bridge crosses a river beneath a relatively short waterfall. A city skyline is on the other side of the river.

Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.

In the contiguous United States, crop irrigation, municipal water supplies, and thermoelectric power generation use more than 224 billion gallons of fresh water every day. Conducting water research or making decisions about water use, until now, often required referencing datasets across various agencies. The U.S. Geological Survey (USGS) National Water Availability Assessment Data Companion (NWDC), announced this week, aims to streamline this process. In part, the tool is designed to help decisionmakers better understand the balance between how high demand and limited supply affect water availability in their communities.

“While the United States has abundant water nationally, regional imbalances between supply and demand may create water challenges affecting millions of Americans,” said lead scientist Shirley Leung in a USGS press release. “What once required significant resources and time can now be done in minutes, giving communities of all sizes the same foundation for water planning.”

The lower 48 states are home to about 80,000 sub-watersheds, from those in the arid southwest to the Great Lakes Basin, where about 84% of North America’s surface fresh water is located. According to the USGS, the NWDC is the first tool that integrates information about water availability in individual watersheds at a national scale.

The tool is designed to complement Water Data for the Nation (WDFN), another USGS product that consolidates observational data from the agency’s thousands of local monitoring stations gathering data on streams, lakes, reservoirs, precipitation, water quality, and groundwater. The new tool uses modeling to fill in spatial and temporal gaps between the observations made at these stations.

Water managers, researchers, agricultural experts, and others can use the NWDC to compare watershed conditions, identify seasonal patterns in water use, or to create data visualizations of statewide water use, for example. Though the tool currently covers only the contiguous United States, it will soon be extended to Alaska, Hawaii, and Puerto Rico, according to the USGS.

David Tarboton, a professor of civil engineering at the Utah Water Research Laboratory, said he was “intrigued” by the new tool, and is interested in examining the data its model produces. 

While Tarboton was disappointed that the tool’s most recent available data are from 2020, “having a sort of integrated, wall-to-wall dataset that’s consistently produced is very valuable,” he said. He works, in part, in the areas of hydroinformatics and data sharing, and noted that the modern methods the agency is using to share the data could be useful in developing automated tools.

—Emily Gardner (@emfurd.bsky.social), Associate Editor

These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org.

A photo of a hand holding a copy of an issue of Eos appears in a circle over a field of blue along with the Eos logo and the following text: Support Eos’s mission to broadly share science news and research. Below the text is a darker blue button that reads “donate today.”
Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Antibiotic Resistance Might Get a Boost from Droughts Javier Barbuzano
    The spread of antibiotic resistance, a growing threat to global health that causes millions of deaths annually, is typically blamed on the overuse of drugs in hospitals and in the food industry. However, a new study published in Nature Microbiology suggests that normal geological processes could be accelerating the development of new resistances. Soil microorganisms naturally produce antibiotics as a form of chemical warfare to compete with each other. When soils dry out, these natural compo
     

Antibiotic Resistance Might Get a Boost from Droughts

29 April 2026 at 13:19
A forest on a mountainside has mostly green trees, with sprinkles of autumn red and yellow. A brown mountain is in the distance.

The spread of antibiotic resistance, a growing threat to global health that causes millions of deaths annually, is typically blamed on the overuse of drugs in hospitals and in the food industry. However, a new study published in Nature Microbiology suggests that normal geological processes could be accelerating the development of new resistances.

Soil microorganisms naturally produce antibiotics as a form of chemical warfare to compete with each other. When soils dry out, these natural compounds become more concentrated because there is less water to dilute them. Like a dosage increase, this concentration can create a harsher environment, killing sensitive microbes and sparing those with the capacity to resist. This phenomenon, in turn, is an evolutive driver that favors the appearance of new and more effective resistance genes.

“If you have more antibiotics in your environment, only the organisms that can withstand it…can resist it.”

To test whether this mechanism is having real genetic effects, Xiaoyu Shan, a microbial ecologist and postdoctoral researcher at the California Institute of Technology (Caltech), and colleagues looked at soil samples under controlled conditions as the samples transitioned from a wet state to a desiccated one. They found that as the soil dried, the presence of genes related to antibiotic production and resistance spiked, suggesting that drought leads to a rapid escalation in the subterranean biological arms race. Importantly, they did not look for pathogenic bacteria specifically, only for resistance genes, which can be present in a variety of microbes, whether those microbes are pathogenic or not.

“Drought leads to this elevation of antibiotic producers and bacteria that are resistant,” said team member Dianne Newman, a professor of biology and geobiology also at Caltech. “It’s a pretty simple idea: If you have more antibiotics in your environment, only the organisms that can withstand it…can resist it.”

Alternative Explanations

However, there could be other potential explanations for the observed increase in antibiotic-producing and antibiotic resistance genes, according to Enrique Monte, a microbiologist at the Universidad de Salamanca in Spain who wasn’t involved with the new study. For instance, arid soils are naturally more diverse than humid soils, making it common to find a more diverse gene pool in the ground, Monte said. In addition, the mere presence of antibiotic genes might not result in an actual release to the environment, or a release could happen in dosages that are too small to cause noticeable effects. “There are antibiotics that are volatile; they escape into the air, so they never reach a therapeutic concentration to kill others,” Monte said.

The authors, however, took some precautions to show that the increase in antibiotic resistance genes was actually a biological response to environmental stress. For instance, they also tracked other genes that should remain unaffected or decline under desiccation. As expected, genes that are needed for basic survival remained stable, while genes responsible for bacterial movement declined in dry soil, where mobility is restricted. Even some species that were not favored by desiccation saw an increase in resistance-related genes, “which is even stronger evidence,” Shan said.

Geographic Limitations

As the researchers combed through publicly available metagenomic data libraries, they had to select collections with strict control of all variables and in which the only changing factor was water content. That limited the analysis to five locations: two grasslands and a sorghum field in California; a forest in Valais, Switzerland; and a wetland in Nanchang, China.

The scarcity of locations might limit how extrapolable these results are, said Fiona Walsh, a microbiologist at Maynooth University in Ireland who was not involved with the work. “There are thousands of high-quality metagenomes available online with excellent metadata. I would really like to see a comparison where they apply their analysis to a broader map of global metagenomic data to see if they reach the same conclusions,” she said.

From the Soil to the Hospital

Drier regions consistently showed a higher number of resistant bacteria cases in hospitals, even after adjusting for confounding factors such as local income.

The study also suggests that dry soils might be a hidden driver of clinical cases of antibiotic resistance worldwide. The authors combined hospital data on the number of cases of resistant infections from 116 countries with the local aridity index, which measures temperature and precipitation, for each location. They found a strong correlation: Drier regions consistently showed a higher number of resistant bacteria cases in hospitals, even after adjusting for confounding factors such as local income.

However, the authors admitted that this is only a correlation effect and doesn’t prove causation. “It motivates follow-up research to see how environmental concentration weighs against human overuse and poor stewardship,” Newman said.

Even this correlation could be a stretch, according to microbiologist Sara Soto, head of the Global Viral and Bacterial Infections Programme at the Instituto de Salud Global de Barcelona. At the end of the day, she said, the authors have soil data from only five locations in three countries, and they are not tracking the specific bacterial varieties that make people sick, only resistance genes.

For the thesis to be solid, Soto said, the ideal approach would have been to contrast hospital strains from a specific area with soil data from that same region during the same drought episode. “Making such a vast inference—that what happens in the soil of one location affects what happens in a hospital elsewhere—is a big leap,” she said.

The authors, however, point out that resistance genes from soils can eventually make their way into human pathogens. Microbes have the capacity to share genetic material across species—a process known as horizontal gene transfer. In their analysis, the team identified specific resistance sequences that appeared to have been transferred between soil bacteria relatively recently, perhaps within the past decade. How they are reaching hospitals remains a matter for a future study, they said.

As droughts increase in numerous regions in the face of climate change, this selective pressure within soil ecosystems is expected to intensify. Though these findings do not show that drought directly puts drug-resistant pathogens in hospitals, they still suggest that a drying climate could set the scene for an increase in antibiotic resistance, the researchers report.

—Javier Barbuzano (@javibar.bsky.social), Science Writer

Citation: Barbuzano, J. (2026), Antibiotic resistance might get a boost from droughts, Eos, 107, https://doi.org/10.1029/2026EO260132. Published on 29 April 2026.
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Hydrothermal Heat Flow as a Window into Subsurface Arc Magmas Benjamin A. Black · S. E. Ingebritsen and Kazuki Sawayama
    Editors’ Vox is a blog from AGU’s Publications Department. The supply of magma from the Earth’s mantle is a primary source of heat to volcanic arc crust, where the heat is then dissipated through various processes. Much of this magmatic heat is dissipated as heated water, or aqueous fluid. A new article in Reviews of Geophysics compares 11 different volcanic-arc segments where heat discharge via aqueous fluid has been well-inventoried to better understand the factors that influence this p
     

Hydrothermal Heat Flow as a Window into Subsurface Arc Magmas

Three scientists working on the side of a mountain.
Editors’ Vox is a blog from AGU’s Publications Department.

The supply of magma from the Earth’s mantle is a primary source of heat to volcanic arc crust, where the heat is then dissipated through various processes. Much of this magmatic heat is dissipated as heated water, or aqueous fluid.

A new article in Reviews of Geophysics compares 11 different volcanic-arc segments where heat discharge via aqueous fluid has been well-inventoried to better understand the factors that influence this process. Here, we asked the authors to give an overview of heat discharge from volcanic arcs, how scientists measure it, and what questions remain.

Why is it important to study how heat is dissipated from volcanic arcs?

The heat from these magmas matters for geothermal energy, patterns of groundwater flow, and the patterns of volcanic activity at the surface.

Volcanic arcs are the chains of volcanoes on top of subduction zones. They can produce some of Earth’s most explosive and hazardous eruptions. But much of the magma beneath the surface never erupts. Nevertheless, the heat from these magmas—and the simple fact of their existence and abundance—still matters for geothermal energy, patterns of groundwater flow, and the patterns of volcanic activity at the surface.

What are the main modes in which heat is discharged from volcanic arcs?

Heat at volcanic arcs can be carried by magmas, transmitted through the crust conductively, and carried by water seeping slowly through the crust. At the base of the crust, magmas are probably most important, with conduction coming in second. But as magmas move upwards through the crust, some of them solidify and impart their heat to their surroundings where it is transferred by conduction. Within a few kilometers of the surface, fluids seeping through the crust begin to take up all that heat, and so if we can quantify the heat carried by those fluids, we can retrace it to its origins in magmas.

How do scientists measure these different forms of heat loss?

Scientists measure the heat carried by erupting magmas using satellites, or by adding up the erupted masses and making an estimate of how much energy was released by cooling from eruption temperatures to solid igneous rocks at Earth’s surface. Conductive heat flow is measured by drilling holes in the Earth’s crust to see how quickly it gets hotter with depth.

Measuring the heat carried by aqueous fluids in the crust is in some ways the trickiest. One approach is to find all the springs where hot or even slightly warm water is trickling out and measure the temperature and discharge to estimate how much extra heat that water is carrying.

What are the challenges and uncertainties in measuring hydrothermal heat discharge?

One challenge is that many springs are only slightly warmer than you’d expect. There is good data for many hot springs, but there are data tracking these ‘slightly warm’ springs for only a subset of arcs. Another challenge is that warm underground fluids can flow laterally, so you have to try to account for that. This is not an uncertainty in hydrothermal discharge, but one additional big uncertainty for our study, where we were trying to quantify the proportion of magmas that freeze underground versus erupting, is in the estimates of how much magma has actually erupted through time.

What are some of the factors that influence hydrothermal heat loss?

A major goal of our paper is to try to quantify these hidden magmas.

A main factor that influences hydrothermal heat loss is the magmas that solidify underground. This link is the key motivation for our study. A major goal of our paper is to try to quantify these hidden magmas—how much magma needs to intrude the crust beneath the surface to supply the hydrothermal heat fluxes that we observe? The composition of magmas influences how much heat they can release. The depth at which magmas are emplaced also matters, because magmas that intrude the shallow crust eventually cool to lower temperatures than magmas emplaced in the lower crust and therefore release more heat.

What are the remaining questions or knowledge gaps where additional research efforts are needed?

A big outstanding challenge is combining estimates from hydrothermal data of how much magma is coming into the crust – like ours – with other approaches to estimating the same thing. The magmatic systems beneath volcanoes span the crust. At the base of the crust, you have magma supply, sort of like the water main feeding your plumbing system. Despite how fundamental magma supply is, we know remarkably little about it. It’s exciting to think about how the rates of magma supply could vary through time and space and why. Applying a range of techniques—including geophysical imaging, hydrothermal budgets, gas and igneous geochemistry, and petrology—to understand these questions could really be a game changer.

—Benjamin A. Black (bblack@eps.rutgers.edu; 0000-0003-4585-6438), Rutgers University, United States; S. E. Ingebritsen (steve.ingebritsen@gmail.com; 0000-0001-6917-9369), Kyoto University, Japan; and Kazuki Sawayama (sawayama@bep.vgs.kyoto-u.ac.jp; 0000-0001-7988-3739), Kyoto University, Japan

Editor’s Note: It is the policy of AGU Publications to invite the authors of articles published in Reviews of Geophysics to write a summary for Eos Editors’ Vox.

Citation: Black, B. A., S. E. Ingebritsen, and K. Sawayama (2026), Hydrothermal heat flow as a window into subsurface arc magmas, Eos, 107, https://doi.org/10.1029/2026EO265017. Published on 28 April 2026.
This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s).
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • The Genesis Mission Needs Hydrology: Here’s How to Incorporate It Amobichukwu C. Amanambu and Jonathan Frame
    Every chip fabricated in a semiconductor plant needs ultrapure water. Most nuclear reactors need water as a coolant and neutron moderator. Every artificial intelligence (AI) data center drinks between 1 million and 5 million gallons of water a day, with thirst often peaking during drought. Water runs through every technology priority the United States has named, yet the word does not appear once in “Launching the Genesis Mission,” an executive order (EO) released in November 2025. As describ
     

The Genesis Mission Needs Hydrology: Here’s How to Incorporate It

28 April 2026 at 13:09
Satellite image of The Dalles Google data center and the adjacent Columbia River.

Every chip fabricated in a semiconductor plant needs ultrapure water. Most nuclear reactors need water as a coolant and neutron moderator. Every artificial intelligence (AI) data center drinks between 1 million and 5 million gallons of water a day, with thirst often peaking during drought.

Water runs through every technology priority the United States has named, yet the word does not appear once in “Launching the Genesis Mission,” an executive order (EO) released in November 2025. As described in the EO, the Genesis Mission is a “dedicated, coordinated national effort to unleash a new age of AI-accelerated innovation and discovery that can solve the most challenging problems of this century.”

Led by the Department of Energy (DOE), the initiative aims to build an integrated AI framework that would harness federal scientific datasets to accelerate breakthroughs in advanced manufacturing, biotechnology, critical materials, nuclear fission and fusion energy, quantum information science, and semiconductor development. The scope of the mission is comparable to that of the Manhattan Project.

Since the announcement, the DOE has listed “Predicting U.S. Water for Energy” among its 26 Genesis Mission Science and Technology Challenges. The project is also soliciting proposals in three water-related focus areas.

This framework provides a foothold for hydrology in the Genesis Mission, but it is scoped narrowly around water as a supply variable for energy production.

In reality, water is a crosscutting constraint that will help determine whether the mission’s priorities translate into deployable outcomes. The hydrology community now has a seat at the table, and if it moves first and positions water security as one of the “most challenging problems of this century,” the Genesis Mission can become the sandbox in which AI reshapes how the country measures, models, and manages water.

Making this happen will require that the DOE and the Office of Science and Technology Policy charter a hydrology workstream inside the Genesis Mission, with interagency delivery involving the U.S. Geological Survey (USGS), NOAA, the Bureau of Reclamation, the EPA, and partners at state, regional, and community levels. Here is what we think that workstream should look like:

Illustration with “Genesis AI Platform” as a hub and seven mission-related components as spokes.
A water-centric Genesis Mission architecture supports seven hydrological components that both feed into and receive decisions from the Genesis AI platform. Each component maps to a section of this article. Credit: Amobichukwu C. Amanambu. Click image for larger version.

While the existing challenges reflect some of these components, others will require coordinated effort from the hydrology community to bring into the Genesis Mission’s scope.

Build the Water Corpus Genesis Will Need

The Genesis Mission EO instructs the DOE to create an American Science and Security Platform to provide the public, scientists, agencies, and policymakers access to crucial scientific datasets.

The good news is that accessible water data systems already exist across several federal agencies and academic research centers. The USGS National Water Information System tracks real-time and historical water quality and use across the country. NASA’s Earth Science Data Systems Program provides open access to Earth science observations. NOAA’s National Water Center, the first federal facility dedicated to national water resource forecasting, operates the National Water Model, which continuously forecasts flows on 2.7 million stream reaches across the continental United States. The Catchment Attributes and Meteorology for Large-Sample Studies (CAMELS) dataset, currently hosted by the National Center for Atmospheric Research, provides data tailored for hydrological research on hundreds of river basins, and the Caravan framework pulls together multiple large-sample meteorological and hydrological datasets at a global scale.

What is missing is a unified, AI-ready repository that brings federal, state, and community data together.

What is missing is a unified, AI-ready repository that brings federal, state, and community data together. Building one is hard. Water data are fragmented, inconsistent, and often entirely absent. Consistent, reliable data for groundwater, withdrawals, reservoir operations, and water quality are especially difficult to obtain.

Local resistance to sharing data is real. In Texas, for example, landowners hold private property rights over groundwater and have opposed metering and reporting requirements imposed by groundwater conservation districts. In California, agricultural well owners fought metering mandates for years before the Sustainable Groundwater Management Act compelled local agencies to begin tracking withdrawals. Tribal nations face a different concern: Water data collected on Indigenous lands has been misrepresented in federal datasets that were modeled without accounting for Indian country, leading many nations to restrict access to their data as an exercise of sovereignty.

Practical steps toward building a unified AI-ready repository include tiered access and licensing for different stakeholders, clear provenance tracking for all data reported, financial and educational incentives for stakeholders for reporting, and targeted gap filling. Where measurements are missing, AI can fuse remote sensing with gauged records and operational logs—but only if the results carry honest uncertainty estimates tied to real decisions.

Get the corpus right, and it will outlive any single program name. It becomes infrastructure the country can lean on.

Develop Shared Hydrologic Foundation Models

The Genesis Mission EO directs the DOE to provide “domain-specific foundation models across the range of scientific domains covered.”

Hydrology has a head start. Long short-term memory (LSTM) networks are a key type of neural network designed to last thousands of time steps. Hydrology LSTMs trained on CAMELS data have already matched traditional conceptual models for daily streamflow discharge prediction. Open-source Neural Hydrology tools serve as baselines for regional runoff prediction. These predictions may serve as precursors to the foundation models the Genesis Mission envisions and building blocks from which they could be developed.

The process of scaling up these tools is not straightforward, however. A hydrologic investigation of snowmelt-driven streams in Colorado will not require the same spatiotemporal data as tile-drained fields in Iowa, for example. A hydrology-specific foundation model must take nuanced requirements into consideration and provide a clear path for managing and exploiting a variety of datasets.

Google’s Flood Hub shows what is possible: Its AI-enabled flood forecasts now cover more than 80 countries. However, Flood Hub’s core model code and trained weights remain proprietary, meaning researchers can use the forecasts but cannot rebuild or adapt the underlying models. Genesis, if well positioned, can fill that accessibility gap by producing foundation models for water that are reusable, reliable, and openly governed.

Build a National Water Digital Twin

The EO prescribes an integrated AI platform combining foundation models with simulation tools to stimulate AI-enabled innovations.

That architecture is exactly what a digital twin requires. Europe’s Destination Earth initiative is already building digital twins for weather extremes and nonstationary conditions on the Large Unified Modern Infrastructure (LUMI) supercomputer. The United Nations–led AI for Good initiative has prioritized water applications, warning that fragmented national efforts risk duplicating work.

If the United States aims for global strategic leadership in AI-accelerated science, water infrastructure cannot be an afterthought.

A water digital twin earns its keep when it makes the consequences of choices visible, in terms of flows, levels, temperatures, and risks to people and ecosystems.

Rather than starting from scratch, a water-centric Genesis Mission would unite existing federal models—the National Water Model, reservoir simulators, and groundwater codes—in a single digital twin. AI can become the thread that stitches them together, correcting biases and providing numerical solvers to enforce mass and energy balance.

What should this twin actually do? Help a dam operator decide whether to release water ahead of a storm. Tell planners where a new data center can draw cooling water without drying up a stream. Flag which coastal defenses will fail first under rising seas.

A water digital twin earns its keep when it makes the consequences of choices visible, in terms of flows, levels, temperatures, and risks to people and ecosystems.

Turn Basins into AI Test Beds

The Genesis Mission promotes AI-directed experimentation and directs the DOE to keep a record of robotic laboratories and production facilities in which such experimentation could be conducted. Hydrological field sites belong in that inventory. The National Ecological Observatory Network already operates 81 sites with standardized measurements of meteorology, surface water, groundwater, and biodiversity. The Critical Zone Collaborative Network instruments catchments to track water-soil-vegetation interactions over decades.

Formalizing these networks as AI test beds would link field observations back into the water digital twin so that experiments and models continually sharpen each other. Imagine mobile sensors steered by AI agents during a storm or aquifer recharge experiments designed by algorithms and verified in real time. That feedback loop is what separates a useful model from a decorative one.

Expand Water Challenges on the Genesis Mission List

The Exchange and What’s at Stake

Allowing water security to flow through the diverse components of the Genesis Mission would benefit both the policies championed by the mission itself and the hydrology community.

The Genesis Mission gets real-world, noisy test beds where AI proves value beyond benchmarks, a domain to stress test climate and infrastructure investments, and scientists trained in both AI and the stubborn realities of rivers, aquifers, and pipes.

Hydrology gets resources for shared data infrastructure, foundation models and instrumented basins no single lab can support, a seat when rules for AI and national scientific infrastructure are negotiated, and a chance to reset practices around openness, collaboration, and equity.

Earlier this year, the DOE released 26 Genesis Mission Science and Technology Challenges, and “Predicting U.S. Water for Energy” was among them. The accompanying funding call (DE-FOA-0003612) solicits proposals on cloud microphysics, coupled surface water–groundwater modeling, and seasonal to multiyear prediction, all framed around energy needs and flood resilience.

These inclusions are a significant win for a hydrology component to Genesis, but several urgent challenges sit outside their scope. Can AI close the gap between a flood forecast issued 12 hours out and the 48 hours emergency managers actually need? Can it map compound extremes, in which drought, heat, and infrastructure failure collide in the same week? Can it redesign monitoring networks so that coverage follows risk rather than where gauges happened to be installed a century ago? Integrating energy and water systems is equally urgent: Floods have caused 80% of major U.S. grid outages since 2000, while drought-driven water stress curtails cooling at thermoelectric plants and reduces hydropower output, exposing how deeply energy infrastructure depends on hydrologic extremes.

The water footprint of new AI infrastructure deserves a place on that list. A separate executive order (14318, “Accelerating Federal Permitting of Data Center Infrastructure”) is already fast-tracking expansion of data center construction, and a single hyperscale facility can consume 1 million to 5 million gallons of water daily. Emerging research shows how withdrawals at that scale can push streams below ecological thresholds during low flows.

Make Hydrology the Conscience of AI Governance

The EO directs the DOE to set data access rules and clarify policies for ownership, licensing, trade secret protections, and commercialization of products and tools associated with it.

Three principles should anchor such policies for AI use in water security.

First, Indigenous and community data rights must be embedded in every major AI water security effort, in line with the collective benefit, authority to control, responsibility, and ethics (CARE) principles for Indigenous data governance.

Second, AI’s own water footprint, through electricity generation and cooling, must be treated as a design constraint. Transparent reporting, stress-based siting, and efficiency targets will prevent hydrology in Genesis from being self-defeating.

Third, the DOE should define what failure looks like. Missing a flood crest portends loss of lives and livelihoods and breaches of treaties. Accountability standards must be measurable, and they must ask not just how accurate the forecast was on average, but who bore the cost when it was wrong.

A single executive order will not solve the country’s water security problems, and a single challenge topic will not either.

But the Genesis Mission has provided a seat at a table that did not exist 6 months ago. Whether the hydrology community treats it as a ceiling or a foundation depends on what happens next. Europe’s Destination Earth and the United Nations’ AI for Good water initiatives are already moving.

American hydrology now has a seat at the table. We should take it.

Recommended Resources

Carroll, S. R., et al. (2020), The CARE principles for Indigenous data governance, Data Sci. J., 19, 43, https://doi.org/10.5334/dsj-2020-043.

European Commission (2023), Destination Earth: Digital Twins and the Digital Twin Engine, Publ. Off. of the Eur. Union, Luxembourg, destination-earth.eu/destination-earth/destines-components/digital-twins-digital-twin-engine/.

Google Research (2024), Flood forecasting and Flood Hub, Google Research Technical Overview, sites.research.google/gr/floodforecasting/.

International Telecommunication Union (2024), AI for Good: Water and sanitation, aiforgood.itu.int/aifg-course/harnessing-ai-for-sustainable-innovation-sdg6-advancing-clean-water-and-sanitation/.

Kratzert, F., et al. (2019), Toward improved predictions in ungauged basins: Exploiting the power of machine learning, Water Resour. Res., 55, 11,344–11,354, https://doi.org/10.1029/2019WR026065.

Kratzert, F., et al. (2023), Caravan: A global community dataset for large-sample hydrology, Sci. Data, 10, 61, https://doi.org/10.1038/s41597-023-01975-w.

Li, P., et al. (2023), Making AI less “thirsty”: Uncovering and addressing the secret water footprint of AI models, Commun. ACM, 66, 28–31, cacm.acm.org/sustainability-and-computing/making-ai-less-thirsty/.

The White House (2025a), Accelerating Federal Permitting of Data Center Infrastructure, Executive Order 14318, Washington, D.C., www.whitehouse.gov/presidential-actions/2025/07/accelerating-federal-permitting-of-data-center-infrastructure.

The White House (2025b), Launching the Genesis Mission, Executive Order 14363, Washington, D.C., www.whitehouse.gov/presidential-actions/2025/11/launching-the-genesis-mission.

Xiao, T., et al. (2025), Environmental impact and net-zero pathways for sustainable artificial intelligence servers in the USA, Nat. Sustainability, 8, 1,541–1,553, https://doi.org/10.1038/s41893-025-01681-y.

Zhang, L., et al. (2025), Foundation models as assistive tools in hydrometeorology: Opportunities, challenges, and perspectives, Water Resour. Res., 61, e2024WR039553, https://doi.org/10.1029/2024WR039553.

Author Information

Amobichukwu C. Amanambu (acamanambu@ua.edu), Department of Geography and the Environment, The University of Alabama, Tuscaloosa; and Jonathan Frame (jmframe@ua.edu), Department of Geological Sciences, The University of Alabama, Tuscaloosa

Citation: Amanambu, A. C., and J. Frame (2026), The Genesis Mission needs hydrology: Here’s how to incorporate it, Eos, 107, https://doi.org/10.1029/2026EO260131. Published on 28 April 2026.
This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s).
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Trump Terminates Entire National Science Board Grace van Deelen
    Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today. The Trump Administration has terminated the positions of every member of an independent board meant to govern the National Science Foundation (NSF). The National Science Board directs and approves large funding decisions for NSF’s approximately $9 billion basic science research budget. It is meant to function ind
     

Trump Terminates Entire National Science Board

27 April 2026 at 14:44
Silhouettes of people in lavender and periwinkle stand, some overlapping, on a aubergine-colored background. Overlying the image at the bottom is the text “R&D Research and Developments.”

Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The Trump Administration has terminated the positions of every member of an independent board meant to govern the National Science Foundation (NSF).

The National Science Board directs and approves large funding decisions for NSF’s approximately $9 billion basic science research budget. It is meant to function independently from the federal administration to keep science funding insulated from political pressure and budget cycles.

“I have watched the systematic dismantling of the scientific advisory infrastructure of this government with growing alarm, and the National Science Board is simply the latest casualty.”

In a 24 April notice from the Presidential Personnel Office, all the scientists serving on the board were informed their positions had been eliminated. The emails dismissing board members provided no reason for the termination.

“I am deeply disappointed, though I cannot say I am entirely surprised,” Willie E. May, one of the terminated board members and vice president of research and economic development at Morgan State University in Maryland, told The New York Times

“I have watched the systematic dismantling of the scientific advisory infrastructure of this government with growing alarm, and the National Science Board is simply the latest casualty,” he said. 

Ranking member of the House Committee on Science, Space, and Technology Zoe Lofgren (D-CA) called the terminations “the latest stupid move made by a president who continues to harm science and American innovation.”  

The terminations come after a year that shocked higher education and research budgets. Last year, NSF granted 51% less funding to scientists than the 2015-2024 average and terminated hundreds of active grants. Last May, the Trump administration proposed cutting $5 billion from NSF’s budget, though the proposal was rejected. The president’s budget request for fiscal year 2027 once again proposes to reduce the foundation’s budget by more than half. In a February 2026 meeting of the National Science Board, NSF leadership said the foundation was seeking to reduce grant solicitations.

The Trump administration has also restructured scientific advisory groups elsewhere in the federal government, eliminating 152 federal advisory committees at science agencies, merging all of the Department of Energy’s advisory committees into one and dismantling the Environmental Protection Agency’s research office.

“Without a functional National Science Board in the near term, the agency is left without the guidance and oversight of independent experts, and the public is left without information on how NSF is carrying out its mission,” Gretchen Goldman, president and CEO of the Union of Concerned Scientists, wrote in a blog post about the terminations. 

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org.

A photo of a hand holding a copy of an issue of Eos appears in a circle over a field of blue along with the Eos logo and the following text: Support Eos’s mission to broadly share science news and research. Below the text is a darker blue button that reads “donate today.”
Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Tracing the Path of PFAS Across Antarctica Rebecca Owen
    Seeking Solutions to PFAS Pollution Chemical Companies Are Churning Out New PFAS. Where in the World Are They Ending Up? The Persistence of PFAS A Peculiar Polymer Paired with Sunlight Could Remove PFAS Tracing the Path of PFAS Across Antarctica Pollution Is Rampant. We Might As Well Make Use of It. Per- and polyfluoroalkyl substances (or PFAS) have been widely used in thousands of common nonstick, waterproof, or stain-resistant products since the 1950s. These “forever c
     

Tracing the Path of PFAS Across Antarctica

27 April 2026 at 13:16
An iceberg sits in a rough, partially frozen sea near Antarctica.

Per- and polyfluoroalkyl substances (or PFAS) have been widely used in thousands of common nonstick, waterproof, or stain-resistant products since the 1950s. These “forever chemicals” do not break down easily: PFAS make their way into the air, soil, and water, as well as into human and animal bloodstreams and to some of Earth’s most pristine environments. They have been detected even in Antarctica, despite its reputation as a relatively untouched landscape far from the types of products—fast-food wrappers, firefighting foam, nonstick cookware—that contain PFAS.

Research into how PFAS arrive in Antarctica is limited, and most tends to focus on the continent’s coasts, rather than its interior. A new study published in Science Advances aimed to fill some of these gaps by examining PFAS accumulation across a 1,200-kilometer stretch of Antarctica, from the snow pits of Zhongshan Station in East Antarctica to the 4,093-meter peak of Dome A. By examining layers of snow collected from the coast to the interior, researchers sought to better track and understand how PFAS levels vary by location and how these forever chemicals have been able to travel long distances through the upper atmosphere to be deposited in remote regions.

“For substances to get there, they have to be able to transport long distances,” said Ian Cousins, a chemist at Stockholm University and one of the study’s authors. “We know PFAS are very persistent, so that helps. By looking at the patterns of the PFAS contamination in the samples, it gives us clues as to how they’re transported.”

PFAS Arrive by Air and by Sea

Along the 1,200-kilometer route, researchers from the Chinese Academy of Sciences collected 39 snow samples at 30-kilometer intervals, scraping the first few centimeters of snow from the surface to analyze for traces of PFAS.

Zhongshan Station sits near Prydz Bay, and there, researchers collected snow from a 1-meter-deep pit, with samples taken every 5 centimeters. At Dome A, the summit of the East Antarctic Ice Sheet, samples were collected at 10-centimeter intervals from another snow pit; this one was 3 meters deep, providing information about the past several decades of PFAS use.

“It’s quite interesting that we see the historical production record of PFAS in this pit on the top of this mountain in Antarctica,” said Cousins.

PFAS pollution arrives in Antarctica in two ways: via upper atmospheric transport and sea spray. Some PFAS are formed in the atmosphere when volatile precursor chemicals like fluorotelomer alcohols used in textile and paper products break down through reactions with sunlight and oxidants into more stable compounds. The resulting PFAS are later deposited into the snow and ice through precipitation.

Storm winds near the coast create sea spray. “When you have waves, it makes bubbles in the ocean. When the bubbles burst, these sea spray aerosols can be super enriched with PFAS. This has been shown to be a very important transport route,” Cousins said.

To distinguish between sources, researchers measured sodium in the snow to trace the ocean’s salty influence. Sodium levels decreased farther inland, reflecting the fading influence of sea spray toward the interior of the continent. But surprisingly, PFAS concentrations actually increased moving from the coast into the interior.

“That was kind of a bit counterintuitive to me,” explained Cousins, who said he expected PFAS levels to be highest near the coast. “You see the opposite, actually.”

The inland increase is likely explained by higher snowfall totals in the coastal regions, which lead to PFAS concentrations becoming diluted. Inland, where snowfall is lower, even small amounts of PFAS can result in relatively higher concentrations within snow samples.

Additional factors shape PFAS distribution. PFAS levels are higher at the onset of precipitation events when they are rapidly removed from the air. Temperature inversions, too, can trap chemicals. In coastal areas, PFAS are more influenced by sea spray in the winter, whereas stronger sunlight drives the degradation of atmospheric precursors into PFAS in the summer months.

PFAS Presence at Both Poles

This new study also offers implications for the way that PFAS circulate globally. Though industrial activity in the Northern Hemisphere contributes most heavily to PFAS emissions, large-scale atmospheric circulation allows these compounds to reach polar regions. Rapid transport in the upper troposphere may act as an efficient pathway to shuttle PFAS across both hemispheres before they are deposited in the cold, remote regions at both ends of Earth.

“This completes the global picture with agreeing measurements at both poles, solidifying our understanding of the global distribution and drivers of PFAS contamination.”

Even though PFAS levels are higher in the Arctic, both polar regions show similar trends in PFAS concentrations since the 1990s. “It really matches decades of the same records that have been reported from the Arctic,” said Cora Young, an atmospheric chemist at York University, who was not involved in the new study.

“This completes the global picture with agreeing measurements at both poles, solidifying our understanding of the global distribution and drivers of PFAS contamination. The role of CFC [chlorofluorocarbon] replacements, changes in regulation, all of these things are important in the Northern Hemisphere and also the Southern Hemisphere,” said Young.

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2026), Tracing the path of PFAS across Antarctica, Eos, 107, https://doi.org/10.1029/2026EO260129. Published on 27 April 2026.
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
❌