Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.
In the contiguous United States, crop irrigation, municipal water supplies, and thermoelectric power generation use more than 224 billion gallons of fresh water every day. Conducting water research or making decisions about water use, until now, often required referencing datasets across various agencies. The U.S. Geological Survey (USGS) National
Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.
In the contiguous United States, crop irrigation, municipal water supplies, and thermoelectric power generation use more than 224 billion gallons of fresh water every day. Conducting water research or making decisions about water use, until now, often required referencing datasets across various agencies. The U.S. Geological Survey (USGS) National Water Availability Assessment Data Companion (NWDC), announced this week, aims to streamline this process. In part, the tool is designed to help decisionmakers better understand the balance between how high demand and limited supply affect water availability in their communities.
“While the United States has abundant water nationally, regional imbalances between supply and demand may create water challenges affecting millions of Americans,” said lead scientist Shirley Leung in a USGSpress release. “What once required significant resources and time can now be done in minutes, giving communities of all sizes the same foundation for water planning.”
The lower 48 states are home to about 80,000 sub-watersheds, from those in the arid southwest to the Great Lakes Basin, where about 84% of North America’s surface fresh water is located. According to the USGS, the NWDC is the first tool that integrates information about water availability in individual watersheds at a national scale.
The tool is designed to complement Water Data for the Nation (WDFN), another USGS product that consolidates observational data from the agency’s thousands of local monitoring stations gathering data on streams, lakes, reservoirs, precipitation, water quality, and groundwater. The new tool uses modeling to fill in spatial and temporal gaps between the observations made at these stations.
Water managers, researchers, agricultural experts, and others can use the NWDC to compare watershed conditions, identify seasonal patterns in water use, or to create data visualizations of statewide water use, for example. Though the tool currently covers only the contiguous United States, it will soon be extended to Alaska, Hawaii, and Puerto Rico, according to the USGS.
David Tarboton, a professor of civil engineering at the Utah Water Research Laboratory, said he was “intrigued” by the new tool, and is interested in examining the data its model produces.
While Tarboton was disappointed that the tool’s most recent available data are from 2020, “having a sort of integrated, wall-to-wall dataset that’s consistently produced is very valuable,” he said. He works, in part, in the areas of hydroinformatics and data sharing, and noted that the modern methods the agency is using to share the data could be useful in developing automated tools.
These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org.
The spread of antibiotic resistance, a growing threat to global health that causes millions of deaths annually, is typically blamed on the overuse of drugs in hospitals and in the food industry. However, a new study published in Nature Microbiology suggests that normal geological processes could be accelerating the development of new resistances.
Soil microorganisms naturally produce antibiotics as a form of chemical warfare to compete with each other. When soils dry out, these natural compo
The spread of antibiotic resistance, a growing threat to global health that causes millions of deaths annually, is typically blamed on the overuse of drugs in hospitals and in the food industry. However, a new study published in Nature Microbiology suggests that normal geological processes could be accelerating the development of new resistances.
Soil microorganisms naturally produce antibiotics as a form of chemical warfare to compete with each other. When soils dry out, these natural compounds become more concentrated because there is less water to dilute them. Like a dosage increase, this concentration can create a harsher environment, killing sensitive microbes and sparing those with the capacity to resist. This phenomenon, in turn, is an evolutive driver that favors the appearance of new and more effective resistance genes.
“If you have more antibiotics in your environment, only the organisms that can withstand it…can resist it.”
To test whether this mechanism is having real genetic effects, Xiaoyu Shan, a microbial ecologist and postdoctoral researcher at the California Institute of Technology (Caltech), and colleagues looked at soil samples under controlled conditions as the samples transitioned from a wet state to a desiccated one. They found that as the soil dried, the presence of genes related to antibiotic production and resistance spiked, suggesting that drought leads to a rapid escalation in the subterranean biological arms race. Importantly, they did not look for pathogenic bacteria specifically, only for resistance genes, which can be present in a variety of microbes, whether those microbes are pathogenic or not.
“Drought leads to this elevation of antibiotic producers and bacteria that are resistant,” said team member Dianne Newman, a professor of biology and geobiology also at Caltech. “It’s a pretty simple idea: If you have more antibiotics in your environment, only the organisms that can withstand it…can resist it.”
Alternative Explanations
However, there could be other potential explanations for the observed increase in antibiotic-producing and antibiotic resistance genes, according to Enrique Monte, a microbiologist at the Universidad de Salamanca in Spain who wasn’t involved with the new study. For instance, arid soils are naturally more diverse than humid soils, making it common to find a more diverse gene pool in the ground, Monte said. In addition, the mere presence of antibiotic genes might not result in an actual release to the environment, or a release could happen in dosages that are too small to cause noticeable effects. “There are antibiotics that are volatile; they escape into the air, so they never reach a therapeutic concentration to kill others,” Monte said.
The authors, however, took some precautions to show that the increase in antibiotic resistance genes was actually a biological response to environmental stress. For instance, they also tracked other genes that should remain unaffected or decline under desiccation. As expected, genes that are needed for basic survival remained stable, while genes responsible for bacterial movement declined in dry soil, where mobility is restricted. Even some species that were not favored by desiccation saw an increase in resistance-related genes, “which is even stronger evidence,” Shan said.
Geographic Limitations
As the researchers combed through publicly available metagenomic data libraries, they had to select collections with strict control of all variables and in which the only changing factor was water content. That limited the analysis to five locations: two grasslands and a sorghum field in California; a forest in Valais, Switzerland; and a wetland in Nanchang, China.
The scarcity of locations might limit how extrapolable these results are, said Fiona Walsh, a microbiologist at Maynooth University in Ireland who was not involved with the work. “There are thousands of high-quality metagenomes available online with excellent metadata. I would really like to see a comparison where they apply their analysis to a broader map of global metagenomic data to see if they reach the same conclusions,” she said.
From the Soil to the Hospital
Drier regions consistently showed a higher number of resistant bacteria cases in hospitals, even after adjusting for confounding factors such as local income.
The study also suggests that dry soils might be a hidden driver of clinical cases of antibiotic resistance worldwide. The authors combined hospital data on the number of cases of resistant infections from 116 countries with the local aridity index, which measures temperature and precipitation, for each location. They found a strong correlation: Drier regions consistently showed a higher number of resistant bacteria cases in hospitals, even after adjusting for confounding factors such as local income.
However, the authors admitted that this is only a correlation effect and doesn’t prove causation. “It motivates follow-up research to see how environmental concentration weighs against human overuse and poor stewardship,” Newman said.
Even this correlation could be a stretch, according to microbiologist Sara Soto, head of the Global Viral and Bacterial Infections Programme at the Instituto de Salud Global de Barcelona. At the end of the day, she said, the authors have soil data from only five locations in three countries, and they are not tracking the specific bacterial varieties that make people sick, only resistance genes.
For the thesis to be solid, Soto said, the ideal approach would have been to contrast hospital strains from a specific area with soil data from that same region during the same drought episode. “Making such a vast inference—that what happens in the soil of one location affects what happens in a hospital elsewhere—is a big leap,” she said.
The authors, however, point out that resistance genes from soils can eventually make their way into human pathogens. Microbes have the capacity to share genetic material across species—a process known as horizontal gene transfer. In their analysis, the team identified specific resistance sequences that appeared to have been transferred between soil bacteria relatively recently, perhaps within the past decade. How they are reaching hospitals remains a matter for a future study, they said.
As droughts increase in numerous regions in the face of climate change, this selective pressure within soil ecosystems is expected to intensify. Though these findings do not show that drought directly puts drug-resistant pathogens in hospitals, they still suggest that a drying climate could set the scene for an increase in antibiotic resistance, the researchers report.
Citation: Barbuzano, J. (2026), Antibiotic resistance might get a boost from droughts, Eos, 107, https://doi.org/10.1029/2026EO260132. Published on 29 April 2026.
Editors’ Vox is a blog from AGU’s Publications Department.
The supply of magma from the Earth’s mantle is a primary source of heat to volcanic arc crust, where the heat is then dissipated through various processes. Much of this magmatic heat is dissipated as heated water, or aqueous fluid.
A new article in Reviews of Geophysics compares 11 different volcanic-arc segments where heat discharge via aqueous fluid has been well-inventoried to better understand the factors that influence this p
The supply of magma from the Earth’s mantle is a primary source of heat to volcanic arc crust, where the heat is then dissipated through various processes. Much of this magmatic heat is dissipated as heated water, or aqueous fluid.
A new article in Reviews of Geophysics compares 11 different volcanic-arc segments where heat discharge via aqueous fluid has been well-inventoried to better understand the factors that influence this process. Here, we asked the authors to give an overview of heat discharge from volcanic arcs, how scientists measure it, and what questions remain.
Why is it important to study how heat is dissipated from volcanic arcs?
The heat from these magmas matters for geothermal energy, patterns of groundwater flow, and the patterns of volcanic activity at the surface.
Volcanic arcs are the chains of volcanoes on top of subduction zones. They can produce some of Earth’s most explosive and hazardous eruptions. But much of the magma beneath the surface never erupts. Nevertheless, the heat from these magmas—and the simple fact of their existence and abundance—still matters for geothermal energy, patterns of groundwater flow, and the patterns of volcanic activity at the surface.
What are the main modes in which heat is discharged from volcanic arcs?
Heat at volcanic arcs can be carried by magmas, transmitted through the crust conductively, and carried by water seeping slowly through the crust. At the base of the crust, magmas are probably most important, with conduction coming in second. But as magmas move upwards through the crust, some of them solidify and impart their heat to their surroundings where it is transferred by conduction. Within a few kilometers of the surface, fluids seeping through the crust begin to take up all that heat, and so if we can quantify the heat carried by those fluids, we can retrace it to its origins in magmas.
How do scientists measure these different forms of heat loss?
Scientists measure the heat carried by erupting magmas using satellites, or by adding up the erupted masses and making an estimate of how much energy was released by cooling from eruption temperatures to solid igneous rocks at Earth’s surface. Conductive heat flow is measured by drilling holes in the Earth’s crust to see how quickly it gets hotter with depth.
Measuring the heat carried by aqueous fluids in the crust is in some ways the trickiest. One approach is to find all the springs where hot or even slightly warm water is trickling out and measure the temperature and discharge to estimate how much extra heat that water is carrying.
What are the challenges and uncertainties in measuring hydrothermal heat discharge?
One challenge is that many springs are only slightly warmer than you’d expect. There is good data for many hot springs, but there are data tracking these ‘slightly warm’ springs for only a subset of arcs. Another challenge is that warm underground fluids can flow laterally, so you have to try to account for that. This is not an uncertainty in hydrothermal discharge, but one additional big uncertainty for our study, where we were trying to quantify the proportion of magmas that freeze underground versus erupting, is in the estimates of how much magma has actually erupted through time.
What are some of the factors that influence hydrothermal heat loss?
A major goal of our paper is to try to quantify these hidden magmas.
A main factor that influences hydrothermal heat loss is the magmas that solidify underground. This link is the key motivation for our study. A major goal of our paper is to try to quantify these hidden magmas—how much magma needs to intrude the crust beneath the surface to supply the hydrothermal heat fluxes that we observe? The composition of magmas influences how much heat they can release. The depth at which magmas are emplaced also matters, because magmas that intrude the shallow crust eventually cool to lower temperatures than magmas emplaced in the lower crust and therefore release more heat.
What are the remaining questions or knowledge gaps where additional research efforts are needed?
A big outstanding challenge is combining estimates from hydrothermal data of how much magma is coming into the crust – like ours – with other approaches to estimating the same thing. The magmatic systems beneath volcanoes span the crust. At the base of the crust, you have magma supply, sort of like the water main feeding your plumbing system. Despite how fundamental magma supply is, we know remarkably little about it. It’s exciting to think about how the rates of magma supply could vary through time and space and why. Applying a range of techniques—including geophysical imaging, hydrothermal budgets, gas and igneous geochemistry, and petrology—to understand these questions could really be a game changer.
Editor’s Note: It is the policy of AGU Publications to invite the authors of articles published in Reviews of Geophysics to write a summary for Eos Editors’ Vox.
Citation: Black, B. A., S. E. Ingebritsen, and K. Sawayama (2026), Hydrothermal heat flow as a window into subsurface arc magmas, Eos, 107, https://doi.org/10.1029/2026EO265017. Published on 28 April 2026.
This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s).
Every chip fabricated in a semiconductor plant needs ultrapure water. Most nuclear reactors need water as a coolant and neutron moderator. Every artificial intelligence (AI) data center drinks between 1 million and 5 million gallons of water a day, with thirst often peaking during drought.
Water runs through every technology priority the United States has named, yet the word does not appear once in “Launching the Genesis Mission,” an executive order (EO) released in November 2025. As describ
Every chip fabricated in a semiconductor plant needs ultrapure water. Most nuclear reactors need water as a coolant and neutron moderator. Every artificial intelligence (AI) data center drinks between 1 million and 5 million gallons of water a day, with thirst often peaking during drought.
Water runs through every technology priority the United States has named, yet the word does not appear once in “Launching the Genesis Mission,” an executive order (EO) released in November 2025. As described in the EO, the Genesis Mission is a “dedicated, coordinated national effort to unleash a new age of AI-accelerated innovation and discovery that can solve the most challenging problems of this century.”
Led by the Department of Energy (DOE), the initiative aims to build an integrated AI framework that would harness federal scientific datasets to accelerate breakthroughs in advanced manufacturing, biotechnology, critical materials, nuclear fission and fusion energy, quantum information science, and semiconductor development. The scope of the mission is comparable to that of the Manhattan Project.
Since the announcement, the DOE has listed “Predicting U.S. Water for Energy” among its 26 Genesis Mission Science and Technology Challenges. The project is also soliciting proposals in three water-related focus areas.
This framework provides a foothold for hydrology in the Genesis Mission, but it is scoped narrowly around water as a supply variable for energy production.
In reality, water is a crosscutting constraint that will help determine whether the mission’s priorities translate into deployable outcomes. The hydrology community now has a seat at the table, and if it moves first and positions water security as one of the “most challenging problems of this century,” the Genesis Mission can become the sandbox in which AI reshapes how the country measures, models, and manages water.
Making this happen will require that the DOE and the Office of Science and Technology Policy charter a hydrology workstream inside the Genesis Mission, with interagency delivery involving the U.S. Geological Survey (USGS), NOAA, the Bureau of Reclamation, the EPA, and partners at state, regional, and community levels. Here is what we think that workstream should look like:
A water-centric Genesis Mission architecture supports seven hydrological components that both feed into and receive decisions from the Genesis AI platform. Each component maps to a section of this article. Credit: Amobichukwu C. Amanambu. Click image for larger version.
While the existing challenges reflect some of these components, others will require coordinated effort from the hydrology community to bring into the Genesis Mission’s scope.
Build the Water Corpus Genesis Will Need
The Genesis Mission EO instructs the DOE to create an American Science and Security Platform to provide the public, scientists, agencies, and policymakers access to crucial scientific datasets.
The good news is that accessible water data systems already exist across several federal agencies and academic research centers. The USGS National Water Information System tracks real-time and historical water quality and use across the country. NASA’s Earth Science Data Systems Program provides open access to Earth science observations. NOAA’s National Water Center, the first federal facility dedicated to national water resource forecasting, operates the National Water Model, which continuously forecasts flows on 2.7 million stream reaches across the continental United States. The Catchment Attributes and Meteorology for Large-Sample Studies (CAMELS) dataset, currently hosted by the National Center for Atmospheric Research, provides data tailored for hydrological research on hundreds of river basins, and the Caravan framework pulls together multiple large-sample meteorological and hydrological datasets at a global scale.
What is missing is a unified, AI-ready repository that brings federal, state, and community data together.
What is missing is a unified, AI-ready repository that brings federal, state, and community data together. Building one is hard. Water data are fragmented, inconsistent, and often entirely absent. Consistent, reliable data for groundwater, withdrawals, reservoir operations, and water quality are especially difficult to obtain.
Local resistance to sharing data is real. In Texas, for example, landowners hold private property rights over groundwater and have opposed metering and reporting requirements imposed by groundwater conservation districts. In California, agricultural well owners fought metering mandates for years before the Sustainable Groundwater Management Act compelled local agencies to begin tracking withdrawals. Tribal nations face a different concern: Water data collected on Indigenous lands has been misrepresented in federal datasets that were modeled without accounting for Indian country, leading many nations to restrict access to their data as an exercise of sovereignty.
Practical steps toward building a unified AI-ready repository include tiered access and licensing for different stakeholders, clear provenance tracking for all data reported, financial and educational incentives for stakeholders for reporting, and targeted gap filling. Where measurements are missing, AI can fuse remote sensing with gauged records and operational logs—but only if the results carry honest uncertainty estimates tied to real decisions.
Get the corpus right, and it will outlive any single program name. It becomes infrastructure the country can lean on.
Develop Shared Hydrologic Foundation Models
The Genesis Mission EO directs the DOE to provide “domain-specific foundation models across the range of scientific domains covered.”
Hydrology has a head start. Long short-term memory (LSTM) networks are a key type of neural network designed to last thousands of time steps. Hydrology LSTMs trained on CAMELS data have already matched traditional conceptual models for daily streamflow discharge prediction. Open-source Neural Hydrology tools serve as baselines for regional runoff prediction. These predictions may serve as precursors to the foundation models the Genesis Mission envisions and building blocks from which they could be developed.
The process of scaling up these tools is not straightforward, however. A hydrologic investigation of snowmelt-driven streams in Colorado will not require the same spatiotemporal data as tile-drained fields in Iowa, for example. A hydrology-specific foundation model must take nuanced requirements into consideration and provide a clear path for managing and exploiting a variety of datasets.
Google’s Flood Hub shows what is possible: Its AI-enabled flood forecasts now cover more than 80 countries. However, Flood Hub’s core model code and trained weights remain proprietary, meaning researchers can use the forecasts but cannot rebuild or adapt the underlying models. Genesis, if well positioned, can fill that accessibility gap by producing foundation models for water that are reusable, reliable, and openly governed.
Build a National Water Digital Twin
The EO prescribes an integrated AI platform combining foundation models with simulation tools to stimulate AI-enabled innovations.
That architecture is exactly what a digital twin requires. Europe’s Destination Earth initiative is already building digital twins for weather extremes and nonstationary conditions on the Large Unified Modern Infrastructure (LUMI) supercomputer. The United Nations–led AI for Good initiative has prioritized water applications, warning that fragmented national efforts risk duplicating work.
If the United States aims for global strategic leadership in AI-accelerated science, water infrastructure cannot be an afterthought.
A water digital twin earns its keep when it makes the consequences of choices visible, in terms of flows, levels, temperatures, and risks to people and ecosystems.
Rather than starting from scratch, a water-centric Genesis Mission would unite existing federal models—the National Water Model, reservoir simulators, and groundwater codes—in a single digital twin. AI can become the thread that stitches them together, correcting biases and providing numerical solvers to enforce mass and energy balance.
What should this twin actually do? Help a dam operator decide whether to release water ahead of a storm. Tell planners where a new data center can draw cooling water without drying up a stream. Flag which coastal defenses will fail first under rising seas.
A water digital twin earns its keep when it makes the consequences of choices visible, in terms of flows, levels, temperatures, and risks to people and ecosystems.
Turn Basins into AI Test Beds
The Genesis Mission promotes AI-directed experimentation and directs the DOE to keep a record of robotic laboratories and production facilities in which such experimentation could be conducted. Hydrological field sites belong in that inventory. The National Ecological Observatory Network already operates 81 sites with standardized measurements of meteorology, surface water, groundwater, and biodiversity. The Critical Zone Collaborative Network instruments catchments to track water-soil-vegetation interactions over decades.
Formalizing these networks as AI test beds would link field observations back into the water digital twin so that experiments and models continually sharpen each other. Imagine mobile sensors steered by AI agents during a storm or aquifer recharge experiments designed by algorithms and verified in real time. That feedback loop is what separates a useful model from a decorative one.
Expand Water Challenges on the Genesis Mission List
The Exchange and What’s at Stake
Allowing water security to flow through the diverse components of the Genesis Mission would benefit both the policies championed by the mission itself and the hydrology community.
The Genesis Mission gets real-world, noisy test beds where AI proves value beyond benchmarks, a domain to stress test climate and infrastructure investments, and scientists trained in both AI and the stubborn realities of rivers, aquifers, and pipes.
Hydrology gets resources for shared data infrastructure, foundation models and instrumented basins no single lab can support, a seat when rules for AI and national scientific infrastructure are negotiated, and a chance to reset practices around openness, collaboration, and equity.
Earlier this year, the DOE released 26 Genesis Mission Science and Technology Challenges, and “Predicting U.S. Water for Energy” was among them. The accompanying funding call (DE-FOA-0003612) solicits proposals on cloud microphysics, coupled surface water–groundwater modeling, and seasonal to multiyear prediction, all framed around energy needs and flood resilience.
These inclusions are a significant win for a hydrology component to Genesis, but several urgent challenges sit outside their scope. Can AI close the gap between a flood forecast issued 12 hours out and the 48 hours emergency managers actually need? Can it map compound extremes, in which drought, heat, and infrastructure failure collide in the same week? Can it redesign monitoring networks so that coverage follows risk rather than where gauges happened to be installed a century ago? Integrating energy and water systems is equally urgent: Floods have caused 80% of major U.S. grid outages since 2000, while drought-driven water stress curtails cooling at thermoelectric plants and reduces hydropower output, exposing how deeply energy infrastructure depends on hydrologic extremes.
The water footprint of new AI infrastructure deserves a place on that list. A separate executive order (14318, “Accelerating Federal Permitting of Data Center Infrastructure”) is already fast-tracking expansion of data center construction, and a single hyperscale facility can consume 1 million to 5 million gallons of water daily. Emerging research shows how withdrawals at that scale can push streams below ecological thresholds during low flows.
Make Hydrology the Conscience of AI Governance
The EO directs the DOE to set data access rules and clarify policies for ownership, licensing, trade secret protections, and commercialization of products and tools associated with it.
Three principles should anchor such policies for AI use in water security.
First, Indigenous and community data rights must be embedded in every major AI water security effort, in line with the collective benefit, authority to control, responsibility, and ethics (CARE) principles for Indigenous data governance.
Second, AI’s own water footprint, through electricity generation and cooling, must be treated as a design constraint. Transparent reporting, stress-based siting, and efficiency targets will prevent hydrology in Genesis from being self-defeating.
Third, the DOE should define what failure looks like. Missing a flood crest portends loss of lives and livelihoods and breaches of treaties. Accountability standards must be measurable, and they must ask not just how accurate the forecast was on average, but who bore the cost when it was wrong.
A single executive order will not solve the country’s water security problems, and a single challenge topic will not either.
But the Genesis Mission has provided a seat at a table that did not exist 6 months ago. Whether the hydrology community treats it as a ceiling or a foundation depends on what happens next. Europe’s Destination Earth and the United Nations’ AI for Good water initiatives are already moving.
American hydrology now has a seat at the table. We should take it.
Kratzert, F., et al. (2019), Toward improved predictions in ungauged basins: Exploiting the power of machine learning, Water Resour. Res., 55, 11,344–11,354, https://doi.org/10.1029/2019WR026065.
Xiao, T., et al. (2025), Environmental impact and net-zero pathways for sustainable artificial intelligence servers in the USA, Nat. Sustainability, 8, 1,541–1,553, https://doi.org/10.1038/s41893-025-01681-y.
Zhang, L., et al. (2025), Foundation models as assistive tools in hydrometeorology: Opportunities, challenges, and perspectives, Water Resour. Res., 61, e2024WR039553, https://doi.org/10.1029/2024WR039553.
Author Information
Amobichukwu C. Amanambu (acamanambu@ua.edu), Department of Geography and the Environment, The University of Alabama, Tuscaloosa; and Jonathan Frame (jmframe@ua.edu), Department of Geological Sciences, The University of Alabama, Tuscaloosa
Citation: Amanambu, A. C., and J. Frame (2026), The Genesis Mission needs hydrology: Here’s how to incorporate it, Eos, 107, https://doi.org/10.1029/2026EO260131. Published on 28 April 2026.
This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s).
Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.
The Trump Administration has terminated the positions of every member of an independent board meant to govern the National Science Foundation (NSF).
The National Science Board directs and approves large funding decisions for NSF’s approximately $9 billion basic science research budget. It is meant to function ind
Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.
The Trump Administration has terminated the positions of every member of an independent board meant to govern the National Science Foundation (NSF).
The National Science Board directs and approves large funding decisions for NSF’s approximately $9 billion basic science research budget. It is meant to function independently from the federal administration to keep science funding insulated from political pressure and budget cycles.
“I have watched the systematic dismantling of the scientific advisory infrastructure of this government with growing alarm, and the National Science Board is simply the latest casualty.”
In a 24 April notice from the Presidential Personnel Office, all the scientists serving on the board were informed their positions had been eliminated. The emails dismissing board members provided no reason for the termination.
“I am deeply disappointed, though I cannot say I am entirely surprised,” Willie E. May, one of the terminated board members and vice president of research and economic development at Morgan State University in Maryland, told The New York Times.
“I have watched the systematic dismantling of the scientific advisory infrastructure of this government with growing alarm, and the National Science Board is simply the latest casualty,” he said.
Ranking member of the House Committee on Science, Space, and Technology Zoe Lofgren (D-CA) called the terminations “the latest stupid move made by a president who continues to harm science and American innovation.”
“Without a functional National Science Board in the near term, the agency is left without the guidance and oversight of independent experts, and the public is left without information on how NSF is carrying out its mission,” Gretchen Goldman, president and CEO of the Union of Concerned Scientists, wrote in a blog post about the terminations.
These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org.
Per- and polyfluoroalkyl substances (or PFAS) have been widely used in thousands of common nonstick, waterproof, or stain-resistant products since the 1950s. These “forever chemicals” do not break down easily: PFAS make their way into the air, soil, and water, as well as into human and animal bloodstreams and to some of Earth’s most pristine environments. They have been detected even in Antarctica, despite its reputation as a relatively untouched landscape far from the types of products—fast-fo
Per- and polyfluoroalkyl substances (or PFAS) have been widely used in thousands of common nonstick, waterproof, or stain-resistant products since the 1950s. These “forever chemicals” do not break down easily: PFAS make their way into the air, soil, and water, as well as into human and animal bloodstreams and to some of Earth’s most pristine environments. They have been detected even in Antarctica, despite its reputation as a relatively untouched landscape far from the types of products—fast-food wrappers, firefighting foam, nonstick cookware—that contain PFAS.
“We know PFAS are very persistent, so that helps. By looking at the patterns of the PFAS contamination in [Antarctic snow] samples, it gives us clues as to how they’re transported.”
Research into how PFAS arrive in Antarctica is limited, and most tends to focus on the continent’s coasts, rather than its interior. A new study published in Science Advances aimed to fill some of these gaps by examining PFAS accumulation across a 1,200-kilometer stretch of Antarctica, from the snow pits of Zhongshan Station in East Antarctica to the 4,093-meter peak of Dome A. By examining layers of snow collected from the coast to the interior, researchers sought to better track and understand how PFAS levels vary by location and how these forever chemicals have been able to travel long distances through the upper atmosphere to be deposited in remote regions.
“For substances to get there, they have to be able to transport long distances,” said Ian Cousins, a chemist at Stockholm University and one of the study’s authors. “We know PFAS are very persistent, so that helps. By looking at the patterns of the PFAS contamination in the samples, it gives us clues as to how they’re transported.”
PFAS Arrive by Air and by Sea
Along the 1,200-kilometer route, researchers from the Chinese Academy of Sciences collected 39 snow samples at 30-kilometer intervals, scraping the first few centimeters of snow from the surface to analyze for traces of PFAS.
Zhongshan Station sits near Prydz Bay, and there, researchers collected snow from a 1-meter-deep pit, with samples taken every 5 centimeters. At Dome A, the summit of the East Antarctic Ice Sheet, samples were collected at 10-centimeter intervals from another snow pit; this one was 3 meters deep, providing information about the past several decades of PFAS use.
“It’s quite interesting that we see the historical production record of PFAS in this pit on the top of this mountain in Antarctica,” said Cousins.
PFAS pollution arrives in Antarctica in two ways: via upper atmospheric transport and sea spray. Some PFAS are formed in the atmosphere when volatile precursor chemicals like fluorotelomer alcohols used in textile and paper products break down through reactions with sunlight and oxidants into more stable compounds. The resulting PFAS are later deposited into the snow and ice through precipitation.
Storm winds near the coast create sea spray. “When you have waves, it makes bubbles in the ocean. When the bubbles burst, these sea spray aerosols can be super enriched with PFAS. This has been shown to be a very important transport route,” Cousins said.
To distinguish between sources, researchers measured sodium in the snow to trace the ocean’s salty influence. Sodium levels decreased farther inland, reflecting the fading influence of sea spray toward the interior of the continent. But surprisingly, PFAS concentrations actually increased moving from the coast into the interior.
“That was kind of a bit counterintuitive to me,” explained Cousins, who said he expected PFAS levels to be highest near the coast. “You see the opposite, actually.”
The inland increase is likely explained by higher snowfall totals in the coastal regions, which lead to PFAS concentrations becoming diluted. Inland, where snowfall is lower, even small amounts of PFAS can result in relatively higher concentrations within snow samples.
Additional factors shape PFAS distribution. PFAS levels are higher at the onset of precipitation events when they are rapidly removed from the air. Temperature inversions, too, can trap chemicals. In coastal areas, PFAS are more influenced by sea spray in the winter, whereas stronger sunlight drives the degradation of atmospheric precursors into PFAS in the summer months.
PFAS Presence at Both Poles
This new study also offers implications for the way that PFAS circulate globally. Though industrial activity in the Northern Hemisphere contributes most heavily to PFAS emissions, large-scale atmospheric circulation allows these compounds to reach polar regions. Rapid transport in the upper troposphere may act as an efficient pathway to shuttle PFAS across both hemispheres before they are deposited in the cold, remote regions at both ends of Earth.
“This completes the global picture with agreeing measurements at both poles, solidifying our understanding of the global distribution and drivers of PFAS contamination.”
Even though PFAS levels are higher in the Arctic, both polar regions show similar trends in PFAS concentrations since the 1990s. “It really matches decades of the same records that have been reported from the Arctic,” said Cora Young, an atmospheric chemist at York University, who was not involved in the new study.
“This completes the global picture with agreeing measurements at both poles, solidifying our understanding of the global distribution and drivers of PFAS contamination. The role of CFC [chlorofluorocarbon] replacements, changes in regulation, all of these things are important in the Northern Hemisphere and also the Southern Hemisphere,” said Young.
A critical ocean current that regulates Antarctica’s climate may have formed only once continents separated and winds aligned with new ocean passageways, according to a new study published in the Proceedings of the National Academy of Sciences of the United States of America.
Today, the Antarctic Circumpolar Current transports more than 100 times as much water as all of Earth’s rivers combined and, critically, insulates the Antarctic Ice Sheet from heat at lower latitudes. A clear picture of
Today, the Antarctic Circumpolar Current transports more than 100 times as much water as all of Earth’s rivers combined and, critically, insulates the Antarctic Ice Sheet from heat at lower latitudes. A clear picture of the origins of this current can help scientists further understand the relationships between contemporary ocean dynamics, the global climate, and ice formation in Antarctica.
“It’s very interesting to learn more about this current, how it developed, and what role it played in the climate change that was happening at that time,” said Hanna Knahl, a paleoclimatologist and doctoral student at the Alfred-Wegener-Institut in Germany and lead author of the new study.
The Birth of a Current
About 34 million years ago, Earth was undergoing a climatic shift, now known as the Eocene-Oligocene transition, during which atmospheric carbon dioxide decreased and the planet cooled.
Earth’s tectonic plates in the Southern Ocean moved away from each other, opening and deepening bodies of water such as the Tasmanian Gateway and the Drake Passage, which separate Antarctica, Australia, and South America.
For years, scientists hypothesized that the alignment of these newly formed waterways, along with westerly winds, could have channeled ocean water and spurred the formation of the Antarctic Circumpolar Current.
“The exact position of the westerly winds and their relative position to the [ocean] gateways have to click together.”
To test that hypothesis, Knahl and her colleagues simulated conditions of the early Oligocene Southern Ocean with a coupled model that included ocean dynamics, atmosphere and wind patterns, temperatures, ice sheet growth, and precipitation. The research team compared these simulations to data from actual Antarctic sediment cores and scans of the ocean floor.
Results confirmed that westerly winds were necessary for the Antarctic Circumpolar Current to form.
“The exact position of the westerly winds and their relative position to the [ocean] gateways have to click together,” Knahl said.
Joanne Whittaker, a marine geophysicist at the University of Tasmania who was not involved in the new study, was a coauthor of a 2015 study that proposed westerly wind alignment played a role in the formation of the current. Knahl’s study presents a more sophisticated model of the early Oligocene Southern Ocean and is a great next step in the investigation of the current’s origins, Whittaker said.
“They did a really nice job of taking a range of different people’s work and linking it all together,” she said.
Oligocene Understandings
“If you can have a model that works in the past, it’s going to give you confidence that it’s going to work for the future, as well.”
Scientists often use Earth’s past behavior to better understand how Earth systems may behave in the present or future. “If you can have a model that works in the past,” Whittaker explained, “it’s going to give you confidence that it’s going to work for the future, as well.”
The Eocene-Oligocene transition is a key to understanding the relationship between atmospheric carbon, ocean dynamics, and the glaciation of Antarctica, Whittaker said. Knowing how the current’s behavior affected carbon uptake millions of years ago helps scientists model how the present current’s behavior might also affect atmospheric carbon.
In addition to carbon uptake, the new research hints at how changes in westerly winds may influence the advance and retreat of the Antarctic Ice Sheet. Some modeling and proxy data indicate the westerly winds that spurred the Antarctic Circumpolar Current’s formation 34 million years ago have shifted in the past century and may continue to shift in the future. Understanding the role these winds initially played in the current’s development may shed light on the current’s present ability to guard the Antarctic Ice Sheet from warmer air masses.
There are still Oligocene patterns that require more research to sort out, though. For example, modeling in the new study showed interesting asymmetries in the timing of the development of different parts of the Antarctic Circumpolar Current, Knahl said. Scientists know from proxy data and modeling that similar asymmetry exists in the history of the Antarctic Ice Sheet; the ice sheet in East Antarctica began to form about 7 million years before the ice sheet began to form in West Antarctica.
“It could be interesting to see if there’s a connection between the asymmetries that we see here,” Knahl said. “Are they linked, or were they more or less independent?”
Citation: van Deelen, G. (2026), Widening channels and westerly winds together formed Earth’s strongest current, Eos, 107, https://doi.org/10.1029/2026EO260126. Published on 24 April 2026.
Editors’ Highlights are summaries of recent papers by AGU’s journal editors.
Source: Space Weather
TianQin is a geocentric space-borne gravitational wave detector, which is proposed to detect the gravitational wave by measuring tiny displacements using inter-satellite laser interferometry. However, the space surrounding the orbit and laser links of TianQin is not a vacuum—but filled with plasma, which can bend the laser links and induce pointing accuracy noise in the gravitational wave de
TianQin is a geocentric space-borne gravitational wave detector, which is proposed to detect the gravitational wave by measuring tiny displacements using inter-satellite laser interferometry. However, the space surrounding the orbit and laser links of TianQin is not a vacuum—but filled with plasma, which can bend the laser links and induce pointing accuracy noise in the gravitational wave detection.
Based on a global magnetohydrodynamic model, Zhou et al. [2026] use a ray-tracing method to obtain the laser deflection caused by laser propagation through plasma, and to evaluate the pointing accuracy noise. The result shows that the laser deflection effect caused by large-scale space plasma distribution under quiet to moderate space weather conditions does not represent a fundamental risk to the TianQin mission. However, during severe space weather events, the laser propagation effect could become a considerable noise in the gravitational wave detection.
This work establishes a connection between space weather and gravitational wave detection. Furthermore, this work raises awareness of the impact of space weather on other high-precision electromagnetic wave measurements in space.
Citation: Zhou, S. W, Su, W., Zhou, S. Y., Li, C. F., & Zhang, J. X. (2026). The pointing error due to laser propagation in space plasma for TianQin gravitational wave detection. Space Weather, 24, e2025SW004784. https://doi.org/10.1029/2025SW004784
This story was produced by Grist and the Food & Environment Reporting Network, a nonprofit news organization. Sign up for Grist’s weekly newsletter here.
Will Runion’s 736-acre cattle and hay farm is tucked into a horseshoe bend of the Nolichucky River in northeast Tennessee. On the morning of Friday, September 27, 2024, he was in the middle of two big projects: building a riverfront campground on his land to bring in tourists and income, and cutting the last of the season’s hay. Hurrica
Will Runion’s 736-acre cattle and hay farm is tucked into a horseshoe bend of the Nolichucky River in northeast Tennessee. On the morning of Friday, September 27, 2024, he was in the middle of two big projects: building a riverfront campground on his land to bring in tourists and income, and cutting the last of the season’s hay. Hurricane Helene had been arcing up from Florida toward the Appalachian Mountains, carrying heavy rain, and the river was high. Even though the banks seemed to be holding, he decided to move some of his cows and equipment to higher ground.
But the river kept rising. At about 11 a.m., the brown water topped its banks. He and his fiancée, his son-in-law’s parents, and neighbors scrambled to salvage what farm equipment they could, but they were nearly trapped when the quickly expanding river flowed into a low-lying area behind where they were working, cutting them off from dry land.
By afternoon, the river had swollen to some 1,200 feet wide—nearly 10 times its usual size. It “looked just like a lake,” Runion said. Trees snapped in the swift current and neighbors’ barns, roofs, hay bales, and household debris swirled by. The water swallowed Runion’s hay equipment and sent the little white house he’d planned to use as the new campground’s office sailing across a field.
At around 8 p.m., the Nolichucky finally crested and started to recede. Runion found a third of his fields covered in debris, dead fish, and tomatoes from upstream vegetable growers. The flood had gouged two holes the size of football fields in his hay pastures, down to a depth of 12 feet. Other sections of the farm were buried in up to 8 feet of sand or silt.
Flooding from Hurricane Helene brought massive damage to Will Runion’s farm, eroding the land in some places and washing up feet of sand on agricultural fields in other sections. Courtesy of Bryan LeBarre, via Grist
Helene dropped up to 30 inches of rain on southern Appalachia, causing historic flooding and landslides in parts of North Carolina, South Carolina, Tennessee, Georgia, Kentucky, and Virginia—a largely rural region where agriculture is a vital economic driver and cultural cornerstone. The mountains make it hard to spread out here, so farms tend to be small, and many growers use flood-prone bottomland because it is flat and fertile. But floods of this magnitude hadn’t hit here in generations. In North Carolina alone, Helene caused an estimated $4.9 billion in damage to the state’s agriculture sector. In Tennessee, agricultural losses were estimated at $1.3 billion. Thousands of farmers lost crops, tools, machinery, barns, buildings, animals, and fences.
“When you see 4 feet of sandy soils on top of your topsoil, you know that’s going to be a challenge. That was overwhelming.”
More than a year later, growers are also contending with the loss of something more vital, and more difficult to replace: their soil.
Runion knew immediately that his livelihood was ravaged. Without good soil, a farmer can’t farm. “When you see 4 feet of sandy soils on top of your topsoil, you know that’s going to be a challenge,” he said. “That was overwhelming.”
He sent drone footage of the damage to Forbes Walker, an environmental soil specialist with University of Tennessee Extension. “How do you fix this?” he asked.
“I don’t know,” Walker recalled thinking when he got Runion’s email. “How do we fix this?”
Over millennia, floods helped build the fertile land that farmers depend on. But today, climate change is driving more powerful and unpredictable storms. One study found that rainfall associated with Helene was 10 percent heavier due to man-made climate change. Research by the U.S. National Science Foundation suggests that what scientists call “100-year storms” will become three times more likely, and 20 percent more severe, over the next 50 years. What’s more, there’s little solid information about what happens to soil during a flood, or what to do when a farm’s soil is eroded or covered with material from elsewhere—its nutrients washed away and microbial communities disrupted. It’s a blind spot that is becoming more of a liability as storms like Helene become more common.
“None of us had ever seen anything like this before or responded to an emergency like that,” said Stephanie Kulesza, a nutrient and soil scientist at North Carolina State University. “And so we weren’t really prepared for recommendations to provide to producers.”
Soil can take thousands of years to form. Rock is weathered and slowly dissolves into smaller and smaller pieces. As dead leaves, animals, trees, and other plants decompose, they add organic matter and nutrients to the rock. Microorganisms establish themselves in the mix, driving nutrient cycling, aiding with decomposition, and stimulating plant growth; then worms and bugs, like beetles and ants, burrow in the mixture, aerating it. For soils to work well for agriculture, they need the right structure—airy enough to allow water to enter and move through, but not too quickly or too slowly—and sufficient biological and chemical richness, including nutrients like nitrogen, phosphorous, and potassium, to nourish crops.
Farmers use synthetic or natural fertilizers to ensure their soil has enough nutrients. They can also introduce practices like no-till—farming without plowing up the ground—to maintain the physical properties of their dirt. Topsoil, the rich, uppermost layer with the most available nutrients for crops, tends to make up less than a foot of the entire soil profile, but it’s crucial for agriculture.
Soil scientist Forbes Walker visits Will Runion’s farm in 2025, examining the deep sandy deposits left behind by Hurricane Helene. Credit: Raffe Lazarian/University of Tennessee Institute of Agriculture via Grist
Helene’s floodwaters either washed away significant topsoil or deposited new sediment on top of it on thousands of farms. Some, including one of Runion’s neighbors, saw their fields stripped down to bedrock, or river rock. Runion and others woke to pastures blanketed by feet of sand or stone.
When topsoil is washed away, the necessary nutrients for growing go with it. And when topsoil is covered with sand, farmers can’t get to it. Both scenarios can significantly alter the land’s usability. Topsoil can take decades or centuries to develop, and sand lacks both organic matter and the physical structure to hold water and nutrients. “These aren’t soils yet,” said Kulesza of what Helene left on Runion’s and other farmers’ land. “They are in their infancy now. The clock has been reset.”
Runion had cared for his soils, working to eliminate weeds, adding fertilizer to keep nutrient levels ideal, and lime to control pH. “They were our way of life,” Runion said. “They were our income.”
After the storm, from October to April, he removed debris, bulldozed sand off his fields to get closer to the topsoil, filled holes, and graded uneven land. Crews from the Federal Emergency Management Agency removed and shredded downed trees. He applied for government relief and received close to $1 million in state and federal aid. Runion said he could have easily used all of that money replacing equipment and paying for cleanup labor, fertilizer, and fuel, but he’s trying to stretch the money as much as possible.
By June, it was time to mow the fields that hadn’t flooded. He managed to put up enough bales of hay to feed his herd of 125 cattle, but not enough to sell. In a normal year, hay sales made up about a third of the farm’s income. With months of work behind him and his flooded land still too sandy and generally depleted, he realized the recovery would be a slog.
Runion returned to work on the campground, which he hoped would diversify the family’s earnings. The longer-term plan included a music venue and some hiking trails, and to host weddings and corporate events. After the storm, finishing it took on new urgency. He chose a new spot, about 450 feet upland from the river, and began clearing enough land for 45 camping sites.
One environmental soil specialist described the academic literature on flood-damaged soils as “thin.”
Runion also prepared a parcel of land for Walker, the extension soil specialist, to run tests that could guide his recovery. Last November, soon after the one-year anniversary of Helene, Walker showed me around Runion’s farm.
Working with students, Walker established four experiments over about 300 test plots. He’s looking at how different soil amendments—hay, wood chips, poultry litter, and a charcoal called biochar, to help the soil hold water and fertilizer; and Triple 19, a common plant food with equal parts nitrogen, phosphorous, and potassium—affect the growth of wheat and fescue grasses.
When I visited, some of the plots remained mostly bare while, in others, tufts of green had sprouted. “We actually got some stuff to grow,” Walker said.
He described the academic literature on flood-damaged soils as “thin.” While some research and case studies exist on how agricultural soil recovers after a flood, there are few systematic investigations like the one Walker is conducting—on what works, and what does not—particularly in Appalachia, where floods of this magnitude have been historically rare.
When so-called atmospheric rivers spawned devastating floods in the Pacific Northwest and southwestern British Columbia in 2021, Aimé Messiga, a Canadian soil research scientist at the Agassiz Research and Development Centre, found a similar “scarcity of data.” He conducted a detailed review of the existing research and concluded that there was limited long-term monitoring, little understanding of how floods affect nutrients and microorganism communities in the soil, and uncertainties about what the actual impacts of floods on agriculture and crops are. Complicating everything is the variability between different farms, soils, and crops.
“You need decades of accumulated data in order to be able to predict what will happen. We don’t have those data.”
“You need decades of accumulated data in order to be able to predict what will happen,” Messiga said. “We don’t have those data.”
Today, some researchers are attempting to replicate flood conditions in labs to better understand, but field work is rare, Messiga said. There’s little money for it—and in the U.S., the Trump administration has cut funding for climate-related research. In addition, “many among us still look at these events as random,” Messiga said. “They’re not random. They will keep occurring.”
Since 1980, 45 flooding events have caused damages over $1 billion each in the U.S., with more than half of those occurring in the past 15 years. In 2024, flooding in the upper Midwest drowned crops. Repeat events in central California damaged agricultural operations from winter 2022 to spring 2023. Flooding along the Mississippi River in 2019 reduced crop planting by millions of acres. There also have been numerous smaller or more localized floods. One study found nearly 75,000 flash floods in the contiguous U.S. from 1996 to 2017, with increasing frequency in the past 22 years. Flooding frequency and strength is predicted to rise in the years to come due to climate change—a warmer atmosphere holds more moisture and leads to stronger rain events—and poor land-use management.
Scientists are also starting to study a new type of event, called “weather whiplash,” when sudden changes occur from one extreme to another, amplifying the effects of the disaster. In Texas in 2025, a flood came after prolonged drought, causing widespread destruction.
For farmers, the effects of flooding on soil may linger for years after the disaster. In 2011, the Missouri River flooded states in the Upper Midwest, including thousands of acres of farmland. Fields were swamped for months with up to 20 feet of water. When the water finally receded, those fields were covered with anywhere from 2 to 20 feet of sand; other fields had washed out holes up to 70 feet deep. It looked like the surface of the moon, said John Wilson, a now-retired educator and agricultural expert who served Burt County, Nebraska, which was particularly hard-hit. “It was just bare soil,” he said. “There was no crop residue whatsoever.”
Wilson led teams that sampled the soil and helped farmers build back. He found that levels of nitrogen and organic matter were low in flooded soils, and fertility suffered when farmers planted their crops. Over about five years, fertility generally improved, but not everywhere. “If you went out today and did a yield map, you could still tell exactly where the erosion was because those areas are not as productive,” Wilson said.
Yield is money for farmers, who already navigate thin margins and, often, years without any profit at all. North Carolina’s strategic plan for agriculture recently enumerated just how thin: Of the state’s “42,500 farms, only 8,000 produce annual gross sales that exceed $100,000 annually. The overwhelming majority … some 23,400, gross less than $10,000 in sales, with only around 40 percent of the farms in the state having a positive net income in 2022.”
As floods increasingly wreck farmland, more researchers are starting to focus on understanding the effects of the floods and how to address them. Most of that work is happening in Asia, Messiga said. But a study in coastal North Carolina, where hurricanes regularly land, found that after a storm there was less organic matter in the soil, including carbon, and a disruption of microbial activity and nutrient cycling. The ground also absorbed water less readily.
Coastal flooding is also driving saltwater into the soil of farmland, making it more saline and unable to sustain crops. A North Carolina State University team has been developing test kits for farmers to sample the salinity of their soils, as well as a set of recommendations for keeping their soil viable. Such local work is important because soils vary greatly from place to place, and findings are not often easily transferable.
Nicole DelCogliano’s farm near Asheville, North Carolina, was wiped out almost entirely by floods from Hurricane Helene in 2024. Courtesy of Nicole DelCogliano via Grist
For now, in the wake of Helene, farmers are relying largely on trial and error to build back what was lost. Nicole DelCogliano has been farming vegetables, flowers, and livestock with her husband on 50 acres on the South Toe River, near Asheville, North Carolina, for 25 years. Helene washed away her barn, tractor, and other infrastructure. Of her 6 acres of vegetable fields, one was covered with several feet of sand, another got a foot, and a third field suffered extensive erosion.
“Our entire operation was wiped out, essentially,” she said.
“It’s not something that can be fixed overnight. This is a long process.”
With the help of some friends with tractors, DelCogliano cleared her main field and spread compost and lime on everything. “There was a mix of guidance about what you should do, like should you disturb the soil, should you not?” she said. “At an instinctual level, we just felt like we got to get the soil covered, we got to get something in the ground.” They sowed rye, a dependable cool season grass, as a cover crop, to protect the soil from erosion and add nutrients.
Karen Blaedow, an agricultural educator in Henderson County, North Carolina, said farmers should expect to put in at least three years of cover cropping before they see results in their soil. “It’s not something that can be fixed overnight,” she said. “This is a long process.”
In the spring following the flood, DelCogliano spread various amendments on her least-damaged field, including compost, lime, biochar, and blood and bone meal, which provide nitrogen and phosphorus, respectively. After all that, she and her husband seeded crops.
Their new vegetables came in about two weeks later than normal, but the season was more productive than ever, even though they grew on just 4 instead of 6 acres—“which is pretty amazing,” she said. “When we first started harvesting crops [after Helene], we didn’t yet have power at the farm. I had to dig one of our sinks out of a bank and bleach it and clean it and drag it up to the new barn—that we barely got a roof on—to wash and pack for that first [farmers] market.”
She doesn’t really know what made the year so productive. They planted more intensively to account for the smaller acreage and were able to harness their years of expertise to restart their operation basically from scratch. She also attributes the relative health of her soil to years of organic practices. “We’re dirt farmers,” she said. “Our primary job is to tend the dirt. Because that’s the basis of everything.”
Some farmers who’ve seen good harvests may have gotten a little lucky. Rather than sand, floods dumped silt. Even Runion got silt deposits in one section of his farm. Unlike the sand, the silty layers carry nutrients and create a positive growing environment. “We have a producer we work with and he said it’s the most fertile soil that he’s had in decades,” said Emine Fidan, a biosystems engineering and soil science researcher at the University of Tennessee, who’s also working on Runion’s farm. “And he said it grew the sweetest corn he’s ever had. It was growing just beautifully.”
Runion didn’t plant anything until this past fall. He prepared about 65 acres of the 220 that were underwater. It was slow going; he used a disking machine to till his land but had to stop often to clear sticks and trash and to grade out low spots. He mixed in mulch and planted oats, wheat, and fescue. Walker drove me past one of the fields and it still looked sandy, the grasses just a pale green shadow on the tan land. Runion said the greenery was “struggling to have any vigor about it.” He won’t know for sure how well or poorly the grasses do until spring, their peak growing season.
He considered planting more acreage but decided to wait and see what he learned from Walker’s trials. “It’s a process, and the knowledge we’re gaining there will help on the whole rest of it, too,” Runion said.
This spring, Walker’s team will measure the biomass in each plot as well as the quality of the crop, including how much protein it has and its digestibility. They’ll also be evaluating the soil itself, including its ability to hold water, to determine if any of the treatments improved the structure of the sandy dirt.
One farmer thinks the hay he’ll get in the coming years will be lower-yielding, lower-quality, and will cost more to produce due to the extra prep time, new seeds, and fertilizers.
Preliminary results suggest that, in plots where they put down mulch, the grasses are growing better than in plots with other amendments. The woody debris is reducing erosion and seeds are germinating well and standing up in the rough matrix. Spreading this kind of mulch isn’t an obvious solution, Walker said: Wood chips are a carbon-rich material, but as they break down in the soil they consume nitrogen, which can lead to a deficiency for the crops. But this mulch had sat in piles and started to decompose before it was applied to Runion’s fields, which made it less likely to cause these problems.
Runion had asked FEMA to leave the piles of wood chips on his farm rather than remove them like they normally would. Walker is looking for solutions to the soil problem that not only work but are also accessible. Have a mountain of mulch? Put it to work. Have nearby chicken houses? Maybe their nitrogen-rich manure can help revive flooded fields. His hope is that his team’s research can provide some guidance to farmers who find themselves in similar situations in the future. “I think it will have broad implications for a number of different crops,” including vegetables, Walker said.
Meanwhile, Runion is coming to terms with his situation. He thinks the hay he’ll get in the coming years will be lower-yielding, lower-quality, and will cost more to produce due to the extra prep time, new seeds, and fertilizers. He used to sell a lot of square bales, which tend to contain high-quality grasses and fetch a higher price, but he doesn’t expect to be doing that for a while. He’d initially hoped to have his land back in shape in a year or two. “Now it’s a four- to five-year [plan], I think,” Runion said. “It has been frustrating, and exhausting, too.”
He’s still optimistic, though. On my visit, I watched him grade out the new campground in a large dump truck. Freshly exposed red soil lay open to the sky. He thinks he can get the campground open by late summer or early fall. Over time, he hopes, it will be a more lucrative, and more sustainable, source of income. “The farm is really beautiful,” Runion said. “It still has a lot to offer.”
Editors’ Highlights are summaries of recent papers by AGU’s journal editors.
Source: Journal of Geophysical Research: Earth Surface
Turbidity currents are underwater currents that transport sediment on the sea floor. They were first observed in the late 1800s in Lake Geneva, Switzerland. The cable break following the 1929 Grand Banks earthquake offshore Canada revealed how massive and destructive these fluxes can be.
Turbidity currents move downslope because they have a higher density
Source: Journal of Geophysical Research: Earth Surface
Turbidity currents are underwater currents that transport sediment on the sea floor. They were first observed in the late 1800s in Lake Geneva, Switzerland. The cable break following the 1929 Grand Banks earthquake offshore Canada revealed how massive and destructive these fluxes can be.
Turbidity currents move downslope because they have a higher density than the surrounding water due to the presence of sediment in suspension. It is critical to keep in mind that suspended sediment concentration in these flows is low, meaning that the fluid is Newtonian and the flow is turbulent.
Notwithstanding recent advances in field monitoring, measuring turbidity current thickness, velocity, suspended sediment concentration, and grain size distribution remains difficult not only for the high-water depths and the destructive nature of some events, but also because these flows are often infrequent. Laboratory experiments and mathematical modeling have been used extensively to understand nature and some aspects of these flows, but questions remain on, for example, how turbidity currents interact with ocean waves, if they do.
Daniller-Verghese et al. [2026] performed laboratory experiments to determine if and how turbidity currents interact with ocean gravity waves. Experimental flows were released in an 11-meter-long, 1.2-meter-deep, and 0.61-meter-wide flume in the Experimental Sedimentation Laboratory of the Jackson School of Geoscience at the University of Texas. A motored wave maker was installed at the downstream end of the facility to generate the wave field. During the experiments, detailed velocity measurements were conducted to characterize the flow field and the fine details of the turbulent fluctuations. At the end of each experiment, high-resolution measurements of changes in bed elevations allowed the quantification of the net depositional fluxes.
The results show that, in presence of a superimposed wave field, the center of deposition volume shifted downstream compared to experiments conducted with the same inflow but in absence of waves. In addition, velocity measurements indicate that the wave signal is stronger in presence of turbidity currents compared to the “clear water” case. In other words, current velocity was larger when waves were present, enhancing downslope sediment transport and causing the observed downstream shift of the center of deposition.
Although the physical mechanism responsible for the observed increase of sediment transport rates in presence of a superimposed wave field still needs to be resolved, these results provide novel insight for the interpretation of storm and turbidity current deposits in the rock record. They also highlight the importance of considering wave-turbidity current interactions to constrain sediment budgets on continental shelves, which are essential to preserve and manage coastlines worldwide.
Citation: Daniller-Varghese, M., Smith, E., Mohrig, D., & Myrow, P. (2026). Wave-signal entrainment into combined flows: Consequences for sediment transport, signal dislocation, and turbulence. Journal of Geophysical Research: Earth Surface, 131, e2025JF008497. https://doi.org/10.1029/2025JF008497
Research & Developments is a blog for brief updates that provide context for the flurry of news that impacts science and scientists today.
After more than half a century of Earth Days, one planetary challenge—climate change—threatens our planet more than ever.
In 1970, the year Sen. Gaylord Nelson (D-Wisc.) organized the first Earth Day events, the annual average concentration of carbon dioxide in the atmosphere was 326 parts per million. In 2025, it was 31% higher, at 427 parts p
“It may sound small, but it’s reshaping daily life.”
Changes in average annual temperatures in U.S. cities and states show the powerful effects of this increase in heat-trapping carbon dioxide. A new analysis, published today by climate research and communications nonprofit Climate Central, found that since 1970, all 50 states and 99% of major U.S. cities have warmed, with an average city-level increase of 1.6°C (2.9°F).
“It may sound small, but it’s reshaping daily life,” Shel Winkley, a meteorologist at Climate Central, said in a video released alongside the report.
On average, the 49 U.S. states analyzed in the report have warmed by 1.7°C (3.0°F) since 1970. The six states that have warmed the fastest since the first Earth Day are Alaska with a 2.4°C (4.4°F) increase, New Jersey and New Mexico with a 2.1°C (3.7°F) increase, and Delaware, Massachusetts, and Vermont with a 2°C (3.6°F). Trends for Hawaii, which were analyzed separately and not included in the national average, also showed statewide warming.
In 2025, the United States was on average 1.4°C (2.6°F) warmer than the 20th century average. The Paris Agreement, a legally binding global treaty, sets a goal to limit warming to 1.5°C (2.7°F) above preindustrial levels, though some scientists expect that the world has already entered the period of time during which this limit will be breached.
Warming is occurring much faster in some cities than in their respective states, or than the United States as a whole. Check out your city’s data in the Climate Central report. Credit: Climate Central, CC BY 4.0
Warming trends in the United States are most pronounced in the Southwest, where cities have warmed an average of 1.9°C (3.5°F) since 1970. And in some cases, cities are warming much faster than whole states. Three of the five cities that have warmed the fastest since 1970 are in the Southwest: Reno, Nev., with an increase of 4.4°C (7.9°F), Las Vegas, with an increase of 3.3°C (6.0°F), and El Paso, Texas, with an increase of 3.3°C (5.9°F).
The effects are evident at the national, state, and local levels. Temperatures have warmed in 240 of the 242 cities analyzed by Climate Central. Harrisonburg, VA and Monterey, CA were the only two cities analyzed that have not warmed since 1970.
The report highlights some good Earth Day news, however, and points out that solar and wind power generation is at an all-time high in the United States, accounting for 19% of the electricity generated in the country in 2025 despite those industries facing recent headwinds from the federal administration.
“Every fraction of a degree [of warming] that we prevent does matter, for our health, for our communities, and for the world that we’re passing on to the next generations,” Winkley said.
These updates are made possible through information from the scientific community. Do you have a story about science or scientists? Send us a tip at eos@agu.org.
Planting more trees will decelerate climate change only if those trees are placed in optimal locations—primarily the tropics and subtropics—suggests new research published in Communications Earth and Environment. However, planting trees in locations like Alaska, Siberia, and large parts of the United States could actually lead to warming, said lead author and doctoral student at ETH Zurich Nora Fahrenbach.
Much of the current thinking in nature-based solutions, Fahrenbach said, is ba
Planting more trees will decelerate climate change only if those trees are placed in optimal locations—primarily the tropics and subtropics—suggests new research published in Communications Earth and Environment. However, planting trees in locations like Alaska, Siberia, and large parts of the United States could actually lead to warming, said lead author and doctoral student at ETH Zurich Nora Fahrenbach.
Much of the current thinking in nature-based solutions, Fahrenbach said, is based on the idea that “more is better.”
As in, “we’ll plant a trillion trees, or we’ll plant more than a trillion trees, and we are going to get more cooling, right?” Fahrenbach said. “That’s something we show is just not the case.”
Fahrenbach researches reforestation potentials, or global maps that identify areas where trees could be planted to mitigate climate change. In this work, she and her colleagues compared three prominent reforestation potentials to determine the effect of tree placement on local and global temperatures.
One scenario involved reforesting about 926 million hectares focused mostly on the tropics and resulted in about 0.25°C of cooling by 2100. Another called for reforesting 894 million hectares, including large areas in northern temperate and polar latitudes, and resulted in 0.13°C of cooling by 2100.
The third scenario involved planting forests strategically over only 440 million hectares of mostly tropical and subtropical land (less than half of the area covered in the other scenarios) but also resulted in 0.13°C of cooling. Geography, the findings suggest, may matter more than quantity when it comes to the cooling benefits of reforestation efforts.
Let’s Get (Biogeo)physical
The researchers modeled all three scenarios using the same parameters: Trees were planted from 2015 to 2070 and then remained steady in their population until 2100.
Planting trees in one area doesn’t just change the local temperature but has effects across the world.
All three models identified reforestation opportunities in regions such as the eastern United States, Amazonia, the Congo rainforest, and eastern China, as well as regions for which reforestation would not be as impactful, such as polar regions in the Northern Hemisphere. The researchers also found significant temperature changes across the Atlantic and Indian oceans as a result of atmospheric changes induced by reforestation, demonstrating an interconnected reality: Planting trees in one area doesn’t just change the local temperature but has effects across the world.
These local and nonlocal effects can be explained by a combination of biogeochemical and biogeophysical effects.
A biogeochemical effect relates to the movement of chemicals or chemical elements, such as trees absorbing carbon from the atmosphere.
A biogeophysical effect relates to the physical results of changing the land’s surface: Placing a tree in a snowy region, for instance, decreases the land’s albedo, meaning it causes the land surface to become darker and absorb more light, leading to more local heat. This rise in surface temperature also raises air temperature, creating cascading effects on wind patterns and oceanic currents.
Considering both processes together is essential for understanding whether a net cooling or net heating effect exists, but most policies focus only on biogeochemical effects, seeing trees solely for their ability to absorb carbon from the atmosphere, Fahrenbach said. They include prominent international policies such as the Paris Agreement and the United Nations’ Framework for REDD+.
“Really, we would also need to consider the biogeophysical effects,” Fahrenbach said. “That’s harder to do, right, considering those nonlocal effects, because just imagine, some country is going to plant a lot of trees, and that’s going to lead to warming somewhere else.”
A Call to Policymakers
Emilio Vilanova, a forest ecologist at the climate action nonprofit Verra, wrote by email, “The most important message for me is that this study emphasizes something that is often not well addressed in reforestation projects: Reforestation is not just about planting trees—it’s about designing where new forests go to maximize benefits and avoid unintended consequences.”
“Reforestation is a helpful tool, not a stand-alone solution to climate change.”
Vilanova also said the study puts the potential for reforestation efforts to address climate change in perspective. “Even very large reforestation efforts would only reduce global temperatures by about 0.13–0.25°C by the end of the century,” he said. “While meaningful, this finding also reinforces that reforestation is a helpful tool, not a stand-alone solution to climate change.”
Though the limited potential for change is sobering, the authors and Vilanova pointed out that this change does matter and that it matters how we think of our approach. They advocate for policies that adopt reforestation strategies based on location and that acknowledge both the local and nonlocal effects of reforestation.
“We really need to make sure that where we plant first, it has benefits locally, it has benefits globally,” Fahrenbach said.
22 April 2026: This article was updated to correct Nora Fahrenbach’s position at ETH Zurich.
This news article is included in our ENGAGE resource for educators seeking science news for their classroom lessons. Browse all ENGAGE articles, and share with your fellow educators how you integrated the article into an activity in the comments section below.
Citation: Meissen, A. (2026), Location, location, location: The “where” of reforestation may matter more than the extent, Eos, 107, https://doi.org/10.1029/2026EO260125. Published on 22 April 2026.