Normal view

  • ✇Eos
  • Hydrothermal Heat Flow as a Window into Subsurface Arc Magmas Chris Micucci
    Editors’ Vox is a blog from AGU’s Publications Department. The supply of magma from the Earth’s mantle is a primary source of heat to volcanic arc crust, where the heat is then dissipated through various processes. Much of this magmatic heat is dissipated as heated water, or aqueous fluid. A new article in Reviews of Geophysics compares 11 different volcanic-arc segments where heat discharge via aqueous fluid has been well-inventoried to better understand the factors that influence this p
     

Hydrothermal Heat Flow as a Window into Subsurface Arc Magmas

28 April 2026 at 18:47
Three scientists working on the side of a mountain.
Editors’ Vox is a blog from AGU’s Publications Department.

The supply of magma from the Earth’s mantle is a primary source of heat to volcanic arc crust, where the heat is then dissipated through various processes. Much of this magmatic heat is dissipated as heated water, or aqueous fluid.

A new article in Reviews of Geophysics compares 11 different volcanic-arc segments where heat discharge via aqueous fluid has been well-inventoried to better understand the factors that influence this process. Here, we asked the authors to give an overview of heat discharge from volcanic arcs, how scientists measure it, and what questions remain.

Why is it important to study how heat is dissipated from volcanic arcs?

The heat from these magmas matters for geothermal energy, patterns of groundwater flow, and the patterns of volcanic activity at the surface.

Volcanic arcs are the chains of volcanoes on top of subduction zones. They can produce some of Earth’s most explosive and hazardous eruptions. But much of the magma beneath the surface never erupts. Nevertheless, the heat from these magmas—and the simple fact of their existence and abundance—still matters for geothermal energy, patterns of groundwater flow, and the patterns of volcanic activity at the surface.

What are the main modes in which heat is discharged from volcanic arcs?

Heat at volcanic arcs can be carried by magmas, transmitted through the crust conductively, and carried by water seeping slowly through the crust. At the base of the crust, magmas are probably most important, with conduction coming in second. But as magmas move upwards through the crust, some of them solidify and impart their heat to their surroundings where it is transferred by conduction. Within a few kilometers of the surface, fluids seeping through the crust begin to take up all that heat, and so if we can quantify the heat carried by those fluids, we can retrace it to its origins in magmas.

How do scientists measure these different forms of heat loss?

Scientists measure the heat carried by erupting magmas using satellites, or by adding up the erupted masses and making an estimate of how much energy was released by cooling from eruption temperatures to solid igneous rocks at Earth’s surface. Conductive heat flow is measured by drilling holes in the Earth’s crust to see how quickly it gets hotter with depth.

Measuring the heat carried by aqueous fluids in the crust is in some ways the trickiest. One approach is to find all the springs where hot or even slightly warm water is trickling out and measure the temperature and discharge to estimate how much extra heat that water is carrying.

What are the challenges and uncertainties in measuring hydrothermal heat discharge?

One challenge is that many springs are only slightly warmer than you’d expect. There is good data for many hot springs, but there are data tracking these ‘slightly warm’ springs for only a subset of arcs. Another challenge is that warm underground fluids can flow laterally, so you have to try to account for that. This is not an uncertainty in hydrothermal discharge, but one additional big uncertainty for our study, where we were trying to quantify the proportion of magmas that freeze underground versus erupting, is in the estimates of how much magma has actually erupted through time.

What are some of the factors that influence hydrothermal heat loss?

A major goal of our paper is to try to quantify these hidden magmas.

A main factor that influences hydrothermal heat loss is the magmas that solidify underground. This link is the key motivation for our study. A major goal of our paper is to try to quantify these hidden magmas—how much magma needs to intrude the crust beneath the surface to supply the hydrothermal heat fluxes that we observe? The composition of magmas influences how much heat they can release. The depth at which magmas are emplaced also matters, because magmas that intrude the shallow crust eventually cool to lower temperatures than magmas emplaced in the lower crust and therefore release more heat.

What are the remaining questions or knowledge gaps where additional research efforts are needed?

A big outstanding challenge is combining estimates from hydrothermal data of how much magma is coming into the crust – like ours – with other approaches to estimating the same thing. The magmatic systems beneath volcanoes span the crust. At the base of the crust, you have magma supply, sort of like the water main feeding your plumbing system. Despite how fundamental magma supply is, we know remarkably little about it. It’s exciting to think about how the rates of magma supply could vary through time and space and why. Applying a range of techniques—including geophysical imaging, hydrothermal budgets, gas and igneous geochemistry, and petrology—to understand these questions could really be a game changer.

—Benjamin A. Black (bblack@eps.rutgers.edu; 0000-0003-4585-6438), Rutgers University, United States; S. E. Ingebritsen (steve.ingebritsen@gmail.com; 0000-0001-6917-9369), Kyoto University, Japan; and Kazuki Sawayama (sawayama@bep.vgs.kyoto-u.ac.jp; 0000-0001-7988-3739), Kyoto University, Japan

Editor’s Note: It is the policy of AGU Publications to invite the authors of articles published in Reviews of Geophysics to write a summary for Eos Editors’ Vox.

Citation: Black, B. A., S. E. Ingebritsen, and K. Sawayama (2026), Hydrothermal heat flow as a window into subsurface arc magmas, Eos, 107, https://doi.org/10.1029/2026EO265017. Published on 28 April 2026.
This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s).
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • The Genesis Mission Needs Hydrology: Here’s How to Incorporate It Amobichukwu C. Amanambu and Jonathan Frame
    Every chip fabricated in a semiconductor plant needs ultrapure water. Most nuclear reactors need water as a coolant and neutron moderator. Every artificial intelligence (AI) data center drinks between 1 million and 5 million gallons of water a day, with thirst often peaking during drought. Water runs through every technology priority the United States has named, yet the word does not appear once in “Launching the Genesis Mission,” an executive order (EO) released in November 2025. As describ
     

The Genesis Mission Needs Hydrology: Here’s How to Incorporate It

28 April 2026 at 13:09
Satellite image of The Dalles Google data center and the adjacent Columbia River.

Every chip fabricated in a semiconductor plant needs ultrapure water. Most nuclear reactors need water as a coolant and neutron moderator. Every artificial intelligence (AI) data center drinks between 1 million and 5 million gallons of water a day, with thirst often peaking during drought.

Water runs through every technology priority the United States has named, yet the word does not appear once in “Launching the Genesis Mission,” an executive order (EO) released in November 2025. As described in the EO, the Genesis Mission is a “dedicated, coordinated national effort to unleash a new age of AI-accelerated innovation and discovery that can solve the most challenging problems of this century.”

Led by the Department of Energy (DOE), the initiative aims to build an integrated AI framework that would harness federal scientific datasets to accelerate breakthroughs in advanced manufacturing, biotechnology, critical materials, nuclear fission and fusion energy, quantum information science, and semiconductor development. The scope of the mission is comparable to that of the Manhattan Project.

Since the announcement, the DOE has listed “Predicting U.S. Water for Energy” among its 26 Genesis Mission Science and Technology Challenges. The project is also soliciting proposals in three water-related focus areas.

This framework provides a foothold for hydrology in the Genesis Mission, but it is scoped narrowly around water as a supply variable for energy production.

In reality, water is a crosscutting constraint that will help determine whether the mission’s priorities translate into deployable outcomes. The hydrology community now has a seat at the table, and if it moves first and positions water security as one of the “most challenging problems of this century,” the Genesis Mission can become the sandbox in which AI reshapes how the country measures, models, and manages water.

Making this happen will require that the DOE and the Office of Science and Technology Policy charter a hydrology workstream inside the Genesis Mission, with interagency delivery involving the U.S. Geological Survey (USGS), NOAA, the Bureau of Reclamation, the EPA, and partners at state, regional, and community levels. Here is what we think that workstream should look like:

Illustration with “Genesis AI Platform” as a hub and seven mission-related components as spokes.
A water-centric Genesis Mission architecture supports seven hydrological components that both feed into and receive decisions from the Genesis AI platform. Each component maps to a section of this article. Credit: Amobichukwu C. Amanambu. Click image for larger version.

While the existing challenges reflect some of these components, others will require coordinated effort from the hydrology community to bring into the Genesis Mission’s scope.

Build the Water Corpus Genesis Will Need

The Genesis Mission EO instructs the DOE to create an American Science and Security Platform to provide the public, scientists, agencies, and policymakers access to crucial scientific datasets.

The good news is that accessible water data systems already exist across several federal agencies and academic research centers. The USGS National Water Information System tracks real-time and historical water quality and use across the country. NASA’s Earth Science Data Systems Program provides open access to Earth science observations. NOAA’s National Water Center, the first federal facility dedicated to national water resource forecasting, operates the National Water Model, which continuously forecasts flows on 2.7 million stream reaches across the continental United States. The Catchment Attributes and Meteorology for Large-Sample Studies (CAMELS) dataset, currently hosted by the National Center for Atmospheric Research, provides data tailored for hydrological research on hundreds of river basins, and the Caravan framework pulls together multiple large-sample meteorological and hydrological datasets at a global scale.

What is missing is a unified, AI-ready repository that brings federal, state, and community data together.

What is missing is a unified, AI-ready repository that brings federal, state, and community data together. Building one is hard. Water data are fragmented, inconsistent, and often entirely absent. Consistent, reliable data for groundwater, withdrawals, reservoir operations, and water quality are especially difficult to obtain.

Local resistance to sharing data is real. In Texas, for example, landowners hold private property rights over groundwater and have opposed metering and reporting requirements imposed by groundwater conservation districts. In California, agricultural well owners fought metering mandates for years before the Sustainable Groundwater Management Act compelled local agencies to begin tracking withdrawals. Tribal nations face a different concern: Water data collected on Indigenous lands has been misrepresented in federal datasets that were modeled without accounting for Indian country, leading many nations to restrict access to their data as an exercise of sovereignty.

Practical steps toward building a unified AI-ready repository include tiered access and licensing for different stakeholders, clear provenance tracking for all data reported, financial and educational incentives for stakeholders for reporting, and targeted gap filling. Where measurements are missing, AI can fuse remote sensing with gauged records and operational logs—but only if the results carry honest uncertainty estimates tied to real decisions.

Get the corpus right, and it will outlive any single program name. It becomes infrastructure the country can lean on.

Develop Shared Hydrologic Foundation Models

The Genesis Mission EO directs the DOE to provide “domain-specific foundation models across the range of scientific domains covered.”

Hydrology has a head start. Long short-term memory (LSTM) networks are a key type of neural network designed to last thousands of time steps. Hydrology LSTMs trained on CAMELS data have already matched traditional conceptual models for daily streamflow discharge prediction. Open-source Neural Hydrology tools serve as baselines for regional runoff prediction. These predictions may serve as precursors to the foundation models the Genesis Mission envisions and building blocks from which they could be developed.

The process of scaling up these tools is not straightforward, however. A hydrologic investigation of snowmelt-driven streams in Colorado will not require the same spatiotemporal data as tile-drained fields in Iowa, for example. A hydrology-specific foundation model must take nuanced requirements into consideration and provide a clear path for managing and exploiting a variety of datasets.

Google’s Flood Hub shows what is possible: Its AI-enabled flood forecasts now cover more than 80 countries. However, Flood Hub’s core model code and trained weights remain proprietary, meaning researchers can use the forecasts but cannot rebuild or adapt the underlying models. Genesis, if well positioned, can fill that accessibility gap by producing foundation models for water that are reusable, reliable, and openly governed.

Build a National Water Digital Twin

The EO prescribes an integrated AI platform combining foundation models with simulation tools to stimulate AI-enabled innovations.

That architecture is exactly what a digital twin requires. Europe’s Destination Earth initiative is already building digital twins for weather extremes and nonstationary conditions on the Large Unified Modern Infrastructure (LUMI) supercomputer. The United Nations–led AI for Good initiative has prioritized water applications, warning that fragmented national efforts risk duplicating work.

If the United States aims for global strategic leadership in AI-accelerated science, water infrastructure cannot be an afterthought.

A water digital twin earns its keep when it makes the consequences of choices visible, in terms of flows, levels, temperatures, and risks to people and ecosystems.

Rather than starting from scratch, a water-centric Genesis Mission would unite existing federal models—the National Water Model, reservoir simulators, and groundwater codes—in a single digital twin. AI can become the thread that stitches them together, correcting biases and providing numerical solvers to enforce mass and energy balance.

What should this twin actually do? Help a dam operator decide whether to release water ahead of a storm. Tell planners where a new data center can draw cooling water without drying up a stream. Flag which coastal defenses will fail first under rising seas.

A water digital twin earns its keep when it makes the consequences of choices visible, in terms of flows, levels, temperatures, and risks to people and ecosystems.

Turn Basins into AI Test Beds

The Genesis Mission promotes AI-directed experimentation and directs the DOE to keep a record of robotic laboratories and production facilities in which such experimentation could be conducted. Hydrological field sites belong in that inventory. The National Ecological Observatory Network already operates 81 sites with standardized measurements of meteorology, surface water, groundwater, and biodiversity. The Critical Zone Collaborative Network instruments catchments to track water-soil-vegetation interactions over decades.

Formalizing these networks as AI test beds would link field observations back into the water digital twin so that experiments and models continually sharpen each other. Imagine mobile sensors steered by AI agents during a storm or aquifer recharge experiments designed by algorithms and verified in real time. That feedback loop is what separates a useful model from a decorative one.

Expand Water Challenges on the Genesis Mission List

The Exchange and What’s at Stake

Allowing water security to flow through the diverse components of the Genesis Mission would benefit both the policies championed by the mission itself and the hydrology community.

The Genesis Mission gets real-world, noisy test beds where AI proves value beyond benchmarks, a domain to stress test climate and infrastructure investments, and scientists trained in both AI and the stubborn realities of rivers, aquifers, and pipes.

Hydrology gets resources for shared data infrastructure, foundation models and instrumented basins no single lab can support, a seat when rules for AI and national scientific infrastructure are negotiated, and a chance to reset practices around openness, collaboration, and equity.

Earlier this year, the DOE released 26 Genesis Mission Science and Technology Challenges, and “Predicting U.S. Water for Energy” was among them. The accompanying funding call (DE-FOA-0003612) solicits proposals on cloud microphysics, coupled surface water–groundwater modeling, and seasonal to multiyear prediction, all framed around energy needs and flood resilience.

These inclusions are a significant win for a hydrology component to Genesis, but several urgent challenges sit outside their scope. Can AI close the gap between a flood forecast issued 12 hours out and the 48 hours emergency managers actually need? Can it map compound extremes, in which drought, heat, and infrastructure failure collide in the same week? Can it redesign monitoring networks so that coverage follows risk rather than where gauges happened to be installed a century ago? Integrating energy and water systems is equally urgent: Floods have caused 80% of major U.S. grid outages since 2000, while drought-driven water stress curtails cooling at thermoelectric plants and reduces hydropower output, exposing how deeply energy infrastructure depends on hydrologic extremes.

The water footprint of new AI infrastructure deserves a place on that list. A separate executive order (14318, “Accelerating Federal Permitting of Data Center Infrastructure”) is already fast-tracking expansion of data center construction, and a single hyperscale facility can consume 1 million to 5 million gallons of water daily. Emerging research shows how withdrawals at that scale can push streams below ecological thresholds during low flows.

Make Hydrology the Conscience of AI Governance

The EO directs the DOE to set data access rules and clarify policies for ownership, licensing, trade secret protections, and commercialization of products and tools associated with it.

Three principles should anchor such policies for AI use in water security.

First, Indigenous and community data rights must be embedded in every major AI water security effort, in line with the collective benefit, authority to control, responsibility, and ethics (CARE) principles for Indigenous data governance.

Second, AI’s own water footprint, through electricity generation and cooling, must be treated as a design constraint. Transparent reporting, stress-based siting, and efficiency targets will prevent hydrology in Genesis from being self-defeating.

Third, the DOE should define what failure looks like. Missing a flood crest portends loss of lives and livelihoods and breaches of treaties. Accountability standards must be measurable, and they must ask not just how accurate the forecast was on average, but who bore the cost when it was wrong.

A single executive order will not solve the country’s water security problems, and a single challenge topic will not either.

But the Genesis Mission has provided a seat at a table that did not exist 6 months ago. Whether the hydrology community treats it as a ceiling or a foundation depends on what happens next. Europe’s Destination Earth and the United Nations’ AI for Good water initiatives are already moving.

American hydrology now has a seat at the table. We should take it.

Recommended Resources

Carroll, S. R., et al. (2020), The CARE principles for Indigenous data governance, Data Sci. J., 19, 43, https://doi.org/10.5334/dsj-2020-043.

European Commission (2023), Destination Earth: Digital Twins and the Digital Twin Engine, Publ. Off. of the Eur. Union, Luxembourg, destination-earth.eu/destination-earth/destines-components/digital-twins-digital-twin-engine/.

Google Research (2024), Flood forecasting and Flood Hub, Google Research Technical Overview, sites.research.google/gr/floodforecasting/.

International Telecommunication Union (2024), AI for Good: Water and sanitation, aiforgood.itu.int/aifg-course/harnessing-ai-for-sustainable-innovation-sdg6-advancing-clean-water-and-sanitation/.

Kratzert, F., et al. (2019), Toward improved predictions in ungauged basins: Exploiting the power of machine learning, Water Resour. Res., 55, 11,344–11,354, https://doi.org/10.1029/2019WR026065.

Kratzert, F., et al. (2023), Caravan: A global community dataset for large-sample hydrology, Sci. Data, 10, 61, https://doi.org/10.1038/s41597-023-01975-w.

Li, P., et al. (2023), Making AI less “thirsty”: Uncovering and addressing the secret water footprint of AI models, Commun. ACM, 66, 28–31, cacm.acm.org/sustainability-and-computing/making-ai-less-thirsty/.

The White House (2025a), Accelerating Federal Permitting of Data Center Infrastructure, Executive Order 14318, Washington, D.C., www.whitehouse.gov/presidential-actions/2025/07/accelerating-federal-permitting-of-data-center-infrastructure.

The White House (2025b), Launching the Genesis Mission, Executive Order 14363, Washington, D.C., www.whitehouse.gov/presidential-actions/2025/11/launching-the-genesis-mission.

Xiao, T., et al. (2025), Environmental impact and net-zero pathways for sustainable artificial intelligence servers in the USA, Nat. Sustainability, 8, 1,541–1,553, https://doi.org/10.1038/s41893-025-01681-y.

Zhang, L., et al. (2025), Foundation models as assistive tools in hydrometeorology: Opportunities, challenges, and perspectives, Water Resour. Res., 61, e2024WR039553, https://doi.org/10.1029/2024WR039553.

Author Information

Amobichukwu C. Amanambu (acamanambu@ua.edu), Department of Geography and the Environment, The University of Alabama, Tuscaloosa; and Jonathan Frame (jmframe@ua.edu), Department of Geological Sciences, The University of Alabama, Tuscaloosa

Citation: Amanambu, A. C., and J. Frame (2026), The Genesis Mission needs hydrology: Here’s how to incorporate it, Eos, 107, https://doi.org/10.1029/2026EO260131. Published on 28 April 2026.
This article does not represent the opinion of AGU, Eos, or any of its affiliates. It is solely the opinion of the author(s).
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

EB53 – Back to the Cloudforest: A Gentle Afternoon at Guango Lodge

28 April 2026 at 13:00
In October 2025, Jim Gain joined a birding tour in Ecuador, exploring its rich biodiversity and stunning landscapes over 14 days. The blog series chronicles his experiences with birds and nature.

  • ✇Earth911
  • About That $3,000 Bag of Groceries in Your Trash Earth911
    Editor’s Note: This is the first article in a new Earth911 series, Where Waste Comes From, examining the largest sources of waste in the typical American household, what each category costs the family, what it costs the country, and what it costs the climate. We begin with food because food is the biggest category, because every household touches it every day, and because the lever any one family can pull on it is unusually large. A family of four in the United States throws out more than $3,000
     

About That $3,000 Bag of Groceries in Your Trash

28 April 2026 at 11:00

Editor’s Note: This is the first article in a new Earth911 series, Where Waste Comes From, examining the largest sources of waste in the typical American household, what each category costs the family, what it costs the country, and what it costs the climate. We begin with food because food is the biggest category, because every household touches it every day, and because the lever any one family can pull on it is unusually large.

A family of four in the United States throws out more than $3,000 worth of food a year. Not “wastes” in the vague sense of eating too much or buying the wrong brand. We mean “throws out” — into the trash, into the disposal, or scraped off a plate into the bin, according to the 2026 ReFED U.S. Food Waste Report, the most current accounting of the problem.

Between uneaten groceries at home and plate waste at restaurants, American consumers discard roughly 35 million tons of food every year, about $259 billion in purchased calories, or $762 per person. Households pay for all of it, and bear most of it at home: residential food waste is the single largest slice of the consumer total.

The climate bill is equally devastating. All of that uneaten food carries an annual greenhouse gas footprint of 154 million metric tons of CO₂-equivalent, the same as driving 36 million passenger vehicles for a year. That food also required about 9 trillion gallons of water to grow — water that was never consumed by a human being. None of these resources made it to a table.

The waste stream inside the house

Food is the single largest component of landfilled material in the United States by weight, based on the EPA’s most recent sustainable materials accounting. EPA discontinued the comprehensive series after that December 2020 release; budget and staffing cuts under the current Trump administration have kept the report from being revived.

State waste studies provide continuing proof of the food waste epidemic, and the potential for progress. Washington’s 2020-2021 Statewide Waste Characterization Study found food waste accounted for nearly 20% of residential garbage. California’s 2021 Disposal Facility-Based Waste Characterization Study found organics, which includes food and yard waste, made up 28.4% of landfilled material, down from 34.1% in 2018, with the reduction credited largely to SB 1383, a state law that requires curbside organics collection for composting.

Where does food waste come from inside the home? ReFED’s consumer-behavior research, published in July 2025, breaks it down into four dominant habits:

Produce that spoiled before it was used. Fresh fruits and vegetables lose freshness quickly, cost less per pound than animal proteins, and tend to be bought in larger quantities than households consume.

Prepared food left over. The restaurant-style portion has migrated into the home kitchen. Leftovers are forgotten, buried, or mentally written off the moment a newer meal enters the fridge.

Confusion over date labels. “Sell by,” “best by,” and “use by” mean different things, are not federally regulated except for infant formula, and are frequently treated by consumers as expiration warnings when they are shelf-life guidance.

Over-purchasing against oversize packaging. The family-size bag of spinach and the 48-ounce jug of milk are typically the lowest per-unit price, and the highest risk of spoilage for small households.

ReFED revised its residential-waste estimate downward in its 2024 report by roughly 40 percent, or 17 million tons — not because household behavior improved, but because earlier estimates double-counted some flows. The overall residential waste picture is still enormous. It is also not shrinking. Consumer waste rates rose in the most recent data year even as overall U.S. food waste edged down, driven by retail and manufacturing progress that the home has not yet matched.

Burning a hole in your family budget

Let’s break down the national number to look inside a single household. A U.S. family of four spending roughly $12,000 to $15,000 a year on groceries throws away, on average, somewhere between 20 and 25 percent of it. The equivalent dollar number — $3,000 a year lost in the kitchen — is larger than the average American household’s annual spending on home energy, larger than most families’ annual clothing budget, and comparable to an annual car insurance premium. It is, in most households, the biggest single lever the family has on its grocery budget, climate footprint, and water footprint simultaneously. Very few household sustainability choices compound this cleanly.

Beyond the grocery-bill number, food waste generates costs the household pays for through taxes, utility fees, and environmental damage whether it knows it or not:

  • Landfill tipping fees: The 2024 Environmental Research and Education Foundation’s national tipping-fee survey put the weighted-average U.S. landfill tipping fee at $62.63 per ton, which is up 10 percent year over year — the largest annual increase since 2022. Every ton of food scraps sent to landfill is a ton charged against the municipal solid-waste budget that residents fund through utility bills and property taxes.
  • Landfill methane: Food waste is the single largest contributor to the methane emissions from U.S. landfills, which are the third-largest source of anthropogenic methane in the country.
  • Food insecurity: The 35 million tons of consumer food waste translate to nearly 58 billion meals that could have gone to people in need, while roughly 14 percent of Americans (1 in 7) experience food insecurity. The waste is not just resources; it is a distribution failure with a public-health cost downstream.
  • Water: Nine trillion gallons is an abstract number. It is roughly the volume of Lake Okeechobee. Every drop required an energy input for pumping, treatment, and, in the western third of the country, an increasingly scarce supply.

Where the infrastructure works, and where it doesn’t

Curbside organics collection, the municipal programs that pick up food scraps along with yard waste for industrial composting or anaerobic digestion, is available in parts of California, Oregon, Washington, Massachusetts, Vermont, Colorado, Minnesota, and a growing number of metro areas in other states. Where it runs, compostable collection materially shifts the numbers. San Francisco’s mandatory program, the oldest and most cited, diverts the majority of residential organic material from landfill and produces commercial-grade compost that returns to regional farms.

Outside those states, most households have no curbside pathway. Backyard composting is the most widely available option. For households without the space or the desire to compost at home, a small ecosystem of digital services has grown up to fill the gap municipal programs don’t cover. MakeSoil and Peels operate peer-matching platforms that connect people who have food scraps with neighbors who already run a compost pile, worm bin, or chicken coop. CompostNow runs paid curbside pickup in a growing list of cities, including Atlanta, Asheville, Cincinnati, and the Raleigh-Durham area, and partners with municipalities on drop-off programs elsewhere. ShareWaste, the original neighbor-matching service and the one most commonly cited in earlier reporting, unfortunately, was shuttered at the end of 2024.

Most of the household lever on food waste is not composting. It is prevention. Composting turns discarded food into a lower-impact product. It still represents calories, dollars, and upstream water and energy that never delivered their purpose. The first line of defense is buying, storing, and planning to match the family’s actual consumption. The second line is composting what remains.

Take Action

At the individual and household level, some simple steps can make a difference:

  1. Audit one week of your kitchen trash. Actually weigh or photograph a week of food-bin contents. Families who do this consistently identify their top three loss categories (usually produce, leftovers, and bread) within a single week, and those become the behavior targets.
  2. Shop the fridge, then the pantry, then the store. Before writing a grocery list, list what’s already on hand. Plan at least one “use it up” meal per week built around what is about to spoil.
  3. Learn date labels. “Use by” is the only label where food should not be eaten after the date, and only for a short list of products (infant formula, some deli meats). “Sell by” is inventory guidance for the retailer. “Best by” is quality guidance, not safety.
  4. Freeze aggressively. Bread, cheese, cooked grains, leftovers, and most produce (with minimal prep) all freeze well. Most household waste is time-based; the freezer pauses the clock.
  5. Start composting where collection exists, or set up a backyard or countertop system. Earth911’s recycling search tool lists local organics programs by ZIP code.

At the community and policy level, a little cooperation and activism can go a long way:

  1. Support mandatory organics collection where your state or city is considering it, then use the services when available. Organics bans have now passed in California (SB 1383, mentioned above), Vermont, Connecticut, Maryland, New Jersey, New York, Rhode Island, and Washington. The programs work only when households participate.
  2. Push for a unified federal date-label standard. Legislation has been introduced in every recent Congress. It has not passed.
  3. Work on food insecurity in the same room as food waste. The two issues belong on the same municipal agenda. Rescue organizations — Feeding America, City Harvest, community food-pantry networks — need volunteers and advocacy as much as they need donations.

The post About That $3,000 Bag of Groceries in Your Trash appeared first on Earth911.

  • ✇Earth911
  • The 2026 World Cup Will Be the Most Polluting Ever Earth911
    Nine million tons of carbon dioxide equivalent. That is the projected climate cost of the 48-team, three-country, 16-city soccer tournament that kicks off June 11 in Mexico City — nearly double the average emissions of every World Cup held between 2010 and 2022. The figure comes from a peer-reviewed analysis published by Scientists for Global Responsibility, the Environmental Defense Fund, Cool Down, the Sport for Climate Action Network, and the New Weather Institute. Their conclusion: FIFA’s de
     

The 2026 World Cup Will Be the Most Polluting Ever

28 April 2026 at 11:00

Nine million tons of carbon dioxide equivalent. That is the projected climate cost of the 48-team, three-country, 16-city soccer tournament that kicks off June 11 in Mexico City — nearly double the average emissions of every World Cup held between 2010 and 2022.

The figure comes from a peer-reviewed analysis published by Scientists for Global Responsibility, the Environmental Defense Fund, Cool Down, the Sport for Climate Action Network, and the New Weather Institute. Their conclusion: FIFA’s decision to expand the tournament and spread it across a continent has locked in a climate footprint that no amount of host-city recycling or LED lighting can offset.

Which makes the question of which host cities are doing serious sustainability work more important, not less. Their practices will outlast the tournament.

The Problem Is Structural

World Cup-related team air travel will account for roughly 7.7 million tons of CO2-equivalent — about 85% of the total, according to the SGR analysis. That is the direct consequence of two FIFA decisions. First, the tournament grew from 32 to 48 teams and from 64 to 104 matches. Second, FIFA chose to hold those matches across Canada, Mexico, and the United States rather than concentrate them in a single region.

The contrast with the previous tournament is stark. Qatar 2022 kept its eight stadiums within 34 miles of each other. The shortest distance between 2026 stadiums, from MetLife in New Jersey to Lincoln Financial Field in Philadelphia, is 95.5 miles. Most teams’ itineraries cover thousands of miles. One UEFA playoff winner, according to a Fossil Free Football analysis, could travel Toronto to Los Angeles (2,175 miles), then Los Angeles to Seattle (932 miles), then, in the knockout rounds, another 2,500 miles to Boston.

FIFA does not set binding emissions limits for host cities, and it did not address the underlying decision to spread the tournament across North America. SGR’s researchers urged FIFA to reverse the team expansion, set mandatory environmental standards, and end sponsorship deals with high-emitting companies, including the Saudi oil company Aramco, whose sponsorship is estimated to result in an additional 30 million tons of CO2e due to energy sales linked to the tournament’s promotion.

The Heat Risk Nobody Planned For

Climate change is not just an abstraction measured in tournament emissions. It is a condition players and fans will experience in real time. The SGR/EDF report assessed heat, flooding, and extreme weather risk at all 16 stadiums. Six face extreme heat stress due to Wet Bulb Globe Temperatures above 80°F, the threshold where exertion becomes dangerous. Eight of the 16 cities require what the researchers called immediate environmental intervention. Four need critical intervention, according to the report.

AT&T Stadium in Arlington, Texas, which will host nine World Cup matches — more than any other venue — experiences 37 days per year above 95°F, with July wet bulb readings that exceed FIFA safety thresholds.

Houston’s NRG Stadium faces simultaneous heat, flooding, and wildfire risk.

Los Angeles contends with wildfire smoke.

Miami faces hurricanes.

Where Host Cities Lead, and Where They Lag

A sustainability ranking published by World Sports Network in April 2026 attempts to score the 16 host cities across transit access, electric vehicle infrastructure, waste, air pollution, urban greening, and greenhouse gas emissions. The methodology has limits — it weights all factors equally, uses stadium-specific data alongside city-wide data, and includes some questionable proxies — but its directional finding is consistent with what urban sustainability researchers have long documented about the climate in North American cities.

Vancouver tops the rankings. British Columbia generates roughly 95% of its electricity from renewable sources, largely hydropower. BC Place sits in the center of Vancouver, with 26 public transit stops within a 10-minute walk. Fans can reach it by SkyTrain or bus. That single design decision eliminates most of the vehicle trips and parking-lot sprawl that define a typical U.S. stadium day.

Boston ranked second, the highest-scoring U.S. city. That is less about inherent greenness than about what severe flooding has forced the city to prepare for. Boston experienced 19 days of flooding in 2024, and sea levels around the city are projected to rise 20 centimeters by 2030 relative to 2000. The city’s Building Emissions Reduction and Disclosure Ordinance requires large buildings to cut emissions to net zero by 2050, with interim targets that have already tightened performance at Gillette Stadium’s surrounding infrastructure.

Mexico City placed third, Toronto fourth, Monterrey fifth. The pattern shows that four of the top five cities are outside the United States, even though 11 of the 16 host cities are American. Mexico City’s transformation from one of the most polluted major cities in the world into one of the Americas’ most active urban reforesters, with over 27 million trees and plants added between 2018 and 2021, is the kind of long-horizon work that does not fit inside a tournament timeline but shapes what that timeline makes possible.

The American Transit Problem

Every U.S. host city except Boston falls in the bottom half of the WSN ranking, and the reason is almost always the same: transit.

AT&T Stadium in Arlington has no public transit stops within a 10-minute walk. Hard Rock Stadium in Miami, which will host seven matches, sits 17 miles north of downtown Miami with no rail connection. SoFi Stadium in Inglewood, MetLife in East Rutherford, and NRG in Houston all require a car, a shuttle, or a rideshare for most attendees.

Dallas-Fort Worth is ranked third in the world for transportation-related greenhouse gas emissions, a structural problem no single event can fix. The Dallas organizing committee has built a sustainability plan in collaboration with the University of Texas at Arlington’s chief sustainability officer, Meghna Tare. It addresses waste management, single-use plastic reduction, composting, and community legacy. The North Central Texas Council of Governments has designed a charter bus system to fill the transit gap for the nine matches AT&T Stadium will host. These are real efforts. They also show that when infrastructure is car-dependent, event-specific workarounds can reduce harm but don’t substitute for the public transit that does not exist.

What This Means Beyond the Tournament

The 2026 World Cup will be a 34-day event watched by a projected 5 million in-person fans and up to 6 billion viewers worldwide. The emissions it generates will dissipate into an atmosphere that cannot tell tournament carbon from commuting carbon. What will persist are the infrastructure choices each host city makes now, including whether transit lines are extended or not, stadium renovations that meet LEED standards or do not, food recovery programs that continue operating after the final match or get packed away with the branded signage.

These are not reasons to hate world football. It’s the Beautiful Game, and its governing body, FIFA, can make changes to reduce the tournament’s impact and protect players from heat-related injuries.

The post The 2026 World Cup Will Be the Most Polluting Ever appeared first on Earth911.

  • ✇Eos
  • Trump Terminates Entire National Science Board Grace van Deelen
    Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today. The Trump Administration has terminated the positions of every member of an independent board meant to govern the National Science Foundation (NSF). The National Science Board directs and approves large funding decisions for NSF’s approximately $9 billion basic science research budget. It is meant to function ind
     

Trump Terminates Entire National Science Board

27 April 2026 at 14:44
Silhouettes of people in lavender and periwinkle stand, some overlapping, on a aubergine-colored background. Overlying the image at the bottom is the text “R&D Research and Developments.”

Research & Developments is a blog for brief updates that provide context for the flurry of news regarding law and policy changes that impact science and scientists today.

The Trump Administration has terminated the positions of every member of an independent board meant to govern the National Science Foundation (NSF).

The National Science Board directs and approves large funding decisions for NSF’s approximately $9 billion basic science research budget. It is meant to function independently from the federal administration to keep science funding insulated from political pressure and budget cycles.

“I have watched the systematic dismantling of the scientific advisory infrastructure of this government with growing alarm, and the National Science Board is simply the latest casualty.”

In a 24 April notice from the Presidential Personnel Office, all the scientists serving on the board were informed their positions had been eliminated. The emails dismissing board members provided no reason for the termination.

“I am deeply disappointed, though I cannot say I am entirely surprised,” Willie E. May, one of the terminated board members and vice president of research and economic development at Morgan State University in Maryland, told The New York Times

“I have watched the systematic dismantling of the scientific advisory infrastructure of this government with growing alarm, and the National Science Board is simply the latest casualty,” he said. 

Ranking member of the House Committee on Science, Space, and Technology Zoe Lofgren (D-CA) called the terminations “the latest stupid move made by a president who continues to harm science and American innovation.”  

The terminations come after a year that shocked higher education and research budgets. Last year, NSF granted 51% less funding to scientists than the 2015-2024 average and terminated hundreds of active grants. Last May, the Trump administration proposed cutting $5 billion from NSF’s budget, though the proposal was rejected. The president’s budget request for fiscal year 2027 once again proposes to reduce the foundation’s budget by more than half. In a February 2026 meeting of the National Science Board, NSF leadership said the foundation was seeking to reduce grant solicitations.

The Trump administration has also restructured scientific advisory groups elsewhere in the federal government, eliminating 152 federal advisory committees at science agencies, merging all of the Department of Energy’s advisory committees into one and dismantling the Environmental Protection Agency’s research office.

“Without a functional National Science Board in the near term, the agency is left without the guidance and oversight of independent experts, and the public is left without information on how NSF is carrying out its mission,” Gretchen Goldman, president and CEO of the Union of Concerned Scientists, wrote in a blog post about the terminations. 

—Grace van Deelen (@gvd.bsky.social), Staff Writer

These updates are made possible through information from the scientific community. Do you have a story about how changes in law or policy are affecting scientists or research? Send us a tip at eos@agu.org.

A photo of a hand holding a copy of an issue of Eos appears in a circle over a field of blue along with the Eos logo and the following text: Support Eos’s mission to broadly share science news and research. Below the text is a darker blue button that reads “donate today.”
Text © 2026. AGU. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.
  • ✇Eos
  • Tracing the Path of PFAS Across Antarctica Rebecca Owen
    Per- and polyfluoroalkyl substances (or PFAS) have been widely used in thousands of common nonstick, waterproof, or stain-resistant products since the 1950s. These “forever chemicals” do not break down easily: PFAS make their way into the air, soil, and water, as well as into human and animal bloodstreams and to some of Earth’s most pristine environments. They have been detected even in Antarctica, despite its reputation as a relatively untouched landscape far from the types of products—fast-fo
     

Tracing the Path of PFAS Across Antarctica

27 April 2026 at 13:16
An iceberg sits in a rough, partially frozen sea near Antarctica.

Per- and polyfluoroalkyl substances (or PFAS) have been widely used in thousands of common nonstick, waterproof, or stain-resistant products since the 1950s. These “forever chemicals” do not break down easily: PFAS make their way into the air, soil, and water, as well as into human and animal bloodstreams and to some of Earth’s most pristine environments. They have been detected even in Antarctica, despite its reputation as a relatively untouched landscape far from the types of products—fast-food wrappers, firefighting foam, nonstick cookware—that contain PFAS.

“We know PFAS are very persistent, so that helps. By looking at the patterns of the PFAS contamination in [Antarctic snow] samples, it gives us clues as to how they’re transported.”

Research into how PFAS arrive in Antarctica is limited, and most tends to focus on the continent’s coasts, rather than its interior. A new study published in Science Advances aimed to fill some of these gaps by examining PFAS accumulation across a 1,200-kilometer stretch of Antarctica, from the snow pits of Zhongshan Station in East Antarctica to the 4,093-meter peak of Dome A. By examining layers of snow collected from the coast to the interior, researchers sought to better track and understand how PFAS levels vary by location and how these forever chemicals have been able to travel long distances through the upper atmosphere to be deposited in remote regions.

“For substances to get there, they have to be able to transport long distances,” said Ian Cousins, a chemist at Stockholm University and one of the study’s authors. “We know PFAS are very persistent, so that helps. By looking at the patterns of the PFAS contamination in the samples, it gives us clues as to how they’re transported.”

PFAS Arrive by Air and by Sea

Along the 1,200-kilometer route, researchers from the Chinese Academy of Sciences collected 39 snow samples at 30-kilometer intervals, scraping the first few centimeters of snow from the surface to analyze for traces of PFAS.

Zhongshan Station sits near Prydz Bay, and there, researchers collected snow from a 1-meter-deep pit, with samples taken every 5 centimeters. At Dome A, the summit of the East Antarctic Ice Sheet, samples were collected at 10-centimeter intervals from another snow pit; this one was 3 meters deep, providing information about the past several decades of PFAS use.

“It’s quite interesting that we see the historical production record of PFAS in this pit on the top of this mountain in Antarctica,” said Cousins.

PFAS pollution arrives in Antarctica in two ways: via upper atmospheric transport and sea spray. Some PFAS are formed in the atmosphere when volatile precursor chemicals like fluorotelomer alcohols used in textile and paper products break down through reactions with sunlight and oxidants into more stable compounds. The resulting PFAS are later deposited into the snow and ice through precipitation.

Storm winds near the coast create sea spray. “When you have waves, it makes bubbles in the ocean. When the bubbles burst, these sea spray aerosols can be super enriched with PFAS. This has been shown to be a very important transport route,” Cousins said.

To distinguish between sources, researchers measured sodium in the snow to trace the ocean’s salty influence. Sodium levels decreased farther inland, reflecting the fading influence of sea spray toward the interior of the continent. But surprisingly, PFAS concentrations actually increased moving from the coast into the interior.

“That was kind of a bit counterintuitive to me,” explained Cousins, who said he expected PFAS levels to be highest near the coast. “You see the opposite, actually.”

The inland increase is likely explained by higher snowfall totals in the coastal regions, which lead to PFAS concentrations becoming diluted. Inland, where snowfall is lower, even small amounts of PFAS can result in relatively higher concentrations within snow samples.

Additional factors shape PFAS distribution. PFAS levels are higher at the onset of precipitation events when they are rapidly removed from the air. Temperature inversions, too, can trap chemicals. In coastal areas, PFAS are more influenced by sea spray in the winter, whereas stronger sunlight drives the degradation of atmospheric precursors into PFAS in the summer months.

PFAS Presence at Both Poles

This new study also offers implications for the way that PFAS circulate globally. Though industrial activity in the Northern Hemisphere contributes most heavily to PFAS emissions, large-scale atmospheric circulation allows these compounds to reach polar regions. Rapid transport in the upper troposphere may act as an efficient pathway to shuttle PFAS across both hemispheres before they are deposited in the cold, remote regions at both ends of Earth.

“This completes the global picture with agreeing measurements at both poles, solidifying our understanding of the global distribution and drivers of PFAS contamination.”

Even though PFAS levels are higher in the Arctic, both polar regions show similar trends in PFAS concentrations since the 1990s. “It really matches decades of the same records that have been reported from the Arctic,” said Cora Young, an atmospheric chemist at York University, who was not involved in the new study.

“This completes the global picture with agreeing measurements at both poles, solidifying our understanding of the global distribution and drivers of PFAS contamination. The role of CFC [chlorofluorocarbon] replacements, changes in regulation, all of these things are important in the Northern Hemisphere and also the Southern Hemisphere,” said Young.

—Rebecca Owen (@beccapox.bsky.social), Science Writer

Citation: Owen, R. (2026), Tracing the path of PFAS across Antarctica, Eos, 107, https://doi.org/10.1029/2026EO260129. Published on 27 April 2026.
Text © 2026. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

Seeing Climate Change Through a New Lens

By: Guest
27 April 2026 at 15:52
M.A. in Climate and Society student Erin Frank shoots film around New York City. She says her camera and climate coursework have more in common than she expected.

  • ✇Earth911
  • The Price Tag on a Ton of Carbon: What It Is, Why It Keeps Changing, and What It Means for Your Future Earth911
    If you took one long-haul flight each year for the past decade, the world would eventually pay about $25,000 for it. You won’t see this charge on your credit card, but the cost shows up somewhere—maybe as a hotter field with less rice, a stronger hurricane, or a factory forced to close on days that are too hot to work. This estimate comes from a Nature study published in March 2026 by researchers at Stanford and the University of California, Berkeley. They created a new way to link damage from s
     

The Price Tag on a Ton of Carbon: What It Is, Why It Keeps Changing, and What It Means for Your Future

27 April 2026 at 11:00

If you took one long-haul flight each year for the past decade, the world would eventually pay about $25,000 for it. You won’t see this charge on your credit card, but the cost shows up somewhere—maybe as a hotter field with less rice, a stronger hurricane, or a factory forced to close on days that are too hot to work. This estimate comes from a Nature study published in March 2026 by researchers at Stanford and the University of California, Berkeley. They created a new way to link damage from specific emissions to certain places and years.

That $25,000 figure is based on the social cost of carbon, a dollar estimate of the harm caused by releasing one ton of carbon dioxide into the air. While it might seem abstract, it is one of the most important numbers in American policy. It helps decide if a fuel-economy rule is worth it and influences permits for pipelines and power plants. Over the last four presidential administrations, this number has been raised, lowered, removed, and brought back. What we think a ton of carbon costs today affects how much the country is willing to do about climate change in the future.

What Is the Social Cost of Carbon?

Think of the cost of carbon like a garbage bill, the metaphor the authors of the Nature study use. When you put trash on the curb, someone has to pick it up, haul it away, and store it somewhere. You pay for that service. Carbon dioxide works the same way, except no one sends an invoice—it’s more like using a credit card, the bill for which your children or great-grandchildren will eventually pay.

Carbon dioxide stays in the atmosphere for centuries, quietly heating the planet, damaging crops, intensifying storms, and wearing down economies. Somebody, somewhere, eventually pays. The social cost of carbon is an attempt to figure out how much.

The number comes from combining climate science with economics. Researchers model how one extra ton of CO₂ affects global temperatures over the next century or two, then estimate how those temperature changes damage human health, farm yields, labor productivity, property, and economic growth. They add up the losses and express them in today’s dollars.

Two technical choices drive almost every disagreement about the final number:

  • Global versus domestic damages. Should the United States count the damage that occurs in India, Brazil, or Bangladesh from American emissions? Carbon mixes in the atmosphere — a ton released in Ohio warms the planet the same as a ton released in Mumbai — so the economic case for global accounting is strong. The political case for domestic-only accounting is that the US government works for Americans.
  • The discount rate. This is the trickiest piece. Economists “discount” future damages to express them in present-day dollars. A higher discount rate makes future harm look cheap today; a lower one makes it look expensive. Using a 7% discount rate, $1 trillion in climate damage in 2100 is worth only about $4 billion today. Using 3%, the same damage is worth about $86 billion. Same science, same damage, twenty times the present value.

That second choice, how much weight to give your grandchildren’s losses compared to your own savings, is where climate economics becomes a moral question.

A Short History of a Disputed Number

2008: A Court Forces the Issue

Federal agencies ignored carbon pricing for most of the modern regulatory era. That changed after the Center for Biological Diversity sued the Bush administration over weak fuel-economy standards for light trucks and SUVs. In 2008, the Ninth Circuit Court of Appeals ruled that assigning zero value to carbon emissions in cost-benefit analyses was “arbitrary and capricious.” The court stated: “the value of carbon emissions reduction is certainly not zero.”

That decision created a legal obligation. If federal agencies wanted to write rules that survived court review, they had to put a price on carbon. They just did not yet have one they could agree on.

2009–2016: The Obama Administration Sets the Framework

In 2009, President Obama convened an Interagency Working Group of federal economists and scientists. In 2010, the group published its first official estimate of the social cost of carbon: $21 per ton of CO₂.

In the following years, as climate models were updated, the estimate rose, reaching about $50 per ton (2020 dollars) by the end of the Obama years. This value was based on a 3% discount rate and global damages.

That framework, which involved interagency process and peer-reviewed models with global scope, was used in more than 65 federal rules and 81 subrules between 2008 and 2016. It shaped appliance efficiency standards, power plant emission limits, fuel-economy requirements, and rules governing methane leaks from oil and gas infrastructure. A higher social cost of carbon justified stricter rules. A lower one did not.

2017–2020: The First Trump Administration Rewrites the Math

Within months of taking office, President Trump signed Executive Order 13783, disbanding the Interagency Working Group and withdrawing its estimates. The Trump EPA recalculated the social cost of carbon by counting only US damages and raising the discount rate to 3%-7%. As a result, Obama’s $52 per ton estimate fell to between $1 and $7 per ton.

That lower number was, as Resources for the Future explained, “too low to make climate policies economically justifiable.” Rules that had provided a cost-benefit analysis supporting strict emissions rules under Obama suddenly no longer did so. The Clean Power Plan, the centerpiece of Obama’s climate policy, was repealed partly on the grounds that the climate benefits recalculated with the lower number no longer exceeded the costs. According to Scientific American, the change in the social cost of carbon was “determinative” in at least half a dozen petroleum-sector rollbacks during the first Trump term. Simply, it gave emitters an easy out.

2021–2024: Biden Restores, Then Raises, The Price Sharply

Biden reinstated the working group and set an interim value of about $51 per ton, adjusted for inflation. Legal challenges from some states were dismissed.

In November 2023, EPA set a new central estimate for the social cost of carbon: $190 per ton for 2020 emissions, rising to $230 by 2030 and $308 by 2050. This increase drew on updated climate science, new economic models, a lower discount rate of 2%, and two decades of scientific progress clarifying warming’s impact on economic growth, climate-driven mortality, and previously understated risks.

Other governments took note. Canada adopted the updated EPA number in 2023. Germany adapted the underlying model for its own analyses in 2024.

2025: The Second Trump Administration Tries to Erase It

On his first day back in office, January 20, 2025, President Trump signed Executive Order 14154, “Unleashing American Energy,” which disbanded the Interagency Working Group, withdrew its estimates, and directed EPA to consider eliminating the social cost of carbon from federal permitting and regulatory decisions entirely. The order called the metric “marked by logical deficiencies, a poor basis in empirical science, politicization, and the absence of a foundation in legislation.”

In March 2025, EPA Administrator Lee Zeldin announced the agency would “overhaul” the social cost of carbon. In May 2025, a follow-up executive memorandum directed federal agencies to stop factoring climate-related economic damage into their regulations and permitting decisions, except where statute requires it.

Where agencies are still legally obligated to put a number on it, the administration has settled on an interim estimate of as little as $1 per ton of CO₂, a return to the first Trump administration’s methodology, with domestic-only damages and higher discount rates. The companion social cost of methane dropped from $1,470 per ton to $58. In July 2025, the White House guidance went further, instructing agencies that any required analysis  should be limited to “the minimum consideration required to meet a statutory requirement” and, where possible, should not be monetized at all. The practical effect: $1 per ton on paper, $0 in most decisions.

The cycle is now in its third full reversal since 2008. Each time the number changes, so does the federal government’s willingness to regulate emissions.

What the New Research Adds

The new study in Nature does something the federal estimates have never done well: it separates past damage from future damage, and it assigns both to specific emitters. Their framework treats every ton of CO₂ as an asset that pays out negative returns; it’s a garbage bill that keeps accruing interest. Using that framework, they found three things that reshape the conversation.

A ton of CO₂ emitted in 1990 has already caused about $180 in global damages by 2020. That same ton will cause an additional $1,840 in damages between now and 2100 — 10 times more.  Using the authors’ conservative assumptions, which use a 2% discount rate with damages capped at 2100, the social cost of carbon for a ton emitted today is approximately $1,013. That is more than five times the Biden EPA’s $190 estimate, and higher estimates are possible under longer time horizons or lower discount rates.

Settling the bill for climate damage that has already happened would only cover a small fraction of the damage still to come from the same emissions. Past payments do not clear past debts.

Individuals and Corporations Run Up the Carbon Bill

The study also puts numbers on the kinds of choices that fill everyday life.

  • One extra long-haul flight per year for a decade produces roughly $25,000 in future discounted damages by 2100.
  • Switching from a meat-heavy to a vegetarian diet for a decade avoids about $6,000 in future damages.
  • Installing and using a heat pump for a decade results in an additional $6,000 in avoided damage.
  • Cutting driving by 10%, another $6,000 less future cost.

At the corporate scale, the numbers are staggering. Emissions from Saudi Aramco’s fossil fuel production between 1988 and 2015 are estimated to cause $64 trillion in cumulative discounted damages through 2100. ExxonMobil’s comparable share: $29 trillion. These are bigger than the annual GDP of most countries.

Today’s Cost, Tomorrow’s Reality

The social cost of carbon can feel like a number on a page in a regulatory document. It is not. It is a bridge between the world you are living in now and the world you will inherit.

When the federal government uses a low social cost of carbon, or no number at all, it writes rules that allow more emissions. More emissions mean a hotter atmosphere, which means stronger storms, longer fire seasons, lower crop yields, higher air conditioning bills, and more days when outdoor work becomes dangerous. Those consequences do not arrive as a lump sum in 2100.

They arrive gradually, starting now, and compounding in the form of flood and wildfire damage, biodiversity loss, and even defense spending to prevent immigration. The Nature researchers emphasize that their estimates are almost certainly too low because GDP damage functions do not capture losses of biodiversity, loss of cultural homelands, harm to mental health, or many slow-moving impacts such as sea level rise.

When the federal government uses a high social cost of carbon, it writes rules that prevent emissions. Those rules have costs today, sometimes real ones, paid by workers in fossil fuel industries, by consumers adjusting to new standards, by companies retooling their operations. The social cost of carbon does not eliminate those costs. It weighs them against costs that will otherwise fall on other people, in other places, at other times. That weighing is a choice about who counts.

The history traced here is, in that sense, a history of that choice, and none of those decisions are final. Courts have repeatedly ruled that federal agencies cannot treat the value of carbon-emissions reductions as zero. The 2008 ruling that gave rise to this framework is still on the books. Whatever the current administration does, the legal obligation to account for climate damages in cost-benefit analysis remains, and the science underpinning the newer, higher estimates continues to strengthen.

The post The Price Tag on a Ton of Carbon: What It Is, Why It Keeps Changing, and What It Means for Your Future appeared first on Earth911.

❌