This story was originally published by the Guardian and is reproduced here as
part of the Climate Desk collaboration.
One year on from the Eaton Fire, long after the vicious winds that sent embers
cascading from the San Gabriel mountains and the flames that swallowed entire
streets, a shadow still hangs over Altadena.
Construction on new properties is under way, and families whose homes survived
the fire have begun to return. But many are grappling with an urgent question:
is it safe to be here?
The fire upended life in this part of Los Angeles county. By the time
firefighters brought it under control, 19 people were dead, tens of thousands
displaced and nearly 9,500 structures destroyed, primarily in Altadena but also
in Pasadena and Sierra Madre.
Guardian graphic. Fire extent source: Cal Fire. Building damage source: analysis
of Copernicus Sentinel-1 satellite data by Corey Scher of CUNY Graduate Center
and Jamon Van Den Hoek of Oregon State University. Using building data from Oak
Ridge National Laboratory and fire perimeters from NIFC/FIRIS. All times are
localGuardian graphic. Fire extent source: Cal Fire. Building damage source:
analysis of Copernicus Sentinel-1 satellite data by Corey Scher of CUNY Graduate
Center and Jamon Van Den Hoek of Oregon State University. Using building data
from Oak Ridge National Laboratory and fire perimeters from NIFC/FIRIS. Note:
all times are local
The flames incinerated many older homes and businesses filled with lead paint
and asbestos. They showered the community with toxins, leaving tall piles of ash
and unseen traces of heavy metals in the soil and along and inside standing
structures. Research has indicated some hazards remain even after properties
have undergone remediation, the clean-up process that is supposed to restore
homes and ensure they are safe to occupy.
As Altadena fights to return, residents—some eager to stay in the community and
others who simply can’t afford to go anywhere else—are facing immense challenges
while trying to rebuild their lives and come back home.
Official information about the health risks was limited early on and those
returning often only learned about the dangers as they went. Some people have
developed health concerns such as migraines and respiratory issues. Many are
still battling their insurance companies to fully cover their costs, and make
certain their homes are habitable.
Their predicament highlights the increased dangers that come with urban fires,
and shows how Altadena has come to serve as a sort of living laboratory with
scientists and residents learning in real time.
Nicole Maccalla, a data scientist, and her family moved back into their Altadena
home over the summer after their property underwent an extensive cleanup
process, but their air purifiers still register high levels of particulate
matter, heavy sediment appears when they vacuum and when it rains the
distinctive smell from the fire returns.
“The toll of displacement was really high on my family. And I just had to move
home and try [to] mitigate risk and keep fighting the good fight,” she said.
“There’s always that back-of-your-mind concern: Did I make the right choice? But
I also don’t have other choices.”
Early on in those first careening hours of the fire, as thick smoke and ash fell
like snow over her yard, Dawn Fanning was sure her home would not be spared. The
wind was blowing from the fire straight to the Spanish bungalow the producer
shared with her adult son, and it seemed there was no way to stop it.
Dawn Fanning outside her home in Pasadena, California, on December 28 2025. The
interior of her home has been found to have lead and asbestos after the Eaton
Fire.Stella Kalinina/Guardian
Fanning’s home, miraculously, escaped the flames. But, while the stucco
structure was intact—clothes still hanging undisturbed in her closet and her
son’s baby photos packed carefully in bins in the garage—it hadn’t been
unscathed either. Virtually nothing in Altadena was.
“It’s dusty and there’s piles of ash in the windowsills and on the floor. At
first glance, it doesn’t look any different,” Fanning said. “Your house looks
the same—but it’s not. There’s toxicity in your attic and in your crawlspace and
on your mattresses and on all the things.”
Confused and frustrated with the local government’s handling of health concerns,
Maccalla and Fanning joined other fire survivors to form Eaton Fire Residents
United in hopes of ensuring the impacted areas recover safely. The community
group is developing testing and remediation guidelines, gathered hygienic
testing reports of hundreds of homes, and advocated for fire survivors and
workers.
> “When she awoke at 3 a.m., the blaze had formed a horseshoe shape around her
> house, and smoke filled the room.”
“There [have been] huge threats to the health and safety of residents, children
in schools, elderly and immunocompromised, workers that are coming into this
area that are being exposed to hazards in the workplace,” Maccalla said. “We’re
still trying to work on that and get the protections people need.”
Barely 15 miles north-east of downtown Los Angeles, Altadena at the start of
last year was home to some 43,000 people, many lured by the affordable home
prices, proximity to the mountains and bucolic feel. It has long been one of the
most diverse cities in the region with a thriving Black community that began to
grow during the great migration.
In the early evening on January 7 2025, Fanning, who had lived in her home in
the area for two decades, had a feeling she couldn’t shake that something could
go very wrong. There were treacherous winds that forecasters warned posed a
serious fire risk. Already, a fire was spreading rapidly on the other side of
the county in the Pacific Palisades, where frantic residents were trying to
evacuate and firefighters were clearing the area.
Some 35 miles away, Fanning and her son were watching coverage of the unfolding
fire while readying their property. Then came an alert—not from officials—but
from a local meteorologist who was telling his followers to get out now. Fanning
spotted flames several blocks away and she and her son decided it was time to
leave.
A few miles to the east, Rosa Robles was evacuating with her grandchild in tow,
leaving her husband and adult children. She wanted them to go—but they were
protecting the home. Armed with garden hoses, they tried to save the residence
and the other houses on their block. Sometimes the wind was so strong it blew
the water back in their faces, Robles said.
Maccalla’s power had gone out that morning, and she and her children were
sitting around watching the TV drama Fire Country on an iPad in the dark when
they got the call about the fire. It seemed far away at the time, Maccalla
recalled, and she felt prepared as a member of a community emergency response
team.
They got out lamps and began packing in case they needed to leave. She set
alarms hourly to monitor the progress of the fire while her children slept.
When she awoke at three, the blaze had formed a horseshoe shape around her
house, and smoke filled the room. The family evacuated with their two dogs and
two cats.
Tamara Artin had returned from work to see chaos on the street, with fierce
winds and billowing smoke all around the house she rented with her husband.
Artin, who is Armenian by way of Iran and has lived in Los Angeles for about six
years, always loved the area. She enjoyed the history and sprawling green parks,
and had been excited to live here.
Now the pair was quickly abandoning the home they had moved into just three
months earlier, heading toward a friend’s house with their bags and passports.
Fanning and her son had gone to a friend’s home too. As they stayed up late
listening to the police scanner, they heard emergency responders call out
addresses where flames were spreading. These were friends’ homes. She waited to
hear her own.
> “We were worried, of course, because we were inhaling all those chemicals
> without knowing what it is.”
In the first days after the fire began, the risk remained and there was little
help available with firefighting resources spread across Los Angeles. Maccalla
and her son soon returned to their property to try to protect their home and
those of their neighbors.
“I was working on removing a bunch of debris that had flown into the yard and
all these dry leaves. I didn’t know at the time that I shouldn’t touch any of
that,” she said.”
The devastation in Altadena, as in the Palisades, was staggering. Many of the 19
people who died were older adults who hadn’t received evacuation warnings for
hours after people in other areas of town, if at all.
Physically, parts of Altadena were almost unrecognizable. In the immediate
aftermath of the fire, bright red flame retardant streaked the hillsides. Off
Woodbury Road, not far from where Robles and Artin lived, seemingly unblemished
homes stood next to blackened lots where nothing remained but fireplaces and
charred rubble—scorched bicycles, collapsed beds and warped ovens. The pungent
smell of smoke seemed to embed itself in the nose.
Robles would sometimes get lost in the place she had lived her whole life as she
tried to navigate streets that had been stripped of any identifiable landmarks.
Fire scorched the beloved community garden, the country club, an 80-year-old
hardware store, the Bunny Museum and numerous schools and houses of worship.
Artin and her husband returned to their home, which still stood, after a single
night. They had no family in the area and nowhere else to go—hotels were packed
across the county. For nearly two weeks they lived without water or power as
they tried to clean up, throwing away most of their furniture and belongings,
even shoes, and all of the food in the fridge and freezer.
“We were worried, of course, because we were inhaling all those chemicals
without knowing what it is, but we didn’t have a choice,” Artin recalled.
As fires burn through communities, they spread particulate matter far and wide,
cause intense smoke damage in standing structures and cars, and release
chemicals even miles beyond the burned area.
> After one round of remediation, “six out of 10 homes were still coming back
> with lead and or asbestos levels that exceeded EPA safety thresholds.”
When Fanning saw her home for the first time, thick piles of ash covered the
floors. She was eager to return, but as she tried to figure out her next steps,
reading scientific articles and guides, and joining Zoom calls with other
concerned residents, it was clear she needed to learn more about precisely what
was in the ash. Asbestos was found in her home, meaning all porous items,
clothing and furniture, were completely ruined.
“You can’t wash lead and asbestos out of your clothing. I was like, OK, this is
real and I need to gather as much evidence [as I can] to find out what’s in my
house.”
In Altadena, more than 90 percent of homes had been built before 1975 and likely
had lead-based paint and toxic asbestos, both of which the EPA has since banned,
according to a report from the California Institute of Technology. All sorts of
things burned along with the houses, Fanning said: plastic, electric cars,
lithium batteries. “The winds were shoving this into our homes,” she said.
The roof on Maccalla’s home had to be rebuilt, and significant cleanup was
required for the smoke damage and layers of ash that blanketed curtains and
beds.
Despite these concerns, residents grew increasingly frustrated about what they
viewed as a lack of official information about the safety of returning to their
homes. Many also encountered pushback from their insurance providers that said
additional testing for hazards, or more intensive remediation efforts
recommended by experts, were unnecessary and not covered under their policies.
So earlier this year a group of residents, including Fanning and Maccalla,
formed Eaton Fire Residents United (EFRU). The group includes scientists and
people dedicated to educating and supporting the community, ensuring there is
data collection to support legislation, and assembling an expert panel to
establish protocols for future fires, Fanning said. They’ve published research
based on testing reports from hundreds of properties across the affected area,
and advocated that homes should receive a comprehensive clearance before
residents return.
Research released by EFRU and headed by Maccalla, who has a doctorate in
education and specializes in research methodology, found that more than half of
homes that had been remediated still had levels of lead and/or asbestos that
rendered them uninhabitable.
“There’s still widespread contamination and that one round of remediation was
not sufficient, the majority of the time. Six out of 10 homes were still coming
back with lead and or asbestos levels that exceeded EPA safety thresholds,” said
Maccalla, who serves as EFRU’S director of data science and educational
outreach.
The interior of Dawn Fanning’s home has been found to have lead and asbestos
after the Eaton Fire.Stella Kalinina/The Guardian
Maccalla moved back home in June after what she viewed as a decent remediation
process. But she hasn’t been able to get insurance coverage for additional
testing, and worries about how many people are having similar experiences.
“We’re putting people back in homes without confirming that they’re free of
contamination,” she said. “It feels very unethical and a very dangerous game to
be playing.”
She couldn’t afford not to come home, and the family couldn’t keep commuting two
hours a day each way from their temporary residence to work and school or their
Altadena property where Maccalla was overseeing construction. But she’s
experienced headaches, her daughter’s asthma is more severe, and her pets have
become sick.
“I don’t think anybody that hasn’t gone through it can really comprehend what
[that is like],” she said. “For everything in your environment that was so
beloved to now become a threat is mentally a really hard switch,” she said.
Robles settled back to the home she’s lived in for years with a few new
additions. Seven of her relatives lost their homes, including her daughter who
now lives with her. “I thank God there’s a place for them. That’s all that
matters to me.”
Nicole Maccalla with her dog, Cami, outside her home in Altadena. Stella
Kalinina/The Guardian
After the fire, she threw away clothes, bed sheets and pillows. The family
mopped and washed the walls. Her insurance was helpful, she said, and covered
the cleanup work. Robles tries not to think about the toxic contamination and
chemicals that spread during the fire. “You know that saying, what you don’t
know?” she said, her voice trailing off.
Artin said she received some assistance from her renter’s insurance, but that
her landlord hadn’t yet undertaken more thorough remediation. She’s still trying
to replace some of the furniture she had to throw away. The fire had come after
an already difficult year in which her husband had been laid off, and their
finances were stretched.
> “I don’t know if I’ll ever feel safe again.”
She shudders when she recalls the early aftermath of the fire, a morning sky as
dark as night. “It was hell, honestly.”
Her rent was set to increase in the new year, and while she fears exposure to
unseen dangers, moving isn’t an option. “We don’t have anywhere else to go. We
can’t do anything,” Artin said.
Fanning has been battling her insurance company to cover the work that is
necessary to ensure her house is safely habitable, she said. Her provider is
underplaying the amount of work that needs to be done and underbidding the
costs, Fanning said. She and her son have been living in a short-term rental
since late summer, and she expects they won’t be able to return home before the
fall.
Sometimes she wonders if she’ll be up to returning at all. Even now, when
Fanning drives through the area to come get her mail or check on the house, she
gets headaches. “I don’t know if I’ll ever feel safe, no matter all the things
that I know and all the things that I’m gonna do. I don’t know if I’ll ever feel
safe again.”
In between trying to restore her home, she’s focused on advocacy with EFRU,
which has become her primary job, albeit unpaid. “There are so many people that
don’t have enough insurance coverage, that don’t speak English, that are
renters, that don’t have access like I do … I feel it’s my duty as a human.”
There’s much work to do, Fanning said, and it has to be done at every single
property.
“It’s a long road to recovery. And if we don’t do it right, safely, it’s never
gonna be what it was before.”
Tag - Corporations
This story was originally published by the Guardian and is reproduced here as
part of the Climate Desk collaboration.
Donald Trump, by dramatically seizing Nicolás Maduro and claiming dominion
over Venezuela’s vast oil reserves, has taken his “drill, baby, drill” mantra
global. Achieving the president’s dream of supercharging the country’s oil
production would be financially challenging—and if fulfilled, would be “terrible
for the climate”, experts say.
Trump has aggressively sought to boost oil and gas production within the US.
Now, after the capture and arrest of Maduro and his wife, Cilia Flores, he is
seeking to orchestrate a ramp-up of drilling in Venezuela, which has the largest
known reserves of oil in the world—equivalent to about 300bn barrels, according
to research firm the Energy Institute.
“The oil companies are going to go in, they are going to spend money, we are
going to take back the oil, frankly, we should’ve taken back a long time ago,”
the US president said after Maduro’s extraction from Caracas. “A lot of money is
coming out of the ground, we are going to be reimbursed for everything we
spend.”
Source: The Oil & Gas Journal. Note: China and Taiwan and Sudan and South Sudan
are combined in the data. *Estimates for the Saudi Arabian-Kuwaiti Neutral Zone
are divided equally between the two countries.Guardian
US oil companies will “spend billions of dollars, fix the badly broken
infrastructure… and start making money for the country,” Trump added, with his
administration pressing Venezuela’s interim government to delete a law requiring
oil projects to be half-owned by the state.
> A 50 percent boost in Venezuelan oil production would result in more carbon
> pollution than major economies like the UK and Brazil emit.
Leading US oil businesses such as Exxon and Chevron have so far remained
silent on whether they would spend the huge sums required to enact the
president’s vision for Venezuela. But should Venezuela ramp up output to near
its 1970s peak of 3.7 million barrels a day—more than triple current levels—it
would further undermine the already faltering global effort to limit dangerous
global heating.
Even raising production to 1.5 million barrels of oil a day from current levels
of around 1 million barrels would produce around 550 million tons of carbon
dioxide a year when the fuel is burned, according to Paasha Mahdavi, an
associate professor of political science at the University of California, Santa
Barbara. This is more carbon pollution than what is emitted annually by major
economies such as the UK and Brazil.
“If there are millions of barrels a day of new oil, that will add quite a lot of
carbon dioxide to the atmosphere and the people of Earth can’t afford that,”
said John Sterman, an expert in climate and economics at the Massachusetts
Institute of Technology.
The climate costs would be especially high because Venezuela produces some of
the world’s most carbon-intensive oil. Its vast reserves of
extra-heavy crude are particularly dirty, and its other reserves are “also quite
carbon- and methane-intensive,” Mahdavi said.
The world is close to breaching agreed temperature increase limits –
already suffering more severe heatwaves, storms and droughts as a result.
Increased Venezuelan drilling would further lower global oil prices and slow the
needed momentum towards renewable energy and electric cars, Sterman added.
“If oil production goes up, climate change will get worse sooner, and everybody
loses, including the people of Venezuela,” he said. “The climate damages
suffered by Venezuela, along with other countries, will almost certainly
outweigh any short-term economic benefit of selling a bit more oil.”
During his first year back in the White House, Trump has demanded the world
remain running on fossil fuels rather than “scam” renewables and has threatened
the annexation of Canada, a major oil-producing country, and Greenland, an
Arctic island rich with mineral resources.
Critics have accused Trump of a fossil fuel-driven “imperialism” that threatens
to further destabilize the world’s climate, as well as upend international
politics. “The US must stop treating Latin America as a resource colony,” said
Elizabeth Bast, the executive director of Oil Change International. “The
Venezuelan people, not US oil executives, must shape their country’s future.”
Patrick Galey, head of fossil fuel investigations at the climate and justice NGO
Global Witness, said Trump’s aggression in Venezuela is “yet another conflict
fuelled by fossil fuels, which are overwhelmingly controlled by some of the
world’s most despotic regimes.”
“So long as governments continue to rely on fossil fuels in energy systems,
their constituents will be hostage to the whims of autocrats,” he said.
Oil rigs at Maracaibo Lake in Venezuela’s Zulia state.Leslie Mazoch/AP
Though the president’s stated vision is for US-based oil companies to tap
Venezuela’s oil reserves for profit, making good on that promise may be
complicated by economic, historical and geological factors, experts say.
Oil companies may not be “eager to invest what’s needed because it will take a
lot longer than the three years of President Trump’s term”, said Sterman.
“That’s a lot of risk—political risk, project risk,” he said. “It seems very
tricky.”
Upping production is “also just a bad bet generally”, said Galey. “Any
meaningful increase in current production would require tens of billions of
investment in things like repairs, upgrades and replacing creaking
infrastructure,” he said. “That’s not even taking into account the dire security
situation.”
> “The heavy Venezuelan crude that could be refined in US Gulf coast
> installations is likely going to undercut domestic producers.”
Venezuela’s oil production has fallen dramatically from its historical highs—a
decline experts blame on both mismanagement and US sanctions imposed by Barack
Obama and escalated by Trump. By 2018, the country was producing just 1.3m
barrels a day—roughly half of what it produced when Maduro took office in 2013,
just over a third of what it produced in the 1990s, and about a third of its
peak production in the 1970s.
Trump has said US companies will revive production levels and be “reimbursed”
for the costs of doing so. But the economics of that expansion may not entice
energy majors, and even if they choose to play along, it would take years to
meaningful boost extraction, experts say.
Boosting Venezuela’s oil output by 500,000 barrels a day would cost about $10bn
and take roughly two years, according to Energy Aspects. Production could reach
between 2 million and 2.5 million barrels a day within a decade by tapping
medium crude reserves, Mahdavi said. But returning to peak output would require
developing the Orinoco Belt, whose heavy, sulfur-rich crude is far more costly
and difficult to extract, transport and refine.
Returning to 2 million barrels per day by the early 2030s would require about
$110 billion in investment, according to Rystad Energy, an industry consultancy.
“That is going to take much more time and much more money, to be able to get at
or close to maybe 3, 4 or 5 million barrels a day of production,” said Mahdavi.
Increasing Venezuelan extraction amid booming US production may also be a hard
sell. “The heavy Venezuelan crude that could be refined in US Gulf coast
installations is likely going to undercut domestic producers, who until Trump
kidnapped Maduro had been vocally supportive of sanctions on Venezuelan oil,”
said Galey.
Some firms may be willing to “eat that uncertainty” because the US plans to
provide companies with financial support to drill in Venezuela, said Mahdavi.
“If you’re willing to deal with the challenges…you are looking still at
relatively cheap crude that will get you a higher profit margin than what you
can do in the United States,” he said. “That’s why they’re still interested:
It’s way more expensive to drill in, say, the US’s Permian Basin.”
Some US oil majors may be more receptive to Trump’s Venezuela strategy. Chevron,
the only US company operating in the country, may be poised scale up production
faster than its rivals. And ExxonMobil, which has invested heavily in oil
production within neighboring Guyana, could benefit from the removal of Maduro,
who staunchly opposes that expansion.
Overall, however, it remains unclear how US oil majors will respond to Trump’s
plans of regime change and increased oil extraction in Venezuela. What is much
clearer is that any expansion would be “terrible for the climate, terrible for
the environment,” said Mahdavi.
This story was originally published by Yale e360 and is reproduced here as part
of the Climate Desk collaboration.
On a mid-November evening, at precisely 7:12 p.m., a SpaceX Falcon 9 rocket
lifted off from Cape Canaveral Space Force Station on the Florida coast. It
appeared to be a perfect launch. At an altitude of about 40 miles, the rocket’s
first stage separated and fell back to Earth, eventually alighting in a gentle,
controlled landing on a SpaceX ship idling in the Atlantic Ocean.
The mission’s focus then returned to the rocket’s payload: 29 Starlink
communication satellites that were to be deployed in low-Earth orbit, about 340
miles above the planet’s surface. With this new fleet of machines, Starlink was
expanding its existing mega-constellation so that it numbered over 9,000
satellites, all circling Earth at about 17,000 miles per hour.
Launches like this have become commonplace. As of late November, SpaceX had sent
up 152 Falcon 9 missions in 2025—an annual record for the company. And while
SpaceX is the undisputed leader in rocket launches, the space economy now ranges
beyond American endeavors to involve orbital missions—military, scientific, and
corporate—originating from Europe, China, Russia, India, Israel, Japan, and
South Korea. This year the global total of orbital launches will near 300 for
the first time, and there seems little doubt it will continue to climb.
> “We are now in this regime where we are doing something new to the atmosphere
> that hasn’t been done before.”
Starlink has sought permission from the Federal Communications Commission to
expand its swarm, which at this point comprises the vast majority of Earth’s
active satellites, so that it might within a few years have as many as 42,000
units in orbit. Blue Origin, the rocket company led by Jeff Bezos, is in the
early stages of helping to deploy a satellite network for Amazon, a
constellation of about 3,000 units known as Amazon Leo. European companies, such
as France’s Eutelsat, plan to expand space-based networks, too.
“We’re now at 12,000 active satellites, and it was 1,200 a decade ago, so it’s
just incredible,” Jonathan MacDowell, a scientist at Harvard and the Smithsonian
who has been tracking space launches for several decades, told me recently.
MacDowell notes that based on applications to communications agencies, as well
as on corporate projections, the satellite business will continue to grow at an
extraordinary rate. By 2040, it’s conceivable that more than 100,000 active
satellites would be circling Earth.
But counting the number of launches and satellites has so far proven easier than
measuring their impacts. For the past decade, astronomers have been calling
attention to whether so much activity high above might compromise their
opportunities to study distant objects in the night sky. At the same time, other
scientists have concentrated on the physical dangers. Several studies project a
growing likelihood of collisions and space debris—debris that could rain down on
Earth or, in rare cases, on cruising airplanes.
More recently, however, scientists have become alarmed by two other potential
problems: the emissions from rocket fuels, and the emissions from satellites and
rocket stages that mostly ablate (that is, burn up) on reentry. “Both of these
processes are producing pollutants that are being injected into just about every
layer of the atmosphere,” explains Eloise Marais, an atmospheric scientist at
University College London, who compiles emissions data on launches and
reentries.
As Marais told me, it’s crucial to understand that Starlink’s satellites, as
well as those of other commercial ventures, don’t stay up indefinitely. With a
lifetime usefulness of about five years, they are regularly deorbited and
replaced by others. The new satellite business thus has a cyclical quality:
launch, deploy, deorbit, destroy. And then repeat.
The cycle suggests we are using Earth’s mesosphere and stratosphere—the layers
above the surface-hugging troposphere—as an incinerator dump for space
machinery. Or as Jonathan MacDowell puts it: “We are now in this regime where we
are doing something new to the atmosphere that hasn’t been done before.”
MacDowell and some of his colleagues seem to agree that we don’t yet understand
how—or how much—the reentries and launches will alter the air. As a result,
we’re unsure what the impacts may be to Earth’s weather, climate, and
(ultimately) its inhabitants.
To consider low-Earth orbit within an emerging environmental framework, it helps
to see it as an interrelated system of cause and effect. As with any system,
trying to address one problematic issue might lead to another. A long-held idea,
for instance, has been to “design for demise,” in the argot of aerospace
engineers, which means constructing a satellite with the intention it should not
survive the heat of reentry.
“But there’s an unforeseen consequence of your solution unless you have a grasp
of how things are connected,” according to Hugh Lewis, a professor of
astronautics at the University of Birmingham in the United Kingdom. In reducing
“the population of debris” with incineration, Lewis told me—and thus, with rare
exceptions, saving us from encounters with falling chunks of satellites or
rocket stages—we seem to have chosen “probably the most harmful solution you
could get from a perspective of the atmosphere.”
We don’t understand the material composition of everything that’s burning up.
Yet scientists have traced a variety of elements that are vaporizing in the
mesosphere during the deorbits of satellites and derelict rocket stages; and
they’ve concluded these vaporized materials—as a recent study in the Proceedings
of the National Academy of Sciences put it—“condense into aerosol particles that
descend into the stratosphere.” The PNAS study, done by high altitude air
sampling and not by modeling, showed that these tiny particles contained
aluminum, silicon, copper, lead, lithium, and more exotic elements like niobium.
> “Emission plumes from the first few minutes of a mission, which disperse into
> the stratosphere, may…have a significant effect on the ozone layer.”
The large presence of aluminum, signaling the formulation of aluminum oxide
nanoparticles, may be especially worrisome, since it can harm Earth’s protective
ozone layers and may undo our progress in halting damage done by
chlorofluorocarbons, or CFCs. A recent academic study in the journal Geophysical
Research Letters concluded that the ablation of a single 550-pound satellite (a
new Starlink unit is larger, at about 1,800 pounds) can generate around 70
pounds of aluminum oxide nanoparticles. This floating metallic pollution may
stay aloft for decades.
The PNAS study and others, moreover, suggest the human footprint on the upper
atmosphere will expand, especially as the total mass of machinery being
incinerated ratchets up. Several scientists I spoke with noted that they have
revised their previous belief that the effects of ablating satellites would not
exceed those of meteorites that naturally burn up in the atmosphere and leave
metallic traces in the stratosphere. “You might have more mass from the
meteoroids,” Aaron Boley, an astronomer at the University of British Columbia,
said, but “these satellites can still have a huge effect because they’re so
vastly different [in composition].”
Last year, a group of researchers affiliated with NASA formulated a course of
research that could be followed to fill large “knowledge gaps” relating to these
atmospheric effects. The team proposed a program of modeling that would be
complemented by data gleaned from in situ measurements. While some of this
information could be gathered through high-altitude airplane flights, sampling
the highest-ranging air might require “sounding” rockets doing tests with
suborbital flights. Such work is viewed as challenging and not inexpensive—but
also necessary. “Unless you have the data from the field, you cannot trust your
simulations too much,” Columbia University’s Kostas Tsigaridis, one of the
scientists on the NASA team, told me.
Tsigaridis explains that lingering uncertainty about NASA’s future expenditures
on science has slowed US momentum for such research. One bright spot, however,
has been overseas, where ESA, the European Space Agency, held an
international workshop in September to address some of the knowledge gaps,
particularly those relating to satellite ablations. The ESA meeting resulted in
a commitment to begin field measurement campaigns over the next 24 months, Adam
Mitchell, an engineer with the agency, said. The effort suggests a sense of
urgency, in Europe, at least, that the space industry’s growth is outpacing our
ability to grasp its implications.
A SpaceX Falcon 9 rocket takes off. SpaceX now has more than 9,000 Starlink
satellites orbiting the Earth.SpaceX
The atmospheric pollution problem is not only about what’s raining down from
above, however; it also relates to what happens as rockets go up. According to
the calculations of Marais’ UCL team, the quantity of heat-trapping gases like
CO2 produced during liftoffs are still tiny in comparison to, say, those of
commercial airliners. On the other hand, it seems increasingly clear that rocket
emission plumes from the first few minutes of a mission, which disperse into the
stratosphere, may, like reentries, have a significant effect on the ozone
layer.
The most common rocket fuel right now is a highly refined kerosene known as
RP-1, which is used by vehicles such as SpaceX’s Falcon 9. When RP-1 is burned
in conjunction with liquid oxygen, the process releases black carbon
particulates into the stratosphere. A recent study led by Christopher Maloney of
the University of Colorado used computer models to assess how the black carbon
absorbs solar radiation and whether it can warm the upper atmosphere
significantly. Based on space industry growth projections a few decades into the
future, these researchers concluded that the warming effect of black carbon
would raise temperatures in the stratosphere by as much as 1.5 degrees C,
leading to significant ozone reductions in the Northern Hemisphere.
> When satellite companies talk about sustainability, “what they mean is, we
> want to sustain this rate of growth.”
It may be the case that a different propellant could alleviate potential
problems. But a fix isn’t as straightforward as it seems. Solid fuels, for
instance, which are often used in rocket boosters to provide additional thrust,
emit chlorine—another ozone-destroying element. Meanwhile, the propellant of the
future looks to be formulations of liquefied natural gas (LNG), often referred
to as liquid methane. Liquid methane will be used to power SpaceX’s massive
Starship, a new vehicle that’s intended to be used for satellite deployments,
moon missions, and, possibly someday, treks to Mars.
The amount of black carbon emissions from burning LNG may be 75 percent less
than from RP-1. “But the issue is that the Starship rocket is so much bigger,”
UCL’s Marais says. “There’s so much more mass that’s being launched.” Thus,
while liquid methane might burn cleaner, using immense quantities of it—and
using it for more frequent launches—could undermine its advantages. Recently,
executives at SpaceX’s Texas factory have said they would like to build a new
Starship every day, readying the company for a near-constant cycle of launches.
One worry amongst scientists is that if new research suggests that space
pollution is leading to serious impacts, it may eventually resemble an
airborne variation of plastics in the ocean. A more optimistic view is that
these are the early days of the space business, and there is still time for
solutions. Some of the recent work at ESA, for instance, focuses on changing the
“design for demise” paradigm for satellites to what some scientists are calling
“design to survive.”
Already, several firms are testing satellites that can get through an reentry
without burning up; a company called Atmos, for instance, is working on an
inflatable “atmospheric decelerator” that serves as a heat shield and parachute
to bring cargo to Earth. Satellites might be built from safer materials, such as
one tested in 2024 by Japan’s space agency, JAXA, made mostly from wood.
More ambitious plans are being discussed: Former NASA engineer Moriba Jah
has outlined a design for an orbital “circular economy” that calls for “the
development and operation of reusable and recyclable satellites, spacecraft, and
space infrastructure.” In Jah’s vision, machines used in the space economy
should be built in a modular way, so that parts can be disassembled, conserved,
and reused. Anything of negligible worth would be disposed of responsibly.
Most scientists I spoke with believe that a deeper recognition of environmental
responsibilities could rattle the developing structure of the space business.
“Regulations often translate into additional costs,” says UCL’s Marais, “and
that’s an issue, especially when you’re privatizing space.” A shift to building
satellites that can survive reentry, for instance, could change the economics of
an industry that, as astronomer Aaron Boley notes, has been created
to resemble the disposable nature of the consumer electronics business.
Boley also warns that technical solutions are likely only one aspect of avoiding
dangers and will not address all the complexities of overseeing low-Earth orbit
as a shared and delicate system. It seems possible to Boley that in addition to
new fuels, satellite designs, and reentry schemes, we may need to look toward
quotas that require international management agreements. He acknowledges that
this may seem “pie in the sky”; while there are treaties for outer space, as
well as United Nations guidelines, they don’t address such governance issues.
Moreover, the emphasis in most countries is on accelerating the space economy,
not limiting it. And yet, Boley argues that without collective-action policy
responses we may end up with orbital shells so crowded that they exceed a safe
carrying capacity.
That wouldn’t be good for the environment or society—but it wouldn’t be good for
the space business, either. Such concerns may be why those in the industry
increasingly discuss a set of principles, supported by NASA, that are often
grouped around the idea of “space sustainability.” University of Edinburgh
astronomer Andrew Lawrence told me that the phrase can be used in a way that
makes it unclear what we’re sustaining: “If you look at the mission statements
that companies make, what they mean is, we want to sustain this rate of
growth.”
But he doesn’t think we can. As one of the more eloquent academics arguing for
space environmentalism, Lawrence perceives an element of unreality in the belief
that in accelerating space activity we can “magically not screw everything up.”
He thinks a goal in space for zero emissions, or zero impact, would be more
sensible. And with recent private-sector startups suggesting that we should use
space to build big data centers or increase sunlight on surface areas of Earth,
he worries we are not entering an era of sustainability but a period of crisis.
Lawrence considers debates around orbital satellites a high-altitude variation
on climate change and threats to biodiversity—an instance, again, of trying to
seek a balance between capitalism and conservation, between growth and
restraint. “Of course, it affects me and other professional astronomers and
amateur astronomers particularly badly,” he concedes. “But it’s really that it
just wakes you up and you think, ‘Oh, God, it’s another thing. I thought, you
know—I thought we were safe.’” After a pause, he adds, “But no, we’re not.”
A landmark study on the safety of glyphosate, the active ingredient in the
controversial herbicide Roundup, has been formally retracted by its publisher,
raising new concerns about the chemical’s potential dangers.
Federal regulators relied heavily on the study, published in 2000 by the science
journal Regulatory Toxicology and Pharmacology, in their assessment that the
herbicide is safe and does not cause cancer. Indeed, the paper, which concluded
that “Roundup herbicide does not pose a health risk to humans” was the most
cited study in some government reports.
But the journal’s editor-in-chief, Martin van den Berg, said he no longer
trusted the study, which appears to have been secretly ghostwritten by employees
of Monsanto, the company that introduced Roundup in 1974. Officially, the
paper’s authors, including a doctor from New York Medical College, were listed
as independent scientists.
Van den Berg, a professor of toxicology in the Netherlands, concluded that the
paper relied entirely on Monsanto’s internal studies and ignored other evidence
suggesting that Roundup might be harmful.
> “The MAHA world is losing their minds right now. They keep getting thrown
> under the bus.”
In 2015, the World Health Organization’s International Agency for Research on
Cancer determined that glyphosate probably causes cancer. Since then, Roundup’s
manufacturer, Bayer, which bought Monsanto in 2018, has agreed to pay more than
$12 billion in legal settlements to people who claim it gave them cancer.
In 2020, the US Environmental Protection Agency released an updated safety
assessment on glyphosate that again determined that it was safe and did not
cause cancer. This EPA report is often cited in news reports that contend
glyphosate is “fine” and important for modern food production.
But those reports failed to mention that the 2020 EPA health assessment was
overturned in 2022 by the 9th US Circuit Court of Appeals. The “EPA’s errors in
assessing human-health risk are serious,” the judges wrote, and “most studies
EPA examined indicated that human exposure to glyphosate is associated with an
at least somewhat increased risk of developing non-Hodgkin’s lymphoma”—a type of
cancer.
The court told the EPA it needed to redo its human health assessment, meaning
the agency now has no official stance on glyphosate’s risk to people. It is
expected to release an updated safety report next year.
During the first Trump administration, Monsanto executives were told they “need
not fear any additional regulation from this administration,” according to an
internal Monsanto email cited in a Roundup lawsuit in 2019. Monsanto had hired a
consultant, according to court documents, who reported back that “a domestic
policy adviser at the White House had said, for instance: ‘We have Monsanto’s
back on pesticides regulation.’”
On Tuesday, the US Solicitor General asked the Supreme Court to consider a case
that could help shield Bayer from further lawsuits. The company’s stock soared
by as much as 14 percent on news of the Trump administration’s help in the case.
Two states—North Dakota and Georgia—have passed laws this year that help shield
Bayer from some cancer lawsuits arising from Roundup use. There is a push to
enact similar laws in other states and on the federal level.
In July, New Jersey Sen. Corey Booker introduced the Pesticide Injury
Accountability Act to push back against these new laws, and ensure that “these
chemical companies can be held accountable in federal court for the harm caused
by their toxic products.” Zen Honeycutt, a key voice in the Make America Healthy
Again coalition, has endorsed the legislation.
Nathan Donley, environmental health science director at the Center for
Biological Diversity, said the glyphosate debate has become a key sticking point
between President Trump and his MAHA base. “The MAHA world is losing their minds
right now. They keep getting thrown under the bus by this administration,”
Donley said. “He’s alienating a crucial voting bloc.”
On Wednesday, the Supreme Court will hear oral arguments over President Donald
Trump’s decision to impose tariffs on almost every nation on earth, in
ever-changing amounts, whenever he feels like it. Legally, this is a case about
any number of complicated questions and legal doctrines, including the
president’s ability to declare emergencies under the International Economic
Emergency Powers Act, the court’s novel major questions doctrine, its dormant
non-delegation doctrine, the proper venue for challenging the tariffs, and the
proper statutory interpretation of IEEPA.
> “This is not just a battle over tariffs.” It’s a battle over just who is in
> charge of the GOP.
But these questions will almost certainly be window-dressing on a decision
driven by how Chief Justice John Roberts and the other five Republican
appointees navigate between the two stakeholders in this case: the powerful
billionaires and business interests behind the challenge to the tariffs and
Trump’s desire to transform the economy into an arm of his personalist rule.
“This is not just a battle over tariffs,” explains Evan Bernick of the Northern
Illinois University College of Law. “It is a battle between competing political
economies within the American right. And how it works out will speak to just who
ultimately has hegemony, who… is shaping the law of the United States.” While
Bernick expects the businesses and states challenging the tariffs to prevail,
“if they do not,” he says, “that tells me things about the relative power of
these competing factions that I did not previously know.”
In February and again in April, Trump cited IEEPA when imposing his sweeping—and
sometimes very high—tariffs, some of which he went on to pause. While the
Constitution grants Congress the power to impose tariffs, Trump claimed his
actions were a legitimate use of that 1977 law, which gives presidents power to
respond to “any unusual and extraordinary threat” from abroad, even though IEEPA
doesn’t specifically name tariffs as an available tool. The court is hearing two
consolidated cases brought by multiple small businesses. Some of the companies’
challenges were brought with support from ideologically conservative and
libertarian nonprofits funded by wealthy Republican-allied donors, most notably
the Koch network.
For decades, the Kochs and their fellow-traveling tycoons, along with the
religious right, channeled millions of dollars into a project to capture the
Supreme Court, successfully creating a loyal 6-3 conservative majority.
Beginning in 2005 with Roberts’ nomination, the Federalist Society vetted
Republican nominees and their allies helped win their confirmations with lots of
money. As Lisa Graves, who leads the judicial watchdog group True North Research
and has published a new book on Roberts, recently told me, “Roberts is really
the beneficiary of the first billionaire-backed campaign to capture the US
Supreme Court.” He’s spent the last 20 years implementing their agenda.
The Roberts Court consistently rules for the interests of this small set of
billionaire political donors, whose money flowed to the Federalist Society and
other activist groups that helped each of the Republican-appointed justices
reach the high court. Further, under Roberts, these members of the court have
increased the political power of the GOP and its wealthiest patrons. For
example, the court has been dismantling the Voting Rights Act to the benefit of
the GOP, a project they will likely finish in the next few months. It has also
cut the power of labor unions, and, by overturning the long-held practice of
courts deferring to agency expertise, declared open season on federal
regulations that industry dislikes. In its stead, the justices invented the
major questions doctrine to justify striking down executive regulations the
court decides are “major” and that don’t have clear authorization from Congress,
and created increasingly radical interpretations of the unitary executive theory
that have weakened agency independence so that partisan politics can destroy
industry regulation.
This clear preference for moneyed interests was detailed by employment lawyer
Scott Budow in a 2021 law review article on how the Roberts Court has changed
labor and employment law. He discussed 15 cases in which the justices cast a
collective 134 votes. “There is no unifying judicial philosophy—such as
originalism or textualism—that neatly explains why conservative justices would
reliably vote in one manner and liberal justices in the opposite manner for
these cases,” he concluded. “Yet, if all one knew was that conservative justices
favor employers and liberal justices favor workers, that person would have
correctly predicted 132 of the 134 votes cast.” That is 98.5 percent of the
time.
“Trying to interpret or anticipate what’s going to happen in cases involving
Trump inside the four corners of legal reasoning will fail, and hasn’t really
explained almost anything the Robert’s court [has done] for the last 20 years,”
says Michael Podhorzer, the former political director of the AFL-CIO. “Instead,
if you step back and think about the interests that elevated the six of them to
the court, then that is really very clarifying.”
> This case has big business going up against the president.
In their 2022 book The Scheme: How the Right Wing Used Dark Money to Capture the
Supreme Court, Sen. Sheldon Whitehouse (D-R.I.) and attorney Jennifer Mueller
breakdown not only the story of how a small handful of rightwing families and
groups channeled millions to put allies on the court, but how they also fund an
array of legal outfits to bring cases and file amicus briefs—filings that help
to signal to the justices which way their benefactors hope they will rule. As
Whitehouse and Mueller write, between 2014 and 2020, 16 rightwing foundations
gave nearly $69 million to 11 groups that filed amicus curiae briefs urging the
court to hobble the Consumer Financial Protection Bureau, which guards against
predatory financial industry practices, as well as more than $33 million to the
Federalist Society. These groups include the Washington Legal Fund, the Pacific
Legal Foundation, the New Civil Liberties Alliance, and the Liberty Justice
Center—all of which have used Koch money to challenge labor unions and weaken
government regulations. Repeatedly, the GOP wing of the court has handed these
organizations, and their donors, major victories.
Those same four legal groups that worked so hard to disempower unions and
destroy the regulatory state are now before the court with a new request: stop
Trump’s arbitrary tariffs. They have a strong case, at least under the Roberts
court’s precedents—after all, the justices have created a brand new doctrine,
the major questions doctrine, and used it to strike down regulations without
clear statutory authorization that industry doesn’t like. Tariffs on nearly
every nation are by every measure “major” actions that can make or break
businesses and reshape both the US and world economies.
But unlike in other major questions doctrine cases, when industry was pitted
against Democratic priorities like environmental regulations or student debt
relief programs that the six conservative justices struck down, this case has
the business community going up against the president.
Trump, too, has been on a winning streak before the six GOP justices, who have
repeatedly used their emergency or shadow docket to greenlight the president’s
agenda, from slashing the federal bureaucracy to detaining suspected immigrants
based on the color of their skin. As of last month, Trump had won some 21
emergency appeals to the court. The Republican wing even restricted lower
courts’ authority to grant relief from Trump’s policies. The logical conclusion
is that the justices are either on board with Trump’s authoritarian project,
protective of his political coalition, or possibly also afraid to cross him for
fear he disobeys their orders. Perhaps it is a combination of these factors, but
the result is a court that contorts itself—or remains completely silent—in order
to repeatedly rule in Trump’s favor. As Justice Ketanji Brown Jackson wrote in a
dissent in August, analogizing her colleagues jurisprudence to a make-believe
game from Calvin and Hobbes: “Calvinball has only one rule: There are no fixed
rules. We seem to have two: that one, and this Administration always wins.”
But this time, the administration is up against the court’s other preferred
client, and one of their winning streaks must come to an end. One view of what’s
coming starts with the solid premise that while ultrawealthy business interests
don’t agree with all of Trump’s agenda, they prefer him to a Democrat. If we
presume that Roberts and the court’s other Federalist Society recruits similarly
view Trump as an essential—even if often misguided—element of their project,
then they will try to limit his tariffs without strongly rebuking him. “I think
the calculus that they’re going through is basically, ‘Would trying to stop him
there lead to electoral defeat, or not? Is it too damaging to them?’” says
Podhorzer, who also expects the court “at a minimum” will “do something that
trims or constrains” Trump’s claimed tariff powers.
“It’s important to look at whatever they end up doing as a reflection of where
that business community is right now,” he adds. A decisive victory for Trump
might signal that big business will tolerate a tariff regime in which they write
multi-million dollar checks to Trump’s ballroom project in exchange for
waivers—although they don’t seem to be there yet because, after all, they did
help bring this challenge in the first place. A big Trump win could also signal
that the justices themselves sense a fundamental shift in where power lies on
the right, from the moneyed interests that created the court to the openly
authoritarian MAGA movement.
Legally, there are a lot of ways the justices could resolve this case. But it
will be more illuminating to think of the Republican wing not as judges weighing
arguments but as mediators seeking a compromise between two competing factions
of the same team.
This story was originally published by the Guardian and is reproduced here as
part of the Climate Desk collaboration.
For decades, Khoji Wesselius has noticed the oily scent of pesticides during
spraying periods when the wind has blown through his tiny farming village in a
rural corner of the Netherlands.
Now, after volunteering in an experiment to count how many such substances
people are subjected to, Wesselius and his wife are one step closer to
understanding the consequences of living among chemical-sprayed fields of seed
potato, sugar beet, wheat, rye and onion.
“We were shocked,” said Wesselius, a retired provincial government worker, who
had exposure to eight different pesticides through his skin, with even more
chemicals found through tests of his blood, urine and stool. “I was contaminated
by 11 sorts of pesticides. My wife, who is more strict in her organic
nourishment, had seven sorts of pesticides.”
Regulators closely monitor dietary intake of pesticides when deciding whether
they are safe enough for the market, but little attention has been paid to the
effects of breathing them in or absorbing them through the skin. According to a
new study, even people who live far from farms are exposed to several different
types of pesticides from non-dietary sources—including banned substances.
“What’s most surprising is that we cannot avoid exposure to pesticides: they are
in our direct environment and our study indicates direct contact,” said Paul
Scheepers, a molecular epidemiologist at Radboud University and co-author of the
study. “The real question is how much is taken up [by the body] and that’s not
so easy to answer.”
> “The conclusions…are highly significant: Pesticides are ubiquitous, not only
> in agricultural areas but also in environments far from crop fields.”
The researchers got 641 participants in 10 European countries to wear silicone
wristbands continuously for one week to capture external exposure to 193
pesticides. In laboratory tests, they detected 173 of the substances they tested
for, with pesticides found in every wristband and an average of 20 substances
for every person who took part.
Non-organic farmers had the highest number of pesticides in their wristbands,
with a median of 36, followed by organic farmers and people who live near farms,
such as Wesselius and his wife. Consumers living far from farms had the fewest,
with a median of 17 pesticides.
“I’ve asked myself, was it worth it to know all this?” said Wesselius, who says
some contractors for the farmers near his village do not seem to consider the
wind direction when applying pesticides such as glyphosate and neonicotinoids.
“It’s lingering in the back of my mind. Every time I see a tractor [with a
spraying installation] there’s this kind of eerie feeling that I’m being
poisoned.”
Pesticides have helped the world produce more food using less space—fouling the
regions in which they are sprayed while reducing the area of land that needs to
be exploited for food—but have worried doctors who point to a growing body of
evidence linking them to disease. The EU scrapped a proposed target last year to
halve pesticide use and risk by 2030 after lobbying from agriculture lobbies and
some member states, who argued the cuts were too deep.
Bartosz Wielgomas, the head of the toxicology department at the Medical
University of Gdańsk, who was not involved in the study, said the results were
of “great value” but may even underestimate exposure to pesticides. The silicone
wristbands do not absorb all substances to the same degree, he said, and the
researchers tested for fewer than half of the pesticides approved in the EU.
“The conclusions of this study are highly significant: Pesticides are
ubiquitous, not only in agricultural areas but also in environments far from
crop fields,” he said.
The researchers found participants in the study were also exposed to pesticides
that have been taken off the market, with breakdown products of DDT
(dichlorodiphenyltrichloroethane), which was banned decades ago on health
grounds, commonly found in the wristbands. They also detected some banned
insecticides, such as dieldrin and propoxur.
While the presence of pesticides in the wristbands does not indicate direct
health effects, the authors voiced concern about the number of different types.
Researchers have suggested that some mixtures of different chemicals amplify
their effects on the human body beyond what studies of isolated exposure find.
Wesselius, whose results have motivated him to eat more organic food, said:
“It’s not a nice thing to know. But it’s even worse to continue this practice.”
This story was originally published by the Guardian and is reproduced here as
part of the Climate Desk collaboration.
Gavin Newsom vetoed a California bill that was set to ban the sale of cookware
and other consumer goods manufactured with PFAS, also known as “forever
chemicals,” human-made compounds linked to a range of health issues.
The governor’s decision on Monday followed months of debate and advocacy,
including from high-profile celebrity chefs such as Thomas Keller and Rachael
Ray, who argued that nonstick cookware made with PFAS, when manufactured
responsibly, can be safe and effective and urged lawmakers to vote against the
proposal.
Newsom said in a statement that the legislation was “well-intentioned” but would
affect too broad a swath of products and would result in a “sizable and rapid
shift” of cooking products available in the state.
“I am deeply concerned about the impact this bill would have on the availability
of affordable options in cooking products,” Newsom wrote, adding that the state
“must carefully consider” the consequences of a dramatic shift in available
products.
Concerns over the use of PFAS, chemicals used to make cookware and other items
non-stick and water-resistant, have grown significantly in recent years. Called
“forever chemicals,” because they do not break down naturally, PFAS are used in
non-stick cookware, waterproof mascara and dental floss, among other items.
They have been linked to a number of health issues, with some linked to high
cholesterol, reproductive issues and cancer. A United States Geological Survey
study in 2023 detected the chemicals in almost half the country’s tap water.
Under the bill approved by California’s legislature, the state by 2030 would
have banned the sale or distribution of goods, including cleaning products,
cookware, floss, food packaging and ski wax, with “intentionally added” PFAS.
The bill had the support of major environmental groups, as well as opposition
from influential figures, such as Ray and Keller, and high-profile chefs who
argued it would place an unfair burden on restaurants. Ray argued the focus
should be on educating consumers rather than eliminating the products.
“Removing access to these products without providing fact-based context could
hurt the very people we’re trying to protect,” Ray said.
Ben Allen, the state senator who introduced the legislation, told the Los
Angeles Times he planned to keep working on the issue.
“We are obviously disappointed,” he told the newspaper. “We know there are safer
alternatives—[but] I understand there were strong voices on both sides on this
topic.”
Over the weekend, Newsom also vetoed legislation focused on racial justice,
including a bill that would allowed universities to give the descendants of
enslaved people preference in admissions preference, while approving funding for
a reparations study. He also signed a bill allowing a wide range of family
members to care for children if the federal government deports their parents.
This story was originally published by Canary Media and is reproduced here as
part of the Climate Desk collaboration.
Michael Gillogly, manager of the Pepperwood Preserve, understands the wildfire
risk that power lines pose firsthand. The 3,200-acre nature reserve in Sonoma
County, California, burned in 2017 when a privately owned electrical
system sparked a fire. It burned again in 2019 during a conflagration started by
power lines operated by utility Pacific Gas & Electric.
So when PG&E approached Gillogly about installing a solar- and battery-powered
microgrid to replace the single power line serving a guest house on the
property, he was relieved. “We do a lot of wildfire research here,” he noted.
Getting rid of “the line up to the Bechtel House is part of PG&E’s work on
eliminating the risk of fire.”
PG&E covered the costs of building the microgrid, and so far, the solar and
batteries have kept the light and heat on at the guest house, even when a dozen
or so researchers spent several cloudy days there, Gillogly said.
Over the past few years, PG&E has increasingly opted for these “remote grids”
as the costs of maintaining long power lines in wildfire-prone terrain skyrocket
and the price of solar panels, batteries, and backup generators continues to
decline. The utility has installed about a dozen systems in the Sierra Nevada
high country, with the Pepperwood Preserve microgrid the first to be powered 100
percent by solar and batteries. The utility plans to complete more
than 30 remote grids by the end of next year.
Until recently, utilities have rarely promoted solar-and-battery alternatives to
power lines, particularly if they don’t own the solar and batteries in question.
After all, utilities earn guaranteed profits on the money they spend on
their grids.
But PG&E’s remote-grid initiative, launched with regulator approval in 2023,
allows it to earn a rate of return on these projects that’s similar to what it
would earn on the grid upgrades required to provide those customers with
reliable power. The catch is that the costs of installing and operating the
solar panels and batteries and maintaining and fueling the generators must be
lower than what the utility would have spent on power lines.
“It all depends on what the alternative is,” said Abigail Tinker, senior manager
of grid innovation delivery at PG&E. For the communities the utility has
targeted, power lines can be quite expensive, largely due to the cost of
ensuring that they won’t cause wildfires.
PG&E was forced into bankruptcy in 2019 after its power lines
sparked California’s deadliest-ever wildfire, and the company is under state
mandate to prevent more such disasters. PG&E and California’s other major
utilities are spending tens of billions of dollars on burying key power lines,
clearing trees and underbrush, and protecting overhead lines with hardened
coverings, hair-trigger shutoff switches, and other equipment.
But these wildfire-prevention investments are driving up utility
expenditures and customer rates. Solar and batteries are an increasingly
cost-effective alternative, Tinker said, with the benefits outweighing the price
tag of having to harden as little as a mile of power lines.
PG&E saves money either by getting rid of grid connections altogether or by
delaying the construction of new lines. Microgrids can also improve reliability
for customers when utilities must intentionally de-energize the lines that serve
them during windstorms and other times of high wildfire risk—an increasingly
common contingency in fire-prone areas.
Angelo Campus, CEO of BoxPower, which built most of PG&E’s remote microgrids,
sees the strategy penciling out for more and more utilities for these same
reasons. “We’re working with about a dozen utilities across the country on
similar but distinct flavors of this,” he said. “Wildfire mitigation is a huge
issue across the West,” and climate change is increasing the frequency and
severity of the threat.
Utilities are responsible for about 10 percent of wildfires. But they’re bearing
outsized financial risks from those they do cause. Portland, Oregon-based
PacifiCorp is facing billions of dollars in costs and $30 billion in claims for
wildfires sparked by its grid in 2020, and potentially more for another fire
in 2022. Hawaiian Electric paid a $2 billion settlement to cover damages from
the deadly 2023 Maui fires caused by its grid.
Microgrids can’t replace the majority of a utility’s system, of course. But they
are being considered for increasingly large communities, Campus said.
Nevada utility NV Energy has proposed a solar and battery microgrid to replace
a diesel generator system now providing backup power to customers in the
mountain town of Mt. Charleston. Combining solar and batteries with
“ruggedized” overhead lines should save about $21 million compared to burying
power lines underground, while limiting impacts of wildfire-prevention power
outages, according to the utility.
Some larger projects have already been built. San Diego Gas & Electric has
been running a microgrid for the rural California town of Borrego Springs
since 2013, offering about 3,000 residents backup solar, battery, and generator
power to bolster the single line that connects them to the larger grid, which is
susceptible to being shut off due to wildfire risk. Duke Energy built
a microgrid in Hot Springs, North Carolina, a town of about 535 residents served
by a single 10-mile power line prone to outages, on the grounds that it was
cheaper than building a second line to improve reliability.
In each of these cases, utilities must weigh the costs of the alternatives,
Tinker said. “It’s complicated and nuanced in terms of dollars per mile,
because you have to be able to do the evaluation of individual circuits, and
what can be done to mitigate the risk for each circuit,” she said.
Whether microgrids are connected to the larger grid or not, utilities need to
maintain communications links with them to ensure the systems are operating
reliably and safely. PG&E is working with New Sun Road, a company that provides
remote monitoring and control technology, to keep its far-flung grids in working
order.
It’s important to distinguish remote microgrids built and operated by utilities
from other types of microgrids. Solar, batteries, backup generators, and on-site
power controls are also being used by electric-truck charging
depots and industrial facilities that don’t want to wait for utilities to expand
their grids to serve them. Microgrids are also providing college campuses,
military bases, municipal buildings, and churches and community centers with
backup power when the grid goes down and with self-supplied power to offset
utility bills when the grid is up and running.
Utilities have been far less friendly to customer-owned microgrids in general,
however, seeing them as a threat to their core business model. Since 2018,
California law has required the state Public Utilities Commission to develop
rules to allow customers to build their own microgrids. But progress has been
painfully slow, and only a handful of grant-funded projects have been completed.
Microgrid developers and advocates complain that the commission has put too many
restrictions on how customers who own microgrids can earn money for the energy
they generate when the grid remains up and running. Utilities contend that they
need to maintain control over the portions of their grid that connect to
microgrids to avoid creating more hazards.
“It is a very difficult balance that PG&E is constantly trying to strike, with
the oversight of [utility regulators] and other stakeholders, between safety and
reliability and affordability,” Tinker said. “That’s something we’re trying to
thread the needle on.”
But as the costs of expanding and maintaining utility grids continue to climb,
and solar and batteries become more affordable, utilities and their customers
are likely to see more opportunities to make microgrids work, Campus said.
“The cost of building poles and wires and maintaining distribution
infrastructure has grown substantially over the past 20 years,” he said. “Look
at the cost of distributed generation and battery—it’s an inverse cost curve.”
This story was originally published by WIRED and is reproduced here as part of
the Climate Desk collaboration.
A new study finds that technologies installed to remove “forever chemicals” from
drinking water are also doing double-duty by removing harmful other
materials—including some substances that have been linked to certain types
of cancer.
The study, published Thursday in the journal ACS ES&T Water, comes as the Trump
administration is overhauling a rule mandating that water systems take action to
clean up forever chemicals in drinking water.
Per- and polyfluoroalkyl substances (PFAS), colloquially referred to as forever
chemicals, are a class of thousands of chemicals that do not degrade in the
environment and have been linked to a slew of worrying health outcomes,
including various cancers, hormonal disorders, and developmental delays. Because
they do not degrade, they are uniquely pervasive: a 2023 study from the US
Geological Survey estimated that 45 percent of tap water in the US could contain
at least one PFAS chemical.
Last year, the Biden administration finalized a rule establishing the first-ever
legal limits of PFAS in drinking water, setting strict limits for six kinds of
PFAS chemicals and mandating that water utilities needed to clean up drinking
water under these limits by 2029. But in May, the Environmental Protection
Agency said it would be reconsidering regulations on four of the six chemicals
in the original rule and extend the deadline by two years. The changes come
after widespread outcry from water utilities, who say that the costs of
installing PFAS filtration systems would be far beyond what the agency
originally estimated.
> “There’s this gray area in between what is safe and what is legal where
> there’s still some risk, which is why we’re so concerned about all of these
> contaminants.”
“Building on the historic actions to address PFAS during the first Trump
Administration, EPA is tackling PFAS from all of our program offices, advancing
research and testing, stopping PFAS from getting into drinking water systems,
holding polluters accountable, and more,” Brigit Hirsch, EPA press secretary,
told WIRED in a statement. “This is just a fraction of the work the agency is
doing on PFAS during President Trump’s second term to ensure Americans have the
cleanest air, land, and water.”
Hirsch also emphasized that as EPA reconsiders standards for the four chemicals
in question, “it is possible that the result could be more stringent
requirements.”
Experts say the costs of cleaning up PFAS could have other benefits beyond just
getting forever chemicals out of Americans’ water supply. The authors of the new
study—all employees of the Environmental Working Group (EWG), a nonprofit that
does research on chemical safety—say that technology that gets rid of PFAS can
also filter out a number of other harmful substances, including some that are
created as byproducts of the water treatment process itself.
The study looks at three types of water filtration technologies that have been
proven to remove PFAS. These technologies “are really widespread, they’ve been
in use for a really long time, and they’re well-documented to remove a large
number of contaminants,” says Sydney Evans, a senior analyst at EWG and coauthor
of the report.
Most routine water disinfection processes in the US entail adding a
chemical—usually chlorine—to the water. While this process removes harmful
pathogens, it can’t leach out PFAS or other types of contaminants, including
heavy metals and elements like arsenic.
This method of disinfection can also, paradoxically, create some harmful
byproducts as chlorine reacts to organic compounds present in water or in
infrastructure like pipes. Long-term exposure to some of these byproducts has
been linked to specific types of cancer. While there are some federal guidelines
for water utilities to follow, experts say that a growing body of research
illustrates that there’s a gap between what is legal and what is safe. (It’s
also not uncommon for utilities to find water samples that exceed legal limits:
Officials in Springfield, Massachusetts, and Akron, Ohio, have notified
residents this year that their water was polluted with disinfection byproducts.)
“There’s this gray area in between what is safe and what is legal where there’s
still some risk, which is why we’re so concerned about all of these
contaminants,” says Evans, some of whose past work has focused on the links
between disinfection byproducts and cancer.
“It’s really an interesting first effort to try to diagnose ancillary
benefits—and perhaps unintended benefits—from installing advanced water
treatment systems intended to remove PFAS,” says P. Lee Ferguson, a professor of
civil and environmental engineering at Duke University. “This gets at a question
many of us have asked, and that I’ve thought about quite a bit: [with] the very
act of installing advanced treatment intended to remove really recalcitrant
contaminants like PFAS, you really do have the potential to get a lot of other
benefits.”
While putting together the study’s methodology, the researchers also
demonstrated how large the gap in advanced technology is between smaller water
systems and bigger ones. Just 7 percent of water systems serving fewer than 500
customers had some kind of advanced water filtration system, as opposed to
nearly 30 percent of water systems serving more than 100,000 people. These
smaller systems, the EWG researchers say, overwhelmingly serve rural and
under-resourced populations. Cost explains a lot here: These types of
technologies are much more expensive than treating water with chlorine. (In May,
the EPA said it would launch an initiative called PFAS OUT, which will connect
with water utilities that need to make upgrades and provide “tools, funding, and
technical assistance.”)
The relatively small sample size of 19 water systems, and the lack of detail in
the data, means there are some wide discrepancies in the results, says Bridger
Ruyle, an assistant professor of environmental engineering at NYU who studies
PFAS and water systems. Some of the systems in the study saw a nearly complete
reduction in disinfection byproducts after they installed advanced filtration;
at the other extreme, some water systems actually showed a gain in byproducts
after they installed the filtration systems.
This, Ruyle says, doesn’t mean that the technology isn’t effective. Rather, it
calls for more research into how variables like new exposure sources and
seasonality might be affecting specific plants.
“In the lab, you can do all of these controlled studies, and you can say, ‘Oh
yes, we eliminate all of the PFAS, and that also takes care of some other
contaminant issues of concern,’” he says. “But when you’re talking about the
real operation of a water facility, the environmental behavior of PFAS and these
other chemicals are not the same. You could have different seasonal patterns,
you could have different sources, you could have climate change impacting
different components. And so, just because we’re treating a certain inflow of
PFAS, a lot of other things could be happening to these other chemicals kind of
independently.”
The question of cost comes back to who, exactly, needs to be on the hook to pay
to clean up water. In communities across the country, water utilities are
folding new PFAS testing and remediation measures into other needed upgrades,
and some consumers are seeing their bills skyrocket. But understanding the full
benefits of some of these fixes can help scientists and policymakers better
grasp the path forward.
“This is an enormous financial challenge,” Ruyle says. “And at the same time,
it’s a financial need. There’s a big focus now in the Trump administration from
the MAHA movement [around] what are these causes of all of these health and
well-being ills. If you’re not willing to put up the money to upgrade
infrastructure, to actually address proven causes of environmental harm, then
what are we going to do?”
This story was originally published by Grist and is reproduced here as part of
the Climate Desk collaboration.
The United States is home to dozens of active mines. Some extract copper, while
others dig for iron. Whatever the resource, however, it usually makes up a small
fraction of the rock pulled from the ground. The rest is typically ignored.
Wasted.
“We’re only producing a few commodities,” said Elizabeth Holley, a professor of
mining engineering at the Colorado School of Mines. “The question is: What else
is in those rocks?”
The answer: a lot.
In a study published today by the journal Science, Holley and her colleagues
aimed to quantify what else is in those rocks. They found that, across 70
critical elements at 54 active mines, the potential for recovery is enormous.
There is enough lithium in one year of US mine waste, for example, to support 10
million electric vehicles. For manganese, it’s enough for 99 million. Those
figures far surpass both U.S. import levels of those elements and current demand
for them.
> Even a 1 percent recovery rate, the study found, would “substantially reduce”
> import reliance for most elements.”
Critical minerals are essential to the production of lithium-ion batteries,
solar panels, and other low- or zero-carbon technologies powering the clean
energy transition. Where the US gets those minerals has long been a politically
fraught topic.
The vast majority of lithium comes from Australia, Chile, and China, for
example, while cobalt predominantly comes from the Democratic Republic of the
Congo. While securing a domestic supply of rare or critical materials has been a
US policy goal for decades, the push has intensified in recent years. Former
president Joe Biden’s landmark climate legislation, the 2022 Inflation Reduction
Act, included incentives for domestic critical mineral production, and this
year, President Donald Trump signed an executive order invoking wartime
powers that would allow more leasing and extraction on federal lands.
“Our national and economic security are now acutely threatened by our reliance
upon hostile foreign powers’ mineral production,” the order read. “It is
imperative for our national security that the United States take immediate
action to facilitate domestic mineral production to the maximum possible
extent.”
Trump also made critical minerals a cornerstone of continued support to Ukraine.
Meanwhile, China recently expanded export controls on rare earth metals,
underscoring the precarious nature of the global market.
> “Mining is a very old-fashioned industry…Who is going to take the risk?”
Holley’s research indicates that increased domestic byproduct recovery could
address this instability. Even a 1 percent recovery rate, it found, would
“substantially reduce” import reliance for most elements. Recovering 4 percent
of lithium would completely offset current imports.
“We could focus on mines that are already corporate and simply add additional
circuits to their process,” said Holley. “It would be a really quick way of
bringing a needed mineral into production.”
This latest research is “very valuable,” said Hamidreza Samouei, a professor of
petroleum engineering at Texas A&M University who wasn’t involved in the study.
He sees it as a great starting point for a multipronged approach to tackling the
byproduct problem and moving toward a zero-waste system. Other areas that will
need attention, he said, include looking beyond discarded rock to the “huge”
amounts of water that a mine uses. He also believes that the government should
play a more aggressive policy and regulatory role in pushing for critical
mineral recovery.
“Mining is a very old-fashioned industry,” said Samouei. “Who is going to take
the risk?”
The Department of Energy recently announced a byproduct recovery pilot program,
and the Pentagon took a $400 million stake in the operator of the country’s only
rare-earth metal mine. At the same time, Congress recently repealed large chunks
of the Inflation Reduction Act, which would have driven demand for critical
minerals, and has slashed federal funding to the US Geological Survey and
the Department of Energy’s Office of Science, among other research arms.
The general thrust of the Science study is “not new,” said Isabel Barton, a
professor of geological engineering at the University of Arizona. “It is a very
hot topic in mining these days.”
The attention is contributing to a burgeoning shift in thinking, from an intense
focus on the target mineral to consideration of what else could be produced,
including critical minerals. “There are some that are probably relatively
simple. There are others that are heinously difficult to get to,” said Barton,
and whether a mineral is recovered will ultimately come down to cost. “Mining
companies are there to make a profit.”
Figuring out the most economically viable way forward is exactly the next step
Holley hopes this research will inform. Byproduct potential varies considerably
by mine, and the analysis, she said, can help pinpoint where to potentially find
which minerals. For instance, the Red Dog mine in Alaska appears to have the
largest germanium potential in the country, while nickel could be found at the
Stillwater and East Boulder mines in Montana.
“The [research and development] funding on critical minerals has been a little
bit of a scattershot,” she said. “Our paper allows the development of a
strategy.”