Science

the-$4.3-billion-space-telescope-trump-tried-to-cancel-is-now-complete

The $4.3 billion space telescope Trump tried to cancel is now complete


“We’re going to be making 3D movies of what is going on in the Milky Way galaxy.”

Artist’s concept of the Nancy Grace Roman Space Telescope. Credit: NASA Goddard Space Flight Center Scientific Visualization Studio

A few weeks ago, technicians inside a cavernous clean room in Maryland made the final connection to complete assembly of NASA’s Nancy Grace Roman Space Telescope.

Parts of this new observatory, named for NASA’s first chief astronomer, recently completed a spate of tests to ensure it can survive the shaking and intense sound of a rocket launch. Engineers placed the core of the telescope inside a thermal vacuum chamber, where it withstood the airless conditions and extreme temperature swings it will see in space.

Then, on November 25, teams at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, joined the inner and outer portions of the Roman Space Telescope. With this milestone, NASA declared the observatory complete and on track for launch as soon as fall 2026.

“The team is ecstatic,” said Jackie Townsend, the observatory’s deputy project manager at Goddard, in a recent interview with Ars. “It has been a long road, but filled with lots of successes and an ordinary amount of challenges, I would say. It’s just so rewarding to get to this spot.”

An ordinary amount of challenges is not something you usually hear a NASA official say about a one-of-a-kind space mission. NASA does hard things, and they usually take more time than originally predicted. Astronomers endured more than 10 years of delays, fixes, and setbacks before the James Webb Space Telescope finally launched in 2021.

Webb is the largest telescope ever put into space. After launch, Webb had to perform a sequence of more than 50 major deployment steps, with 178 release mechanisms that had to work perfectly. Any one of the more than 300 single points of failure could have doomed the mission. In the end, Webb unfolded its giant segmented mirror and delicate sunshield without issue. After a quarter-century of development and more than $11 billion spent, the observatory is finally delivering images and science results. And they’re undeniably spectacular.

The completed Nancy Grace Roman Space Telescope, seen here with its solar panels deployed inside a clean room at NASA’s Goddard Space Flight Center in Maryland. Credit: NASA/Jolearra Tshiteya

Seeing far and wide

Roman is far less complex, with a 7.9-foot (2.4-meter) primary mirror that is nearly three times smaller than Webb’s. While it lacks Webb’s deep vision, Roman will see wider swaths of the sky, enabling a cosmic census of billions of stars and galaxies near and far (on the scale of the Universe). This broad vision will support research into dark matter and dark energy, which are thought to make up about 95 percent of the Universe. The rest of the Universe is made of regular atoms and molecules that we can see and touch.

It is also illustrative to compare Roman with the Hubble Space Telescope, which has primary mirrors of the same size. This means Roman will produce images with similar resolution to Hubble. The distinction lies deep inside Roman, where technicians have delicately laid an array of detectors to register the faint infrared light coming through the telescope’s aperture.

“Things like night vision goggles will use the same basic detector device, just tuned to a different wavelength,” Townsend said.

These detectors are located in Roman’s Wide Field Instrument, the mission’s primary imaging camera. There are 18 of them, each 4,096×4,096 pixels wide, combining to form a roughly 300-megapixel camera sensitive to visible and near-infrared light. Teledyne, the company that produced the detectors, says this is the largest infrared focal plane ever made.

The near-infrared channel on Hubble’s Wide Field Camera 3, which covers much the same part of the spectrum as Roman, has a single 1,024-pixel detector.

“That’s how you get to a much higher field-of-view for the Roman Space Telescope, and it was one of the key enabling technologies,” Townsend told Ars. “That was one place where Roman invested significant dollars, even before we started as a mission, to mature that technology so that it was ready to infuse into this mission.”

With these detectors in its bag, Roman will cover much more cosmic real estate than Hubble. For example, Roman will be able to re-create Hubble’s famous Ultra Deep Field image with the same sharpness, but expand it to show countless stars and galaxies over an area of the sky at least 100 times larger.

This infographic illustrates the differences between the sizes of the primary mirrors and detectors on the Hubble, Roman, and Webb telescopes. Credit: NASA

Roman has a second instrument, the Roman Coronagraph, with masks, filters, and adaptive optics to block out the glare from stars and reveal the faint glow from objects around them. It is designed to photograph planets 100 million times fainter than their stars, or 100 to 1,000 times better than similar instruments on Webb and Hubble. Roman can also detect exoplanets using the tried-and-true transit method, but scientists expect the new telescope will find a lot more than past space missions, thanks to its wider vision.

“With Roman’s construction complete, we are poised at the brink of unfathomable scientific discovery,” said Julie McEnery, Roman’s senior project scientist at NASA Goddard, in a press release. “In the mission’s first five years, it’s expected to unveil more than 100,000 distant worlds, hundreds of millions of stars, and billions of galaxies. We stand to learn a tremendous amount of new information about the universe very rapidly after Roman launches.”

Big numbers are crucial for learning how the Universe works, and Roman will feed vast volumes of data down to astronomers on Earth. “So much of what physics is trying to understand about the nature of the Universe today needs large number statistics in order to understand,” Townsend said.

In one of Roman’s planned sky surveys, the telescope will cover in nine months what would take Hubble between 1,000 and 2,000 years. In another survey, Roman will cover an area equivalent to 3,455 full moons in about three weeks, then go back and observe a smaller portion of that area repeatedly over five-and-a-half days—jobs that Hubble and Webb can’t do.

“We will do fundamentally different science,” Townsend said. “In some subset of our observations, we’re going to be making 3D movies of what is going on in the Milky Way galaxy and in distant galaxies. That is just something that’s never happened before.”

Getting here and getting there

Roman’s promised scientific bounty will come at a cost of $4.3 billion, including expenses for development, manufacturing, launch, and five years of operations.

This is about $300 million more than NASA expected when it formally approved Roman for development in 2020, an overrun the agency blamed on complications related to the coronavirus pandemic. Otherwise, Roman’s budget has been stable since NASA officials finalized the mission’s architecture in 2017, when it was still known by a bulky acronym: WFIRST, the Wide Field InfraRed Survey Telescope.

At that time, the agency reclassified the Roman Coronagraph as a technology demonstration, allowing managers to relax their requirements for the instrument and stave off concerns about cost growth.

Roman survived multiple attempts by the first Trump administration to cancel the mission. Each time, Congress restored funding to keep the observatory on track for launch in the mid-2020s. With Donald Trump back in the White House, the administration’s budget office earlier this year again wanted to cancel Roman. Eventually, the Trump administration released its fiscal year 2026 budget request in May, calling for a drastic cut to Roman, but not total cancellation.

Once again, both houses of Congress signaled their opposition to the cuts, and the mission remains on track for launch next year, perhaps as soon as September. This is eight months ahead of the schedule NASA has publicized for Roman for the last few years.

Townsend told Ars the mission escaped the kind of crippling cost overruns and delays that afflicted Webb through careful planning and execution. “Roman was under a cost cap, and we operated to that,” she said. “We went through reasonable efforts to preclude those kinds of highly complex deployments that lead you to having trouble in integration and test.”

The outer barrel section of the Roman Space Telescope inside a thermal vacuum chamber at NASA’s Goddard Space Flight Center, Maryland. Credit: NASA/Sydney Rohde

There are only a handful of mechanisms that must work after Roman’s launch. They include a deployable cover designed to shield the telescope’s mirror during launch and solar array wings that will unfold once Roman is in space. The observatory will head to an observing post about a million miles (1.5 million kilometers) from Earth.

“We don’t have moments of terror for the deployment,” Townsend said. “Obviously, launch is always a risk, the tip-off rates that you have when you separate from the launch vehicle… Then, obviously, getting the aperture door open so that it’s deployed is another one. But these feel like normal aerospace risks, not unusual, harrowing moments for Roman.”

It also helps that Roman will use a primary mirror gifted to NASA by the National Reconnaissance Office, the US government’s spy satellite agency. The NRO originally ordered the mirror for a telescope that would peer down on the Earth, but the spy agency no longer needed it. Before NASA got its hands on the surplus mirror in 2012, scientists working on the preliminary design for what became Roman were thinking of a smaller telescope.

The larger telescope will make Roman a more powerful tool for science, and the NRO’s donation eliminated the risk of a problem or delay manufacturing a new mirror. But the upside meant NASA had to build a more massive spacecraft and use a bigger rocket to accommodate it, adding to the observatory’s cost.

Tests of Roman’s components have gone well this year. Work on Roman continued at Goddard through the government shutdown in the fall. On Webb, engineers uncovered one problem after another as they tried to verify the observatory would perform as intended in space. There were leaky valves, tears in the Webb’s sunshield, a damaged transducer, and loose screws. With Roman, engineers so far have found no “significant surprises” during ground testing, Townsend said.

“What we always hope when you’re doing this final round of environmental tests is that you’ve wrung out the hardware at lower levels of assembly, and it looks like, in Roman’s case, we did a spectacular job at the lower level,” she said.

With Roman now fully assembled, attention at Goddard will turn to an end-to-end functional test of the observatory early next year, followed by electromagnetic interference testing, and another round of acoustic and vibration tests. Then, perhaps around June of next year, NASA will ship the observatory to Kennedy Space Center, Florida, to prepare for launch on a SpaceX Falcon Heavy rocket.

“We’re really down to the last stretch of environmental testing for the system,” Townsend said. “It’s definitely already seen the worst environment until we get to launch.”

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

The $4.3 billion space telescope Trump tried to cancel is now complete Read More »

utah-leaders-hinder-efforts-to-develop-solar-energy-supply

Utah leaders hinder efforts to develop solar energy supply


Solar power accounts for two-thirds of the new projects waiting to connect to the state’s power grid.

Utah Gov. Spencer Cox believes his state needs more power—a lot more. By some estimates, Utah will require as much electricity in the next five years as it generated all last century to meet the demands of a growing population as well as chase data centers and AI developers to fuel its economy.

To that end, Cox announced Operation Gigawatt last year, declaring the state would double energy production in the next decade. Although the announcement was short on details, Cox, a Republican, promised his administration would take an “any of the above” approach, which aims to expand all sources of energy production.

Despite that goal, the Utah Legislature’s Republican supermajority, with Cox’s acquiescence, has taken a hard turn against solar power—which has been coming online faster than any other source in Utah and accounts for two-thirds of the new projects waiting to connect to the state’s power grid.

Cox signed a pair of bills passed this year that will make it more difficult and expensive to develop and produce solar energy in Utah by ending solar development tax credits and imposing a hefty new tax on solar generation. A third bill aimed at limiting solar development on farmland narrowly missed the deadline for passage but is expected to return next year.

While Operation Gigawatt emphasizes nuclear and geothermal as Cox’s preferred sources, the legislative broadside, and Cox’s willingness to go along with it, caught many in the solar industry off guard. The three bills, in their original form, could have brought solar development to a halt if not for solar industry lobbyists negotiating a lower tax rate and protecting existing projects as well as those under construction from the brunt of the impact.

“It took every dollar of political capital from all the major solar developers just to get to something tolerable, so that anything they have under development will get built and they can move on to greener pastures,” said one industry insider, indicating that solar developers will likely pursue projects in more politically friendly states. ProPublica spoke with three industry insiders—energy developers and lobbyists—all of whom asked to remain anonymous for fear of antagonizing lawmakers who, next month, will again consider legislation affecting the industry.

The Utah Legislature’s pivot away from solar mirrors President Donald Trump taking a more hostile approach to the industry than his predecessor. Trump has ordered the phaseout of lucrative federal tax incentives for solar and other renewable energy, which expanded under the Biden administration. The loss of federal incentives is a bigger hit to solar companies than the reductions to Utah’s tax incentives, industry insiders acknowledged. The administration has also canceled large wind and solar projects, which Trump has lamented as “the scam of the century.” He described solar as “farmer killing.”

Yet Cox criticized the Trump administration’s decision to kill a massive solar project in neighboring Nevada. Known as a governor who advocates for a return to more civil political discourse, Cox doesn’t often pick fights. But he didn’t pull punches with the decision to halt the Esmeralda 7 project planned on 62,300 acres of federal land. The central Nevada project was expected to produce 6.2 gigawatts of power—enough to supply nearly eight times the number of households in Las Vegas. (Although the Trump administration canceled the environmental review of the joint project proposed by multiple developers, it has the potential to move forward as individual projects.)

“This is how we lose the AI/energy arms race with China,” Cox wrote on X when news surfaced of the project’s cancellation. “Our country needs an all-of-the-above approach to energy (like Utah).”

But he didn’t take on his own Legislature, at least publicly.

Many of Utah’s Republican legislators have been skeptical of solar for years, criticizing its footprint on the landscape and viewing it as an unreliable energy source, while lamenting the retirement of coal-generated power plants. The economies of several rural counties rely on mining coal. But lawmakers’ skepticism hadn’t coalesced into successful anti-solar legislation—until this year. When Utah lawmakers convened at the start of 2025, they took advantage of the political moment to go after solar.

“This is a sentiment sweeping through red states, and it’s very disconcerting and very disturbing,” said Steve Handy, Utah director of The Western Way, which describes itself as a conservative organization advocating for an all-of-the-above approach to energy development.

The shift in sentiment against solar energy has created a difficult climate for an all-of-the-above approach. Solar projects can be built quickly on Utah’s vast, sun-drenched land, while nuclear is a long game with projects expected to take a decade or more to come online under optimistic scenarios.

Cox generally supports solar, “in the right places,” especially when the captured energy can be stored in large batteries for distribution on cloudy days and after the sun goes down.

Cox said that instead of vetoing the anti-solar bills, he spent his political capital to moderate the legislation’s impact. “I think you’ll see where our fingerprints were,” he told ProPublica. He didn’t detail specific changes for which he advocated but said the bills’ earlier iterations would have “been a lot worse.”

“We will continue to see solar in Utah.”

Cox’s any-of-the-above approach to energy generation draws from a decades-old Republican push similarly titled “all of the above.” The GOP policy’s aim was as much about preserving and expanding reliance on fossil fuels (indeed, the phrase may have been coined by petroleum lobbyists) as it was turning to cleaner energy sources such as solar, wind, and geothermal.

As governor of a coal-producing state, Cox hasn’t shown interest in reducing reliance on such legacy fuels. But as he slowly rolls out Operation Gigawatt, his focus has been on geothermal and nuclear power. Last month, he announced plans for a manufacturing hub for small modular reactors in the northern Utah community of Brigham City, which he hopes will become a nuclear supply chain for Utah and beyond. And on a recent trade mission to New Zealand, he signed an agreement to collaborate with the country on geothermal energy development.

Meanwhile, the bills Cox signed into law already appear to be slowing solar development in Utah. Since May, when the laws took effect, 51 planned solar projects withdrew their applications to connect to the state’s grid—representing more than a quarter of all projects in Utah’s transmission connection queue. Although projects drop out for many reasons, some industry insiders theorize the anti-solar legislation could be at play.

Caught in the political squeeze over power are Utah customers, who are footing higher electricity bills. Earlier this year, the state’s utility, Rocky Mountain Power, asked regulators to approve a 30 percent hike to fund increased fuel and wholesale energy costs, as well as upgrades to the grid. In response to outrage from lawmakers, the utility knocked the request down to 18 percent. Regulators eventually awarded the utility a 4.7 percent increase—a decision the utility promptly appealed to the state Supreme Court.

Juliet Carlisle, a University of Utah political science professor focusing on environmental policy, said the new solar tax could signal to large solar developers that Utah energy policy is “becoming more unpredictable,” prompting them to build elsewhere. This, in turn, could undermine Cox’s efforts to quickly double Utah’s electricity supply.

Operation Gigawatt “relies on rapid deployment across multiple energy sources, including renewables,” she said. “If renewable growth slows—especially utility-scale solar, which is currently the fastest-deploying resource—the state may face challenges meeting demand growth timelines.”

Rep. Kay Christofferson, R-Lehi, had sponsored legislation to end the solar industry’s state tax credits for several legislative sessions, but this was the first time the proposal succeeded.

Christofferson agrees Utah is facing unprecedented demand for power, and he supports Cox’s any-of-the-above approach. But he doesn’t think solar deserves the advantages of tax credits. Despite improving battery technology, he still considers it an intermittent source and thinks overreliance on it would work against Utah’s energy goals.

In testimony on his bill, Christofferson said he believed the tax incentives had served their purpose of getting a new industry off the ground—16 percent of Utah’s power generation now comes from solar, ranking it 16th in the nation for solar capacity.

Christofferson’s bill was the least concerning to the industry, largely because it negotiated a lengthy wind-down of the subsidies. Initially it would have ended the tax credit after Jan. 1, 2032. But after negotiations with the solar industry, he extended the deadline to 2035.

The bill passed the House, but when it reached the Senate floor, Sen. Brady Brammer, R-Pleasant Grove, moved the end of the incentives to 2028. He told ProPublica he believes solar is already established and no longer needs the subsidy. Christofferson tried to defend his compromise but ultimately voted with the legislative majority.

Unlike Christofferson’s bill, which wasn’t born of an antipathy for renewable energy, Rep. Casey Snider, R-Paradise, made it clear in public statements and behind closed doors to industry lobbyists that the goal of his bill was to make solar pay.

The bill imposes a tax on all solar production. The proceeds will substantially increase the state’s endangered species fund, which Utah paradoxically uses to fight federal efforts to list threatened animals for protection. Snider cast his bill as pro-environment, arguing the money could also go to habitat protection.

As initially written, the bill would have taxed not only future projects, but also those already producing power and, more worrisome for the industry, projects under construction or in development with financing in place. The margins on such projects are thin, and the unanticipated tax could kill projects already in the works, one solar industry executive testified.

“Companies like ours are being effectively punished for investing in the state,” testified another.

The pushback drew attacks from Snider, who accused solar companies of hypocrisy on the environment.

Industry lobbyists who spoke to ProPublica said Snider wasn’t as willing to negotiate as Christofferson. However, they succeeded in reducing the tax rate on future developments and negotiated a smaller, flat fee for existing projects.

“Everyone sort of decided collectively to save the existing projects and let it go for future projects,” said one lobbyist.

Snider told ProPublica, “My goal was never to run anybody out of business. If we wanted to make it more heavy-handed, we could have. Utah is a conservative state, and I would have had all the support.”

Snider said, like the governor, he favors an any-of-the-above approach to energy generation and doesn’t “want to take down any particular industry or source.” But he believes utility-scale solar farms need to pay to mitigate their impact on the environment. He likened his bill to federal law that requires royalties from oil and gas companies to be used for conservation. He hopes federal lawmakers will use his bill as a model for federal legislation that would apply to solar projects nationwide.

“This industry needs to give back to the environment that they claim very heavily they are going to protect,” he said. “I do believe there’s a tinge of hypocrisy to this whole movement. You can’t say you’re good for the environment and not offset your impacts.”

One of the more emotional debates over solar is set to return next year, after a bill that would end tax incentives for solar development on agricultural land failed to get a vote in the final minutes of this year’s session. Sponsored by Rep. Colin Jack, R-St. George, the bill has been fast-tracked in the next session, which begins in January.

Jack said he was driven to act by ranchers who were concerned that solar companies were outbidding them for land they had been leasing to graze cows. Solar companies pay substantially higher rates than ranchers can. His bill initially had a slew of land use restrictions—such as mandating the distance between projects and residential property and creeks, minimum lot sizes and 4-mile “green zones” between projects—that solar lobbyists said would have strangled their industry. After negotiating with solar developers, Jack eliminated the land use restrictions while preserving provisions to prohibit tax incentives for solar farms on private agricultural land and to create standards for decommissioning projects.

Many in rural Utah recoil at rows of black panels disrupting the landscape and fear solar farms will displace the ranching and farming way of life. Indeed, some wondered whether Cox, who grew up on a farm in central Utah, would have been as critical of Trump scuttling a 62,300-acre solar farm in his own state as he was of the Nevada project’s cancellation.

Peter Greathouse, a rancher in western Utah’s Millard County, said he is worried about solar farms taking up grazing land in his county. “Twelve and a half percent is privately owned, and a lot of that is not farmable. So if you bring in these solar places that start to eat up the farmland, it can’t be replaced,” he said.

Utah is losing about 500,000 acres of agricultural land every 10 years, most of it to housing. A report by The Western Way estimated solar farms use 0.1 percent of the United States’ total land mass. That number is expected to grow to 0.46 percent by 2050—a tiny fraction of what is used by agriculture. Of the land managed by the Utah Trust Lands Administration, less than 3,000 of the 2.9 million acres devoted to grazing have been converted to solar farms.

Other ranchers told ProPublica they’ve been able to stay on their land and preserve their way of life by leasing to solar. Landon Kesler’s family, which raises cattle for team roping competitions, has leased land to solar for more than a decade. The revenue has allowed the family to almost double its land holdings, providing more room to ranch, Kesler said.

“I’m going to be quite honest, it’s absurd,” Kesler said of efforts to limit solar on agricultural land. “Solar very directly helped us tie up other property to be used for cattle and ranching. It didn’t run us out; it actually helped our agricultural business thrive.”

Solar lobbyists and executives have been working to bolster the industry’s image with lawmakers ahead of the next legislative session. They’re arguing solar is a good neighbor.

“We don’t use water, we don’t need sidewalks, we don’t create noise, and we don’t create light,” said Amanda Smith, vice president of external affairs for AES, which has one solar project operating in Utah and a second in development. “So we just sort of sit out there and produce energy.”

Solar pays private landowners in Utah $17 million a year to lease their land. And, more important, solar developers argue, it’s critical to powering data centers the state is working to attract.

“We are eager to be part of a diversified electricity portfolio, and we think we bring a lot of values that will benefit communities, keep rates low and stable, and help keep the lights on,” Rikki Seguin, executive director of Interwest Energy Alliance, a western trade organization that advocates for utility-scale renewable energy projects, told an interim committee of lawmakers this summer.

The message didn’t get a positive reception from some lawmakers on the committee. Rep. Carl Albrecht, R-Richfield, who represents three rural Utah counties and was among solar’s critics last session, said the biggest complaint he hears from constituents is about “that ugly solar facility” in his district.

“Why, Rep. Albrecht, did you allow that solar field to be built? It’s black. It looks like the Dead Sea when you drive by it,” Albrecht said.

This story was originally published by ProPublica.

Photo of ProPublica

Utah leaders hinder efforts to develop solar energy supply Read More »

sharks-and-rays-gain-landmark-protections-as-nations-move-to-curb-international-trade

Sharks and rays gain landmark protections as nations move to curb international trade


Gov’ts agree to ban or restrict international trade in shark meat, fins, and other products.

For the first time, global governments have agreed to widespread international trade bans and restrictions for sharks and rays being driven to extinction.

Last week, more than 70 shark and ray species, including oceanic whitetip sharks, whale sharks, and manta rays, received new safeguards under the Convention on International Trade in Endangered Species of Wild Fauna and Flora. The convention, known as CITES, is a United Nations treaty that requires countries to regulate or prohibit international trade in species whose survival is threatened.

Sharks and rays are closely related species that play similar roles as apex predators in the ocean, helping to maintain healthy marine ecosystems. They have been caught and traded for decades, contributing to a global market worth nearly $1 billion annually, according to Luke Warwick, director of shark and ray conservation at Wildlife Conservation Society (WCS), an international nonprofit dedicated to preserving animals and their habitats.

The sweeping conservation measures were adopted as the treaty’s 20th Conference of the Parties (COP20) concluded in Samarkand, Uzbekistan, signaling a landmark global commitment to stop or regulate the demand for shark meat, fins, and other products derived from the animals.

“These new protections are a powerful step toward ensuring these species have a real chance at recovery,” said Diego Cardeñosa, an assistant professor at Florida International University and lead scientist at the school’s Predator Ecology and Conservation Lab, which is developing new technologies to combat the illegal trade of sharks.

More than a third of shark and ray species are now threatened with extinction. Pelagic shark populations that live in the open ocean have declined by more than 70 percent over the last 50 years. Reef sharks have all but vanished from one in five coral reefs worldwide. “We’re in the middle of an extinction crisis for the species and it’s kind of a silent crisis,” said Warwick. “It’s only in the last decade or so we’ve really, really started to notice that this is happening, and the major driver of it is actually overfishing.”

Unlike tuna and other commercially valuable fish that have been tightly regulated for decades, sharks have long lacked comparable controls on their trade and have often been treated as if they were another fast-reproducing seafood commodity.

“People treat sharks and rays, or have done over the last 50 years, as if they’re like other fish,” Warwick said. But unlike many fish that produce millions of eggs a year, sharks and rays take much longer to mature and produce significantly fewer young. Manta rays, for instance, may only give birth to seven live pups in their lifetime. “But we’ve been catching and killing them, just like other fish, and that, sadly, has led to these catastrophic declines.”

Manta rays are targeted primarily for their large gill plates, which are used in some traditional medicines in Asia aimed at detoxifying the body and boosting immunity, though there is no scientific evidence to support these claims. Their meat is sometimes turned into animal feed or consumed locally.

Shark fins remain a delicacy in luxury Chinese cuisine, prized in expensive dishes like shark fin soup. Shark meat is increasingly sold as a low-cost source of protein. It’s also a common ingredient in cat and dog food.

The livers of deep-water species like gulper sharks are also harvested for their oil, which is used to produce squalene, a staple component of topical skincare products and makeup. Years of unregulated trade of the species have driven population declines of more than 80 percent in some regions.

“The cosmetic industry, really, in a way, is driving the trade of the sharks,” said Gabriel Vianna, a shark researcher from the Charles Darwin Foundation, an international nonprofit dedicated to conserving the Galapagos Islands. In recent years, squalene has also been increasingly used in pharmaceuticals and even COVID-19 vaccines. “We should be using synthetic options and not exploiting these species,” Vianna said.

But until last week, there were no international controls in place to regulate trade in these species despite growing demand for their livers.

That has now changed through the latest decisions adopted at CITES, which Warwick said mark a turning point in marine conservation.

For much of its 50-year history, the convention focused on protecting iconic land species like elephants, rhinos, primates, and parrots, or charismatic marine species like sea turtles, Warwick said. By 1981, CITES had imposed an international ban on all international trade of sea turtles, which Warwick credited for helping some species make remarkable comebacks in the last few decades. Only in the last 10 years, Warwick said, has the convention slowly begun recognizing sharks and rays with similar urgency.

This year at COP20, all proposed protections for sharks and rays were adopted, largely with unanimous support from CITES’ 185 member countries and the European Union, which Warwick said had never happened before.

The European Union is one of the top suppliers of shark meat to Southeast and East Asian markets, with its imports and exports adding up to more than 20 percent of global shark meat trade, according to the World Wildlife Fund. 

Gulper sharks, targeted for their livers, as well as smoothhound and tope sharks, which are primarily fished for their meat, were listed under CITES’ Appendix II. Each listing covers multiple species—20 species of gulper sharks and 30 species of smoothhounds—grouped together because their products cannot be reliably distinguished in trade.

The listing requires all CITES parties to strictly regulate international trade of the species and demonstrate if it is traceable and biologically sustainable. Some species, including wedgefish and giant guitarfish—large shark-like rays targeted for their highly valuable fins—are now protected by a temporary suspension of trade.

Others, such as oceanic whitetips, whale sharks, manta, and devil rays, can no longer be traded internationally at all. Under the new protections, CITES now lists them as Appendix I species, meaning they face a real extinction risk due to trade and are afforded the treaty’s highest level of protection.

“If you find an oceanic whitetip fin being traded, 90 days from here onwards, that’s an illegal product,” he said.

For many shark advocates, the new listings are bittersweet.

“We are very happy but we are very sad at the same time,” said Vianna. “We shouldn’t be happy about this species being listed. We should actually be really worried that there’s such a problem with them.” Meaningful implementation of the new protections will be critical to the survival of many of these species, he said.

Research published in November by Cardeñosa and Warwick found that fins from several shark and ray species, such as oceanic white tip sharks, which were previously listed under Appendix II, were frequently found in Hong Kong—the world’s largest shark fin market—between 2015 and 2021. Appendix II allows for regulated trade, but little to no legal trade in species like the oceanic white tip has been reported since CITES began regulating it in 2014, revealing a significant gap in the amount of sharks being traded and what is being legally documented. For example, genetic analysis of shark fins in Hong Kong detected more than 70 times the number of oceanic whitetip shark fins reported in official CITES records, indicating that more than 90 percent of the trade is illegal.

“This tells us that enforcement gaps remain, especially in large, complex supply chains,” Cardeñosa said in an email.

Now that the oceanic whitetip shark has been uplisted to Appendix I, which prohibits any international trade, Cardeñosa hopes loopholes that previously allowed the protected species and others to slip through will be closed.

“The new listings will not eliminate illegal trade overnight, but they will significantly strengthen the ability of countries to inspect, detect, and prosecute illegal shipments,” Cardeñosa said. “Parties must invest in identification tools, capacity building, and routine monitoring if these protections are to translate into real reductions in illegal trade.”

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy and the environment. Sign up for their newsletter here.

Photo of Inside Climate News

Sharks and rays gain landmark protections as nations move to curb international trade Read More »

scientists-built-an-ai-co-pilot-for-prosthetic-bionic-hands

Scientists built an AI co-pilot for prosthetic bionic hands

To test their AI-powered hand, the team asked intact and amputee participants to manipulate fragile objects: pick up a paper cup and drink from it, or take an egg from a plate and put it down somewhere else. Without the AI, they could succeed roughly one or two times in 10 attempts. With the AI assistant turned on, their success rate jumped to 80 or 90 percent. The AI also decreased the participants’ cognitive burden, meaning they had to focus less on making the hand work.

But we’re still a long way away from seamlessly integrating machines with the human body.

Into the wild

“The next step is to really take this system into the real world and have someone use it in their home setting,” Trout says. So far, the performance of the AI bionic hand was assessed under controlled laboratory conditions, working with settings and objects the team specifically chose or designed.

“I want to make a caveat here that this hand is not as dexterous or easy to control as a natural, intact limb,” George cautions. He thinks that every little increment that we make in prosthetics is allowing amputees to do more tasks in their daily life. Still, to get to the Star Wars or Cyberpunk technology level where bionic prostheses are just as good or better than natural limbs, we’re going to need more than just incremental changes.

Trout says we’re almost there as far as robotics go. “These prostheses are really dexterous, with high degrees of freedom,” Trout says, “but there’s no good way to control them.” This in part comes down to the challenge of getting the information in and out of users themselves. “Skin surface electromyography is very noisy, so improving this interface with things like internal electromyography or using neural implants can really improve the algorithms we already have,” Trout argued. This is why the team is currently working on neural interface technologies and looking for industry partners.

“The goal is to combine all these approaches in one device,” George says. “We want to build an AI-powered robotic hand with a neural interface working with a company that would take it to the market in larger clinical trials.”

Nature Communications, 2025.  DOI: 10.1038/s41467-025-65965-9

Scientists built an AI co-pilot for prosthetic bionic hands Read More »

investors-commit-quarter-billion-dollars-to-startup-designing-“giga”-satellites

Investors commit quarter-billion dollars to startup designing “Giga” satellites

A startup established three years ago to churn out a new class of high-power satellites has raised $250 million to ramp up production at its Southern California factory.

The company, named K2, announced the cash infusion on Thursday. K2’s Series C fundraising round was led by Redpoint Ventures, with additional funding from investment firms in the United States, the United Kingdom, and Germany. K2 has now raised more than $400 million since its founding in 2022 and is on track to launch its first major demonstration mission next year, officials said.

K2 aims to take advantage of a coming abundance of heavy- and super-heavy-lift launch capacity, with SpaceX’s Starship expected to begin deploying satellites as soon as next year. Blue Origin’s New Glenn rocket launched twice this year and will fly more in 2026 while engineers develop an even larger New Glenn with additional engines and more lift capability.

Underscoring this trend toward big rockets are other launchers like SpaceX’s Falcon 9 and Falcon Heavy, United Launch Alliance’s Vulcan, and new vehicles from companies like Rocket Lab, Relativity Space, and Firefly Aerospace. K2’s founders believe satellites will follow a similar progression, reversing a trend toward smaller spacecraft in recent years, to address emerging markets like in-space computing and data processing.

Mega, then Giga

K2 is designing two classes of satellites—Mega and Giga—that it will build at an 180,000-square-foot factory in Torrance, California. The company’s first “Mega Class” satellite is named Gravitas. It is scheduled to launch in March 2026 on a Falcon 9 rocket. Once in orbit, Gravitas will test several systems that are fundamental to K2’s growth strategy. One is a 2o-kilowatt Hall-effect thruster that K2 says will be four times more powerful than any such thruster flown to date. Gravitas will also deploy twin solar arrays capable of generating 20 kilowatts of power.

“Gravitas brings our full stack together for the first time,” said Karan Kunjur, K2’s co-founder and CEO, in a company press release. “We are validating the architecture in space, from high-voltage power and large solar arrays to our guidance and control algorithms, and a 20 kW Hall thruster, and we will scale based on measured performance.”

Investors commit quarter-billion dollars to startup designing “Giga” satellites Read More »

ars-live:-3-former-cdc-leaders-detail-impacts-of-rfk-jr.’s-anti-science-agenda

Ars Live: 3 former CDC leaders detail impacts of RFK Jr.’s anti-science agenda

The Centers for Disease Control and Prevention is in critical condition. This year, the premier public health agency had its funding brutally cut and staff gutted, its mission sabotaged, and its headquarters riddled with literal bullets. The over 500 rounds fired were meant for its scientists and public health experts, who endured only to be sidelined, ignored, and overruled by Health Secretary Robert F. Kennedy Jr., an anti-vaccine activist hellbent on warping the agency to fit his anti-science agenda.

Then, on August 27, Kennedy fired CDC Director Susan Monarez just weeks after she was confirmed by the Senate. She had refused to blindly approve vaccine recommendations from a panel of vaccine skeptics and contrarians that he had hand-selected. The agency descended into chaos, and Monarez wasn’t the only one to leave the agency that day.

Three top leaders had reached their breaking point and coordinated their resignations upon the dramatic ouster: Drs. Demetre Daskalakis, Debra Houry, and Daniel Jernigan walked out of the agency as their colleagues rallied around them.

Dr. Daskalakis was the director of the CDC National Center for Immunization and Respiratory Diseases. He managed national responses to mpox, measles, seasonal flu, bird flu, COVID-19, and RSV.

Ars Live: 3 former CDC leaders detail impacts of RFK Jr.’s anti-science agenda Read More »

no-sterile-neutrinos-after-all,-say-microboone-physicists

No sterile neutrinos after all, say MicroBooNE physicists

Since the 1990s, physicists have pondered the tantalizing possibility of an exotic fourth type of neutrino, dubbed the “sterile” neutrino, that doesn’t interact with regular matter at all, apart from its fellow neutrinos, perhaps. But definitive experimental evidence for sterile neutrinos has remained elusive. Now it looks like the latest results from Fermilab’s MiniBooNE experiment have ruled out the sterile neutrino entirely, according to a paper published in the journal Nature.

How did the possibility of sterile neutrinos even become a thing? It all dates back to the so-called “solar neutrino problem.” Physicists detected the first solar neutrinos from the Sun in 1966. The only problem was that there were far fewer solar neutrinos being detected than predicted by theory, a conundrum that became known as the solar neutrino problem. In 1962, physicists discovered a second type (“flavor”) of neutrino, the muon neutrino. This was followed by the discovery of a third flavor, the tau neutrino, in 2000.

Physicists already suspected that neutrinos might be able to switch from one flavor to another. In 2002, scientists at the Sudbury Neutrino Observatory (or SNO) announced that they had solved the solar neutrino problem. The missing solar (electron) neutrinos were just in disguise, having changed into a different flavor on the long journey between the Sun and the Earth. If neutrinos oscillate, then they must have a teensy bit of mass after all. That posed another knotty neutrino-related problem. There are three neutrino flavors, but none of them has a well-defined mass. Rather, different kinds of “mass states” mix together in various ways to produce electron, muon, and tau neutrinos. That’s quantum weirdness for you.

And there was another conundrum, thanks to results from Los Alamos’ LSND experiment and Fermilab’s MiniBooNE (MicroBooNE’s predecessor). Both found evidence of muon neutrinos oscillating into electron neutrinos in a way that shouldn’t be possible if there were just three neutrino flavors. So physicists suggested there might be a fourth flavor: the sterile neutrino, so named because unlike the other three, it does not couple to a charged counterpart via the electroweak force. Its existence would also have big implications for the nature of dark matter. But despite the odd tantalizing hint, sterile neutrinos have proven to be maddeningly elusive.

No sterile neutrinos after all, say MicroBooNE physicists Read More »

nasa-just-lost-contact-with-a-mars-orbiter,-and-will-soon-lose-another-one

NASA just lost contact with a Mars orbiter, and will soon lose another one

Technicians work on the MAVEN spacecraft at NASA’s Kennedy Space Center in Florida ahead of its launch in 2013. Credit: NASA/Kim Shiflett

But NASA’s two other Mars orbiters have been in space for more than 20 years. The older of the two, named Mars Odyssey, has been at Mars since 2001 and will soon run out of fuel, probably sometime in the next couple of years. NASA’s Mars Reconnaissance Orbiter, which launched in 2005, is healthy for its age, with enough fuel to last into the 2030s. MRO is also important to NASA because it has the best camera at Mars, with the ability to map landing sites for future missions.

Two European spacecraft, Mars Express and the ExoMars Trace Gas Orbiter, have radios to relay data between mission controllers and NASA’s landers on the Martian surface. Mars Express, now 22 years old, suffers from the same aging concerns as Mars Odyssey and MRO. The ExoMars Trace Gas Orbiter is newer, having arrived at Mars in 2016, but is also operating beyond its original lifetime.

China and the United Arab Emirates also have orbiters circling Mars, but neither spacecraft is equipped to serve as a communications relay.

NASA’s Curiosity and Perseverance rovers have the capability for direct-to-Earth communications, but the orbiting relay network can support vastly higher data throughput. Without overhead satellites, much of the science data and many of the spectacular images collected by NASA’s rovers might never make it off the planet.

MAVEN’s unique orbit, stretching as far as 2,800 miles (4,500 kilometers) above Mars, has some advantages for data relay. In that orbit, MAVEN could relay science data from rovers on the surface for up to 30 minutes at a time, longer than the relay periods available through NASA’s lower-altitude orbiters. Because of this, MAVEN could support the largest data volumes of any of the other relay options.

NASA just lost contact with a Mars orbiter, and will soon lose another one Read More »

this-is-the-oldest-evidence-of-people-starting-fires

This is the oldest evidence of people starting fires


We didn’t start the fire. (Neanderthals did, at least 400,000 years ago.)

This artist’s impression shows what the fire at Barnham might have looked like. Credit: Craig Williams, The Trustees of the British Museum

Heat-reddened clay, fire-cracked stone, and fragments of pyrite mark where Neanderthals gathered around a campfire 400,000 years ago in what’s now Suffolk, England.

Based on chemical analysis of the sediment at the site, along with the telltale presence of pyrite, a mineral not naturally found nearby but very handy for striking sparks with flint, British Museum archaeologist Rob Davis and his colleagues say the Neanderthals probably started the fire themselves. That makes the abandoned English clay pit at Barnham the oldest evidence in the world that people (Neanderthal people, in this case) had learned to not only use fire, but also create it and control it.

A cozy Neanderthal campfire

Today, the Barnham site is part of an abandoned clay pit where workers first discovered stone tools in the early 1900s. But 400,000 years ago, it would have been a picturesque little spot at the edge of a stream-fed pond, surrounded by a mix of forest and grassland. There are no hominin fossils here, but archaeologists unearthed a Neanderthal skull about 100 kilometers to the south, so the hominins at Barnham were probably also Neanderthals. The place would have have offered a group of Neanderthals a relatively quiet, sheltered place to set up camp, according to Davis and his colleagues.

The cozy domesticity of that camp apparently centered on a hearth about the size of a small campfire. What’s left of that hearth today is a patch of clayey silt baked to a rusty red color by a series of fires; it stands out sharply against the yellowish clay that makes up the rest of the site. When ancient hearth fires heated that iron-rich yellow clay, it formed tiny grains of hematite that turned the baked clay a telltale red. Near the edge of the hearth, the archaeologists unearthed a handful of flint handaxes shattered by heat, alongside a scattering of other heat-cracked flint flakes.

And glinting against the dull clay lay two small pieces of a shiny sulfide mineral, aptly named pyrite—a key piece of Stone Age firestarting kits. Long before people struck flint and steel together to make fire, they struck flint and pyrite. Altogether, the evidence at Barnham suggests that Neanderthals were building and lighting their own fires 400,000 years ago.

Fire: the way of the future

Lighting a fire sounds like a simple thing, but once upon a time, it took cutting-edge technology. Working out how to start a fire on purpose—and then how to control its size and temperature—was the breakthrough that made nearly everything else possible: hafted stone weapons, cooked food, metalworking, and ultimately microprocessors and heavy-lift rockets.

“Something else that fire provides is additional time. The campfire becomes a social hub,” said Davis during a recent press conference. “Having fire… provides this kind of intense socialization time after dusk.” It may have been around fires like the one at Barnham, huddled together against the dark Pleistocene evening, that hominins began developing language, storytelling, and mythologies. And those things, Davis suggested, could have “played a critical part in maintaining social relationships over bigger distances or within more complex social groups.” Fire, in other words, helped make us more fully human and may have helped us connect in the same way that bonding over TV shows does today.

Archaeologists have worked for decades to try to pinpoint exactly when that breakthrough happened (although most now agree that it probably happened multiple times in different places). But evidence of fire is hard to find because it’s ephemeral by its very nature. The small patch of baked clay at Barnham hasn’t seen a fire in half a million years, but its light is still pushing back the shadows.

an artist's impression of a person's hands holding a piece of flint and a piece of pyrite, striking them together to make sparks

This was the first step toward the Internet. We could have turned back. Credit: Craig Williams, The Trustees of the British Museum

A million-year history of fire

Archaeologists suspect that the first hominins to use fire took advantage of nearby wildfires: Picture a Homo erectus lighting a branch on a nearby wildfire (which must have taken serious guts), then carefully carrying that torch back to camp to cook or make it easier to ward off predators for a night. Evidence of that sort of thing—using fire, but not necessarily being able to summon it on command—dates back more than a million years at sites like Koobi Fora in Kenya and Swartkrans in South Africa.

Learning to start a fire whenever you want one is harder, but it’s essential if you want to cook your food regularly without having to wait for the next lightning strike to spark a brushfire. It can also help maintain the careful control of temperature needed to make birch tar adhesives, “The advantage of fire-making lies in its predictability,” as Davis and his colleagues wrote in their paper. Knowing how to strike a light changed fire from an occasional luxury item to a staple of hominin life.

There are hints that Neanderthals in Europe were using fire by around 400,000 years ago, based on traces of long-cold hearths at sites in France, Portugal, Spain, the UK, and Ukraine. (The UK site, Beeches Pit, is just 10 kilometers southwest of Barnham.) But none of those sites offer evidence that Neanderthals were making fire rather than just taking advantage of its natural appearance. That kind of evidence doesn’t show up in the archaeological record until 50,000 years ago, when groups of Neanderthals in France used pyrite and bifaces (multi-purpose flint tools with two worked faces, sharp edges, and a surprisingly ergonomic shape) to light their own hearth-fires; marks left on the bifaces tell the tale.

Barnham pushes that date back dramatically, but there’s probably even older evidence out there. Davis and his colleagues say the Barnham Neanderthals probably didn’t invent firestarting; they likely brought the knowledge with them from mainland Europe.

“It’s certainly possible that Homo sapiens in Africa had the ability to make fire, but it can’t be proven yet from the evidence. We only have the evidence at this date from Barnham,” said Natural History Museum London anthropologist Chris Stringer, a coauthor of the study, in the press conference.

a person holds a tiny fragment of pyrite between a thumb and forefinger

The two pyrite fragments at the side may have broken off a larger nodule when it was struck against a piece of flint. Credit: Jordan Mansfield, Pathways to Ancient Britain Project.

Digging into the details

Several types of evidence at the site point to Neanderthals starting their own fire, not borrowing from a local wildfire. Ancient wildfires leave traces in sediment that can last hundreds of thousands of years or more—microscopic bits of charcoal and ash. But the area that’s now Suffolk wasn’t in the middle of wildfire season when the Barnham hearth was in use. Chemical evidence, like the presence of heavy hydrocarbon molecules in the sediment around the hearth, suggests this fire was homemade (wildfires usually scatter lighter ones across several square kilometers of landscape).

But the key piece of evidence at Barnham—the kind of clue that arson investigators probably dream about—is the pyrite. Pyrite isn’t a naturally common mineral in the area around Barnham; Neanderthals would have had to venture at least 12 kilometers southeast to find any. And although few hominins can resist the allure of picking up a shiny rock, it’s likely that these bits of pyrite had a more practical purpose.

To figure out what sort of fire might have produced the reddened clay, Davis and his colleagues did some experiments (which involved setting a bunch of fires atop clay taken from near the site). The archaeologists compared the baked clay from Barnham to the clay from beneath their experimental fires. The grain size and chemical makeup of the clay from the ancient Neanderthal hearth looked almost exactly like “12 or more heating events, each lasting 4 hours at temperatures of 400º Celsius or 600º Celsius,” as Davis and his colleagues wrote.

In other words, the hearth at Barnham hints at the rhythms of daily life for one group of Neanderthals 400,000 years ago. For starters, it seems that they kindled their campfire in the same spot over and over and left it burning for hours at a time. Flakes of flint nearby conjure up images of Neanderthals sitting around the fire, knapping stone tools as they told each other stories long into the night.

Nature, 2025 DOI: 10.1038/s41586-025-09855-6 About DOIs).

Photo of Kiona N. Smith

Kiona is a freelance science journalist and resident archaeology nerd at Ars Technica.

This is the oldest evidence of people starting fires Read More »

court:-“because-trump-said-to”-may-not-be-a-legally-valid-defense

Court: “Because Trump said to” may not be a legally valid defense

In one of those cases, a judge lifted the hold on construction, ruling that a lack of a sound justification for the hold made it “the height of arbitrary and capricious,” a legal standard that determines whether federal decision-making is acceptable under the Administrative Procedures Act. If this were a fictional story, that would be considered foreshadowing.

With no indication of how long the comprehensive assessment would take, 17 states sued to lift the hold on permitting. They were joined by the Alliance for Clean Energy New York, which represents companies that build wind projects or feed their supply chain. Both the plaintiffs and the agencies that were sued asked for summary judgment in the case.

The first issue Judge Saris addressed is standing: Are the states suffering appreciable harm from the suspension of wind projects? She noted that they would receive tax revenue from the projects, that their citizens should see reduced energy costs following their completion, and that the projects were intended to contribute to their climate goals, thus limiting harm to their citizens. At one point, Saris even referred to the government’s attempts to claim the parties lacked standing as “tilting at windmills.”

The government also argued that the suspension wasn’t a final decision—that would come after the review—and thus didn’t fall under the Administrative Procedures Act. But Saris ruled that the decision to suspend all activity pending the rule was the end of a decision-making process and was not being reconsidered by the government, so it qualified.

Because Trump told us to

With those basics out of the way, Saris turned to the meat of the case, which included a consideration of whether the agencies had been involved with any decision-making at all. “The Agency Defendants contend that because they ‘merely followed’ the Wind Memo ‘as the [Wind Memo] itself commands,’ the Wind Order did not constitute a ‘decision’ and therefore no reasoned explanation was required,” her ruling says. She concludes that precedent at the circuit court level blocks this defense, as it would mean that agencies would be exempt from the Administrative Procedures Act whenever the president told them to do anything.

Court: “Because Trump said to” may not be a legally valid defense Read More »

brazil-weakens-amazon-protections-days-after-cop30

Brazil weakens Amazon protections days after COP30


Backed by powerful corporations, nations are giving public false choices: Environmental protection or economic growth.

Deforestation fire in the Amazon rainforest. Credit: Brasil2

Despite claims of environmental leadership and promises to preserve the Amazon rainforest ahead of COP30, Brazil is stripping away protections for the region’s vital ecosystems faster than workers dismantled the tents that housed the recent global climate summit in Belém.

On Nov. 27, less than a week after COP30 ended, a powerful political bloc in Brazil’s National Congress, representing agribusiness, and development interests, weakened safeguards for the Amazon’s rivers, forests, and Indigenous communities.

The rollback centered on provisions in an environmental licensing bill passed by the government a few months before COP30. The law began to take shape well before, during the Jair Bolsonaro presidency from 2019 to 2023. It reflected the deregulatory agenda of the rural caucus, the Frente Parlamentar da Agropecuária, which wielded significant power during his term and remains influential today.

Bolsonaro’s government openly supported weakening environmental licensing. His environment minister, Ricardo Salles, dismissed licensing as “a barrier to development” and pushed for broad deregulation.

Current President Luiz Inácio Lula da Silva vetoed many of its most controversial provisions in August, citing risks to Indigenous rights and environmental oversight. But in late November, the legislature overturned those vetoes and reinstated the contested sections.

“This is neither improving nor modernizing, it is simply deregulation,” said Sarah Sax, who analyzes Brazil’s climate and human rights policies as a researcher with Climate Rights International, a California-based nonprofit advocating for climate justice.

“It’s happening in Brazil in ways that mirror what you’re seeing around the world. These are proxy fights over democracy, human rights, and institutional power,” she said, noting a broader global pattern of industrial and political blocs pushing deregulation and weakening institutions designed to protect communities and ecosystems.

According to analyses by the Brazilian Academy of Sciences and other organizations, the provisions at issue will enable many projects to get permits by self-declaring compliance, without undergoing complete environmental impact assessments or third-party review.

Under the law, deforested properties or land cleared without a license can be retroactively legalized without restoring the land or ecological conditions, which rewards illegal deforestation. Larger projects, like irrigation, dams, and sanitation works, as well as roads and energy infrastructure, can proceed with minimal environmental scrutiny, risking more forest fragmentation and habitat destruction. And the licensing changes narrow who must be recognized and consulted during reviews, which could exclude communities without formal land titles.

A human rights issue

It’s alarming that the legislature overrode the vetoes, said Astrid Puentes Riaño, the United Nations special rapporteur on the human right to a healthy environment. As it stands now, the law may violate Brazil’s international environmental commitments, she added.

“What is at stake is [whether] Brazil, as a country, is able to effectively protect the environment, including all their fundamental resources,” she said.

She noted that Brazil is not facing this problem alone.

“I think that we, unfortunately, are seeing a wave of regressions globally toward weakened environmental impact assessments, because they’re seen as obstacles for development and investment,” she said.

But cutting reviews when science clearly shows that the planet is facing a “triple crisis of climate change, biodiversity loss, and toxic contamination” is a huge step in the wrong direction.

“Environmental impact assessments are not a checklist in a supermarket,” she said. “They are an essential element for states to prevent environmental, climate, human rights, and social impacts.”

She emphasized that weakening environmental review isn’t a technocratic tweak or political win for one side. It undermines the foundations of public health, Indigenous rights, and climate safety.

“This is not about politics, it’s about survival,” she said. “Some of these impacts on water, on air, on biodiversity, on people’s health, are irreversible. These are not things you can fix later.”

Climate backlash is scientifically unfounded

The fight over Brazil’s environmental licensing law can be seen as a microcosm of global climate policy tensions, with governments performatively signaling climate ambition at international meetings, such as COP30, while doubling down on economic nationalism by claiming there is no money for climate action at home and instead financing measures to boost development and growth.

Claudio Angelo, with Brazilian NGO Observatório do Clima, said that this false-choice paradigm was “certainly an underlying theme” in the debates over the law.

“It has appeared in the speeches of most Congressmen who voted for the new legislation and to overturn Lula’s vetoes,” he said. “But, more worryingly, there was a lot of sheer disinformation.”

The two lobbying groups that pushed for the law that weakens environmental reviews repeatedly said that the existing licensing process is too slow and thus hampers economic progress. They claimed, without proof, that thousands of projects were stuck in the permitting process.

“But in the end, this may have been more about hubris than anything,” Angelo said. “Congress did that because it could. And because the private interests most Congressmen serve don’t want any regulation of any kind.”

Even without a complete analysis, it’s clear that cutting environmental reviews conflicts indirectly with Brazil’s climate plans, making it more difficult to stop deforestation.

Angelo expects some environmental groups will challenge the new law. Parts of it are subject to a 180-day waiting period, he said, so the final outcome is unclear. But a companion measure that passed as an executive order just this week, creates a fast-track permitting process for projects the government deems strategic, and it is effective immediately.

Puentes Riaño said recent advisory opinions from the International Court of Justice and the Inter-American Court of Human Rights make it clear that states must “use all means at their disposal to prevent actions that cause significant harm” to the Earth’s climate.

A growing body of research in ecological economics shows that such false choices are mainly a political narrative used by special interest groups to justify deregulation, despite evidence showing that degrading ecosystems undermines both climate goals and economic resilience.

Mainstream science and climate reports, including the Sixth Assessment Report from the Intergovernmental Panel on Climate Change and the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, directly contradict the idea that countries must choose between protecting ecosystems and achieving economic development.

The studies show that intact forests, healthy rivers, and secure Indigenous and local land rights are among the most effective and cost-efficient climate mitigation strategies available, delivering carbon sequestration, ecosystem resilience, public health and long-term economic stability. The IPCC explicitly recognizes community-led stewardship and ecosystem protection as core pillars of climate action, not afterthoughts.

Those scientific realities also underpinned the 2015 Paris Agreement, which ignited more public pressure for climate action, including youth-led mass demonstrations like the Fridays for Future marches that swelled in 2019. For a short time after COP21 in Paris, climate ambition rose worldwide, pushing governments to adopt stronger targets and framing ecosystems and community rights as essential to mitigation.

But the COVID-19 pandemic and Russia’s invasion of Ukraine unleashed overlapping economic shocks that reset priorities. Governments focused on energy security, food security supply chains and inflation, creating openings for industrial and agricultural lobbies to argue that environmental rules hindered economic recovery.

Those pressures dovetailed with persistent strains of economic nationalism and identity politics, strengthening political forces that frame environmental safeguards as constraints on sovereignty and growth. The result was global regulatory rollbacks, from the US and Canada to mining regions in Indonesia and Australia, each framed as necessary to speed development, stabilize supply chains or simply acting out of economic self-interest.

Germany, for example, arrived at COP30 emphasizing its commitment to ambitious climate action. But weeks later, its new government under Chancellor Friedrich Merz pressed the European Union to weaken or delay the bloc’s 2035 phaseout of gas- and diesel-fueled cars.

The move mirrored Brazil’s own post-COP30 reversal. In both cases, political leaders under pressure from domestic industries framed their actions as necessary to defend national interests amid economic uncertainty.

Why it matters

Brazil’s post-COP30 shift toward deregulation in the name of economic development has far-reaching implications because Amazon forests influence global climate and weather patterns, circulating vast amounts of heat and water with effects that ripple far beyond the Amazon Basin.

Moisture from the rainforests creates a belt of rising, humid air that shapes rainfall patterns from the Andes to the US Gulf Coast. Research shows that when large areas of the Amazon are cleared or degraded, the system weakens, shifting precipitation patterns in ways that can amplify droughts in South America and intensify rainfall extremes elsewhere.

Drier Amazon conditions also warm the tropical Atlantic and can change winds that shape Atlantic hurricane formation, potentially boosting the frequency or intensity of storms that strike the Caribbean and North America. Research on long-distance links in the climate system shows that Amazon drying can also reduce summer rainfall across the US Midwest and Southern Plains, regions that depend on predictable precipitation for agriculture.

And the Amazon’s role as a critical carbon sink is also at risk. Its vegetation and soils store about 150 billion to 200 billion metric tons of carbon, equivalent to about 70 to 90 years of annual US fossil-fuel carbon dioxide emissions.

Brazil’s land-use sector is already one of the world’s largest sources of climate-warming pollution. Deforestation, fires, and forest degradation in the Amazon and Cerrado savanna account for 700 million to 800 million metric tons of climate-warming gases annually, equal to Germany’s yearly emissions.

Research shows that additional degradation enabled by the licensing law increases the risk of rainforest dieback, which could convert large tracts of rainforest to drier savanna-like conditions, pushing the region closer to a tipping point beyond which the Amazon would drive accelerated warming rather than helping to stabilize the climate.

Brazil’s reversal lands at a moment when the world can least afford mixed signals, and COP30 ended with Indigenous leaders warning that “our land is not for sale” and that “we can’t eat money,” reminding delegates that protecting forests is not an abstraction but a matter of survival.

Brazil’s decision to weaken environmental protections so soon after COP30 captures the larger crisis facing global climate policy: the widening gap between international promises and domestic political choices. And the Amazon can’t withstand much more waffling, Sax said.

“There is no planet B,” she said. “This is the fight.”

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy and the environment. Sign up for their newsletter here.

Photo of Inside Climate News

Brazil weakens Amazon protections days after COP30 Read More »

pompeii-construction-site-confirms-recipe-for-roman-concrete

Pompeii construction site confirms recipe for Roman concrete

Back in 2023, we reported on MIT scientists’ conclusion that the ancient Romans employed “hot mixing” with quicklime, among other strategies, to make their famous concrete, giving the material self-healing functionality. The only snag was that this didn’t match the recipe as described in historical texts. Now the same team is back with a fresh analysis of samples collected from a recently discovered site that confirms the Romans did indeed use hot mixing, according to a new paper published in the journal Nature Communications.

As we’ve reported previously, like today’s Portland cement (a basic ingredient of modern concrete), ancient Roman concrete was basically a mix of a semi-liquid mortar and aggregate. Portland cement is typically made by heating limestone and clay (as well as sandstone, ash, chalk, and iron) in a kiln. The resulting clinker is then ground into a fine powder with just a touch of added gypsum to achieve a smooth, flat surface. But the aggregate used to make Roman concrete was made up of fist-sized pieces of stone or bricks.

In his treatise De architectura (circa 30 CE), the Roman architect and engineer Vitruvius wrote about how to build concrete walls for funerary structures that could endure for a long time without falling into ruin. He recommended the walls be at least two feet thick, made of either “squared red stone or of brick or lava laid in courses.” The brick or volcanic rock aggregate should be bound with mortar composed of hydrated lime and porous fragments of glass and crystals from volcanic eruptions (known as volcanic tephra).

Admir Masic, an environmental engineer at MIT, has studied ancient Roman concrete for several years. For instance, in 2019, Masic helped pioneer a new set of tools for analyzing Roman concrete samples from Privernum at multiple length scales—notably, Raman spectroscopy for chemical profiling and multi-detector energy dispersive spectroscopy (EDS) for phase mapping the material. Masic was also a co-author of a 2021 study analyzing samples of the ancient concrete used to build a 2,000-year-old mausoleum along the Appian Way in Rome known as the Tomb of Caecilia Metella, a noblewoman who lived in the first century CE.

And in 2023, Masic’s group analyzed samples taken from the concrete walls of the Privernum, focusing on strange white mineral chunks known as “lime clasts,” which others had largely dismissed as resulting from subpar raw materials or poor mixing. Masic et al. concluded that was not the case. Rather, the Romans deliberately employed “hot mixing” with quicklime that gave the material self-healing functionality. When cracks begin to form in the concrete, they are more likely to move through the lime clasts. The clasts can then react with water, producing a solution saturated with calcium. That solution can either recrystallize as calcium carbonate to fill the cracks or react with the pozzolanic components to strengthen the composite material.

Pompeii construction site confirms recipe for Roman concrete Read More »