Science

a-japanese-lander-crashed-on-the-moon-after-losing-track-of-its-location

A Japanese lander crashed on the Moon after losing track of its location


“It’s not impossible, so how do we overcome our hurdles?”

Takeshi Hakamada, founder and CEO of ispace, attends a press conference in Tokyo on June 6, 2025, to announce the outcome of his company’s second lunar landing attempt. Credit: Kazuhiro Nogi/AFP via Getty Images

A robotic lander developed by a Japanese company named ispace plummeted to the Moon’s surface Thursday, destroying a small rover and several experiments intended to demonstrate how future missions could mine and harvest lunar resources.

Ground teams at ispace’s mission control center in Tokyo lost contact with the Resilience lunar lander moments before it was supposed to touch down in a region called Mare Frigoris, or the Sea of Cold, a basaltic plain in the Moon’s northern hemisphere.

A few hours later, ispace officials confirmed what many observers suspected. The mission was lost. It’s the second time ispace has failed to land on the Moon in as many tries.

“We wanted to make Mission 2 a success, but unfortunately we haven’t been able to land,” said Takeshi Hakamada, the company’s founder and CEO.

Ryo Ujiie, ispace’s chief technology officer, said the final data received from the Resilience lander—assuming it was correct—showed it at an altitude of approximately 630 feet (192 meters) and descending too fast for a safe landing. “The deceleration was not enough. That was a fact,” Ujiie told reporters in a press conference. “We failed to land, and we have to analyze the reasons.”

The company said in a press release that a laser rangefinder used to measure the lander’s altitude “experienced delays in obtaining valid measurement values.” The downward-facing laser fires light pulses toward the Moon during descent, and clocks the time it takes to receive a reflection. This time delay at light speed tells the lander’s guidance system how far it is above the lunar surface. But something went wrong in the altitude measurement system on Thursday.

“As a result, the lander was unable to decelerate sufficiently to reach the required speed for the planned lunar landing,” ispace said. “Based on these circumstances, it is currently assumed that the lander likely performed a hard landing on the lunar surface.”

Controllers sent a command to reboot the lander in hopes of reestablishing communication, but the Resilience spacecraft remained silent.

“Given that there is currently no prospect of a successful lunar landing, our top priority is to swiftly analyze the telemetry data we have obtained thus far and work diligently to identify the cause,” Hakamada said in a statement. “We will strive to restore trust by providing a report of the findings to our shareholders, payload customers, Hakuto-R partners, government officials, and all supporters of ispace.”

Overcoming obstacles

The Hakuto name harkens back to ispace’s origin in 2010 as a contender for the Google Lunar X-Prize, a sweepstakes that offered a $20 million grand prize to the first privately funded team to put a lander on the Moon. Hakamada’s group was called Hakuto, which means “white rabbit” in Japanese. The prize shut down in 2018 without a winner, leading some of the teams to dissolve or find new purpose. Hakamada stayed the course, raised more funding, and rebooted the program under the name Hakuto-R.

It’s a story of resilience, hence the name of ispace’s second lunar lander. The mission made it closer to the Moon than the ispace’s first landing attempt in 2023, but Thursday’s failure is a blow to Hakamada’s project.

“As a fact, we tried twice and we haven’t been able to land on the Moon,” Hakamada said through an interpreter. “So we have to say it’s hard to land on the Moon, technically. We know it’s not easy. It’s not something that everyone can do. We know it’s hard, but the important point is it’s not impossible. The US private companies have succeeded in landing, and also JAXA in Japan has succeeded in landing, so it’s not impossible. So how do we overcome our hurdles?”

The Resilience lander and Tenacious rover, seen mounted near the top of the spacecraft, inside a test facility at the Tsukuba Space Center in Tsukuba, Ibaraki Prefecture, on Thursday, Sept. 12, 2024. Credit: Toru Hanai/Bloomberg via Getty Images

In April 2023, ispace’s first lander crashed on the Moon due to a similar altitude measurement problem. The spacecraft thought it was on the surface of the Moon, but was actually firing its engine to hover at an altitude of 3 miles (5 kilometers). The spacecraft ran out of fuel and went into a free fall before impacting the Moon.

Engineers blamed software as the most likely reason for the altitude-measurement problem. During descent, ispace’s lander passed over a 10,000-foot-tall (3,000-meter) cliff, and the spacecraft’s computer interpreted the sudden altitude change as erroneous.

Ujiie, who leads ispace’s technical teams, said the failure mode Thursday was “similar” to that of the first mission two years ago. But at least in ispace’s preliminary data reviews, engineers saw different behavior from the Resilience lander, which flew with a new type of laser rangefinder after ispace’s previous supplier stopped producing the device.

“From Mission 1 to Mission 2, we improved the software,” Ujiie said. “Also, we improved how to approach the landing site… We see different phenomena from Mission 1, so we have to do more analysis to give you any concrete answers.”

If ispace landed smoothly on Thursday, the Resilience spacecraft would have deployed a small rover developed by ispace’s European subsidiary. The rover was partially funded by the Luxembourg Space Agency with support from the European Space Agency. It carried a shovel to scoop up a small amount of lunar soil and a camera to take a photo of the sample. NASA had a contract with ispace to purchase the lunar soil in a symbolic proof of concept to show how the government might acquire material from commercial mining companies in the future.

The lander also carried a water electrolyzer experiment to demonstrate technologies that could split water molecules into hydrogen and oxygen, critical resources for a future Moon base. Other payloads aboard the Resilience spacecraft included cameras, a food production experiment, a radiation monitor, and a Swedish art project called “MoonHouse.”

The spacecraft chassis used for ispace’s first two landing attempts was about the size of a compact car, with a mass of about 1 metric ton (2,200 pounds) when fully fueled. The company’s third landing attempt is scheduled for 2027 with a larger lander. Next time, ispace will fly to the Moon in partnership between the company’s US subsidiary and Draper Laboratory, which has a contract with NASA to deliver experiments to the lunar surface.

Track record

The Resilience lander launched in January on top of a SpaceX Falcon 9 rocket, riding to space in tandem with a commercial Moon lander named Blue Ghost from Firefly Aerospace. Firefly’s lander took a more direct journey to the Moon and achieved a soft landing on March 2. Blue Ghost operated on the lunar surface for two weeks and completed all of its objectives.

The trajectory of ispace’s lander was slower, following a lower-energy, more fuel-efficient path to the Moon before entering lunar orbit last month. Once in orbit, the lander made a few more course corrections to line up with its landing site, then commenced its final descent on Thursday.

Thursday’s landing attempt was the seventh time a privately developed Moon lander tried to conduct a controlled touchdown on the lunar surface.

Two Texas-based companies have had the most success. One of them, Houston-based Intuitive Machines, landed its Odysseus spacecraft on the Moon in February 2024, marking the first time a commercial lander reached the lunar surface intact. But the lander tipped over after touchdown, cutting its mission short after achieving some limited objectives. A second Intuitive Machines lander reached the Moon in one piece in March of this year, but it also fell over and didn’t last as long as the company’s first mission.

Firefly’s Blue Ghost operated for two weeks after reaching the lunar surface, accomplishing all of its objectives and becoming the first fully successful privately owned spacecraft to land and operate on the Moon.

Intuitive Machines, Firefly, and a third company—Astrobotic Technology—have launched their lunar missions under contract with a NASA program aimed at fostering a commercial marketplace for transportation to the Moon. Astrobotic’s first lander failed soon after its departure from Earth. The first two missions launched by ispace were almost fully private ventures, with limited participation from the Japanese space agency, Luxembourg, and NASA.

The Earth looms over the Moon’s horizon in this image from lunar orbit captured on May 27, 2025, by ispace’s Resilience lander. Credit: ispace

Commercial travel to the Moon only began in 2019, so there’s not much of a track record to judge the industry’s prospects. When NASA started signing contracts for commercial lunar missions, the then-chief of the agency’s science vision, Thomas Zurbuchen, estimated the initial landing attempts would have a 50-50 chance of success. On the whole, NASA’s experience with Intuitive Machines, Firefly, and Astrobotic isn’t too far off from Zurbuchen’s estimate, with one full success and a couple of partial successes.

The commercial track record worsens if you include private missions from ispace and Israel’s Beresheet lander.

But ispace and Hakamada haven’t given up on the dream. The company’s third mission will launch under the umbrella of the same NASA program that contracted with Intuitive Machines, Firefly, and Astrobotic. Hakamada cited the achievements of Firefly and Intuitive Machines as evidence that the commercial model for lunar missions is a valid one.

“The ones that have the landers, there are two companies I mentioned. Also, Blue Origin maybe coming up. Also, ispace is a possibility,” Hakamada said. “So, very few companies. We would like to catch up as soon as possible.”

It’s too early to know how the failure on Thursday might impact ispace’s next mission with Draper and NASA.

“I have to admit that we are behind,” said Jumpei Nozaki, director and chief financial officer at ispace. “But we do not really think we are behind from the leading group yet. It’s too early to decide that. The players in the world that can send landers to the Moon are very few, so we still have some competitive edge.”

“Honestly, there were some times I almost cried, but I need to lead this company, and I need to have a strong will to move forward, so it’s not time for me to cry,” Hakamada said.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

A Japanese lander crashed on the Moon after losing track of its location Read More »

cambridge-mapping-project-solves-a-medieval-murder

Cambridge mapping project solves a medieval murder


“A tale of shakedowns, sex, and vengeance that expose[s] tensions between the church and England’s elite.”

Location of the murder of John Forde, taken from the Medieval Murder Maps. Credit: Medieval Murder Maps. University of Cambridge: Institute of Criminology

In 2019, we told you about a new interactive digital “murder map” of London compiled by University of Cambridge criminologist Manuel Eisner. Drawing on data catalogued in the city coroners’ rolls, the map showed the approximate location of 142 homicide cases in late medieval London. The Medieval Murder Maps project has since expanded to include maps of York and Oxford homicides, as well as podcast episodes focusing on individual cases.

It’s easy to lose oneself down the rabbit hole of medieval murder for hours, filtering the killings by year, choice of weapon, and location. Think of it as a kind of 14th-century version of Clue: It was the noblewoman’s hired assassins armed with daggers in the streets of Cheapside near St. Paul’s Cathedral. And that’s just the juiciest of the various cases described in a new paper published in the journal Criminal Law Forum.

The noblewoman was Ela Fitzpayne, wife of a knight named Sir Robert Fitzpayne, lord of Stogursey. The victim was a priest and her erstwhile lover, John Forde, who was stabbed to death in the streets of Cheapside on May 3, 1337. “We are looking at a murder commissioned by a leading figure of the English aristocracy,” said University of Cambridge criminologist Manuel Eisner, who heads the Medieval Murder Maps project. “It is planned and cold-blooded, with a family member and close associates carrying it out, all of which suggests a revenge motive.”

Members of the mapping project geocoded all the cases after determining approximate locations for the crime scenes. Written in Latin, the coroners’ rolls are records of sudden or suspicious deaths as investigated by a jury of local men, called together by the coroner to establish facts and reach a verdict. Those records contain such relevant information as where the body was found and by whom; the nature of the wounds; the jury’s verdict on cause of death; the weapon used and how much it was worth; the time, location, and witness accounts; whether the perpetrator was arrested, escaped, or sought sanctuary; and any legal measures taken.

A brazen killing

The murder of Forde was one of several premeditated revenge killings recorded in the area of Westcheap. Forde was walking on the street when another priest, Hascup Neville, caught up to him, ostensibly for a casual chat, just after Vespers but before sunset. As they approached Foster Lane, Neville’s four co-conspirators attacked: Ela Fitzpayne’s brother, Hugh Lovell; two of her former servants, Hugh of Colne and John Strong; and a man called John of Tindale. One of them cut Ford’s throat with a 12-inch dagger, while two others stabbed him in the stomach with long fighting knives.

At the inquest, the jury identified the assassins, but that didn’t result in justice. “Despite naming the killers and clear knowledge of the instigator, when it comes to pursuing the perpetrators, the jury turn a blind eye,” said Eisner. “A household of the highest nobility, and apparently no one knows where they are to bring them to trial. They claim Ela’s brother has no belongings to confiscate. All implausible. This was typical of the class-based justice of the day.”

Colne, the former servant, was eventually charged and imprisoned for the crime some five years later in 1342, but the other perpetrators essentially got away with it.

Eisner et al. uncovered additional historical records that shed more light on the complicated history and ensuing feud between the Fitzpaynes and Forde. One was an indictment in the Calendar of Patent Rolls of Edward III, detailing how Ela and her husband, Forde, and several other accomplices raided a Benedictine priory in 1321. Among other crimes, the intruders “broke [the prior’s] houses, chests and gates, took away a horse, a colt and a boar… felled his trees, dug in his quarry, and carried away the stone and trees.” The gang also stole 18 oxen, 30 pigs, and about 200 sheep and lambs.

There were also letters that the Archbishop of Canterbury wrote to the Bishop of Winchester. Translations of the letters are published for the first time on the project’s website. The archbishop called out Ela by name for her many sins, including adultery “with knights and others, single and married, and even with clerics and holy orders,” and devised a punishment. This included not wearing any gold, pearls, or precious stones and giving money to the poor and to monasteries, plus a dash of public humiliation. Ela was ordered to perform a “walk of shame”—a tamer version than Cersei’s walk in Game of Thrones—every fall for seven years, carrying a four-pound wax candle to the altar of Salisbury Cathedral.

The London Archives. Inquest number 15 on 1336-7 City of London Coroner’s Rolls (

The London Archives. Inquest number 15 on 1336-7 City of London Coroner’s Rolls. Credit: The London Archives

Ela outright refused to do any of that, instead flaunting “her usual insolence.” Naturally, the archbishop had no choice but to excommunicate her. But Eisner speculates that this may have festered within Ela over the ensuing years, thereby sparking her desire for vengeance on Forde—who may have confessed to his affair with Ela to avoid being prosecuted for the 1321 raid. The archbishop died in 1333, four years before Forde’s murder, so Ela was clearly a formidable person with the patience and discipline to serve her revenge dish cold. Her marriage to Robert (her second husband) endured despite her seemingly constant infidelity, and she inherited his property when he died in 1354.

“Attempts to publicly humiliate Ela Fitzpayne may have been part of a political game, as the church used morality to stamp its authority on the nobility, with John Forde caught between masters,” said Eisner. “Taken together, these records suggest a tale of shakedowns, sex, and vengeance that expose tensions between the church and England’s elites, culminating in a mafia-style assassination of a fallen man of god by a gang of medieval hitmen.”

I, for one, am here for the Netflix true crime documentary on Ela Fitzpayne, “a woman in 14th century England who raided priories, openly defied the Archbishop of Canterbury, and planned the assassination of a priest,” per Eisner.

The role of public spaces

The ultimate objective of the Medieval Murder Maps project is to learn more about how public spaces shaped urban violence historically, the authors said. There were some interesting initial revelations back in 2019. For instance, the murders usually occurred in public streets or squares, and Eisner identified a couple of “hot spots” with higher concentrations than other parts of London. One was that particular stretch of Cheapside running from St Mary-le-Bow church to St. Paul’s Cathedral, where John Forde met his grisly end. The other was a triangular area spanning Gracechurch, Lombard, and Cornhill, radiating out from Leadenhall Market.

The perpetrators were mostly men (in only four cases were women the only suspects). As for weapons, knives and swords of varying types were the ones most frequently used, accounting for 68 percent of all the murders. The greatest risk of violent death in London was on weekends (especially Sundays), between early evening and the first few hours after curfew.

Eisner et al. have now extended their spatial analysis to include homicides committed in York and London in the 14th century with similar conclusions. Murders most often took place in markets, squares, and thoroughfares—all key nodes of medieval urban life—in the evenings or on weekends. Oxford had significantly higher murder rates than York or London and also more organized group violence, “suggestive of high levels of social disorganization and impunity.” London, meanwhile, showed distinct clusters of homicides, “which reflect differences in economic and social functions,” the authors wrote. “In all three cities, some homicides were committed in spaces of high visibility and symbolic significance.”

Criminal Law Forum, 2025. DOI: 10.1007/s10609-025-09512-7  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Cambridge mapping project solves a medieval murder Read More »

startup-puts-a-logical-qubit-in-a-single-piece-of-hardware

Startup puts a logical qubit in a single piece of hardware

A bit over a year ago, Nord Quantique used a similar setup to show that it could be used to identify the most common form of error in these devices, one in which the system loses one of its photons. “We can store multiple microwave photons into each of these cavities, and the fact that we have redundancy in the system comes exactly from this,” said Nord Quantique’s CTO, Julien Camirand Lemyre. However, this system was unable to handle many of the less common errors that might also occur.

This time around, the company is showing that it can get an actual logical qubit into a variant of the same hardware. In the earlier version of its equipment, the resonator cavity had a single post and supported a single frequency. In the newer iteration, there were two posts and two frequencies. Each of those frequencies creates its own quantum resonator in the same cavity, with its own set of modes. “It’s this ensemble of photons inside this cavity that creates the logical qubit,” Lemyre told Ars.

The additional quantum information that can now be stored in the system enables it to identify more complex errors than the loss of a photon.

Catching, but not fixing errors

The company did two experiments with this new hardware. First, it ran multiple rounds of error detection on data stored in the logical qubit, essentially testing its ability to act like a quantum memory and retain the information stored there. Without correcting errors, the system rapidly decayed, with an error probability in each round of measurement of about 12 percent. By the time the system reached the 25th measurement, almost every instance had already encountered an error.

The second time through, the company repeated the process, discarding any instances in which an error occurred. In almost every instance, that meant the results were discarded long before they got through two dozen rounds of measurement. But at these later stages, none of the remaining instances were in an erroneous state. That indicates that a successful correction of the errors—something the team didn’t try—would be able to fix all the detected problems.

Startup puts a logical qubit in a single piece of hardware Read More »

what-solar?-what-wind?-texas-data-centers-build-their-own-gas-power-plants

What solar? What wind? Texas data centers build their own gas power plants


Data center operators are turning away from the grid to build their own power plants.

Sisters Abigail and Jennifer Lindsey stand on their rural property on May 27 outside New Braunfels, Texas, where they posted a sign in opposition to a large data center and power plant planned across the street. Credit: Dylan Baddour/Inside Climate News

NEW BRAUNFELS, Texas—Abigail Lindsey worries the days of peace and quiet might be nearing an end at the rural, wooded property where she lives with her son. On the old ranch across the street, developers want to build an expansive complex of supercomputers for artificial intelligence, plus a large, private power plant to run it.

The plant would be big enough to power a major city, with 1,200 megawatts of planned generation capacity fueled by West Texas shale gas. It will only supply the new data center, and possibly other large data centers recently proposed, down the road.

“It just sucks,” Lindsey said, sitting on her deck in the shade of tall oak trees, outside the city of New Braunfels. “They’ve come in and will completely destroy our way of life: dark skies, quiet and peaceful.”

The project is one of many others like it proposed in Texas, where a frantic race to boot up energy-hungry data centers has led many developers to plan their own gas-fired power plants rather than wait for connection to the state’s public grid. Egged on by supportive government policies, this buildout promises to lock in strong gas demand for a generation to come.

The data center and power plant planned across from Lindsey’s home is a partnership between an AI startup called CloudBurst and the natural gas pipeline giant Energy Transfer. It was Energy Transfer’s first-ever contract to supply gas for a data center, but it is unlikely to be its last. In a press release, the company said it was “in discussions with a number of data center developers and expects this to be the first of many agreements.”

Previously, conventional wisdom assumed that this new generation of digital infrastructure would be powered by emissions-free energy sources like wind, solar and battery power, which have lately seen explosive growth. So far, that vision isn’t panning out, as desires to build quickly overcome concerns about sustainability.

“There is such a shortage of data center capacity and power,” said Kent Draper, chief commercial officer at Australian data center developer IREN, which has projects in West Texas. “Even the large hyperscalers are willing to turn a blind eye to their renewable goals for some period of time in order to get access.”

The Hays Energy Project is a 990 MW gas-fired power plant near San Marcos, Texas.

Credit: Dylan Baddour/Inside Climate News

The Hays Energy Project is a 990 MW gas-fired power plant near San Marcos, Texas. Credit: Dylan Baddour/Inside Climate News

IREN prioritizes renewable energy for its data centers—giant warehouses full of advanced computers and high-powered cooling systems that can be configured to produce crypto currency or generate artificial intelligence. In Texas, that’s only possible because the company began work here years ago, early enough to secure a timely connection to the state’s grid, Draper said.

There were more than 2,000 active generation interconnection requests as of April 30, totalling 411,600 MW of capacity, according to grid operator ERCOT. A bill awaiting signature on Gov. Greg Abbott’s desk, S.B. 6, looks to filter out unserious large-load projects bloating the queue by imposing a $100,000 fee for interconnection studies.

Wind and solar farms require vast acreage and generate energy intermittently, so they work best as part of a diversified electrical grid that collectively provides power day and night. But as the AI gold rush gathered momentum, a surge of new project proposals has created years-long wait times to connect to the grid, prompting many developers to bypass it and build their own power supply.

Operating alone, a wind or solar farm can’t run a data center. Battery technologies still can’t store such large amounts of energy for the length of time required to provide steady, uninterrupted power for 24 hours per day, as data centers require. Small nuclear reactors have been touted as a means to meet data center demand, but the first new units remain a decade from commercial deployment, while the AI boom is here today.

Now, Draper said, gas companies approach IREN all the time, offering to quickly provide additional power generation.

Gas provides almost half of all power generation capacity in Texas, far more than any other source. But the amount of gas power in Texas has remained flat for 20 years, while wind and solar have grown sharply, according to records from the US Energy Information Administration. Facing a tidal wave of proposed AI projects, state lawmakers have taken steps to try to slow the expansion of renewable energy and position gas as the predominant supply for a new era of demand.

This buildout promises strong demand and high gas prices for a generation to come, a boon to Texas’ fossil fuel industry, the largest in the nation. It also means more air pollution and emissions of planet-warming greenhouse gases, even as the world continues to barrel past temperature records.

Texas, with 9 percent of the US population, accounted for about 15 percent of current gas-powered generation capacity in the country but 26 percent of planned future generation at the end of 2024, according to data from Global Energy Monitor. Both the current and planned shares are far more than any other state.

GEM identified 42 new gas turbine projects under construction, in development, or announced in Texas before the start of this year. None of those projects are sited at data centers. However, other projects announced since then, like CloudBurst and Energy Transfer outside New Braunfels, will include dedicated gas power plants on site at data centers.

For gas companies, the boom in artificial intelligence has quickly become an unexpected gold mine. US gas production has risen steadily over 20 years since the fracking boom began, but gas prices have tumbled since 2024, dragged down by surging supply and weak demand.

“The sudden emergence of data center demand further brightens the outlook for the renaissance in gas pricing,” said a 2025 oil and gas outlook report by East Daley Analytics, a Colorado-based energy intelligence firm. “The obvious benefit to producers is increased drilling opportunities.”

It forecast up to a 20 percent increase in US gas production by 2030, driven primarily by a growing gas export sector on the Gulf Coast. Several large export projects will finish construction in the coming years, with demand for up to 12 billion cubic feet of gas per day, the report said, while new power generation for data centers would account for 7 billion cubic feet per day of additional demand. That means profits for power providers, but also higher costs for consumers.

Natural gas, a mixture primarily composed of methane, burns much cleaner than coal but still creates air pollution, including soot, some hazardous chemicals, and greenhouse gases. Unburned methane released into the atmosphere has more than 80 times the near-term warming effect of carbon dioxide, leading some studies to conclude that ubiquitous leaks in gas supply infrastructure make it as impactful as coal to the global climate.

Credit: Dylan Baddour/Inside Climate News

It’s a power source that’s heralded for its ability to get online fast, said Ed Hirs, an energy economics lecturer at the University of Houston. But the years-long wait times for turbines have quickly become the industry’s largest constraint in an otherwise positive outlook.

“If you’re looking at a five-year lead time, that’s not going to help Alexa or Siri today,” Hirs said.

The reliance on gas power for data centers is a departure from previous thought, said Larry Fink, founder of global investment firm BlackRock, speaking to a crowd of industry executives at an oil and gas conference in Houston in March.

About four years ago, if someone said they were building a data center, they said it must be powered by renewables, he recounted. Two years ago, it was a preference.

“Today?” Fink said. “They care about power.”

Gas plants for data centers

Since the start of this year, developers have announced a flurry of gas power deals for data centers. In the small city of Abilene, the builders of Stargate, one of the world’s largest data center projects, applied for permits in January to build 360 MW of gas power generation, authorized to emit 1.6 million tons of greenhouse gases and 14 tons of hazardous air pollutants per year. Later, the company announced the acquisition of an additional 4,500 MW of gas power generation capacity.

Also in January, a startup called Sailfish announced ambitious plans for a 2,600-acre, 5,000 MW cluster of data centers in the tiny North Texas town of Tolar, population 940.

“Traditional grid interconnections simply can’t keep pace with hyperscalers’ power demands, especially as AI accelerates energy requirements,” Sailfish founder Ryan Hughes told the website Data Center Dynamics at the time. “Our on-site natural gas power islands will let customers scale quickly.”

CloudBurst and Energy Transfer announced their data center and power plant outside New Braunfels in February, and another company partnership also announced plans for a 250 MW gas plant and data center near Odessa in West Texas. In May, a developer called Tract announced a 1,500-acre, 2,000 MW data center campus with some on-site generation and some purchased gas power near the small Central Texas town of Lockhart.

Not all new data centers need gas plants. A 120 MW South Texas data center project announced in April would use entirely wind power, while an enormous, 5,000 MW megaproject outside Laredo announced in March hopes to eventually run entirely on private wind, solar, and hydrogen power (though it will use gas at first). Another collection of six data centers planned in North Texas hopes to draw 1,400 MW from the grid.

Altogether, Texas’ grid operator predicts statewide power demand will nearly double within five years, driven largely by data centers for artificial intelligence. It mirrors a similar situation unfolding across the country, according to analysis by S&P Global.

“There is huge concern about the carbon footprint of this stuff,” said Dan Stanzione, executive director of the Texas Advanced Computing Center at the University of Texas at Austin. “If we could decarbonize the power grid, then there is no carbon footprint for this.”

However, despite massive recent expansions of renewable power generation, the boom in artificial intelligence appears to be moving the country farther from, not closer to, its decarbonization goals.

Restrictions on renewable energy

Looking forward to a buildout of power supply, state lawmakers have proposed or passed new rules to support the deployment of more gas generation and slow the surging expansion of wind and solar power projects. Supporters of these bills say they aim to utilize Texas’ position as the nation’s top gas producer.

Some energy experts say the rules proposed throughout the legislative session could dismantle the state’s leadership in renewables as well as the state’s ability to provide cheap and reliable power.

“It absolutely would [slow] if not completely stop renewable energy,” said Doug Lewin, a Texas energy consultant, about one of the proposed rules in March. “That would really be extremely harmful to the Texas economy.”

While the bills deemed as “industry killers” for renewables missed key deadlines, failing to reach Abbott’s desk, they illustrate some lawmakers’ aspirations for the state’s energy industry.

One failed bill, S.B. 388, would have required every watt of new solar brought online to be accompanied by a watt of new gas. Another set of twin bills, H.B. 3356 and S.B. 715, would have forced existing wind and solar companies to buy fossil-fuel based power or connect to a battery storage resource to cover the hours the energy plants are not operating.

When the Legislature last met in 2023, it created a $5 billion public “energy fund” to finance new gas plants but not wind or solar farms. It also created a new tax abatement program that excluded wind and solar. This year’s budget added another $5 billion to double the fund.

Bluebonnet Electric Cooperative is currently completing construction on a 190 MW gas-fired peaker plant near the town of Maxwell in Caldwell County.

Credit: Dylan Baddour/Inside Climate News

Bluebonnet Electric Cooperative is currently completing construction on a 190 MW gas-fired peaker plant near the town of Maxwell in Caldwell County. Credit: Dylan Baddour/Inside Climate News

Among the lawmakers leading the effort to scale back the state’s deployment of renewables is state Sen. Lois Kolkhorst, a Republican from Brenham. One bill she co-sponsored, S.B. 819, aimed to create new siting rules for utility-scale renewable projects and would have required them to get permits from the Public Utility Commission that no other energy source—coal, gas or nuclear—needs. “It’s just something that is clearly meant to kneecap an industry,” Lewin said about the bill, which failed to pass.

Kolkhorst said the bill sought to balance the state’s need for power while respecting landowners across the state.

Former state Rep. John Davis, now a board member at Conservative Texans for Energy Innovation, said the session shows how renewables have become a red meat issue.

More than 20 years ago, Davis and Kolkhorst worked together in the Capitol as Texas deregulated its energy market, which encouraged renewables to enter the grid’s mix, he said. Now Davis herds sheep and goats on his family’s West Texas ranch, where seven wind turbines provide roughly 40 percent of their income.

He never could have dreamed how significant renewable energy would become for the state grid, he said. That’s why he’s disappointed with the direction the legislature is headed with renewables.

“I can’t think of anything more conservative, as a conservative, than wind and solar,” Davis said. “These are things God gave us—use them and harness them.”

A report published in April finds that targeted limitations on solar and wind development in Texas could increase electricity costs for consumers and businesses. The report, done by Aurora Energy Research for the Texas Association of Business, said restricting the further deployment of renewables would drive power prices up 14 percent by 2035.

“Texas is at a crossroads in its energy future,” said Olivier Beaufils, a top executive at Aurora Energy Research. “We need policies that support an all-of-the-above approach to meet the expected surge in power demand.”

Likewise, the commercial intelligence firm Wood Mackenzie expects the power demand from data centers to drive up prices of gas and wholesale consumer electricity.

Pollution from gas plants

Even when new power plants aren’t built on the site of data centers, they might still be developed because of demand from the server farms.

For example, in 2023, developer Marathon Digital started up a Bitcoin mine in the small town of Granbury on the site of the 1,100 MW Wolf Hollow II gas power plant. It held contracts to purchase 300 MW from the plant.

One year later, the power plant operator sought permits to install eight additional “peaker” gas turbines able to produce up to 352 MW of electricity. These small units, designed to turn on intermittently during hours of peak demand, release more pollution than typical gas turbines.

Those additional units would be approved to release 796,000 tons per year of greenhouse gases, 251 tons per year of nitrogen oxides and 56 tons per year of soot, according to permitting documents. That application is currently facing challenges from neighboring residents in state administrative courts.

About 150 miles away, neighbors are challenging another gas plant permit application in the tiny town of Blue. At 1,200 MW, the $1.2 billion plant proposed by Sandow Lakes Energy Co. would be among the largest in the state and would almost entirely serve private customers, likely including the large data centers that operate about 20 miles away.

Travis Brown and Hugh Brown, no relation, stand by a sign marking the site of a proposed 1,200 MW gas-fired power plant in their town of Blue on May 7.

Credit: Dylan Baddour/Inside Climate News

Travis Brown and Hugh Brown, no relation, stand by a sign marking the site of a proposed 1,200 MW gas-fired power plant in their town of Blue on May 7. Credit: Dylan Baddour/Inside Climate News

This plan bothers Hugh Brown, who moved out to these green, rolling hills of rural Lee County in 1975, searching for solitude. Now he lives on 153 wooded acres that he’s turned into a sanctuary for wildlife.

“What I’ve had here is a quiet, thoughtful life,” said Brown, skinny with a long grey beard. “I like not hearing what anyone else is doing.”

He worries about the constant roar of giant cooling fans, the bright lights overnight and the air pollution. According to permitting documents, the power plant would be authorized to emit 462 tons per year of ammonia gas, 254 tons per year of nitrogen oxides, 153 tons per year of particulate matter, or soot, and almost 18 tons per year of “hazardous air pollutants,” a collection of chemicals that are known to cause cancer or other serious health impacts.

It would also be authorized to emit 3.9 million tons of greenhouse gases per year, about as much as 72,000 standard passenger vehicles.

“It would be horrendous,” Brown said. “There will be a constant roaring of gigantic fans.”

In a statement, Sandow Lakes Energy denied that the power plant will be loud. “The sound level at the nearest property line will be similar to a quiet library,” the statement said.

Sandow Lakes Energy said the plant will support the local tax base and provide hundreds of temporary construction jobs and dozens of permanent jobs. Sandow also provided several letters signed by area residents who support the plant.

“We recognize the critical need for reliable, efficient, and environmentally responsible energy production to support our region’s growth and economic development,” wrote Nathan Bland, president of the municipal development district in Rockdale, about 20 miles from the project site.

Brown stands next to a pond on his property ringed with cypress trees he planted 30 years ago.

Credit: Dylan Baddour/Inside Climate News

Brown stands next to a pond on his property ringed with cypress trees he planted 30 years ago. Credit: Dylan Baddour/Inside Climate News

Sandow says the plant will be connected to Texas’ public grid, and many supporting letters for the project cited a need for grid reliability. But according to permitting documents, the 1,200 MW plant will supply only 80 MW to the grid and only temporarily, with the rest going to private customers.

“Electricity will continue to be sold to the public until all of the private customers have completed projects slated to accept the power being generated,” said a permit review by the Texas Commission on Environmental Quality.

Sandow has declined to name those customers. However, the plant is part of Sandow’s massive, master-planned mixed-use development in rural Lee and Milam counties, where several energy-hungry tenants are already operating, including Riot Platforms, the largest cryptocurrency mine on the continent. The seven-building complex in Rockdale is built to use up to 700 MW, and in April, it announced the acquisition of a neighboring, 125 MW cryptocurrency mine, previously operated by Rhodium. Another mine by Bitmain, also one of the world’s largest Bitcoin companies, has 560 MW of operating capacity with plans to add 180 more in 2026.

In April, residents of Blue gathered at the volunteer fire department building for a public meeting with Texas regulators and Sandow to discuss questions and concerns over the project. Brown, owner of the wildlife sanctuary, spoke into a microphone and noted that the power plant was placed at the far edge of Sandow’s 33,000-acre development, 20 miles from the industrial complex in Rockdale but near many homes in Blue.

“You don’t want to put it up into the middle of your property where you could deal with the negative consequences,” Brown said, speaking to the developers. “So it looks to me like you are wanting to make money, in the process of which you want to strew grief in your path and make us bear the environmental costs of your profit.”

Inside Climate News’ Peter Aldhous contributed to this report.

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

What solar? What wind? Texas data centers build their own gas power plants Read More »

us-science-is-being-wrecked,-and-its-leadership-is-fighting-the-last-war

US science is being wrecked, and its leadership is fighting the last war


Facing an extreme budget, the National Academies hosted an event that ignored it.

WASHINGTON, DC—The general outline of the Trump administration’s proposed 2026 budget was released a few weeks back, and it included massive cuts for most agencies, including every one that funds scientific research. Late last week, those agencies began releasing details of what the cuts would mean for the actual projects and people they support. And the results are as bad as the initial budget had suggested: one-of-a-kind scientific experiment facilities and hardware retired, massive cuts in supported scientists, and entire areas of research halted.

And this comes in an environment where previously funded grants are being terminated, funding is being held up for ideological screening, and universities have been subjected to arbitrary funding freezes. Collectively, things are heading for damage to US science that will take decades to recover from. It’s a radical break from the trajectory science had been on.

That’s the environment that the US’s National Academies of Science found itself in yesterday while hosting the State of the Science event in Washington, DC. It was an obvious opportunity for the nation’s leading scientific organization to warn the nation of the consequences of the path that the current administration has been traveling. Instead, the event largely ignored the present to worry about a future that may never exist.

The proposed cuts

The top-line budget numbers proposed earlier indicated things would be bad: nearly 40 percent taken off the National Institutes of Health’s budget, the National Science Foundation down by over half. But now, many of the details of what those cuts mean are becoming apparent.

NASA’s budget includes sharp cuts for planetary science, which would be cut in half and then stay flat for the rest of the decade, with the Mars Sample Return mission canceled. All other science budgets, including Earth Science and Astrophysics, take similar hits; one astronomer posted a graphic showing how many present and future missions that would mean. Active missions that have returned unprecedented data, like Juno and New Horizons, would go, as would two Mars orbiters. As described by Science magazine’s news team, “The plans would also kill off nearly every major science mission the agency has not yet begun to build.”

A NASA graphic showing different missions focused on astrophysics. Red Xs have been superimposed on most of them.

A chart prepared by astronomer Laura Lopez showing just how many astrophysics missions will be cancelled. Credit: Laura Lopez

The National Science Foundation, which funds much of the US’s fundamental research, is also set for brutal cuts. Biology, engineering, and education will all be slashed by over 70 percent; computer science, math and physical science, and social and behavioral science will all see cuts of over 60 percent. International programs will take an 80 percent cut. The funding rate of grant proposals is expected to drop from 26 percent to just 7 percent, meaning the vast majority of grants submitted to the NSF will be a waste of time. The number of people involved in NSF-funded activities will drop from over 300,000 to just 90,000. Almost every program to broaden participation in science will be eliminated.

As for specifics, they’re equally grim. The fleet of research ships will essentially become someone else’s problem: “The FY 2026 Budget Request will enable partial support of some ships.” We’ve been able to better pin down the nature and location of gravitational wave events as detectors in Japan and Italy joined the original two LIGO detectors; the NSF will reverse that progress by shutting one of the LIGOs. The NSF’s contributions to detectors at the Large Hadron Collider will be cut by over half, and one of the two very large telescopes it was helping fund will be cancelled (say goodbye to the Thirty Meter Telescope). “Access to the telescopes at Kitt Peak and Cerro Tololo will be phased out,” and the NSF will transfer the facilities to other organizations.

The Department of Health and Human Services has been less detailed about the specific cuts its divisions will see, largely focusing on the overall numbers, which are down considerably. The NIH, which is facing a cut of over 40 percent, will be reorganized, with its 19 institutes pared down to just eight. This will result in some odd pairings, such as the dental and eye institutes ending up in the same place; genomics and biomedical imaging will likewise end up under the same roof. Other groups like the Centers for Disease Control and Prevention and the Food and Drug Administration will also face major cuts.

Issues go well beyond the core science agencies, as well. In the Department of Energy, funding for wind, solar, and renewable grid integration has been zeroed out, essentially ending all programs in this area. Hydrogen and fuel cells face a similar fate. Collectively, these had gotten over $600 billion dollars in 2024’s budget. Other areas of science at the DOE, such as high-energy physics, fusion, and biology, receive relatively minor cuts that are largely in line with the ones faced by administration priorities like fossil and nuclear energy.

Will this happen?

It goes without saying that this would amount to an abandonment of US scientific leadership at a time when most estimates of China’s research spending show it approaching US-like levels of support. Not only would it eliminate many key facilities, instruments, and institutions that have helped make the US a scientific powerhouse, but it would also block the development of newer and additional ones. The harms are so widespread that even topics that the administration claims are priorities would see severe cuts.

And the damage is likely to last for generations, as support is cut at every stage of the educational pipeline that prepares people for STEM careers. This includes careers in high-tech industries, which may require relocation overseas due to a combination of staffing concerns and heightened immigration controls.

That said, we’ve been here before in the first Trump administration, when budgets were proposed with potentially catastrophic implications for US science. But Congress limited the damage and maintained reasonably consistent budgets for most agencies.

Can we expect that to happen again? So far, the signs are not especially promising. The House has largely adopted the Trump administration’s budget priorities, despite the fact that the budget they pass turns its back on decades of supposed concerns about deficit spending. While the Senate has yet to take up the budget, it has also been very pliant during the second Trump administration, approving grossly unqualified cabinet picks such as Robert F. Kennedy Jr.

All of which would seem to call for the leadership of US science organizations to press the case for the importance of science funding to the US and highlight the damage that these cuts would cause. But, if yesterday’s National Academies event is anything to judge by, the leadership is not especially interested.

Altered states

As the nation’s premier science organization, and one that performs lots of analyses for the government, the National Academies would seem to be in a position to have its concerns taken seriously by members of Congress. And, given that the present and future of science in the US is being set by policy choices, a meeting entitled the State of the Science would seem like the obvious place to address those concerns.

If so, it was not obvious to Marcia McNutt, the president of the NAS, who gave the presentation. She made some oblique references to current problems, saying, “We are embarking on a radical new experiment in what conditions promote science leadership, with the US being the treatment group, and China as the control,” and acknowledged that “uncertainties over the science budgets for next year, coupled with cancellations of billions of dollars of already hard-won research grants, is causing an exodus of researchers.”

But her primary focus was on the trends that have been operative in science funding and policy leading up to but excluding the second Trump administration. McNutt suggested this was needed to look beyond the next four years. However, that ignores the obvious fact that US science will be fundamentally different if the Trump administration can follow through on its plans and policies; the trends that have been present for the last two decades will be irrelevant.

She was also remarkably selective about her avoidance of discussing Trump administration priorities. After noting that faculty surveys have suggested they spend roughly 40 percent of their time handling regulatory requirements, she twice mentioned that the administration’s anti-regulatory stance could be a net positive here (once calling it “an opportunity to help”). Yet she neglected to note that many of the abandoned regulations represent a retreat from science-driven policy.

McNutt also acknowledged the problem of science losing the bipartisan support it has enjoyed, as trust in scientists among US conservatives has been on a downward trend. But she suggested it was scientists’ responsibility to fix the problem, even though it’s largely the product of one party deciding it can gain partisan advantage by raising doubts about scientific findings in fields like climate change and vaccine safety.

The panel discussion that came after largely followed McNutt’s lead in avoiding any mention of the current threats to science. The lone exception was Heather Wilson, president of the University of Texas at El Paso and a former Republican member of the House of Representatives and secretary of the Air Force during the first Trump administration. Wilson took direct aim at Trump’s cuts to funding for underrepresented groups, arguing, “Talent is evenly distributed, but opportunity is not.” After arguing that “the moral authority of science depends on the pursuit of truth,” she highlighted the cancellation of grants that had been used to study diseases that are more prevalent in some ethnic groups, saying “that’s not woke science—that’s genetics.”

Wilson was clearly the exception, however, as the rest of the panel largely avoided direct mention of either the damage already done to US science funding or the impending catastrophe on the horizon. We’ve asked the National Academies’ leadership a number of questions about how it perceives its role at a time when US science is clearly under threat. As of this article’s publication, however, we have not received a response.

At yesterday’s event, however, only one person showed a clear sense of what they thought that role should be—Wilson again, whose strongest words were directed at the National Academies themselves, which she said should “do what you’ve done since Lincoln was president,” and stand up for the truth.

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

US science is being wrecked, and its leadership is fighting the last war Read More »

science-phds-face-a-challenging-and-uncertain-future

Science PhDs face a challenging and uncertain future


Smaller post-grad classes are likely due to research budget cuts.

Credit: Thomas Barwick/Stone via Getty Images

Since the National Science Foundation first started collecting postgraduation data nearly 70 years ago, the number of PhDs awarded in the United States has consistently risen. Last year, more than 45,000 students earned doctorates in science and engineering, about an eight-fold increase compared to 1958.

But this level of production of science and engineering PhD students is now in question. Facing significant cuts to federal science funding, some universities have reduced or paused their PhD admissions for the upcoming academic year. In response, experts are beginning to wonder about the short and long-term effects those shifts will have on the number of doctorates awarded and the consequent impact on science if PhD production does drop.

Such questions touch on longstanding debates about academic labor. PhD training is a crucial part of nurturing scientific expertise. At the same time, some analysts have worried about an oversupply of PhDs in some fields, while students have suggested that universities are exploiting them as low-cost labor.

Many budding scientists go into graduate school with the goal of staying in academia and ultimately establishing their own labs. For at least 30 years, there has been talk of a mismatch between the number of doctorates and the limited academic job openings. According to an analysis conducted in 2013, only 3,000 faculty positions in science and engineering are added each year—even though more than 35,000 PhDs are produced in these fields annually.

Decades of this asymmetrical dynamic has created a hypercompetitive and high-pressure environment in the academic world, said Siddhartha Roy, an environmental engineer at Rutgers University who co-authored a recent study on tenure-track positions in engineering. “If we look strictly at academic positions, we have a huge oversupply, and it’s not sustainable,” he said.

But while the academic job market remains challenging, experts point out that PhD training also prepares individuals for career paths in industry, government, and other science and technology fields. If fewer doctorates are awarded and funding continues to be cut, some argue, American science will weaken.

“The immediate impact is there’s going to be less science,” said Donna Ginther, a social researcher who studies scientific labor markets at the University of Kansas. In the long run, that could mean scientific innovations, such as new drugs or technological advances, will stall, she said: “We’re leaving that scientific discovery on the table.”

Historically, one of the main goals of training PhD students has been to retain those scientists as future researchers in their respective fields. “Academia has a tendency to want to produce itself, reproduce itself,” said Ginther. “Our training is geared towards creating lots of mini-mes.”

But it is no secret in the academic world that tenure-track faculty positions are scarce, and the road to obtaining tenure is difficult. Although it varies across different STEM fields, the number of doctorates granted each year consistently surpass the number of tenure-track positions available. A survey gathering data from the 2022-2023 academic year, conducted by the Computing Research Association, found that around 11 percent of PhD graduates in computational science (for which employment data was reported) moved on to tenure-track faculty positions.

Roy found a similar figure for engineering: Around one out of every eight individuals who obtain their doctorate—12.5 percent—will eventually land a tenure-track faculty position, a trend that remained stable between 2014 and 2021, the last year for which his team analyzed data. The bottleneck in faculty positions, according to one recent study, leads about 40 percent of postdoctoral researchers to leave academia.

However, in recent years, researchers who advise graduate students have begun to acknowledge careers beyond academia, including positions in industry, nonprofits, government, consulting, science communication, and policy. “We need, as academics, need to take a broader perspective on what and how we prepare our students,” said Ginther.

As opposed to faculty positions, some of these labor markets can be more robust and provide plenty of opportunities for those with a doctorate, said Daniel Larremore, a computer scientist at the University of Colorado Boulder who studies academic labor markets, among other topics. Whether there is a mismatch between the number of PhDs and employment opportunities will depend on the subject of study and which fields are growing or shrinking, he added. For example, he pointed out that there is currently a boom in machine learning and artificial intelligence, so there is a lot of demand from industry for computer science graduates. In fact, commitments to industry jobs after graduation seem to be at a 30-year high.

But not all newly minted PhDs immediately find work. According to the latest NSF data, students in biological and biomedical sciences experienced a decline in job offers in the past 20 years, with 68 percent having definite commitments after graduating in 2023, compared to 72 percent in 2003. “The dynamics in the labor market for PhDs depends very much on what subject the PhD is in,” said Larremore.

Still, employment rates reflect that postgraduates benefit from greater opportunities compared to the general population. In 2024, the unemployment rate for college graduates with a doctoral degree in the US was 1.2 percent, less than half the national average at the time, according to the Bureau of Labor Statistics. In NSF’s recent survey, 74 percent of science and engineering graduating doctorates had definite commitments for employment or postdoctoral study or training positions, three points higher than it was in 2003.

“Overproducing for the number of academic jobs available? Absolutely,” said Larremore. “But overproducing for the economy in general? I don’t think so.”

The experts who spoke with Undark described science PhDs as a benefit for society: Ultimately, scientists with PhDs contribute to the economy of a nation, be it through academia or alternative careers. Many are now concerned about the impact that cuts to scientific research may have on that contribution. Already, there are reports of universities scaling back graduate student admissions in light of funding uncertainties, worried that they might not be able to cover students’ education and training costs. Those changes could result in smaller graduating classes in future years.

Smaller classes of PhD students might not be a bad thing for academia, given the limited faculty positions, said Roy. And for most non-academic jobs, Roy said, a master’s degree is more than sufficient. However, people with doctorates do contribute to other sectors like industry, government labs, and entrepreneurship, he added.

In Ginther’s view, fewer scientists with doctoral training could deal a devastating blow for the broader scientific enterprise. “Science is a long game, and the discoveries now take a decade or two to really hit the market, so it’s going to impinge on future economic growth.”

These long-term impacts of reductions in funding might be hard to reverse and could lead to the withering of the scientific endeavor in the United States, Larremore said: “If you have a thriving ecosystem and you suddenly halve the sunlight coming into it, it simply cannot thrive in the way that it was.”

This article was originally published on Undark. Read the original article.

Science PhDs face a challenging and uncertain future Read More »

some-parts-of-trump’s-proposed-budget-for-nasa-are-literally-draconian

Some parts of Trump’s proposed budget for NASA are literally draconian


“That’s exactly the kind of thing that NASA should be concentrating its resources on.”

Artist’s illustration of the DRACO nuclear rocket engine in space. Credit: Lockheed Martin

New details of the Trump administration’s plans for NASA, released Friday, revealed the White House’s desire to end the development of an experimental nuclear thermal rocket engine that could have shown a new way of exploring the Solar System.

Trump’s NASA budget request is rife with spending cuts. Overall, the White House proposes reducing NASA’s budget by about 24 percent, from $24.8 billion this year to $18.8 billion in fiscal year 2026. In previous stories, Ars has covered many of the programs impacted by the proposed cuts, which would cancel the Space Launch System rocket and Orion spacecraft and terminate numerous robotic science missions, including the Mars Sample Return, probes to Venus, and future space telescopes.

Instead, the leftover funding for NASA’s human exploration program would go toward supporting commercial projects to land on the Moon and Mars.

NASA’s initiatives to pioneer next-generation space technologies are also hit hard in the White House’s budget proposal. If the Trump administration gets its way, NASA’s Space Technology Mission Directorate, or STMD, will see its budget cut nearly in half, from $1.1 billion to $568 million.

Trump’s budget request isn’t final. Both Republican-controlled houses of Congress will write their own versions of the NASA budget, which must be reconciled before going to the White House for President Trump’s signature.

“The budget reduces Space Technology by approximately half, including eliminating failing space propulsion projects,” the White House wrote in an initial overview of the NASA budget request released May 2. “The reductions also scale back or eliminate technology projects that are not needed by NASA or are better suited to private sector research and development.”

Breathing fire

Last week, the White House and NASA put a finer point on these “failing space propulsion projects.”

“This budget provides no funding for Nuclear Thermal Propulsion and Nuclear Electric Propulsion projects,” officials wrote in a technical supplement released Friday detailing Trump’s NASA budget proposal. “These efforts are costly investments, would take many years to develop, and have not been identified as the propulsion mode for deep space missions. The nuclear propulsion projects are terminated to achieve cost savings and because there are other nearer-term propulsion alternatives for Mars transit.”

Foremost among these cuts, the White House proposes to end NASA’s participation in the Demonstration Rocket for Agile Cislunar Operations (DRACO) project. NASA said this proposal “reflects the decision by our partner to cancel” the DRACO mission, which would have demonstrated a nuclear thermal rocket engine in space for the first time.

NASA’s partner on the DRACO mission was the Defense Advanced Research Projects Agency, or DARPA, the Pentagon’s research and development arm. A DARPA spokesperson confirmed the agency was closing out the project.

“DARPA has completed the agency’s involvement in the Demonstration Rocket for Agile Cislunar Orbit (DRACO) program and is transitioning its knowledge to our DRACO mission partner, the National Aeronautics and Space Administration (NASA), and to other potential DOD programs,” the spokesperson said in a response to written questions.

A nuclear rocket engine, which was to be part of NASA’s aborted NERVA program, is tested at Jackass Flats, Nevada, in 1967. Credit: Corbis via Getty Images)

Less than two years ago, NASA and DARPA announced plans to move forward with the roughly $500 million DRACO project, targeting a launch into Earth orbit aboard a traditional chemical rocket in 2027. “With the help of this new technology, astronauts could journey to and from deep space faster than ever, a major capability to prepare for crewed missions to Mars,” former NASA administrator Bill Nelson said at the time.

The DRACO mission would have consisted of several elements, including a nuclear reactor to rapidly heat up super-cold liquid hydrogen fuel stored in an insulated tank onboard the spacecraft. Temperatures inside the engine would reach nearly 5,000° Fahrenheit, boiling the hydrogen and driving the resulting gas through a nozzle, generating thrust. From the outside, the spacecraft’s design looks a lot like the upper stage of a traditional rocket. However, theoretically, a nuclear thermal rocket engine like DRACO’s would offer twice the efficiency of the highest-performing conventional rocket engines. That translates to significantly less fuel that a mission to Mars would have to carry across the Solar System.

Essentially, a nuclear thermal rocket engine combines the high-thrust capability of a chemical engine with some of the fuel efficiency benefits of low-thrust solar-electric engines. With DRACO, engineers sought hard data to verify their understanding of nuclear propulsion and wanted to make sure the nuclear engine’s challenging design actually worked. DRACO would have used high-assay low-enriched uranium to power its nuclear reactor.

Nuclear electric propulsion uses an onboard nuclear reactor to power plasma thrusters that create thrust by accelerating an ionized gas, like xenon, through a magnetic field. Nuclear electric propulsion would provide another leap in engine efficiency beyond the capabilities of a system like DRACO and may ultimately offer the most attractive option for enduring deep space transportation.

NASA led the development of DRACO’s nuclear rocket engine, while DARPA was responsible for the overall spacecraft design, operations, and the thorny problem of securing regulatory approval to launch a nuclear reactor into orbit. The reactor on DRACO would have launched in “cold” mode before activating in space, reducing the risk to people on the ground in the event of a launch accident. The Space Force agreed to pay for DRACO’s launch on a United Launch Alliance Vulcan rocket.

DARPA and NASA selected Lockheed Martin as the lead contractor for the DRACO spacecraft in 2023. BWX Technologies, a leader in the US nuclear industry, won the contract to develop the mission’s reactor.

“We received the notice from DARPA that it ended the DRACO program,” a Lockheed Martin spokesperson said. “While we’re disappointed with the decision, it doesn’t change our vision of how nuclear power influences how we will explore and operate in the vastness of space.”

Mired in the lab

More than 60 years have passed since a US-built nuclear reactor launched into orbit. Aviation Week reported in January that one problem facing DRACO engineers involved questions about how to safely test the nuclear thermal engine on the ground while adhering to nuclear safety protocols.

“We’re bringing two things together—space mission assurance and nuclear safety—and there’s a fair amount of complexity,” said Matthew Sambora, a DRACO program manager at DARPA, in an interview with Aviation Week. At the time, DARPA and NASA had already given up on a 2027 launch to concentrate on developing a prototype engine using helium as a propellant before moving on to an operational engine with more energetic liquid hydrogen fuel, Aviation Week reported.

Greg Meholic, an engineer at the Aerospace Corporation, highlighted the shortfall in ground testing capability in a presentation last year. Nuclear thermal propulsion testing “requires that engine exhaust be scrubbed of radiologics before being released,” he wrote. This requirement “could result in substantially large, prohibitively expensive facilities that take years to build and qualify.”

These safety protocols weren’t as stringent when NASA and the Air Force first pursued nuclear propulsion in the 1960s. Now, the first serious 21st-century effort to fly a nuclear rocket engine in space is grinding to a halt.

“Given that our near-term human exploration and science needs do not require nuclear propulsion, current demonstration projects will end,” wrote Janet Petro, NASA’s acting administrator, in a letter accompanying the Trump administration’s budget release last week.

This figure illustrates the major elements of a typical nuclear thermal rocket engine. Credit: NASA/Glenn Research Center

NASA’s 2024 budget allocated $117 million for nuclear propulsion work, an increase from $91 million the previous year. Congress added more funding for NASA’s nuclear propulsion programs over the Biden administration’s proposed budget in recent years, signaling support on Capitol Hill that may save at least some nuclear propulsion initiatives next year.

It’s true that nuclear propulsion isn’t required for any NASA missions currently on the books. Today’s rockets are good at hurling cargo and people off planet Earth, but once a spacecraft arrives in orbit, there are several ways to propel it toward more distant destinations.

NASA’s existing architecture for sending astronauts to the Moon uses the SLS rocket and Orion spacecraft, both of which are proposed for cancellation and look a lot like the vehicles NASA used to fly astronauts to the Moon more than 50 years ago. SpaceX’s reusable Starship, designed with an eye toward settling Mars, uses conventional chemical propulsion, with methane and liquid oxygen propellants that SpaceX one day hopes to generate on the surface of the Red Planet.

So NASA, SpaceX, and other companies don’t need nuclear propulsion to beat China back to the Moon or put the first human footprints on Mars. But there’s a broad consensus that in the long run, nuclear rockets offer a better way of moving around the Solar System.

The military’s motive for funding nuclear thermal propulsion was its potential for becoming a more efficient means of maneuvering around the Earth. Many of the military’s most important spacecraft are limited by fuel, and the Space Force is investigating orbital refueling and novel propulsion methods to extend the lifespan of satellites.

NASA’s nuclear power program is not finished. The Trump administration’s budget proposal calls for continued funding for the agency’s fission surface power program, with the goal of fielding a nuclear reactor that could power a base on the surface of the Moon or Mars. Lockheed and BWXT, the contractors involved in the DRACO mission, are part of the fission surface power program.

There is some funding in the White House’s budget request for tech demos using other methods of in-space propulsion. NASA would continue funding experiments in long-term storage and transfer of cryogenic propellants like liquid methane, liquid hydrogen, and liquid oxygen. These joint projects between NASA and industry could pave the way for orbital refueling and orbiting propellant depots, aligning with the direction of companies like SpaceX, Blue Origin, and United Launch Alliance.

But many scientists and engineers believe nuclear propulsion offers the only realistic path for a sustainable campaign ferrying people between the Earth and Mars. A report commissioned by NASA and the National Academies concluded in 2021 that an aggressive tech-development program could advance nuclear thermal propulsion enough for a human expedition to Mars in 2039. The prospects for nuclear electric propulsion were murkier.

This would have required NASA to substantially increase its budget for nuclear propulsion immediately, likely by an order of magnitude beyond the agency’s baseline funding level, or to an amount exceeding $1 billion per year, said Bobby Braun, co-chair of the National Academies report, in a 2021 interview with Ars. That didn’t happen.

Going nuclear

The interplanetary transportation architectures envisioned by NASA and SpaceX will, at least initially, primarily use chemical propulsion for the cruise between Earth and Mars.

Kurt Polzin, chief engineer of NASA’s space nuclear propulsion projects, said significant technical hurdles stand in the way of any propulsion system selected to power heavy cargo and humans to Mars.

“Anybody who says that they’ve solved the problem, you don’t know that because you don’t have enough data,” Polzin said last week at the Humans to the Moon and Mars Summit in Washington.

“We know that to do a Mars mission with a Starship, you need lots of refuelings at Earth, you need lots of refuelings at Mars, which you have to send in advance,” Polzin said. “You either need to send that propellant in advance or send a bunch of material and hardware to the surface to be set up and robotically make your propellant in situ while you’re there.”

Elon Musk’s SpaceX is betting on chemical propulsion for round-trip flights to Mars with its Starship rocket. This will require assembly of propellant-generation plants on the Martian surface. Credit: SpaceX

Last week, SpaceX founder Elon Musk outlined how the company plans to land its first Starships on Mars. His roadmap includes more than 100 cargo flights to deliver equipment to produce methane and liquid oxygen propellants on the surface of Mars. This is necessary for any Starship to launch off the Red Planet and return to Earth.

“You can start to see that this starts to become a Rube Goldberg way to do Mars,” Polzin said. “Will I say it can’t work? No, but will I say that it’s really, really difficult and challenging. Are there a lot of miracles to make it work? Absolutely. So the notion that SpaceX has solved Mars or is going to do Mars with Starship, I would challenge that on its face. I don’t think the analysis and the data bear that out.”

Engineers know how methane-fueled rocket engines perform in space. Scientists have created liquid oxygen and liquid methane since the late 1800s. Scaling up a propellant plant on Mars to produce thousands of tons of cryogenic liquids is another matter. In the long run, this might be a suitable solution for Musk’s vision of creating a city on Mars, but it comes with immense startup costs and risks. Still, nuclear propulsion is an entirely untested technology as well.

“The thing with nuclear is there are challenges to making it work, too,” Polzin said. “However, all of my challenges get solved here at Earth and in low-Earth orbit before I leave. Nuclear is nice. It has a higher specific impulse, especially when we’re talking about nuclear thermal propulsion. It has high thrust, which means it will get our astronauts there and back quickly, but I can carry all the fuel I need to get back with me, so I don’t need to do any complicated refueling at Mars. I can return without having to make propellant or send any pre-positioned propellant to get back.”

The tug of war over nuclear propulsion is nothing new. The Air Force started a program to develop reactors for nuclear thermal rockets at the height of the Cold War. NASA took over the Air Force’s role a few years later, and the project proceeded into the next phase, called the Nuclear Engine for Rocket Vehicle Application (NERVA). President Richard Nixon ultimately canceled the NERVA project in 1973 after the government had spent $1.4 billion on it, equivalent to about $10 billion in today’s dollars. Despite nearly two decades of work, NERVA never flew in space.

Doing the hard things

The Pentagon and NASA studied several more nuclear thermal and nuclear electric propulsion initiatives before DRACO. Today, there’s a nascent commercial business case for compact nuclear reactors beyond just the government. But there’s scant commercial interest in mounting a full-scale nuclear propulsion demonstration solely with private funding.

Fred Kennedy, co-founder and CEO of a space nuclear power company called Dark Fission, said most venture capital investors lack the appetite to wait for financial returns in nuclear propulsion that they may see in 15 or 20 years.

“It’s a truism: Space is hard,” said Kennedy, a former DARPA program manager. “Nuclear turns out to be hard for reasons we can all understand. So space-nuclear is hard-squared, folks. As a result, you give this to your average associate at a VC firm and they get scared quick. They see the moles all over your face, and they run away screaming.”

But commercial launch costs are coming down. With sustained government investment and streamlined regulations, “this is the best chance we’ve had in a long time” to get a nuclear propulsion system into space, Kennedy said.

Technicians prepare a nozzle for a prototype nuclear thermal rocket engine in 1964. Credit: NASA

“I think, right now, we’re in this transitional period where companies like mine are going have to rely on some government largesse, as well as hopefully both commercial partnerships and honest private investment,” Kennedy said. “Three years ago, I would have told you I thought I could have done the whole thing with private investment, but three years have turned my hair white.”

Those who share Kennedy’s view thought they were getting an ally in the Trump administration. Jared Isaacman, the billionaire commercial astronaut Trump nominated to become the next NASA administrator, promised to prioritize nuclear propulsion in his tenure as head of the nation’s space agency.

During his Senate confirmation hearing in April, Isaacman said NASA should turn over management of heavy-lift rockets, human-rated spacecraft, and other projects to commercial industry. This change, he said, would allow NASA to focus on the “near-impossible challenges that no company, organization, or agency anywhere in the world would be able to undertake.”

The example Isaacman gave in his confirmation hearing was nuclear propulsion. “That’s something that no company would ever embark upon,” he told lawmakers. “There is no obvious economic return. There are regulatory challenges. That’s exactly the kind of thing that NASA should be concentrating its resources on.”

But the White House suddenly announced on Saturday that it was withdrawing Isaacman’s nomination days before the Senate was expected to confirm him for the NASA post. While there’s no indication that Trump’s withdrawal of Isaacman had anything to do with any specific part of the White House’s funding plan, his removal leaves NASA without an advocate for nuclear propulsion and a number of other projects falling under the White House’s budget ax.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Some parts of Trump’s proposed budget for NASA are literally draconian Read More »

milky-way-galaxy-might-not-collide-with-andromeda-after-all

Milky Way galaxy might not collide with Andromeda after all

100,000 computer simulations reveal Milky Way’s fate—and it might not be what we thought.

It’s been textbook knowledge for over a century that our Milky Way galaxy is doomed to collide with another large spiral galaxy, Andromeda, in the next 5 billion years and merge into one even bigger galaxy. But a fresh analysis published in the journal Nature Astronomy is casting that longstanding narrative in a more uncertain light. The authors conclude that the likelihood of this collision and merger is closer to the odds of a coin flip, with a roughly 50 percent probability that the two galaxies will avoid such an event during the next 10 billion years.

Both the Milky Way and the Andromeda galaxies (M31) are part of what’s known as the Local Group (LG), which also hosts other smaller galaxies (some not yet discovered) as well as dark matter (per the prevailing standard cosmological model). Both already have remnants of past mergers and interactions with other galaxies, according to the authors.

“Predicting future mergers requires knowledge about the present coordinates, velocities, and masses of the systems partaking in the interaction,” the authors wrote. That involves not just the gravitational force between them but also dynamical friction. It’s the latter that dominates when galaxies are headed toward a merger, since it causes galactic orbits to decay.

This latest analysis is the result of combining data from the Hubble Space Telescope and the European Space Agency’s (ESA) Gaia space telescope to perform 100,000 Monte Carlo computer simulations, taking into account not just the Milky Way and Andromeda but the full LG system. Those simulations yielded a very different prediction: There is approximately a 50/50 chance of the galaxies colliding within the next 10 billion years. There is still a 2 percent chance that they will collide in the next 4 to 5 billion years. “Based on the best available data, the fate of our galaxy is still completely open,” the authors concluded.

Milky Way galaxy might not collide with Andromeda after all Read More »

could-floating-solar-panels-on-a-reservoir-help-the-colorado-river?

Could floating solar panels on a reservoir help the Colorado River?


Floating solar panels appear to conserve water while generating green electricity.

The Gila River Indian Community in Arizona has lined 3,000 feet of their canals with solar panels. Credit: Jake Bolster/Inside Climate News

GILA RIVER INDIAN RESERVATION, Ariz.—About 33 miles south of Phoenix, Interstate 10 bisects a line of solar panels traversing the desert like an iridescent snake. The solar farm’s shape follows the path of a canal, with panels serving as awnings to shade the gently flowing water from the unforgiving heat and wind of the Sonoran Desert.

The panels began generating power last November for the Akimel O’otham and Pee Posh tribes—known together as the Gila River Indian Community, or GRIC—on their reservation in south-central Arizona, and they are the first of their kind in the US. The community is studying the effects of these panels on the water in the canal, hopeful that they will protect a precious resource from the desert’s unflinching sun and wind.

In September, GRIC is planning to break ground on another experimental effort to conserve water while generating electricity: floating solar. Between its canal canopies and the new project that would float photovoltaic panels on a reservoir it is building, GRIC hopes to one day power all of its canal and irrigation operations with solar electricity, transforming itself into one of the most innovative and closely watched water users in the West in the process.

The community’s investments come at a critical time for the Colorado River, which supplies water to about 40 million people across seven Western states, Mexico and 30 tribes, including GRIC. Annual consumption from the river regularly exceeds its supply, and a decadeslong drought, fueled in part by climate change, continues to leave water levels at Lake Powell and Lake Mead dangerously low.

Covering water with solar panels is not a new idea. But for some it represents an elegant mitigation of water shortages in the West. Doing so could reduce evaporation, generate more carbon-free electricity and require dams to run less frequently to produce power.

But, so far, the technology has not been included in the ongoing Colorado River negotiations between the Upper Basin states of Colorado, New Mexico, Utah, and Wyoming, the Lower Basin states of Arizona, California, and Nevada, tribes and Mexico. All are expected to eventually agree on cuts to the system’s water allocations to maintain the river’s ability to provide water and electricity for residents and farms, and keep its ecosystem alive.

“People in the US don’t know about [floating solar] yet,” said Scott Young, a former policy analyst in the Nevada state legislature’s counsel bureau. “They’re not willing to look at it and try and factor it” into the negotiations.

Several Western water managers Inside Climate News contacted for this story said they were open to learning more about floating solar—Colorado has even studied the technology through pilot projects. But, outside of GRIC’s project, none knew of any plans to deploy floating solar anywhere in the basin. Some listed costly and unusual construction methods and potentially modest water savings as the primary obstacles to floating solar maturing in the US.

A tantalizing technology with tradeoffs

A winery in Napa County, California, deployed the first floating solar panels in the US on an irrigation pond in 2007. The country was still years away from passing federal legislation to combat the climate crisis, and the technology matured here haltingly. As recently as 2022, according to a Bloomberg analysis, most of the world’s 13 gigawatts of floating solar capacity had been built in Asia.

Unlike many Asian countries, the US has an abundance of undeveloped land where solar could be constructed, said Prateek Joshi, a research engineer at the National Renewable Energy Laboratory (NREL) who has studied floating solar, among other forms of energy. “Even though [floating solar] may play a smaller role, I think it’s a critical role in just diversifying our energy mix and also reducing the burden of land use,” he said.

Credit: Paul Horn/Inside Climate News

This February, NREL published a study that found floating solar on the reservoirs behind federally owned dams could provide enough electricity to power 100 million US homes annually, but only if all the developable space on each reservoir were used.

Lake Powell could host almost 15 gigawatts of floating solar using about 23 percent of its surface area, and Lake Mead could generate over 17 gigawatts of power on 28 percent of its surface. Such large-scale development is “probably not going to be the case,” Joshi said, but even if a project used only a fraction of the developable area, “there’s a lot of power you could get from a relatively small percentage of these Colorado Basin reservoirs.”

The study did not measure how much water evaporation floating solar would prevent, but previous NREL research has shown that photovoltaic panels—sometimes called “floatovoltaics” when they are deployed on reservoirs—could also save water by changing the way hydropower is deployed.

Some of a dam’s energy could come from solar panels floating on its reservoir to prevent water from being released solely to generate electricity. As late as December, when a typical Western dam would be running low, lakes with floating solar could still have enough water to produce hydropower, reducing reliance on more expensive backup energy from gas-fired power plants.

Joshi has spoken with developers and water managers about floating solar before, and said there is “an eagerness to get this [technology] going.” The technology, however, is not flawless.

Solar arrays can be around 20 percent more expensive to install on water than land, largely because of the added cost of buoys that keep the panels afloat, according to a 2021 NREL report. The water’s cooling effect can boost panel efficiency, but floating solar panels may produce slightly less energy than a similarly sized array on land because they can’t be tilted as directly toward the sun as land-based panels.

And while the panels likely reduce water loss from reservoirs, they may also increase a water body’s emissions of greenhouse gases, which in turn warm the climate and increase evaporation. This January, researchers at Cornell University found that floating solar covering more than 70 percent of a pond’s surface area increased the water’s CO2 and methane emissions. These kinds of impacts “should be considered not only for the waterbody in which [floating solar] is deployed but also in the broader context of trade-offs of shifting energy production from land to water,” the study’s authors wrote.

“Any energy technology has its tradeoffs,” Joshi said, and in the case of floating solar, some of its benefits—reduced evaporation and land use—may not be easy to express in dollars and cents.

Silver buckshot

There is perhaps no bigger champion for floating solar in the West than Scott Young. Before he retired in 2016, he spent much of his 18 years working for the Nevada Legislature researching the effects of proposed legislation, especially in the energy sector.

On an overcast, blustery May day in southwest Wyoming near his home, Young said that in the past two years he has promoted the technology to Colorado River negotiators, members of Congress, environmental groups and other water managers from the seven basin states, all of whom he has implored to consider the virtues of floating solar arrays on Lake Powell and Lake Mead.

Young grew up in the San Francisco Bay area, about 40 miles, he estimated, from the pioneering floating solar panels in Napa. He stressed that he does not have any ties to industry; he is just a concerned Westerner who wants to diversify the region’s energy mix and save as much water as possible.

But so far, when he has been able to get someone’s attention, Young said his pitch has been met with tepid interest. “Usually the response is: ‘Eh, that’s kind of interesting,’” said Young, dressed in a black jacket, a maroon button-down shirt and a matching ball cap that framed his round, open face. “But there’s no follow-up.”

The Bureau of Reclamation “has not received any formal proposals for floating solar on its reservoirs,” said an agency spokesperson, who added that the bureau has been monitoring the technology.

In a 2021 paper published with NREL, Reclamation estimated that floating solar on its reservoirs could generate approximately 1.5 terawatts of electricity, enough to power about 100 million homes. But, in addition to potentially interfering with recreation, aquatic life, and water safety, floating solar’s effect on evaporation proved difficult to model broadly.

So many environmental factors determine how water is lost or consumed in a reservoir—solar intensity, wind, humidity, lake circulation, water depth, and temperature—that the study’s authors concluded Reclamation “should be wary of contractors’ claims of evaporation savings” without site-specific studies. Those same factors affect the panels’ efficiency, and in turn, how much hydropower would need to be generated from the reservoir they cover.

The report also showed the Colorado River was ripe with floating solar potential—more than any other basin in the West. That’s particularly true in the Upper Basin, where Young has been heartened by Colorado’s approach to the technology.

In 2023, the state passed a law requiring several agencies to study the use of floating solar. Last December, the Colorado Water Conservation Board published its findings, and estimated that the state could save up to 407,000 acre feet of water by deploying floating solar on certain reservoirs. An acre foot covers one acre with a foot of water, or 325,851 gallons, just about three year’s worth of water for a family of four.

When Young saw the Colorado study quantifying savings from floating solar, he felt hopeful. “407,000 acre feet from one state,” he said. “I was hoping that would catch people’s attention.”

Saving that much water would require using over 100,000 acres of surface water, said Cole Bedford, the Colorado Water Conservation Board’s chief operating officer, in an email. “On some of these reservoirs a [floating solar] system would diminish the recreational value such that it would not be appropriate,” he said. “On others, recreation, power generation, and water savings could be balanced.”

Colorado is not planning to develop another project in the wake of this study, and Bedford said that the technology is not a silver bullet solution for Colorado River negotiations.

“While floating solar is one tool in the toolkit for water conservation, the only true solution to the challenges facing the Colorado River Basin is a shift to supply-driven, sustainable uses and operations,” he said.

Some of the West’s largest and driest cities, like Phoenix and Denver, ferry Colorado River water to residents hundreds of miles away from the basin using a web of infrastructure that must reliably operate in unforgiving terrain. Like their counterparts at the state level, water managers in these cities have heard floatovoltaics floated before, but they say the technology is currently too immature and costly to be deployed in the US.

Lake Pleasant

Lake Pleasant, which holds some of the Central Arizona Project’s Colorado River Water, is also a popular recreation space, complicating its floating solar potential.

Credit: Jake Bolster/Inside Climate News

Lake Pleasant, which holds some of the Central Arizona Project’s Colorado River Water, is also a popular recreation space, complicating its floating solar potential. Credit: Jake Bolster/Inside Climate News

In Arizona, the Central Arizona Project (CAP) delivers much of the Colorado River water used by Phoenix, Tucson, tribes, and other southern Arizona communities with a 336-mile canal running through the desert, and Lake Pleasant, the company’s 811,784-acre-foot reservoir.

Though CAP is following GRIC’s deployment of solar over canals, it has no immediate plans to build solar over its canal, or Lake Pleasant, according to Darrin Francom, CAP’s assistant general manager for operations, power, engineering, and maintenance, in part because the city of Peoria technically owns the surface water.

Covering the whole canal with solar to save the 4,000 acre feet that evaporates from it could be prohibitively expensive for CAP. “The dollar cost per that acre foot [saved] is going to be in the tens of, you know, maybe even hundreds of thousands of dollars,” Francom said, mainly due to working with novel equipment and construction methods. “Ultimately,” he continued, “those costs are going to be borne by our ratepayers,” which gives CAP reason to pursue other lower-cost ways to save water, like conservation programs, or to seek new sources.

An intake tower moves water into and out of the dam at Lake Pleasant.

Credit: Jake Bolster/Inside Climate News

An intake tower moves water into and out of the dam at Lake Pleasant. Credit: Jake Bolster/Inside Climate News

The increased costs associated with building solar panels on water instead of on land has made such projects unpalatable to Denver Water, Colorado’s largest water utility, which moves water out of the Colorado River Basin and through the Rocky Mountains to customers on the Front Range. “Floating solar doesn’t pencil out for us for many reasons,” said Todd Hartman, a company spokesperson. “Were we to add more solar resources—which we are considering—we have abundant land-based options.”

GRIC spent about $5.6 million, financed with Inflation Reduction Act grants, to construct 3,000 feet of solar over a canal, according to David DeJong, project director for the community’s irrigation district.

Young is aware there is no single solution to the problems plaguing the Colorado River Basin, and he knows floating solar is not a perfect technology. Instead, he thinks of it as a “silver buckshot,” he said, borrowing a term from John Entsminger, general manager for the Southern Nevada Water Authority—a technology that can be deployed alongside a constellation of behavioral changes to help keep the Colorado River alive.

Given the duration and intensity of the drought in the West and the growing demand for water and clean energy, Young believes the US needs to act now to embed this technology into the fabric of Western water management going forward.

As drought in the West intensifies, “I think more lawmakers are going to look at this,” he said. “If you can save water in two ways—why not?”

“We’re not going to know until we try”

If all goes according to plan, GRIC’s West Side Reservoir will be finished and ready to store Colorado River water by the end of July. The community wants to cover just under 60 percent of the lake’s surface area with floating solar.

“Do we know for a fact that this is going to be 100 percent effective and foolproof? No,” said DeJong, GRIC’s project director for its irrigation district. “But we’re not going to know until we try.”

Solar panels over the canal

The Gila River Indian Community spent about $5.6 million, with the help of Inflation Reduction Act grants, to cover a canal with solar.

Credit: Jake Bolster/Inside Climate News

The Gila River Indian Community spent about $5.6 million, with the help of Inflation Reduction Act grants, to cover a canal with solar. Credit: Jake Bolster/Inside Climate News

GRIC’s panels will have a few things going for them that projects on lakes Mead or Powell probably wouldn’t. West Side Reservoir will not be open to recreation, limiting the panels’ impacts on people. And the community already has the funds—Inflation Reduction Act grants and some of its own money—to pay for the project.

But GRIC’s solar ambitions may be threatened by the hostile posture toward solar and wind energy from the White House and congressional Republicans, and the project is vulnerable to an increasingly volatile economy. Since retaking office, President Donald Trump, aided by billionaire Elon Musk, has made deep cuts in renewable energy grants at the Environmental Protection Agency. It is unclear whether or to what extent the Bureau of Reclamation has slashed its grant programs.

“Under President Donald J. Trump’s leadership, the Department is working to cut bureaucratic waste and ensure taxpayer dollars are spent efficiently,” said a spokesperson for the Department of the Interior, which oversees Reclamation. “This includes ensuring Bureau of Reclamation projects that use funds from the Infrastructure Investments and Jobs Act and the Inflation Reduction Act align with administration priorities. Projects are being individually assessed by period of performance, criticality, and other criteria. Projects have been approved for obligation under this process so that critical work can continue.”

And Trump’s tariffs could cause costs to balloon beyond the community’s budget, which could either reduce the size of the array or cause delays in soliciting proposals, DeJong said.

While the community will study the panels over canals to understand the water’s effects on solar panel efficiency, it won’t do similar research on the panels on West Side Reservoir, though DeJong said they have been in touch with NREL about studying them. The enterprise will be part of the system that may one day offset all the electrical demand and carbon footprint of GRIC’s irrigation system.

“The community, they love these types of innovative projects. I love these innovative projects,” said GRIC Governor Stephen Roe Lewis, standing in front of the canals in April. Lewis had his dark hair pulled back in a long ponytail and wore a blue button down that matched the color of the sky.

“I know for a fact this is inspiring a whole new generation of water protectors—those that want to come back and they want to go into this cutting-edge technology,” he said. “I couldn’t be more proud of our team for getting this done.”

DeJong feels plenty of other water managers across the West could learn from what is happening at GRIC. In fact, the West Side Reservoir was intentionally constructed near Interstate 10 so that people driving by on the highway could one day see the floating solar the community intends to build there, DeJong said.

“It could be a paradigm shift in the Western United States,” he said. “We recognize all of the projects we’re doing are pilot projects. None of them are large scale. But it’s the beginning.”

This story originally appeared on Photo of Inside Climate News

Could floating solar panels on a reservoir help the Colorado River? Read More »

research-roundup:-7-stories-we-almost-missed

Research roundup: 7 stories we almost missed


Ping-pong bots, drumming chimps, picking styles of two jazz greats, and an ancient underground city’s soundscape

Time lapse photos show a new ping-pong-playing robot performing a top spin. Credit: David Nguyen, Kendrick Cancio and Sangbae Kim

It’s a regrettable reality that there is never time to cover all the interesting scientific stories we come across each month. In the past, we’ve featured year-end roundups of cool science stories we (almost) missed. This year, we’re experimenting with a monthly collection. May’s list includes a nifty experiment to make a predicted effect of special relativity visible; a ping-pong playing robot that can return hits with 88 percent accuracy; and the discovery of the rare genetic mutation that makes orange cats orange, among other highlights.

Special relativity made visible

The Terrell-Penrose-Effect: Fast objects appear rotated

Credit: TU Wien

Perhaps the most well-known feature of Albert Einstein’s special theory of relativity is time dilation and length contraction. In 1959, two physicists predicted another feature of relativistic motion: an object moving near the speed of light should also appear to be rotated. It’s not been possible to demonstrate this experimentally, however—until now. Physicists at the Vienna University of Technology figured out how to reproduce this rotational effect in the lab using laser pulses and precision cameras, according to a paper published in the journal Communications Physics.

They found their inspiration in art, specifically an earlier collaboration with an artist named Enar de Dios Rodriguez, who collaborated with VUT and the University of Vienna on a project involving ultra-fast photography and slow light. For this latest research, they used objects shaped like a cube and a sphere and moved them around the lab while zapping them with ultrashort laser pulses, recording the flashes with a high-speed camera.

Getting the timing just right effectively yields similar results to a light speed of 2 m/s. After photographing the objects many times using this method, the team then combined the still images into a single image. The results: the cube looked twisted and the sphere’s North Pole was in a different location—a demonstration of the rotational effect predicted back in 1959.

DOI: Communications Physics, 2025. 10.1038/s42005-025-02003-6  (About DOIs).

Drumming chimpanzees

A chimpanzee feeling the rhythm. Credit: Current Biology/Eleuteri et al., 2025.

Chimpanzees are known to “drum” on the roots of trees as a means of communication, often combining that action with what are known as “pant-hoot” vocalizations (see above video). Scientists have found that the chimps’ drumming exhibits key elements of musical rhythm much like humans, according to  a paper published in the journal Current Biology—specifically non-random timing and isochrony. And chimps from different geographical regions have different drumming rhythms.

Back in 2022, the same team observed that individual chimps had unique styles of “buttress drumming,” which served as a kind of communication, letting others in the same group know their identity, location, and activity. This time around they wanted to know if this was also true of chimps living in different groups and whether their drumming was rhythmic in nature. So they collected video footage of the drumming behavior among 11 chimpanzee communities across six populations in East Africa (Uganda) and West Africa (Ivory Coast), amounting to 371 drumming bouts.

Their analysis of the drum patterns confirmed their hypothesis. The western chimps drummed in regularly spaced hits, used faster tempos, and started drumming earlier during their pant-hoot vocalizations. Eastern chimps would alternate between shorter and longer spaced hits. Since this kind of rhythmic percussion is one of the earliest evolved forms of human musical expression and is ubiquitous across cultures, findings such as this could shed light on how our love of rhythm evolved.

DOI: Current Biology, 2025. 10.1016/j.cub.2025.04.019  (About DOIs).

Distinctive styles of two jazz greats

Wes Montgomery (left)) and Joe Pass (right) playing guitars

Jazz lovers likely need no introduction to Joe Pass and Wes Montgomery, 20th century guitarists who influenced generations of jazz musicians with their innovative techniques. Montgomery, for instance, didn’t use a pick, preferring to pluck the strings with his thumb—a method he developed because he practiced at night after working all day as a machinist and didn’t want to wake his children or neighbors. Pass developed his own range of picking techniques, including fingerpicking, hybrid picking, and “flat picking.”

Chirag Gokani and Preston Wilson, both with Applied Research Laboratories and the University of Texas, Austin, greatly admired both Pass and Montgomery and decided to explore the underlying the acoustics of their distinctive playing, modeling the interactions of the thumb, fingers, and pick with a guitar string. They described their research during a meeting of the Acoustical Society of America in New Orleans, LA.

Among their findings: Montgomery achieved his warm tone by playing closer to the bridge and mostly plucking at the string. Pass’s rich tone arose from a combination of using a pick and playing closer to the guitar neck. There were also differences in how much a thumb, finger, and pick slip off the string:  use of the thumb (Montgomery) produced more of a “pluck” compared to the pick (Pass), which produced more of a “strike.” Gokani and Wilson think their model could be used to synthesize digital guitars with a more realistic sound, as well as helping guitarists better emulate Pass and Montgomery.

Sounds of an ancient underground city

A collection of images from the underground tunnels of Derinkuyu.

Credit: Sezin Nas

Turkey is home to the underground city Derinkuyu, originally carved out inside soft volcanic rock around the 8th century BCE. It was later expanded to include four main ventilation channels (and some 50,000 smaller shafts) serving seven levels, which could be closed off from the inside with a large rolling stone. The city could hold up to 20,000 people and it  was connected to another underground city, Kaymakli, via tunnels. Derinkuyu helped protect Arab Muslims during the Arab-Byzantine wars, served as a refuge from the Ottomans in the 14th century, and as a haven for Armenians escaping persecution in the early 20th century, among other functions.

The tunnels were rediscovered in the 1960s and about half of the city has been open to visitors since 2016. The site is naturally of great archaeological interest, but there has been little to no research on the acoustics of the site, particularly the ventilation channels—one of Derinkuyu’s most unique features, according to Sezin Nas, an architectural acoustician at Istanbul Galata University in Turkey.  She gave a talk at a meeting of the Acoustical Society of America in New Orleans, LA, about her work on the site’s acoustic environment.

Nas analyzed a church, a living area, and a kitchen, measuring sound sources and reverberation patterns, among other factors, to create a 3D virtual soundscape. The hope is that a better understanding of this aspect of Derinkuyu could improve the design of future underground urban spaces—as well as one day using her virtual soundscape to enable visitors to experience the sounds of the city themselves.

MIT’s latest ping-pong robot

Robots playing ping-pong have been a thing since the 1980s, of particular interest to scientists because it requires the robot to combine the slow, precise ability to grasp and pick up objects with dynamic, adaptable locomotion. Such robots need high-speed machine vision, fast motors and actuators, precise control, and the ability to make accurate predictions in real time, not to mention being able to develop a game strategy. More recent designs use AI techniques to allow the robots to “learn” from prior data to improve their performance.

MIT researchers have built their own version of a ping-pong playing robot, incorporating a lightweight design and the ability to precisely return shots. They built on prior work developing the Humanoid, a small bipedal two-armed robot—specifically, modifying the Humanoid’s arm by adding an extra degree of freedom to the wrist so the robot could control a ping-pong paddle. They tested their robot by mounting it on a ping-pong table and lobbing 150 balls at it from the other side of the table, capturing the action with high-speed cameras.

The new bot can execute three different swing types (loop, drive, and chip) and during the trial runs it returned the ball with impressive accuracy across all three types: 88.4 percent, 89.2 percent, and 87.5 percent, respectively. Subsequent tweaks to theirrystem brought the robot’s strike speed up to 19 meters per second (about 42 MPH), close to the 12 to 25 meters per second of advanced human players. The addition of control algorithms gave the robot the ability to aim. The robot still has limited mobility and reach because it has to be fixed to the ping-pong table but the MIT researchers plan to rig it to a gantry or wheeled platform in the future to address that shortcoming.

Why orange cats are orange

an orange tabby kitten

Cat lovers know orange cats are special for more than their unique coloring, but that’s the quality that has intrigued scientists for almost a century. Sure, lots of animals have orange, ginger, or yellow hues, like tigers, orangutans, and golden retrievers. But in domestic cats that color is specifically linked to sex. Almost all orange cats are male. Scientists have now identified the genetic mutation responsible and it appears to be unique to cats, according to a paper published in the journal Current Biology.

Prior work had narrowed down the region on the X chromosome most likely to contain the relevant mutation. The scientists knew that females usually have just one copy of the mutation and in that case have tortoiseshell (partially orange) coloring, although in rare cases, a female cat will be orange if both X chromosomes have the mutation. Over the last five to ten years, there has been an explosion in genome resources (including complete sequenced genomes) for cats which greatly aided the team’s research, along with taking additional DNA samples from cats at spay and neuter clinics.

From an initial pool of 51 candidate variants, the scientists narrowed it down to three genes, only one of which was likely to play any role in gene regulation: Arhgap36. It wasn’t known to play any role in pigment cells in humans, mice, or non-orange cats. But orange cats are special; their mutation (sex-linked orange) turns on Arhgap36 expression in pigment cells (and only pigment cells), thereby interfering with the molecular pathway that controls coat color in other orange-shaded mammals. The scientists suggest that this is an example of how genes can acquire new functions, thereby enabling species to better adapt and evolve.

DOI: Current Biology, 2025. 10.1016/j.cub.2025.03.075  (About DOIs).

Not a Roman “massacre” after all

Two of the skeletons excavated by Mortimer Wheeler in the 1930s, dating from the 1st century AD.

Credit: Martin Smith

In 1936, archaeologists excavating the Iron Age hill fort Maiden Castle in the UK unearthed dozens of human skeletons, all showing signs of lethal injuries to the head and upper body—likely inflicted with weaponry. At the time, this was interpreted as evidence of a pitched battle between the Britons of the local Durotriges tribe and invading Romans. The Romans slaughtered the native inhabitants, thereby bringing a sudden violent end to the Iron Age. At least that’s the popular narrative that has prevailed ever since in countless popular articles, books, and documentaries.

But a paper published in the Oxford Journal of Archaeology calls that narrative into question. Archaeologists at Bournemouth University have re-analyzed those burials, incorporating radiocarbon dating into their efforts. They concluded that those individuals didn’t die in a single brutal battle. Rather, it was Britons killing other Britons over multiple generations between the first century BCE and the first century CE—most likely in periodic localized outbursts of violence in the lead-up to the Roman conquest of Britain. It’s possible there are still many human remains waiting to be discovered at the site, which could shed further light on what happened at Maiden Castle.

DOI: Oxford Journal of Archaeology, 2025. 10.1111/ojoa.12324  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Research roundup: 7 stories we almost missed Read More »

why-incels-take-the-“blackpill”—and-why-we-should-care

Why incels take the “Blackpill”—and why we should care


“Don’t work for Soyciety”

A growing number of incels are NEET (Not in Education, Employment, or Training). That should concern us all.

The Netlix series Adolescence explores the roots of misogynistic subcultures. Credit: Netflix

The online incel (“involuntary celibate”) subculture is mostly known for its extreme rhetoric, primarily against women, sometimes erupting into violence. But a growing number of self-identified incels are using their ideology as an excuse for not working or studying. This could constitute a kind of coping mechanism to make sense of their failures—not just in romantic relationships but also in education and employment, according to a paper published in the journal Gender, Work, & Organization.

Contrary to how it’s often portrayed, the “manosphere,” as it is often called, is not a monolith. Those who embrace the “Redpill” ideology, for example, might insist that women control the “sexual marketplace” and are only interested in ultramasculine “Chads.” They champion self-improvement as a means to make themselves more masculine and successful, and hence (they believe) more attractive to women—or at least better able to manipulate women.

By contrast, the “Blackpilled” incel contingent is generally more nihilistic. These individuals reject the Redpill notion of alpha-male masculinity and the accompanying focus on self-improvement. They believe that dating and social success are entirely determined by one’s looks and/or genetics. Since there is nothing they can do to improve their chances with women or their lot in life, why even bother?

“People have a tendency to lump all these different groups together as the manosphere,” co-author AnnaRose Beckett-Herbert of McGill University told Ars. “One critique I have of the recent Netflix show Adolescence—which was well done overall—is they lump incels in with figures like Andrew Tate, as though it’s all interchangeable. There’s areas of overlap, like extreme misogyny, but there are really important distinctions. We have to be careful to make those distinctions because the kind of intervention or prevention efforts that we might direct towards the Redpill community versus the Blackpill community might be very different.”

Incels constitute a fairly small fraction of the manosphere, but the vast majority of incels appear to embrace the Blackpill ideology, per Beckett-Herbert. That nihilistic attitude can extend to any kind of participation in what incels term “Soyciety”—including educational attainment and employment. When that happens, such individuals are best described by the acronym NEET (Not in Education, Employment, or Training).

“It’s not that we have large swaths of young men that are falling into this rabbit hole,” said Beckett-Herbert. “Their ideology is pretty fringe, but we’re seeing the community grow, and we’re seeing the ideology spread. It used to be contained to romantic relationships and sex. Now we’re seeing this broader disengagement from society as a whole. We should all be concerned about that trend.”

The NEET trend is also tied to the broader cultural discourse on how boys and young men are struggling in contemporary society. While prior studies tended to focus on the misogynistic rhetoric and propensity for violence among incels, “I thought that the unemployment lens was interesting because it’s indicative of larger problems,” said Beckett-Herbert. “It’s important to remember that it’s not zero-sum. We can care about the well-being of women and girls and also acknowledge that young men are struggling, too. Those don’t have to be at odds.”

“Lie down and rot”

Beckett-Herbert and her advisor/co-author, McGill University sociologist Eran Shor, chose the incels.is platform as a data source for their study due to its ease of public access and relatively high traffic, with nearly 20,000 members. The pair used Python code to scrape 100 pages, amounting to around 10,000 discussion threads between October and December 2022. A pilot study revealed 10 keywords that appeared most frequently in those threads: “study,” “school,” “NEET,” “job,” “work,” “money,” “career,” “wage,” “employ,” and “rot.” (“They use the phrase ‘lie down and rot’ a lot,” said Beckett-Herbert.)

This allowed Beckett-Herbert and Shor to narrow their sample down to 516 threads with titles containing those keywords. They randomly selected a subset of 171 discussion threads for further study. That analysis yielded four main themes that dominated the discussion threads: political/ideological arguments about being NEET; boundary policing; perceived discrimination; and bullying and marginalization.

Roughly one-quarter of the total comments consisted of political or ideological arguments promoting being NEET, with most commenters advocating minimizing one’s contributions to society as much as possible. They suggested going on welfare, for instance, to “take back” from society, or declared they should be exempt from paying any taxes, as “compensation for our suffering.” About 25 percent—a vocal minority—pushed back on glorifying the NEET lifestyle and offered concrete suggestions for self-improvement. (“Go outside and try at least,” one user commented.)

Such pushback often led to boundary policing. Those who do pursue jobs or education run the risk of being dubbed “fakecels” and becoming alienated from the rest of the incel community. (“Don’t work for a society that hates you,” one user commented.) “There’s a lot of social psychological research on groupthink and group polarization that is relevant here,” said Beckett-Herbert. “A lot of these young men may not have friends in their real life. This community is often their one source of social connection. So the incel ideology becomes core to their identity: ‘I’m part of this community, and we don’t work. We are subhumans.'”

There were also frequent laments about being discriminated against for not being attractive (“lookism”), both romantically and professionally, as well as deep resentment of women’s increased presence in the workplace, deemed a threat to men’s own success. “They love to cherry-pick all these findings from psychology research [to support their position],” said Beckett-Herbert. For instance, “There is evidence that men who are short or not conventionally attractive are discriminated against in hiring. But there’s also a lot of evidence suggesting that this actually affects women more. Women who are overweight face a greater bias against them in hiring than men do, for example.”

Beckett-Herbert and Shor also found that about 15 percent of the comments in their sample concerned users’ experiences being harassed or bullied (usually by other men), their mental health challenges (anxiety, depression), and feeling estranged or ostracized at school or work—experiences that cemented their reluctance to work or engage in education or vocational training.

Many of these users also mentioned being autistic, in keeping with prior research showing a relatively high share of people with autism in incel communities. The authors were careful to clarify, however, that most people with autism “are not violent or hateful, nor do they identify as incels or hold explicitly misogynistic views,” they wrote. “Rather, autism, when combined with other mental health issues such as depression, anxiety, and hopelessness, may make young men more vulnerable to incel ideologies.”

There are always caveats. In this case, the study was limited to a single incel forum, which might not be broadly representative of similar discussions on other platforms. And there could be a bit of selection bias at play. Not every incel member may actively participate in discussion threads (lurkers) and non-NEET incels might be less likely to do so either because they have less free time or don’t wish to be dismissed as “fakecels.”However, Beckett-Herbert and Shor note that their findings are consistent with previous studies that suggest there are a disproportionately large number of NEETs within the incel community.

A pound of prevention

Is effective intervention even possible for members of the incel community, given their online echo chamber? Beckett-Herbert acknowledges that it is very difficult to break through to such people. “De-radicalization is a noble, worthy line of research,” she said. “But the existing evidence from that field of study suggests that prevention is easier and more effective than trying to pull these people out once they’re already in.” Potential strategies might include fostering better digital and media literacy, i.e., teaching kids to be cognizant of the content they’re consuming online. Exposure time is another key issue.

“A lot of these young people don’t have healthy outlets that are not in the digital world,” said Beckett-Herbert “They come home from school and spend hours and hours online. They’re lonely and isolated from real-world communities and structures. Some of these harmful ideologies might be downstream of these larger root causes. How can we help boys do better in school, feel better prepared for the labor market? How can we help them make more friends? How can we get them involved in real-world activities that will diminish their time spent online? I think that that can go a long way. Just condemning them or banning their spaces—that’s not a good long-term solution.”

While there are multiple well-publicized instances of self-identified incels committing violent acts—most notably Elliot Rodger, who killed six people in 2014—Beckett-Herbert emphasizes not losing sight of incels’ fundamental humanity. “We focus a lot on the misogyny, the potential for violence against women, and that is so important,” she said. “You will not hear me saying we should not focus on that. But we also should note that statistically, an incel is much more likely to commit suicide or be violent towards themselves than they are toward someone else. You can both condemn their ideology and find it abhorrent and also remember that we need to have empathy for these people.”

Many people—women especially—might find that a tall order, and Beckett-Herbert understands that reluctance. “I do understand people’s hesitancy to empathize with them, because it feels like you’re giving credence to their rhetoric,” she said. “But at the end of the day, they are human, and a lot of them are really struggling, marginalized people coming from pretty sad backgrounds. When you peruse their online world, it’s the most horrifying, angering misogyny right next to some of the saddest mental health, suicidal, low self-esteem stuff you’ve ever seen. I think humanizing them and having empathy is going to be foundational to any intervention efforts to reintegrate them. But it’s something I wrestle with a lot.”

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Why incels take the “Blackpill”—and why we should care Read More »

testing-a-robot-that-could-drill-into-europa-and-enceladus

Testing a robot that could drill into Europa and Enceladus


We don’t currently have a mission to put it on, but NASA is making sure it’s ready.

Geysers on Saturn’s moon Enceladus Credit: NASA

Europa and Enceladus are two ocean moons that scientists have concluded have liquid water oceans underneath their outer icy shells. The Europa Clipper mission should reach Europa around April of 2030. If it collects data hinting at the moon’s potential habitability, robotic lander missions could be the only way to confirm if there’s really life in there or not.

To make these lander missions happen, NASA’s Jet Propulsion Laboratory team has been working on a robot that could handle the search for life and already tested it on the Matanuska Glacier in Alaska. “At this point this is a pretty mature concept,” says Kevin Hand, a planetary scientist at JPL who led this effort.

Into the unknown

There are only a few things we know for sure about conditions on the surface of Europa, and nearly all of them don’t bode well for lander missions. First, Europa is exposed to very harsh radiation, which is a problem for electronics. The window of visibility—when a potential robotic lander could contact Earth—lasts less than half of the 85 hours it takes for the moon to complete its day-night cycle due to the Europa-Jupiter orbit. So, for more than half the mission, the robot would need to fend for itself, with no human ground teams to get it out of trouble. The lander would also need to run on non-rechargeable batteries, because the vast distance to the Sun would make solar panels prohibitively massive.

And that’s just the beginning. Unlike on Mars, we don’t have any permanent orbiters around Europa that could provide a communication infrastructure, and we don’t have high-resolution imagery of the surface, which would make the landing particularly tricky. “We don’t know what Europa’s surface looks like at the centimeter to meter scale. Even with the Europa Clipper imagery, the highest resolution will be about half a meter per pixel across a few select regions,” Hand explains.

Because Europa has an extremely thin atmosphere that doesn’t provide any insulation, the temperatures on top of its ice shell are estimated to vary between minus-160° Celsius during the daytime maximum and minus-220° C during the night, which means the ice the lander would be there to sample is most likely hard as concrete. Hand’s team, building their robot, had to figure out a design that could deal with all these issues.

The work on the robotic system for the Europa lander mission began more than 10 years ago. Back then, the 2013–2022 decadal strategy for planetary science cited the Europa Clipper as the second-highest priority large-scale planetary mission, so a lander seemed like a natural follow-up.

Autonomy and ice drilling

The robot developed by Hand’s team has legs that enable it to stabilize itself on various types of surfaces, from rock-hard ice to loose, soft snow. To orient itself in the environment, it uses a stereoscopic camera with an LED light source for illumination hooked to computer-vision algorithms—a system similar to the one currently used by the Perseverance rover on Mars. “Stereoscopic cameras can triangulate points in an image and build a digital surface topography model,” explains Joseph Bowkett, a JPL researcher and engineer who worked on the robot’s design.

The team built an entirely new robotic arm with seven degrees of freedom. Force torque sensors installed in most of its joints act a bit like a nervous system, informing the robot when key components sustain excessive loads to prevent it from damaging the arm or the drill. “As we press down on the surface [and] conduct drilling and sampling, we can measure the forces and react accordingly,” Bowkett says. The finishing touch was the ICEPICK, a drilling and sampling tool the robot uses to excavate samples from the ice up to 20 centimeters deep.

Because of long periods the lander would need operate without any human supervision, the team also gave it a wide range of autonomous systems, which operate at two different levels. High-level autonomy is responsible for scheduling and prioritizing tasks within a limited energy budget. The robot can drill into a sampling site, analyze samples with onboard instruments, and decide whether it makes sense to keep drilling at the same spot or choose a different sampling site. The high-level system is also tasked with choosing the most important results for downlink back to Earth.

Low-level autonomy breaks all these high-level tasks down into step-by-step decisions on how to operate the drill and how to move the arm in the safest and most energy-efficient way.

The robot was tested in simulation software first, then indoors at JPL’s facilities, and finally at the Matanuska Glacier in Alaska, where it was lowered from a helicopter that acted as a proxy for a landing vehicle. It was tested at three different sites, ranked from the easiest to the most challenging. It completed all the baseline activities as well as all of the extras. The latter included a task like drilling 27 centimeters deep into ice at the most difficult site, where it was awkwardly positioned on an eight-to-12-degree slope. The robot passed all the tests with flying colors.

And then it got shelved.

Switching the ocean worlds

Hand’s team put their Europa landing robot through the Alaskan field test campaign between July and August 2022. But when the new decadal strategy for planetary science came out in 2023, it turned out that the Europa lander was not among the missions selected. The National Academies committee responsible for formulating these decadal strategies did not recommend giving it a go, mainly because they believed harsh radiation in the Jovian system would make detecting biosignatures “challenging” for a lander.

An Enceladus lander, on the other hand, remained firmly on the table. “I was also on the team developing EELS, a robot intended for a potential Enceladus mission, so thankfully I can speak about both. The radiation challenges are indeed far greater for Europa,” Bowkett says.

Another argument for changing our go-to ocean world is that water plumes containing salts along with carbon- and nitrogen-bearing molecules have already been observed on Enceladus, which means there is a slight chance biosignatures could be detected by a flyby mission. The surface of Enceladus, according to the decadal strategy document, should be capable of preserving biogenic evidence for a long time and seems more conducive to a lander mission. “Luckily, many of the lessons on how to conduct autonomous sampling on Europa, we believe, will transfer to Enceladus, with the benefit of a less damaging radiation environment,” Bowkett told Ars.

The dream of a Europa landing is not completely dead, though. “I would love to get into the Europa’s ocean with a submersible and further down to the seafloor. I would love for that to happen,” Hand says. “But technologically it’s quite a big leap, and you always have to balance your dream missions with the number of technological miracles that need to be solved to make these missions possible.”

Science Robotics, 2025.  DOI: 10.1126/scirobotics.adi5582

Photo of Jacek Krywko

Jacek Krywko is a freelance science and technology writer who covers space exploration, artificial intelligence research, computer science, and all sorts of engineering wizardry.

Testing a robot that could drill into Europa and Enceladus Read More »