This led a Mexican-European research collaboration to get interested in finding DNA from elsewhere in the Columbian mammoth’s range, which extended down into Central America. The researchers focused on the Basin of Mexico, which is well south of where any woolly mammoths were likely to be found. While the warmer terrain generally tends to degrade DNA more quickly, the team had a couple of things working in its favor. To begin with, there were a lot of bones. The Basin of Mexico has been heavily built up over the centuries, and a lot of mammoth remains have been discovered, including over 100 individuals during the construction of Mexico City’s international airport.
In addition, the team focused entirely on the mitochondrial genome. In contrast to the two sets of chromosomes in each cell, a typical cell might have hundreds of mitochondria, each of which could have dozens of copies of its genome. So, while the much smaller mitochondria don’t provide as much detail about ancestry, they’re at least likely to survive at high enough levels to provide something to work with.
And indeed they did. Altogether, the researchers obtained 61 new mitochondrial genomes from the mammoths of Mexico from the 83 samples they tested. Of these, 28 were considered high enough quality to perform an analysis.
Off on their own
By building a family tree using this genetic data, along with that from other Columbian and woolly mammoth samples, the researchers could potentially determine how different populations were related. And one thing became very clear almost immediately: They were in a very weird location on that tree.
To begin with, all of them clustered together in a single block, although there were three distinct groupings within that block. But the placement of that block within the larger family tree was notably strange. To begin with, there were woolly mammoths on either side of it, suggesting the lineage was an offshoot of woolly mammoths. That would make sense if all Columbian mammoths clustered together with the Mexican ones. But they don’t. Some Columbian mammoths from much farther north are actually more closely related to woolly mammoths than they are to the Mexican mammoths.
“Once a species enters, they never leave,” interior secretary says. But there’s more to the story.
A female northern spotted owl catches a mouse on a stick at the Hoopa Valley Tribe on the Hoopa Valley Reservation on Aug. 28, 2024. Credit: The Washington Post/Getty Images
“You can check out any time you like, but you can never leave.”
It’s the ominous slogan for “Hotel California,” an iconic fictional lodging dreamed up by the Eagles in 1976. One of the rock band’s lead singers, Don Henley, said in an interview that the song and place “can have a million interpretations.”
For US Interior Secretary Doug Burgum, what comes to mind is a key part of one of the country’s most central conservation laws.
“The Endangered Species List has become like the Hotel California: once a species enters, they never leave,” Burgum wrote in an April post on X. He’s referring to the roster of more than 1,600 species of imperiled plants and animals that receive protections from the federal government under the Endangered Species Act to prevent their extinctions. “In fact, 97 percent of species that are added to the endangered list remain there. This is because the status quo is focused on regulation more than innovation.”
US Secretary of the Interior Doug Burgum speaks during a press conference on Aug. 11, 2025. Credit: Yasin Ozturk/Anadolu via Getty Images
Since January, the Endangered Species Act has been a frequent target of the Trump administration, which claims that the law’s strict regulations inhibit development and “energy domination.” Several recent executive orders direct the federal government to change ESA regulations in a way that could enable businesses—fossil fuel firms in particular—to bypass the typical environmental reviews associated with project approval.
More broadly, though, Burgum and other conservative politicians are implying the law is ineffective at achieving its main goal: recovering biodiversity. But a number of biologists, environmental groups and legal experts say that recovery delays for endangered species are not a result of the law itself.
Instead, they point to systemically low conservation funding and long-standing political flip-flopping as wildlife faces mounting threats from climate change and widespread habitat loss.
“We continue to wait until species are in dire straits before we protect them under the Endangered Species Act,” said David Wilcove, a professor of ecology, evolutionary biology, and public affairs at Princeton University, “and in doing that, we are more or less ensuring that it’s going to be very difficult to recover them and get them off the list.”
Endangered species by the numbers
Since the Endangered Species Act was enacted in 1973, the US Fish and Wildlife Service and the National Oceanic and Atmospheric Administration have listed more than 2,370 species of plants and animals as threatened or endangered—from schoolbus-sized North Atlantic right whales off the East Coast to tiny Oahu tree snails in Hawaii. In some cases, the list covers biodiversity abroad to prevent further harm from the global wildlife trade.
Once a plant or animal is added, it receives certain protections by the federal government to stanch population losses. Those measures include safeguards from adverse effects of federal activities, restrictions on hunting or development, and active conservation plans like seed planting or captive rearing of animals.
Despite these steps, only 54 of the several thousand species listed from 1973 to 2021 recovered to the point where they no longer needed protection. A number of factors play into this low recovery rate, according to a 2022 study.
The team of researchers who worked on it dove into the population sizes for species of concern, the timelines of their listings, and recovery efforts.
A few trends emerged: Most of the imperiled plants and animals in the US do not receive protections until their populations have fallen to “dangerously low levels,” with less genetic diversity and more vulnerability to extinction from extreme events like severe weather or disease outbreaks.
Additionally, the process to get a species listed frequently took several years, allowing time for populations to dip even lower, said Wilcove, a co-author of the study.
“It’s simply a biological fact that if you don’t start protecting a species until it’s down to a small number of individuals, you’re going to face a long uphill battle,” he said. On top of that, “there are more species in trouble, but at the same time, we are providing less funding on a per-species basis for the Fish and Wildlife Service, so we’re basically asking them to do more and more with less and less.”
These findings echo a similar paper Wilcove co-authored in 1993. Since that analysis was published, the number of listings has risen, while federal funding per species has dropped substantially. “Hotel California” isn’t the right analogy for the endangered species list, in Wilcove’s view: He says it’s more akin to “the critical care unit of the hospital”—one that is struggling to stay afloat.
“It’s as though you built a great hospital and then didn’t pay any money for medical equipment or doctors,” he said. “The hospital isn’t going to work.”
Even so, it has prevented a lot of deaths, experts say. Since the law was passed, just 26 listed species have gone extinct, many of which had not been seen in the wild for years prior to their listing. An estimated 47 species have perished while being considered for a listing, as they were still exposed to the threats that helped reduce their populations in the first place, according to an analysis by the High Country News. Some listing decisions take more than a decade.
“I think the marquee statistic is how few animals have gone extinct under the watch of the federal government,” said Andrew Mergen, the director of Harvard Law School’s Emmett Environmental Law and Policy Clinic. He spent more than 30 years serving as legal counsel in the US Department of Justice, where he litigated a bevy of cases related to the Endangered Species Act.
“Our goal should be to get them off the list and to recover them, but it requires a commitment to this enterprise that we don’t see very often,” Mergen said.
History shows it can be done. Bald eagles—widely considered an emblem of American patriotism—nearly disappeared in the 1960s, with just 417 known nesting pairs left in the lower 48 states. This was largely due to habitat loss and the pesticide DDT, which caused eagle eggshells to become too brittle to survive incubation. By the time the bald eagle was listed as threatened or endangered in all lower 48 states in 1978, DDT had been outlawed, a regulation that the ESA helped enforce, experts say.
A bald eagle flies over the Massapequa Preserve on March 25, 2025 in Massapequa, New York. Credit: Bruce Bennett/Getty Images
This step, along with captive breeding programs, reintroduction efforts, law enforcement, and habitat protection, helped recover populations to nearly 10,000 nesting pairs. In 2007, bald eagles came off the list. Other once-endangered animals like American alligators and Steller sea lions have also been delisted in recent decades due to targeted limits on actions that led to their decline, such as hunting.
Recovery gets trickier when threats to species are more multifaceted, according to Taal Levi, an associate professor at Oregon State University.
“The other class of species with complex, multicausal, or poorly understood threats can be like Hotel California,” Levi said over email. “This is in part because we don’t always have funding to research the threats, and if we identify them, we don’t always have funding to mitigate the threats.”
That is particularly true for the primary driver of biodiversity decline: habitat loss. Levi studies the endangered Humboldt marten, a small carnivore that lives on the Northern California and Southern Oregon coast. The animal was once widespread, but logging in old-growth and coniferous forests decimated their habitats. Now, Levi said it is difficult to fund research that helps unveil basic things about the animals, including what constitutes high-quality habitats. Other animals, like endangered Florida panthers, also struggle to maintain high populations in environments fragmented by urbanization.
“Sometimes being in Hotel California isn’t the worst thing,” Levi wrote in his email. “We’d prefer that Florida Panthers expand into other available habitat to the North of South Florida, but in lieu of that, maintaining them on the ESA seems wise to prevent their extinction.”
The private lands predicament
The federal government manages around 640 million acres of public lands and more than 3.4 million nautical miles of ocean, and it has final say on how endangered species are protected within these areas. However, more than two-thirds of species listed under the Endangered Species Act depend at least in part on private lands, with 10 percent residing only on such property.
The law prohibits any action that would harm a listed species wherever it might be, even if unintentionally. There is also a provision that enables the government to designate certain “critical habitat” areas that are crucial for a species’ survival, including on private land.
As a result, landowners and businesses often see endangered species as a detriment to their operations, said Jonathan Adler, an environmental law professor at William & Mary Law School in Virginia.
“Your ability to use that land is going to be limited, and you can be prosecuted… That creates a lot of conflict, and it discourages landowners from being cooperative,” he said. Adler published a paper in 2024 that argued the Endangered Species Act has been largely ineffective at conserving species, mainly due to the private land problem.
In some cases, this dynamic can create what Adler calls “perverse incentives” for landowners to destroy a habitat before a species is found on their land or listed to avoid any restrictions or costs associated with the endangered label.
Take the red-cockaded woodpecker, which typically relies on old-growth pine trees for nesting. This bird was part of the first cohort listed as endangered under the Act, which limited timber production in many areas of North Carolina. However, an analysis of timber harvests from 1984 to 1990 found that the closer a timber plot was to red-cockaded woodpeckers, the more likely the pines were to be harvested at a young age. This was most likely to prevent the trees from reaching maturity and avoid critical habitat regulation altogether, according to the 2007 study.
Adler argues that the ESA in its current form has too many sticks and not enough carrots. Over the years, Congress has implemented a few strategies to incentivize biodiversity protection on private lands, including providing tax benefits or purchasing conservation easements. This voluntary legal agreement allows an individual to receive compensation for a portion of their land while still owning it, in exchange for agreeing to certain restrictions, such as limiting development or following sustainable farming practices. Environmental groups often purchase conservation easements as well.
This strategy has helped protect animals like the California tiger salamander, San Joaquin kit fox, waterfowls, and other imperiled species. However, providing incentives to landowners for conservation is becoming less common under the Trump administration, Princeton’s Wilcove said.
The Department of the Interior did not respond to requests for comment.
“You shouldn’t reduce the prohibition on harming endangered species, but you should make it easier for landowners to do the right thing, and there are ways for doing that, and this administration is not a champion of those ways,” Wilcove said. “We’re waiting too long to protect species, and when we get around to protecting them, we’re not giving the government sufficient resources to do the job.”
Is the Endangered Species Act itself endangered?
The Endangered Species Act was passed with wide bipartisan support. But it has become one of the most highly litigated environmental laws in the US, in part because anyone can petition to have a species listed as endangered.
A number of conservative presidential administrations and members of Congress have tried to soften the law’s power, but more environmentally minded administrations often strengthened it once again.
“It’s been a very strong law, partly because so much of the public supports it,” said Kristen Boyles, an attorney at the nonprofit Earthjustice, which has frequently filed ESA-related lawsuits. “Whenever legislative changes have been proposed, we’ve pretty much been able to defeat those.”
But experts say things may be different this time around as the Trump administration takes a more accelerated and aggressive approach to the ESA at a time when environmentalists can’t count on the Supreme Court to push back.
Since January, the president has issued several executive orders that would allow certain fossil fuel projects to get a fast-pass trip through environmental reviews, including those that could harm endangered animals or plants. In April, the Fish and Wildlife Service proposed rescinding certain habitat protections for endangered species, effectively allowing such activities as logging and oil drilling even if they degrade the surrounding environment.
Meanwhile, the Department of the Interior and NOAA have in recent months cut funding for conservation programs and laid off many of the people responsible for carrying out the Endangered Species Act’s mandate. That includes rangers who were monitoring animals like the endangered Pacific fisher in California’s Yosemite National Park.
People observe North Atlantic right whales from a boat in Canada’s Bay of Fundy. Credit: (Photo by: Francois Gohier/VW Pics/Universal Images Group via Getty Images)
“One thing that I would say to [Secretary Burgum] is that you have a duty to faithfully execute the law as a member of the executive branch as it was enacted by Congress,” Harvard’s Mergen said. “That’s going to mean that you should not cut all your biologists out but invest in the recovery of these species, understanding what’s putting them at risk and mitigating those harms.”
Conservation funding declined long before Trump entered office, so there is “plenty of blame to go around,” Wilcove said. But political flip-flopping on how recovery projects are carried out inhibit their effectiveness, he added. “If you’re lurching between administrations that care and administrations that are hostile, it’s going to be very hard to make progress.”
For all the discussion about the economic costs of endangered species regulations, studies show that funding biodiversity protection has a strong return on investment for society.
For instance, coastal mangroves around the world reduce property damage from storms by more than $65 billion annually and protect more than 15 million people, according to 2020 research. The Fish and Wildlife Service estimates that insect crop pollination equates to $34 billion in value each year.
Protecting vulnerable animals can also benefit industries that depend on healthy landscapes and oceans. Researchers estimated in 2007 that protecting water flow in the Rio Grande River in Texas for the endangered Rio Grande silvery minnow produces average annual benefits of over $200,000 per year for west Texas agriculture and over $1 million for El Paso municipal and industrial water users.
Endangered species can be a boon for the outdoor tourism industry, too. NOAA Fisheries estimates that the endangered North Atlantic right whale generated $2.3 billion in sales in the whale-watching industry and across the broader economy in 2008 alone, compared to annual costs of about $30 million related to shipping and fishing restrictions protecting them.
Beyond financial gains, humanity has pulled a wealth of knowledge from nature to help treat and cure diseases. For example, the anti-cancer compound paclitaxel was originally extracted from the bark of the Pacific yew tree and is “too fiendishly complex” a chemical structure for researchers to have invented on their own, according to the federal government.
Preventing endangered species from going extinct ensures that we can someday still discover what we don’t yet know, according to Dave Owen, an environmental law professor at the University of California Law, San Francisco.
“Even seemingly simple species are extraordinarily complex; they contain an incredible variety of chemicals, microbes, and genetic adaptations, all of which we can learn from—but only if the species is still around,” he said over email.
Last month, the Fish and Wildlife Service announced that the Roanoke logperch—a freshwater fish—has recovered enough to be removed from the endangered species list altogether.
In a post on X, the Interior secretary declared this is “proof that the Endangered Species List is no longer Hotel California. Under the Trump admin, species can finally leave!”
But this striped fish’s recovery didn’t happen overnight. Federal agencies, local partners, landowners, and conservationists spent more than three decades, millions of dollars, and countless hours removing obsolete dams, restoring wetlands, and reintroducing fish populations to help pull the Roanoke logperch back from the brink. And it was the Biden administration that first proposed delisting the fish in 2024.
These types of success stories give reasons for hope, Wilcove said.
“What I’m optimistic about is our ability to save species, if we put our mind and our resources to it.”
This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy and the environment. Sign up for their newsletter here.
And we have known for sure that the armor was around back then, given that we’ve found the skin-derived osteoderms that comprise the armor in Jurassic deposits. But with little more than a rib and a handful of mouth parts to go on, it wasn’t really possible to say much more than that.
Until now, that is. Because the new Spicomellus remains show extremely clearly that the armor of ankylosaurs got less elaborate over time.
The small, solid-looking spikes found along the edges of later ankylosaurs? Forget those. Spicomellus had a back that was probably bristling with sharper spines, along with far larger ones along its outer edges. Each rib appears to have generated as many as six individual spikes. At a handful of locations, these spikes extended out to nearly a meter, looking more like lances than anything needed to ward off a close-in attack.
And the largest of these were along its neck. On the upper surface of its neck, several osteoderms fused to form a massive half-collar of bone and then extended out five or more individual spikes, each among the longest on the animal’s body. And there were three of these structures along the neck. “No known ankylosaur possesses any condition close to the extremely long pairs of spines on the cervical half-ring of Spicomellus,” its discoverers note.
As if its hedgehog-on-acid appearance weren’t enough, handles present on the tail vertebrae suggest that it also had a weaponized tail. All told, the researchers sum things up by saying, “The new specimen reveals extreme dermal armour modifications unlike those of any other vertebrate, extinct or extant, which fall far outside of the range of morphologies shown by other armoured dinosaurs.”
Out go the hypotheses
Because it’s so unusual, the skeleton’s characteristics are difficult to place within a neat family tree of the ankylosaurs. The researchers conclude that some details of its skeleton do suggest Spicomellus groups among the ankylosaurs and conclude that it’s probably an early branch from the main lineage. But without any other significant examples from the lineage at that time, it’s an extremely tentative conclusion. Still, the alternative is that this thing is unrelated to the only other organisms that share at least a few of its bizarre features, which is a difficult idea to swallow.
The ship made it all the way through reentry, turned to a horizontal position to descend through scattered clouds, then relit three of its engines to flip back to a vertical orientation for the final braking maneuver before splashdown.
Things to improve on
There are several takeaways from Tuesday’s flight that will require some improvements to Starship, but these are more akin to what officials might expect from a rocket test program and not the catastrophic failures of the ship that occurred earlier this year.
One of the Super Heavy booster’s 33 engines prematurely shut down during ascent. This has happened before, and while it didn’t affect the booster’s overall performance, engineers will investigate the failure to try to improve the reliability of SpaceX’s Raptor engines, each of which can generate more than a half-million pounds of thrust.
Later in the flight, cameras pointed at one of the ship’s rear flaps showed structural damage to the back of the wing. It wasn’t clear what caused the damage, but super-heated plasma burned through part of the flap as the ship fell deeper into the atmosphere. Still, the flap remained largely intact and was able to help control the vehicle through reentry and splashdown.
“We’re kind of being mean to this Starship a little bit,” Huot said on SpaceX’s live webcast. “We’re really trying to put it through the paces and kind of poke on what some of its weak points are.”
Small chunks of debris were also visible peeling off the ship during reentry. The origin of the glowing debris wasn’t immediately clear, but it may have been parts of the ship’s heat shield tiles. On this flight, SpaceX tested several different tile designs, including ceramic and metallic materials, and one tile design that uses “active cooling” to help dissipate heat during reentry.
A bright flash inside the ship’s engine bay during reentry also appeared to damage the vehicle’s aft skirt, the stainless steel structure that encircles the rocket’s six main engines.
“That’s not what we want to see,” Huot said. “We just saw some of the aft skirt just take a hit. So we’ve got some visible damage on the aft skirt. We’re continuing to reenter, though. We are intentionally stressing the ship as we go through this, so it is not guaranteed to be a smooth ride down to the Indian Ocean.
“We’ve removed a bunch of tiles in kind of critical places across the vehicle, so seeing stuff like that is still valuable to us,” he said. “We are trying to kind of push this vehicle to the limits to learn what its limits are as we design our next version of Starship.”
Shana Diez, a Starship engineer at SpaceX, perhaps summed up Tuesday’s results best on X: “It’s not been an easy year but we finally got the reentry data that’s so critical to Starship. It feels good to be back!”
Collapsing gas clouds in the early universe may have formed lower-mass stars as well.
Stars form in the universe from massive clouds of gas. Credit: European Southern Observatory, CC BY-SA
For decades, astronomers have wondered what the very first stars in the universe were like. These stars formed new chemical elements, which enriched the universe and allowed the next generations of stars to form the first planets.
The first stars were initially composed of pure hydrogen and helium, and they were massive—hundreds to thousands of times the mass of the Sun and millions of times more luminous. Their short lives ended in enormous explosions called supernovae, so they had neither the time nor the raw materials to form planets, and they should no longer exist for astronomers to observe.
At least that’s what we thought.
Two studies published in the first half of 2025 suggest that collapsing gas clouds in the early universe may have formed lower-mass stars as well. One study uses a new astrophysical computer simulation that models turbulence within the cloud, causing fragmentation into smaller, star-forming clumps. The other study—an independent laboratory experiment—demonstrates how molecular hydrogen, a molecule essential for star formation, may have formed earlier and in larger abundances. The process involves a catalyst that may surprise chemistry teachers.
As an astronomer who studies star and planet formation and their dependence on chemical processes, I am excited at the possibility that chemistry in the first 50 million to 100 million years after the Big Bang may have been more active than we expected.
These findings suggest that the second generation of stars—the oldest stars we can currently observe and possibly the hosts of the first planets—may have formed earlier than astronomers thought.
Primordial star formation
Video illustration of the star and planet formation process. Credit: Space Telescope Science Institute.
Stars form when massive clouds of hydrogen many light-years across collapse under their own gravity. The collapse continues until a luminous sphere surrounds a dense core that is hot enough to sustain nuclear fusion.
Nuclear fusion happens when two or more atoms gain enough energy to fuse together. This process creates a new element and releases an incredible amount of energy, which heats the stellar core. In the first stars, hydrogen atoms fused together to create helium.
The new star shines because its surface is hot, but the energy fueling that luminosity percolates up from its core. The luminosity of a star is its total energy output in the form of light. The star’s brightness is the small fraction of that luminosity that we directly observe.
This process where stars form heavier elements by nuclear fusion is called stellar nucleosynthesis. It continues in stars after they form as their physical properties slowly change. The more massive stars can produce heavier elements such as carbon, oxygen, and nitrogen, all the way up to iron, in a sequence of fusion reactions that end in a supernova explosion.
Supernovae can create even heavier elements, completing the periodic table of elements. Lower-mass stars like the Sun, with their cooler cores, can sustain fusion only up to carbon. As they exhaust the hydrogen and helium in their cores, nuclear fusion stops, and the stars slowly evaporate.
The remnant of a high-mass star supernova explosion imaged by the Chandra X-ray Observatory, left, and the remnant of a low-mass star evaporating in a blue bubble, right. Credit: CC BY 4.0
High-mass stars have high pressure and temperature in their cores, so they burn bright and use up their gaseous fuel quickly. They last only a few million years, whereas low-mass stars—those less than two times the Sun’s mass—evolve much more slowly, with lifetimes of billions or even trillions of years.
If the earliest stars were all high-mass stars, then they would have exploded long ago. But if low-mass stars also formed in the early universe, they may still exist for us to observe.
Chemistry that cools clouds
The first star-forming gas clouds, called protostellar clouds, were warm—roughly room temperature. Warm gas has internal pressure that pushes outward against the inward force of gravity trying to collapse the cloud. A hot air balloon stays inflated by the same principle. If the flame heating the air at the base of the balloon stops, the air inside cools, and the balloon begins to collapse.
Only the most massive protostellar clouds with the most gravity could overcome the thermal pressure and eventually collapse. In this scenario, the first stars were all massive.
The only way to form the lower-mass stars we see today is for the protostellar clouds to cool. Gas in space cools by radiation, which transforms thermal energy into light that carries the energy out of the cloud. Hydrogen and helium atoms are not efficient radiators below several thousand degrees, but molecular hydrogen, H₂, is great at cooling gas at low temperatures.
When energized, H₂ emits infrared light, which cools the gas and lowers the internal pressure. That process would make gravitational collapse more likely in lower-mass clouds.
For decades, astronomers have reasoned that a low abundance of H₂ early on resulted in hotter clouds whose internal pressure would be too hot to easily collapse into stars. They concluded that only clouds with enormous masses, and therefore higher gravity, would collapse, leaving more massive stars.
Helium hydride
In a July 2025 journal article, physicist Florian Grussie and collaborators at the Max Planck Institute for Nuclear Physics demonstrated that the first molecule to form in the universe, helium hydride, HeH⁺, could have been more abundant in the early universe than previously thought. They used a computer model and conducted a laboratory experiment to verify this result.
Helium hydride? In high school science you probably learned that helium is a noble gas, meaning it does not react with other atoms to form molecules or chemical compounds. As it turns out, it does—but only under the extremely sparse and dark conditions of the early universe, before the first stars formed.
HeH⁺ reacts with hydrogen deuteride—HD, which is one normal hydrogen atom bonded to a heavier deuterium atom—to form H₂. In the process, HeH⁺ also acts as a coolant and releases heat in the form of light. So the high abundance of both molecular coolants earlier on may have allowed smaller clouds to cool faster and collapse to form lower-mass stars.
Gas flow also affects stellar initial masses
In another study, published in July 2025, astrophysicist Ke-Jung Chen led a research group at the Academia Sinica Institute of Astronomy and Astrophysics using a detailed computer simulation that modeled how gas in the early universe may have flowed.
The team’s model demonstrated that turbulence, or irregular motion, in giant collapsing gas clouds can form lower-mass cloud fragments from which lower-mass stars condense.
The study concluded that turbulence may have allowed these early gas clouds to form stars either the same size or up to 40 times more massive than the Sun’s mass.
The galaxy NGC 1140 is small and contains large amounts of primordial gas with far fewer elements heavier than hydrogen and helium than are present in our Sun. This composition makes it similar to the intensely star-forming galaxies found in the early universe. These early universe galaxies were the building blocks for large galaxies such as the Milky Way. Credit: ESA/Hubble & NASA, CC BY-ND
The two new studies both predict that the first population of stars could have included low-mass stars. Now, it is up to us observational astronomers to find them.
This is no easy task. Low-mass stars have low luminosities, so they are extremely faint. Several observational studies have recently reported possible detections, but none are yet confirmed with high confidence. If they are out there, though, we will find them eventually.
The Conversation is an independent source of news and views, sourced from the academic and research community. Our team of editors work with these experts to share their knowledge with the wider public. Our aim is to allow for better understanding of current affairs and complex issues, and hopefully improve the quality of public discourse on them.
It’s not just you. Survey says: “Twitter sucks now and all the cool kids are moving to Bluesky”
Credit: Getty Images | Chris Delmas
Marine biologist and conservationist David Shiffman was an early power user and evangelist for science engagement on the social media platform formerly known as Twitter. Over the years, he trained more than 2,000 early career scientists on how to best use the platform for professional goals: networking with colleagues, sharing new scientific papers, and communicating with interested members of the public.
But when Elon Musk bought Twitter in 2022, renaming it X, changes to both the platform’s algorithm and moderation policy soured Shiffman on the social media site. He started looking for a viable alternative among the fledgling platforms that had begun to pop up: most notably Threads, Post, Mastodon, and Bluesky. He was among the first wave of scientists to join Bluesky and found that, even in its infancy, it had many of the features he had valued in “golden age” Twitter.
Shiffman also noticed that he wasn’t the only one in the scientific community having issues with Twitter. This impression was further bolstered by news stories in outlets like Nature, Science, and the Chronicle of Higher Education noting growing complaints about Twitter and increased migration over to Bluesky by science professionals. (Full disclosure: I joined Bluesky around the same time as Shiffman, for similar reasons: Twitter had ceased to be professionally useful, and many of the science types I’d been following were moving to Bluesky. I nuked my Twitter account in November 2024.)
A curious Shiffman decided to conduct a scientific survey, announcing the results in a new paper published in the journal Integrative and Comparative Biology. The findings confirm that, while Twitter was once the platform of choice for a majority of science communicators, those same people have since abandoned it in droves. And of the alternatives available, Bluesky seems to be their new platform of choice.
Shiffman, the author of Why Sharks Matter, described early Twitter recently on the blog Southern Fried Science as “the world’s most interesting cocktail party.”
“Then it stopped being useful,” Shiffman told Ars. “I was worried for a while that this incredibly powerful way of changing the world using expertise was gone. It’s not gone. It just moved. It’s a little different now, and it’s not as powerful as it was, but it’s not gone. It was for me personally, immensely reassuring that so many other people were having the same experience that I was. But it was also important to document that scientifically.”
Eager to gather solid data on the migration phenomenon to bolster his anecdotal observations, Shiffman turned to social scientist Julia Wester, one of the scientists who had joined Twitter at Shiffman’s encouragement years before, before also becoming fed up and migrating to Bluesky. Despite being “much less online” than the indefatigable Shiffman, Wester was intrigued by the proposition. “I was interested not just in the anecdotal evidence, the conversations we were having, but also in identifying the real patterns,” she told Ars. “As a social scientist, when we hear anecdotal evidence about people’s experiences, I want to know what that looks like across the population.”
Shiffman and Wester targeted scientists, science communicators, and science educators who used (or had used) both Twitter and Bluesky. Questions explored user attitudes toward, and experiences with, each platform in a professional capacity: when they joined, respective follower and post counts, which professional tasks they used each platform for, the usefulness of each platform for those purposes relative to 2021, how they first heard about Bluesky, and so forth.
The authors acknowledge that they are looking at a very specific demographic among social media users in general and that there is an inevitable self-selection effect. However, “You want to use the sample and the method that’s appropriate to the phenomenon that you’re looking at,” said Wester. “For us, it wasn’t just the experience of people using these platforms, but the phenomenon of migration. Why are people deciding to stay or move? How they’re deciding to use both of these platforms? For that, I think we did get a pretty decent sample for looking at the dynamic tensions, the push and pull between staying on one platform or opting for another.”
They ended up with a final sample size of 813 people. Over 90 percent of respondents said they had used Twitter for learning about new developments in their field; 85.5 percent for professional networking; and 77.3 percent for public outreach. Roughly three-quarters of respondents said that the platform had become significantly less useful for each of those professional uses since Musk took over. Nearly half still have Twitter accounts but use it much less frequently or not at all, while about 40 percent have deleted their accounts entirely in favor of Bluesky.
Making the switch
User complaints about Twitter included a noticeable increase in spam, porn, bots, and promoted posts from users who paid for a verification badge, many spreading extremist content. “I very quickly saw material that I did not want my posts to be posted next to or associated with,” one respondent commented. There were also complaints about the rise in misinformation and a significant decline in both the quantity and quality of engagement, with respondents describing their experiences as “unpleasant,” “negative,” or “hostile.”
The survey responses also revealed a clear push/pull dynamic when it came to the choice to abandon Twitter for Bluesky. That is, people felt they were being pushed away from Twitter and were actively looking for alternatives. As one respondent put it, “Twitter started to suck and all the cool people were moving to Bluesky.”
Bluesky was user-friendly with no algorithm, a familiar format, and helpful tools like starter packs of who to follow in specific fields, which made the switch a bit easier for many newcomers daunted by the prospect of rebuilding their online audience. Bluesky users also appreciated the moderation on the platform and having the ability to block or mute people as a means of disengaging from more aggressive, unpleasant conversations. That said, “If Twitter was still great, then I don’t think there’s any combination of features that would’ve made this many people so excited about switching,” said Shiffman.
Per Shiffman and Wester, an “overwhelming majority” of respondents said that Bluesky has a “vibrant and healthy online science community,” while Twitter no longer does. And many Bluesky users reported getting more bang for their buck, so to speak, on Bluesky. They might have a lower follower count, but those followers are far more engaged: Someone with 50,000 Twitter/X followers, for example, might get five likes on a given post; but on Bluesky, they may only have 5,000 followers, but their posts will get 100 likes.
According to Shiffman, Twitter always used to be in the top three in terms of referral traffic for posts on Southern Fried Science. Then came the “Muskification,” and suddenly Twitter referrals weren’t even cracking the top 10. By contrast, in 2025 thus far, Bluesky has driven “a hundred times as many page views” to Southern Fried Science as Twitter. Ironically, “the blog post that’s gotten the most page views from Twitter is the one about this paper,” said Shiffman.
Ars social media manager Connor McInerney confirmed that Ars Technica has also seen a steady dip in Twitter referral traffic thus far in 2025. Furthermore, “I can say anecdotally that over the summer we’ve seen our Bluesky traffic start to surpass our Twitter traffic for the first time,” McInerney said, attributing the growth to a combination of factors. “We’ve been posting to the platform more often and our audience there has grown significantly. By my estimate our audience has grown by 63 percent since January. The platform in general has grown a lot too—they had 10 million users in September of last year, and this month the latest numbers indicate they’re at 38 million users. Conversely, our Twitter audience has remained fairly static across the same period of time.”
Bubble, schmubble
As for scientists looking to share scholarly papers online, Shiffman pulled the Altmetrics stats for his and Wester’s new paper. “It’s already one of the 10 most shared papers in the history of that journal on social media,” he said, with 14 shares on Twitter/X vs over a thousand shares on Bluesky (as of 4 pm ET on August 20). “If the goal is showing there’s a more active academic scholarly conversation on Bluesky—I mean, damn,” he said.
“When I talk about fish on Bluesky, people ask me questions about fish. When I talk about fish on Twitter, people threaten to murder my family because we’re Jewish.”
And while there has been a steady drumbeat of op-eds of late in certain legacy media outlets accusing Bluesky of being trapped in its own liberal bubble, Shiffman, for one, has few concerns about that. “I don’t care about this, because I don’t use social media to argue with strangers about politics,” he wrote in his accompanying blog post. “I use social media to talk about fish. When I talk about fish on Bluesky, people ask me questions about fish. When I talk about fish on Twitter, people threaten to murder my family because we’re Jewish.” He compared the current incarnation of Twitter as no better than 4Chan or TruthSocial in terms of the percentage of “conspiracy-prone extremists” in the audience. “Even if you want to stay, the algorithm is working against you,” he wrote.
“There have been a lot of opinion pieces about why Bluesky is not useful because the people there tend to be relatively left-leaning,” Shiffman told Ars. “I haven’t seen any of those same people say that Twitter is bad because it’s relatively right-leaning. Twitter is not a representative sample of the public either.” And given his focus on ocean conservation and science-based, data-driven environmental advocacy, he is likely to find a more engaged and persuadable audience at Bluesky.
The survey results show that at this point, Bluesky seems to have hit a critical mass for the online scientific community. That said, Shiffman, for one, laments that the powerful Black Science Twitter contingent, for example, has thus far not switched to Bluesky in significant numbers. He would like to conduct a follow-up study to look into how many still use Twitter vs those who may have left social media altogether, as well as Bluesky’s demographic diversity—paving the way for possible solutions should that data reveal an unwelcoming environment for non-white scientists.
There are certainly limitations to the present survey. “Because this is such a dynamic system and it’s changing every day, I think if we did this study now versus when we did it six months ago, we’d get slightly different answers and dynamics,” said Wester. “It’s still relevant because you can look at the factors that make people decide to stay or not on Bluesky, to switch to something else, to leave social media altogether. That can tell us something about what makes a healthy, vibrant conversation online. We’re capturing one of the responses: ‘I’ll see you on Bluesky.’ But that’s not the only response. Public science communication is as important now as it’s ever been, so looking at how scientists have pivoted is really important.”
We recently reported on research indicating that social media as a system might well be doomed, since its very structure gives rise to the toxic dynamics that plague so much of social media: filter bubbles, algorithms that amplify the most extreme views to boost engagement, and a small number of influencers hogging the lion’s share of attention. That paper concluded that any intervention strategies were likely to fail. Both Shiffman and Wester, while acknowledging the reality of those dynamics, are less pessimistic about social media’s future.
“I think the problem is not with how social media works, it’s with how any group of people work,” said Shiffman. “Humans evolved in tiny social groupings where we helped each other and looked out for each other’s interests. Now I have to have a fight with someone 10,000 miles away who has no common interest with me about whether or not vaccines are bad. We were not built for that. Social media definitely makes it a lot easier for people who are anti-social by nature and want to stir conflict to find those conflicts. Something that took me way too long to learn is that you don’t have to participate in every fight you’re invited to. There are people who are looking for a fight and you can simply say, ‘No, thank you. Not today, Satan.'”
“The contrast that people are seeing between Bluesky and present-day Twitter highlights that these are social spaces, which means that you’re going to get all of the good and bad of humanity entering into that space,” said Wester. “But we have had new social spaces evolve over our whole history. Sometimes when there’s something really new, we have to figure out the rules for that space. We’re still figuring out the rules for these social media spaces. The contrast in moderation policies and the use (or not) of algorithms between those two platforms that are otherwise very similar in structure really highlights that you can shape those social spaces by creating rules and tools for how people interact with each other.”
Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.
“The underlying issue here is whether US missile defense should remain focused on the threat from rogue states and… accidental launches, and explicitly refrain from countering missile threats from China or Russia,” DesJarlais said. He called the policy of Mutually Assured Destruction “outdated.”
President Donald Trump speaks alongside Secretary of Defense Pete Hegseth in the Oval Office at the White House on May 20, 2025, in Washington, DC. President Trump announced his plans for the Golden Dome, a national ballistic and cruise missile defense system. Credit: Chip Somodevilla/Getty Images
Moulton’s amendment on nuclear deterrence failed to pass the committee in a voice vote, as did another Moulton proposal that would have tapped the brakes on developing space-based interceptors.
But one of Moulton’s amendments did make it through the committee. This amendment, if reconciled with the Senate, would prohibit the Pentagon from developing a privatized or subscription-based missile defense intercept capability. The amendment says the US military can own and operate such a system.
Ultimately, the House Armed Services Committee voted 55–2 to send the NDAA to a vote on the House floor. Then, lawmakers must hash out the differences between the House version of the NDAA with a bill written in the Senate before sending the final text to the White House for President Trump to sign into law.
More questions than answers
The White House says the missile shield will cost $175 billion over the next three years. But that’s just to start. A network of space-based missile sensors and interceptors, as prescribed in Trump’s executive order, will eventually number thousands of satellites in low-Earth orbit. The Congressional Budget Office reported in May that the Golden Dome program may ultimately cost up to $542 billion over 20 years.
The problem with all of the Golden Dome cost estimates is that the Pentagon has not settled on an architecture. We know the system will consist of a global network of satellites with sensors to detect and track missile launches, plus numerous interceptors in orbit to take out targets in space and during their “boost phase” when they’re moving relatively slowly through the atmosphere.
The Pentagon will order more sea- and ground-based interceptors to destroy missiles, drones, and aircraft as they near their targets within the United States. All of these weapons must be interconnected with a sophisticated command and control network that doesn’t yet exist.
Will Golden Dome’s space-based interceptors use kinetic kill vehicles to physically destroy missiles targeting the United States? Or will the interceptors rely on directed energy weapons like lasers or microwave signals to disable their targets? How many interceptors are actually needed?
These are all questions without answers. Despite the lack of detail, congressional Republicans approved $25 billion for the Pentagon to get started on the Golden Dome program as part of the Trump-backed One Big Beautiful Bill Act. The bill passed Congress with a party-line vote last month.
Israel’s Iron Dome aerial defense system intercepts a rocket launched from the Gaza Strip on May 11, 2021. Credit: Jack Guez/AFP via Getty Images
Moulton earned a bachelor’s degree in physics and master’s degrees in business and public administration from Harvard University. He served as a Marine Corps platoon leader in Iraq and was part of the first company of Marines to reach Baghdad during the US invasion of 2003. Moulton ran for the Democratic presidential nomination in 2020 but withdrew from the race before the first primary contest.
The text of our interview with Moulton is published below. It is lightly edited for length and clarity.
Ars: One of your amendments that passed committee would prevent the DoD from using a subscription or pay-for-service model for the Golden Dome. What prompted you to write that amendment?
Moulton: There were some rumors we heard that this is a model that the administration was pursuing, and there was reporting in mid-April suggesting that SpaceX was partnering with Anduril and Palantir to offer this kind of subscription service where, basically, the government would pay to access the technology rather than own the system. This isn’t an attack on any of these companies or anything. It’s a reassertion of the fundamental belief that these are responsibilities of our government. The decision to engage an intercontinental ballistic missile is a decision that the government must make, not some contractors working at one of these companies.
Ars: Basically, the argument you’re making is that war-fighting should be done by the government and the armed forces, not by contractors or private companies, right?
Moulton: That’s right, and it’s a fundamental belief that I’ve had for a long time. I was completely against contractors in Iraq when I was serving there as a younger Marine, but I can’t think of a place where this is more important than when you’re talking about nuclear weapons.
Ars: One of the amendments that you proposed, but didn’t pass, was intended to reaffirm the nation’s strategy of nuclear deterrence. What was the purpose of this amendment?
Moulton: Let’s just start by saying this is fundamentally why we have to have a theory that forms a foundation for spending hundreds of billions of taxpayer dollars. Golden Dome has no clear design, no real cost estimate, and no one has explained how this protects or enhances strategic stability. And there’s a lot of evidence that it would make strategic stability worse because our adversaries would no longer have confidence in Mutual Assured Destruction, and that makes them potentially much more likely to initiate a strike or overreact quickly to some sort of confrontation that has the potential to go nuclear.
In the case of the Russians, it means they could activate their nuclear weapon in space and just take out our Golden Dome interceptors if they think we might get into a nuclear exchange. I mean, all these things are horrific consequences.
Like I said in our hearing, there are two explanations for Golden Dome. The first is that every nuclear theorist for the last 75 years was wrong, and thank God, Donald Trump came around and set us right because in his first administration and every Democratic and Republican administration, we’ve all been wrong—and really the future of nuclear deterrence is nuclear defeat through defense and not Mutually Assured Destruction.
The other explanation, of course, is that Donald Trump decided he wants the golden version of something his friend has. You can tell me which one’s more likely, but literally no one has been able to explain the theory of the case. It’s dangerous, it’s wasteful… It might be incredibly dangerous. I’m happy to be convinced that Golden Dome is the right solution. I’m happy to have people explain why this makes sense and it’s a worthwhile investment, but literally nobody has been able to do that. If the Russians attack us… we know that this system is not going to be 100 percent effective. To me, that doesn’t make a lot of sense. I don’t want to gamble on… which major city or two we lose in a scenario like that. I want to prevent a nuclear war from happening.
Several Chinese DF-5B intercontinental ballistic missiles, each capable of delivering up to 10 independently maneuverable nuclear warheads, are seen during a parade in Beijing on September 3, 2015. Credit: Xinhua/Pan Xu via Getty Images
Ars: What would be the way that an administration should propose something like the Golden Dome? Not through an executive order? What process would you like to see?
Moulton: As a result of a strategic review and backed up by a lot of serious theory and analysis. The administration proposes a new solution and has hearings about it in front of Congress, where they are unafraid of answering tough questions. This administration is a bunch of cowards who can who refuse to answer tough questions in Congress because they know they can’t back up their president’s proposals.
Ars: I’m actually a little surprised we haven’t seen any sort of architecture yet. It’s been six months, and the administration has already missed a few of Trump’s deadlines for selecting an architecture.
Moulton: It’s hard to develop an architecture for something that doesn’t make sense.
Ars: I’ve heard from several retired military officials who think something like the Golden Dome is a good idea, but they are disappointed in the way the Trump administration has approached it. They say the White House hasn’t stated the case for it, and that risks politicizing something they view as important for national security.
Moulton: One idea I’ve had is that the advent of directed energy weapons (such as lasers and microwave weapons) could flip the cost curve and actually make defense cheaper than offense, whereas in the past, it’s always been cheaper to develop more offensive capabilities rather than the defensive means to shoot at them.
And this is why the Anti-Ballistic Missile Treaty in the early 1970s was so effective, because there was this massive arms race where we were constantly just creating a new offensive weapon to get around whatever defenses our adversary proposed. The reason why everyone would just quickly produce a new offensive weapon before that treaty was put into place is because it was easy to do.
My point is that I’ve even thrown them this bone, and I’m saying, ‘Here, maybe that’s your reason, right?” And they just look at me dumbfounded because obviously none of them are thinking about this. They’re just trying to be lackeys for the president, and they don’t recognize how dangerous that is.
Ars: I’ve heard from a chorus of retired and even current active duty military leaders say the same thing about directed energy weapons. You essentially can use one platform in space take take numerous laser shots at a missile instead of expending multiple interceptors for one kill.
Moulton: Yes, that’s basically the theory of the case. Now, my hunch is that if you actually did the serious analysis, you would determine that it still decreases state strategic stability. So in terms of the overall safety and security of the United States, whether it’s directed energy weapons or kinetic interceptors, it’s still a very bad plan.
But I’m even throwing that out there to try to help them out here. “Maybe this is how you want to make your case.” And they just look at me like deer in the headlights because, obviously, they’re not thinking about the national security of the United States.
Ars: I also wanted to ask about the Space Force’s push to develop weapons to use against other satellites in orbit. They call these counter-space capabilities. They could be using directed energy, jamming, robotic arms, anti-satellite missiles. This could take many different forms, and the Space Force, for the first time, is talking more openly about these issues. Are these kinds of weapons necessary, in your view, or are they too destabilizing?
Moulton: I certainly wish we could go back to a time when the Russians and Chinese were not developing space weapons—or were not weaponizing space, I should say, because that was the international agreement. But the reality of the world we live in today is that our adversaries are violating that agreement. We have to be prepared to defend the United States.
Ars: Are there any other space policy issues on your radar or things you have concerns about?
Moulton: There’s a lot. There’s so much going on with space, and that’s the reason I chose this subcommittee, even though people would expect me to serve on the subcommittee dealing with the Marine Corps, because I just think space is incredibly important. We’re dealing with everything from promotion policy in the Space Force to acquisition reform to rules of engagement, and anything in between. There’s an awful lot going on there, but I do think that one of the most important things to talk about right now is how dangerous the Golden Dome could be.
For many beer lovers, a nice thick head of foam is one of life’s pure pleasures, and the longer that foam lasts, the better the beer-drinking experience. A team of Swiss researchers spent seven years studying why some beer foams last longer than others and found that the degree of fermentation—i.e., whether a given beer has been singly, doubly, or triply fermented—is crucial, according to a new paper published in the journal Physics of Fluids.
As previously reported, foams are ubiquitous in everyday life, found in foods (whipped cream), beverages (beer, cappuccino), shaving cream and hair-styling mousse, packing peanuts, building insulation, flame-retardant materials, and so forth. All foams are the result of air being beaten into a liquid formula that contains some kind of surfactant (active surface agent), usually fats or proteins in edible foams, or chemical additives in non-edible products. That surfactant strengthens the liquid film walls of the bubbles to keep them from collapsing.
Individual bubbles typically form a sphere because that’s the shape with the minimum surface area for any volume and hence is the most energy-efficient. One reason for the minimizing principle when it comes to a bubble’s shape is that many bubbles can then tightly pack together to form a foam. But bubbles “coarsen” over time, the result of gravity pulling down on the liquid and thinning out the walls. Eventually, they start to look more like soccer balls (polyhedrons). In a coarsening foam, smaller bubbles are gradually absorbed by larger ones. There is less and less liquid to separate the individual bubbles, so they press together to fill the space.
This “jamming” is why foams are typically far more rigid than their gas (95 percent) and liquid (5 percent) components. The more tightly the bubbles jam together, the less they can move around and the greater the pressure inside them becomes, giving them properties of a solid.
Various factors can affect foam stability. For instance, in 2019, Japanese researchers investigated a phenomenon known as “collective bubble collapse,” or CBC, in which breaking one bubble at the edge of a foam results in a cascading effect as the breakage spreads to other bubbles in the foam. They identified two distinct mechanisms for the resulting CBCs: a so-called “propagating mode,” in which a broken bubble is absorbed into the liquid film, and a “penetrating mode,” in which the breakage of a bubble causes droplets to shoot off and hit other bubbles, causing them to break in turn.
In early June, shortly after the beginning of the Atlantic hurricane season, Google unveiled a new model designed specifically to forecast the tracks and intensity of tropical cyclones.
Part of the Google DeepMind suite of AI-based weather research models, the “Weather Lab” model for cyclones was a bit of an unknown for meteorologists at its launch. In a blog post at the time, Google said its new model, trained on a vast dataset that reconstructed past weather and a specialized database containing key information about hurricanes tracks, intensity, and size, had performed well during pre-launch testing.
“Internal testing shows that our model’s predictions for cyclone track and intensity are as accurate as, and often more accurate than, current physics-based methods,” the company said.
Google said it would partner with the National Hurricane Center, an arm of the National Oceanic and Atmospheric Service that has provided credible forecasts for decades, to assess the performance of its Weather Lab model in the Atlantic and East Pacific basins.
All eyes on Erin
It had been a relatively quiet Atlantic hurricane season until a few weeks ago, with overall activity running below normal levels. So there were no high-profile tests of the new model. But about 10 days ago, Hurricane Erin rapidly intensified in the open Atlantic Ocean, becoming a Category 5 hurricane as it tracked westward.
From a forecast standpoint, it was pretty clear that Erin was not going to directly strike the United States, but meteorologists sweat the details. And because Erin was such a large storm, we had concerns about how close Erin would get to the East Coast of the United States (close enough, it turns out, to cause some serious beach erosion) and its impacts on the small island of Bermuda in the Atlantic.
In a statement to Politico’s E&E News days after the order was lifted in May, the White House claimed that Hochul “caved” and struck an agreement to allow “two natural gas pipelines to advance” through New York.
Hochul denied that any such deal was made.
Trump has made no effort to conceal his disdain for wind power and other renewable energies, and his administration has actively sought to stymie growth in the industry while providing what critics have described as “giveaways” to fossil fuels.
In a Truth Social post on Wednesday, Trump called wind and solar energy the “SCAM OF THE CENTURY,” criticizing states that have built and rely on them for power.
“We will not approve wind or farmer destroying Solar,” Trump wrote. “The days of stupidity are over in the USA!!!”
On Trump’s first day in office, the president issued a memorandum halting approvals, permits, leases, and loans for both offshore and onshore wind projects.
The GOP also targeted wind energy in the One Big Beautiful Bill Act, accelerating the phaseout of tax credits for wind and solar projects while mandating lease sales for fossil fuels and making millions of acres of federal land available for mining.
The administration’s subsequent consideration of rules to further restrict access to tax credits for wind and solar projects alarmed even some Republicans, prompting Iowa Sen. Chuck Grassley and Utah Sen. John Curtis to place holds on Treasury nominees as they awaited the department’s formal guidance.
Those moves have rattled the wind industry and created uncertainty about the viability of ongoing and future projects.
“The unfortunate message to investors is clear: the US is no longer a reliable place for long-term energy investments,” said the American Clean Power Association, a trade association, in a statement on Friday.
To Kathleen Meil, local clean energy deployment director at the League of Conservation Voters, that represents a loss not only for the environment but also for the US economy.
“It’s really easy to think about the visible—the 4,200 jobs across all phases of development that you see… They’ve hit more than 2 million union work hours on Revolution Wind,” Meil said.
“But what’s also really transformational is that it’s already triggered $1.3 billion in investment through the supply chain. So it’s not just coastal communities that are benefiting from these jobs,” she said.
“This hurts so many people. And why? There’s just no justification.”
This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.
Mexican tetras in pitch-black caverns had no use for the energetically costly organs.
Photographs of Astyanax mexicanus, surface form with eyes (top) and cave form without eyes (bottom). Credit: Daniel Castranova, NICHD/NIH
Photographs of Astyanax mexicanus, surface form with eyes (top) and cave form without eyes (bottom). Credit: Daniel Castranova, NICHD/NIH
Time and again, whenever a population was swept into a cave and survived long enough for natural selection to have its way, the eyes disappeared. “But it’s not that everything has been lost in cavefish,” says geneticist Jaya Krishnan of the Oklahoma Medical Research Foundation. “Many enhancements have also happened.”
Though the demise of their eyes continues to fascinate biologists, in recent years, attention has shifted to other intriguing aspects of cavefish biology. It has become increasingly clear that they haven’t just lost sight but also gained many adaptations that help them to thrive in their cave environment, including some that may hold clues to treatments for obesity and diabetes in people.
Casting off expensive eyes
It has long been debated why the eyes were lost. Some biologists used to argue that they just withered away over generations because cave-dwelling animals with faulty eyes experienced no disadvantage. But another explanation is now considered more likely, says evolutionary physiologist Nicolas Rohner of the University of Münster in Germany: “Eyes are very expensive in terms of resources and energy. Most people now agree that there must be some advantage to losing them if you don’t need them.”
Scientists have observed that mutations in different genes involved in eye formation have led to eye loss. In other words, says Krishnan, “different cavefish populations have lost their eyes in different ways.”
Meanwhile, the fishes’ other senses tend to have been enhanced. Studies have found that cave-dwelling fish can detect lower levels of amino acids than surface fish can. They also have more tastebuds and a higher density of sensitive cells alongside their bodies that let them sense water pressure and flow.
Regions of the brain that process other senses are also expanded, says developmental biologist Misty Riddle of the University of Nevada, Reno, who coauthored a 2023 article on Mexican tetra research in the Annual Review of Cell and Developmental Biology. “I think what happened is that you have to, sort of, kill the eye program in order to expand the other areas.”
Killing the processes that support the formation of the eye is quite literally what happens. Just like non-cave-dwelling members of the species, all cavefish embryos start making eyes. But after a few hours, cells in the developing eye start dying, until the entire structure has disappeared. Riddle thinks this apparent inefficiency may be unavoidable. “The early development of the brain and the eye are completely intertwined—they happen together,” she says. That means the least disruptive way for eyelessness to evolve may be to start making an eye and then get rid of it.
In what Krishnan and Rohner have called “one of the most striking experiments performed in the field of vertebrate evolution,” a study published in 2000 showed that the fate of the cavefish eye is heavily influenced by its lens. Scientists showed this by transplanting the lens of a surface fish embryo to a cavefish embryo, and vice versa. When they did this, the eye of the cavefish grew a retina, rod cells, and other important parts, while the eye of the surface fish stayed small and underdeveloped.
Starving and bingeing
It’s easy to see why cavefish would be at a disadvantage if they were to maintain expensive tissues they aren’t using. Since relatively little lives or grows in their caves, the fish are likely surviving on a meager diet of mostly bat feces and organic waste that washes in during the rainy season. Researchers keeping cavefish in labs have discovered that, genetically, the creatures are exquisitely adapted to absorbing and storing nutrients. “They’re constantly hungry, eating as much as they can,” Krishnan says.
Intriguingly, the fish have at least two mutations that are associated with diabetes and obesity in humans. In the cavefish, though, they may be the basis of some traits that are very helpful to a fish that occasionally has a lot of food but often has none. When scientists compare cavefish and surface fish kept in the lab under the same conditions, cavefish fed regular amounts of standard fish food “get fat. They get high blood sugar,” Rohner says. “But remarkably, they do not develop obvious signs of disease.”
Fats can be toxic for tissues, Rohner explains, so they are stored in fat cells. “But when these cells get too big, they can burst, which is why we often see chronic inflammation in humans and other animals that have stored a lot of fat in their tissues.” Yet a 2020 study by Rohner, Krishnan, and their colleagues revealed that even very well-fed cavefish had fewer signs of inflammation in their fat tissues than surface fish do.
Even in their sparse cave conditions, wild cavefish can sometimes get very fat, says Riddle. This is presumably because, whenever food ends up in the cave, the fish eat as much of it as possible, since there may be nothing else for a long time to come. Intriguingly, Riddle says, their fat is usually bright yellow, because of high levels of carotenoids, the substance in the carrots that your grandmother used to tell you were good for your… eyes.
“The first thing that came to our mind, of course, was that they were accumulating these because they don’t have eyes,” says Riddle. In this species, such ideas can be tested: Scientists can cross surface fish (with eyes) and cavefish (without eyes) and look at what their offspring are like. When that’s done, Riddle says, researchers see no link between eye presence or size and the accumulation of carotenoids. Some eyeless cavefish had fat that was practically white, indicating lower carotenoid levels.
Instead, Riddle thinks these carotenoids may be another adaptation to suppress inflammation, which might be important in the wild, as cavefish are likely overeating whenever food arrives.
Studies by Krishnan, Rohner, and colleagues published in 2020 and 2022 have found other adaptations that seem to help tamp down inflammation. Cavefish cells produce lower levels of certain molecules called cytokines that promote inflammation, as well as lower levels of reactive oxygen species — tissue-damaging byproducts of the body’s metabolism that are often elevated in people with obesity or diabetes.
Krishnan is investigating this further, hoping to understand how the well-fed cavefish remain healthy. Rohner, meanwhile, is increasingly interested in how cavefish survive not just overeating, but long periods of starvation, too.
No waste
On a more fundamental level, researchers still hope to figure out why the Mexican tetra evolved into cave forms while any number of other Mexican river fish that also regularly end up in caves did not. (Globally, there are more than 200 cave-adapted fish species, but species that also still have populations on the surface are quite rare.) “Presumably, there is something about the tetras’ genetic makeup that makes it easier for them to adapt,” says Riddle.
Though cavefish are now well-established lab animals used in research and are easy to purchase for that purpose, preserving them in the wild will be important to safeguard the lessons they still hold for us. “There are hundreds of millions of the surface fish,” says Rohner, but cavefish populations are smaller and more vulnerable to pressures like pollution and people drawing water from caves during droughts.
One of Riddle’s students, David Perez Guerra, is now involved in a committee to support cavefish conservation. And researchers themselves are increasingly careful, too. “The tissues of the fish collected during our lab’s last field trip benefited nine different labs,” Riddle says. “We wasted nothing.”
“Our capsule’s engines are not pointed in the right direction for optimum boost,” said Sarah Walker, SpaceX’s director of Dragon mission management. “So, this trunk module has engines pointed in the right direction to maximize efficiency of propellant usage.”
When NASA says it’s the right time, SpaceX controllers will command the Draco thrusters to ignite and gently accelerate the massive 450-ton complex. All told, the reboost kit can add about 20 mph, or 9 meters per second, to the space station’s already-dizzying speed, according to Walker.
Spetch said that’s roughly equivalent to the total reboost impulse provided by one-and-a-half Russian Progress cargo vehicles. That’s about one-third to one-fourth of the total orbit maintenance the ISS needs in a year.
“The boost kit will help sustain the orbiting lab’s altitude, starting in September, with a series of burns planned periodically throughout the fall of 2025,” Spetch said.
After a few months docked at the ISS, the Dragon cargo capsule will depart and head for a parachute-assisted splashdown in the Pacific Ocean off the coast of California. SpaceX will recover the pressurized capsule to fly again, while the trunk containing the reboost kit will jettison and burn up in the atmosphere.
SpaceX’s Dragon spacecraft approaches the International Space Station for docking at 7: 05 am EDT (11: 05 UTC) on Monday. Credit: NASA TV/Ars Technica
While this mission is SpaceX’s 33rd cargo flight to the ISS under the auspices of NASA’s multibillion-dollar Commercial Resupply Services contract, it’s also SpaceX’s 50th overall Dragon mission to the outpost. This tally includes 17 flights of the human-rated Crew Dragon.
“With CRS-33, we’ll mark our 50th voyage to ISS,” Walker said. “Just incredible. Together, these missions have (carried) well over 300,000 pounds of cargo and supplies to the orbiting lab and well over 1,000 science and research projects that are not only helping us to understand how to live and work effectively in space… but also directly contributing to critical research that serves our lives here on Earth.”
Future Dragon trunks will be able to accommodate a reboost kit or unpressurized science payloads, depending on NASA’s needs at the space station.
The design of the Dragon reboost kit is a smaller-scale version of what SpaceX will build for a much larger Dragon trunk under a $843 million contract signed with NASA last year for the US Deorbit Vehicle. This souped-up Dragon will dock with the ISS and steer it back into the atmosphere after the lab’s decommissioning in the early 2030s. The deorbit vehicle will have 46 Draco thrusters—16 to control the craft’s orientation and 30 in the trunk to provide the impulse needed to drop the station out of orbit.