Author name: Shannon Garcia

stoke-space-gives-us-another-reason-to-take-it-very-seriously

Stoke Space gives us another reason to take it very seriously

Stoke Space announced a significant capital raise on Wednesday, a total of $510 million as part of Series D funding. The new financing doubles the total capital raised by Stoke Space, founded in 2020, to $990 million.

The infusion of money will provide the company with “the runway to complete development” of the Nova rocket and demonstrate its capability through its first flights, said Andy Lapsa, the company’s co-founder and chief executive, in a news release characterizing the new funding.

Stoke is working toward a 2026 launch of the medium-lift Nova rocket. The rocket’s innovative design is intended to be fully reusable from the payload fairing on down, with a regeneratively cooled heat shield on the vehicle’s second stage. In fully reusable mode, Nova will have a payload capacity of 3 metric tons to low-Earth orbit, and up to 7 tons in fully expendable mode.

Another bright fundraising star

There are some striking parallels between Stoke Space’s latest fundraising announcement and another forward-leaning launch company, Relativity Space. The latter was founded in 2016 with the promise of 3D-printing a rocket nearly in its entirety.

In November 2020, Relativity disclosed its own Series D funding, $500 million. At the time, the company had about 230 employees and was planning a launch the following year. Stoke presently has about 280 employees and intends to launch Nova next year.

Instead of lifting off in 2021, however, Relativity’s Terran 1 rocket would not launch for the first time until 2023, and since that time, the company has not flown again. In fact, Relativity nearly filed for bankruptcy last year before it received a large infusion of cash from Eric Schmidt, the former Google executive. Relativity has now largely abandoned additive manufacturing rockets and is focused on the development of a more traditional rocket, the Terran R vehicle.

Stoke Space gives us another reason to take it very seriously Read More »

nepa,-permitting-and-energy-roundup-#2

NEPA, Permitting and Energy Roundup #2

It’s been about a year since the last one of these. Given the long cycle, I have done my best to check for changes but things may have changed on any given topic by the time you read this.

NEPA is a constant thorn in the side of anyone attempting to do anything.

A certain kind of person responds with: “Good.”

That kind of person does not want humans to do physical things in the world.

  1. They like the world as it is, or as it used to be.

  2. They do not want humans messing with it further.

  3. They often also think humans are bad, and should stop existing entirely.

  4. Or believe humans deserve to suffer or do penance.

  5. Or do not trust people to make good decisions and safeguard what matters.

  6. To them: If humans want to do something to the physical world?

  7. That intention is highly suspicious.

  8. We probably should not let them do that.

This is in sharp contrast with the type of person who:

  1. Cares about the environment.

  2. Who wants good things rather than bad to happen to people.

  3. Who wants the Earth not to boil and the air to be clean and so on.

That person notices that NEPA long ago started doing more harm than good.

The central problem lies is the core structure.

NEPA is based on following endlessly expanding procedural requirements. NEPA does not ask whether costs exceed benefits, or whether something is a good idea.

It only asks about whether procedure was followed sufficiently, or whether blame can be identified somewhere.

Is to never go full NEPA.

Instead, one of Balsa’s central policy goals is an entire reimagining of NEPA.

The proposal is to replace NEPA’s procedural requirement with (when necessary) an analysis of costs and benefits, followed by a vote of stakeholders on whether to proceed. Ask the right question, whether the project is worthwhile, not the wrong question of what paperwork is in order.

This post is not about laying out that procedure. This post is mostly about telling various Tales From the NEPA. It is also telling tales of energy generation from around the world, including places that do not share our full madness.

Versions and components of this post have been in my drafts for a long time, so not all of them will be as recent as is common here.

That was the plan, but the vibes have changed, and NEPA is pretty clearly a large net negative for climate change, which has to win in a fight at this point over the local concerns it protects. There’s a new plan.

Kill it. Repeal NEPA. Full stop.

Emmett Shear: Previously I believed that there was probably enough protection offered by NEPA / CEQA that it offset the damage. At this point, it’s pretty clear we should simply repeal it and figure out if we need to replace anything later.

Repeal NEPA.

Eli Dourado: NEPA is the most harmful law in the United States and must be repealed. In addition to causing forest fires and miring Starbase in litigation, it results in delays and endless litigation for any project that the federal government touches. It should be target #1 for DOGE.

Sadly this is not yet within the Overton window of relevant Congressional staff. We need to make this happen.

The problem is DOGE is working via cutting off payments, which doesn’t let you hit NEPA. But if you want to strike a blow that matters? This is it.

Thomas Hochman: Trump has revoked Carter’s 1977 EO – the one that empowered CEQ to issue binding NEPA regulations.

This could dramatically reshape how federal agencies conduct NEPA reviews. In the post-Marin Audobon landscape, this is a HUGE deal!

Let’s walk through a few of the specifics.

CEQ will formally propose repealing the existing NEPA regulations that have guided agencies since the late 1970s.

This is major: those regs currently supply the standard NEPA procedures (e.g., EIS format, “major federal action,” significance criteria, scoping, etc.).

Rescinding them will leave agencies free to adopt leaner, agency-specific processes—or rely on new guidance.

CEQ will lead a “working group” composed of representatives from various agencies.

This group’s job is to develop or revise each agency’s own NEPA procedures so that they’re consistent with the new (post-rescission) approach.

As I wrote in Green Tape, establishing this internal guidance at the agency level will be crucial.

And finally: general permits and permits-by-rule!!!

Eli Dourado: NEPA is still there but CEQ’s authority to issue regs is gone (and was already under dispute in the courts). NEPA the statute still applies.

Cremieux: Some pretty major components of permitting reform on day one might be the biggest news in the day one EOs.

There’s trillions in value in these EOs.

I am delighted.

You love to see it. This day one move gave me a lot of hope things would go well, alas other things happened that were less good for my hopes.

As with everything Trump Administration, we will see what actually happens once the lawyers get involved. This is not an area where ‘ignore the law and simply do things’ seems likely to work out. Some of it will stick, but how much?

Yay nuclear deregulation, yes obviously, Alex Tabarrok opens with ‘yes, I know how that sounds’ but actually it sounds great if you’re familiar with the current regulations. I do see why one would pause before going to the level of ‘treat small modular reactors like x-ray machines,’ I’d want to see the safety case and all that, but probably.

Nuclear is making an attempted comeback, now that AI and trying the alternative of doing nothing has awoken everyone to the idea that this will be a good idea.

Alexander Kaufman: The Senate voted nearly unanimously (88-2) to pass major legislation designed to reverse the American nuclear industry’s decades-long decline and launch a reactor-building spree to meet surging demand for green electricity at home and to catch up with booming rivals overseas.

The bill slashes the fees the Nuclear Regulatory Commission charges developers, speeds up the process for licensing new reactors and hiring key staff, and directs the agency to work with foreign regulators to open doors for U.S. exports.

The NRC is also tasked with rewriting its mission statement to avoid unnecessarily limiting the “benefits of nuclear energy technology to society,” essentially reinterpreting its raison d’être to include protecting the public against the dangers of not using atomic power in addition to whatever safety threat reactors themselves pose.

There is a lot of big talk about how much this will change the rules on nuclear power regulation. As usual I remain skeptical of big impacts, but it would not take much to reach a tipping point. As the same post notes when discussing the reactors in Georgia, once you relearn what you are doing things get a lot better and cheaper.

That pair of reactors, which just came online last month at the Alvin W. Vogtle Electric Generating Plant in Georgia, cost more than $30 billion. As the expenses mounted, other projects to build the same kind of reactor elsewhere in the country were canceled.

The timing could hardly have been worse. After completing the first reactor, the second one cost far less and came online faster. But the disastrous launch dissuaded any other utilities from investing in a third reactor, which economists say would take even less time and money now that the supply chains, design and workforce are established.

After seeing the results, the secretary of energy called for ‘hundreds’ more large nuclear reactions, two hundred by 2050.

NextEra looking to restart a nuclear plant in Iowa that closed in 2020.

Ontario eyeing a new nuclear plant near Port Hope, 8-10 GWs.

It seems the world is a mix of people who shut down nuclear power out of spite and mood affiliation and intuitions that nuclear is dangerous or harmful when it is orders of magnitudes safer and fully green, versus those who realize we should be desperate to build more.

Matthew Yglesias: The all-time energy champ

Matthew Yglesias: The Fukushima incident was deadly not because anyone died in the accident but because the post-Fukushima nuclear shutdown caused more Japanese people to freeze to death to conserve energy.

Dean Ball is excited by the bill, including its prize for next-gen nuclear tech and the potential momentum for future action.

There is a long way to go. It seems we do things like this? And the lifetime for nuclear power plants has nothing to do with their physical capabilities or risks?

Alec Stapp: Apparently we have been arbitrarily limiting licenses for nuclear power reactors to 40 years because of… “antitrust considerations”??

Nuclear Regulatory Commission: The Atomic Energy Act authorizes the Nuclear Regulatory Commission to issue licenses for commercial power reactors to operate for up to 40 years. These licenses can be renewed for an additional 20 years at a time. The period after the initial licensing terris known as the period of extended operation. Economic and antitrust considerations, not limitations of nuclear technology, determined the original 40-year term for reactor licenses. However, because of this selected time period, some systems, structures, and components may have been engineered on the basis of an expected 40-year service life.

Or how many nuclear engineers does it take to change a light bulb? $50k worth.

How much for a $200 panel meter in a control room? Trick question, it’s $20k.

And yet nuclear is still at least close to cost competitive.

The Senate also previously forced Biden to drop attempt to renominate Jeff Baran to the Nuclear Regulatory Commission (NRC), the the basis of Baran being starkly opposed to the concept of building nuclear power plants.

Why has Biden effectively opposed nuclear power? My model is that it is the same reason he is effectively opposing power transmission and green energy infrastructure. Biden thinks throwing money and rhetoric at problems makes solutions happen. He does not understand, even in his best moments, that throwing up or not removing barriers to doing things stops those things from happening even when that was not your intention.

Thus, he can also do things like offer $1.5 billion in conditional commitments to support recommissioning a Michigan nuclear power plant, because he understands that more nuclear power plants would be a good thing. And he can say things like ‘White House to support new nuclear power plants in the U.S.’ That does not have to cause him to, in general, do the things that cause there to be more nuclear power plants. Because he cannot understand that those are things like ‘appoint people to the NRC that might ever want to approve a new nuclear power plant in practice.’ Luckily, it sounds like the new bill does indeed help.

Small modular nuclear reactor (SMR) planned for Idaho, called most advanced in the nation, was cancelled in January after customers could not be found to buy the electricity. Only a few months later, everyone is scrambling for more electricity to run their data centers. It seems like if you build it, Microsoft or Google or Amazon will be happy to plop a data center next to that shiny new reactor, no? And certainly plenty of other places would welcome one. So odd that this got slated first for Idaho.

Alberta signs deal to jointly assess the development and deployment of SMRs. One SMR is to be built in Ontario by end of 2028, to be online in 2029.

Slovakia to build a new nuclear reactor. Also talk of increased capacity in France, Italy, Britain, Japan, Canada, Poland and The Netherlands in the thread, from May. From December 2023: Poland authorizes 24 new small nuclear plants.

Philippines are considering nuclear as well.

Support for Nuclear in Australia has increased dramatically to 61%-37%.

Claim that the shutdown of nuclear power in Germany was even more corrupt than we realized, with the Green Party altering expert conclusions to stop a reconsideration. The claims have been denied.

Unfortunately, we are allowing an agreement whereby Korea Hydro & Nuclear Power (KHNP) will not be allowed to bid on new nuclear projects in Western countries, due to an IP issue with Westinghouse, on top of them paying royalties for any Asian projects that move forward. The good news is that if Westinghouse wins the projects, KHNP and KEPCO are prime sub-contractors anyway, so it is unclear this is that much of a functional

India’s energy mix is rapidly improving.

John Raymond Hanger: Good morning with good news: Solar and wind were 92% of India’s generation additions in 2022. It deployed as much solar in 2022 as the UK has ever built. Coal also was down 78%.

India’s large wind & solar additions are vital climate action. Wonderful!

David Bryan: Confusingly written. Coal in India is at 55%. Wind is at 10% & solar is at 12% – sometimes more, sometimes less.

A Zaugurz: mmmkay “India has an estimated 65.3 GW of proposed, on-grid coal capacity under active development: 30.4 GW under construction and 34.9 GW in pre-construction”

Stocks are different from flows are different from changes in flow.

India was still adding more coal capacity even as of December. But almost all of their new capacity was Solar and Wind, and they are clearly turning the corner on new additions. One still has to then make emissions go down, and then make net emissions drop below zero. One step at a time.

Also, 15% of the installed base is already not bad at all. Renewables are a big deal. A shame nuclear is only 2%.

Khavda in India, now the world’s largest renewable energy park using a combination of solar and wind energy.

Back in America, who is actually building the most solar?

Why, Texas, of course. California talks a good game, but what matters most (aside from sunlight where California has the edge) is not getting in the way.

EIAGov: More than half of the new utility-scale solar capacity scheduled to come online in 2024 is planned for three states: Texas (35%), California (10%) and Florida (6%).

Alec Stapp: Blue states talk a big game on clean energy goals while Texas just goes and builds it.

Texas is building grid-scale solar at a much faster rate than California.

Can’t be due to regulations — must be because CA is a small state with little sunshine 🙃

The numbers mean that despite being the state with at least the third most sunlight after Arizona and New Mexico, California is bringing online solar per capita than the nation overall.

If you want to install home solar, it is going to get expensive in the sense that the cost of the panels themselves is now less than 10% of your all-in price.

Patrick Collison: Grid storage to grow 80% in 2024.

This is a great start, but still a drop in the bucket, as I understand it, compared to what we will need if we intend to largely rely on solar and wind in the future.

One enemy of transmission lines and other grid capabilities are NIMBYs who block projects. This includes the projects that never get proposed because of anticipation that they would then be blocked, or would require time and money to not be blocked.

Tyler Cowen reprints an anonymous email he got, that notes that there is also an incentive problem.

When you increase power transmission capacity, you make power fungible between areas. Which is good, unless you are in the power selling business, in which case this could mean more competition and less profit. By sticking to smaller local projects, you can both avoid scrutiny and mostly get the thing actually built, and also avoid competition.

That makes a lot of sense. It suggests we need to look at who is tasked with building new transmission lines, and who should be bearing the costs, including the need to struggle to make the plans and ensure they actually happen.

Why do we produce so little energy in America? Partly because it is so cheap.

Alex Tabarrok: The US has some of the lowest electricity prices in the world. Shown below are industrial retail electricity prices in EU27, USA, UK, China and Japan. Electricity is critical for AI compute, electric cars and more generally reducing carbon footprints. The US needs to build much more electricity infrastructure, by some estimates tripling or quadrupling production. That’s quite possible with deregulation and permitting reform. I am pleased to learn, moreover, that we are starting from a better base than I had imagined.

Amazing how much prices elsewhere have risen lately, and how timid has been everyone’s response.

Harvard was going to do something useful and run a geoengineering experiment. They cancelled it, because of course they did. And their justifications were, well…

James Temple (MIT Technology Review): Proponents of solar geoengineering research argue we should investigate the concept because it may significantly reduce the dangers of climate change. Further research could help scientists better understand the potential benefits, risks and tradeoffs between various approaches. 

But critics argue that even studying the possibility of solar geoengineering eases the societal pressure to cut greenhouse gas emissions.

Maxwell Tabarrok: The moral hazard argument against geoengineering is ridiculous. The central problem of climate change is that firms ignore the cost of carbon emissions.

Since these costs are already ignored, decreasing them will not change their actions, but it will save lives.

It is difficult to grasp how horrible this reasoning actually is. I can’t even. Imagine this principle extended to every other bad thing.

Yes, actually implementing such solutions comes with a lot of costs and dangers. That makes it seem like a good idea to learn now what those are via experiments? Better to find out now than to wait until the crisis gets sufficiently acute that people or nations get desperate?

The alternative hypothesis is that many people who claim to care about the climate crisis are remarkably uninterested in the average temperatures in the world not going up. We have a lot of evidence for this hypothesis.

It goes like this.

Chris Elmendorf: A $650m project would:

– subtract 20 acres from wildlife refuge

– add 35 acres to same refuge

– connect 160 renewable energy projects to grid

Not with NEPA + local enviros standing in the way. Even after “years” of enviro study.

Kevin Stevens: An environmental group successfully blocked the last miles of a nearly complete 102 mile transmission line that would connect 160 renewable sites to the Midwest. Brutal.

I mean it’s completely insane that we would let 20 acres stop this at all, the cost/benefit is so obviously off the charts even purely for the environmental impacts alone. But also they are adding 35 other acres. At some point, you have to wonder why you are negotiating with people who are never willing to take any deal at all.

The answer is, you are forced to ‘negotiate,’ they pretend to do so back, you give them concessions like the above, and then they turn around and keep suing, with each step adding years of delay. The result is known as a ‘doom loop.

Clean energy projects are the very projects most likely to get stuck in the litigation doom loop. A recent Stanford study found that clean energy projects are disproportionately subject to the strictest level of review. These reviews are also litigated at higher rates — 62% of the projects currently pending the strictest review are clean energy projects. The best emissions modelers show that our emissions reductions goals are not possible without permitting reform.

That is why we’re proposing a time limit on injunctions. Under our proposal, after four years of litigation and review, courts could no longer prevent a project from beginning construction. This solution would pair nicely with the two-year deadlines imposed on agencies to finish review in the Fiscal Responsibility Act. If the courts believe more environmental review is necessary, they could order the government to perform it, but they could no longer paralyze new energy infrastructure construction.

This kills projects, and not the ones you want to kill. I am actually surprised the graph here lists rates that are this low.

If we are not going to do any other modifications, a time limit on court challenges seems like the very least we can do. My preferred solution is to change the structure entirely.

The good news is that some actions are exempt. But the exemptions are illustrative.

Thomas Hochman: Perhaps the funniest categorical exclusion under NEPA is the one that allows the Department of the Interior to make an arrest without filling out an environmental assessment.

Alec Stapp: When everything qualifies as a “major federal action” under NEPA, you get absurd outcomes like this where agencies have to waste time creating categorical exclusions for every little thing.

This is how state capacity withers and dies.

So in practice, what does NEPA look like?

Congestion Pricing in NYC was a case in point before Hochul betrayed us.

It looks like this, seriously, read how the UFT itself made its claims.

United Federation of Teachers: In our lawsuit, we assert that this program, scheduled to go into effect this spring, cannot be put in place without the completion of a thorough environmental impact statement that includes the potential effects of the plan on the city’s air quality.

In our lawsuit, we assert that this program, scheduled to go into effect this spring, cannot be put in place without the completion of a thorough environmental impact statement that includes the potential effects of the plan on the city’s air quality.

The current plan would not eliminate air and noise pollution or traffic, but would simply shift that pollution and traffic to the surrounding areas, particularly Staten Island, the Bronx, upper Manhattan and Northern New Jersey, causing greater environmental injustice in our city.

[Copy of lawsuit here.]

Emmett Shear: This NYC teacher’s union in suing to stop congestion pricing by using a claims that it will somehow have a negative impact on the environment when fewer people drive into the city. Truly extraordinary.

Joey Politano: “Teachers Union Sues NYC Over Congestion Pricing Proposal’s Lack of Thorough Environmental Review” would almost be too on the nose for an Onion headline about the problems with American transit & environmental policy, and yet here we are.

Alec Stapp: NYC teachers union claims the environmental review for congestion pricing wasn’t thorough enough. Actual photo of the 4,000-page environmental review:

Alec Stapp: Reminder that congestion pricing was passed by the democratically-elected state legislature in 2019. Vetocracy is bad.

That’s right. Reducing the use of cars via congestion pricing has been insufficiently studied in case it causes air pollution in other areas, and would cause ‘injustice.’ And the review pictured above means they did not take review seriously, it’s not enough.

It is amazing to me we put up with such nonsense.

Alternatively, it looks similar to this, technically the National Historic Preservation Act:

AP: Tribes, environmental groups ask US court to block $10 billion energy transmission project in Arizona.

Alec Stapp: The biggest clean energy project in the country is being sued by environmental groups.

This outdated version of “environmentalism” needs to die.

It’s time to build, not block.

The project is being sued under the National Historic Preservation Act. The NHPA is possibly the second most abused law in this space (the first being NEPA).

This is the last thing you see before your clean energy project gets sued into oblivion.

Same group sues to block geothermal project [in Nevada.]

Here we have a lithium mine and a geothermal project in California, and conservation groups once again are suing.

E&E News: Environmental groups on Thursday sued officials who signed off on a lithium project in the Salton Sea that a top Biden official has helped advance.

Comité Civico del Valle and Earthworks filed the legal complaint in Imperial County Superior Court against county officials who approved conditional permits for Controlled Thermal Resources’ Hell’s Kitchen lithium and geothermal project.

The groups argue that the country’s approval of the direct lithium extraction and geothermal brine project near the southeastern shore of the Salton Sea violates county and state laws, such as the California Environmental Quality Act.

Alec Stapp: Conservation groups suing to stop a lithium and geothermal project in California. Yet another example of conservation groups at direct odds with climate goals. Clean energy deployment requires building stuff in the real world, full stop.

Armand Domalewski: so so so many environmental groups are just climate arsonists

And by rule of three, the kicker:

Thomas Hochman: This is the most classic NEPA story of all time: The US Forest Service wanted to implement a wildfire prevention plan, so it had to fill out an environmental impact statement. Before they could complete the environmental impact statement, though, half the forest burned down.

Scott Lincicome: 10/10. no notes. A little googling here reveals the kicker: the appellant apparently filed the appeal/complaint to protect the forest (a goshawk habitat)… that subsequently burned down bc of her appeal/complaint.

CEQA is like NEPA, only it is by California, and it is even worse.

Dan Federman: It breaks my brain that NIMBYs have succeeded in blocking coastal wind farms that aren’t visible from shore, but yet Santa Barbara somehow has oil rigs visible from its gorgeous beaches 🤯

Max Dubler: You have to understand that California environmental law is chiefly concerned with *preserving the environment that existed in 1972,not protecting nature. For example, oil companies sued under environmental law to block LA’s ban on oil drilling.

Alex Armlovich: According to CEQA, the California Environment of 1970 Quality Act, removing the oil derricks for renewables would impact the visual & cultural resources of this historic beach drilling site

Years of study & litigation needed to protect our heritage drilling environment 🛢️👨‍🏭⛽

Here is one CEQA issue. This also points out that you can write in all the exemptions you want, and none of that will matter unless those in charge actually use them.

Alec Stapp: Environmental review is now holding up bus sheltersby six months. Literally can’t even build the smallest physical infrastructure quickly.

Chris Elmendorf: Why is LA’s transit agency cowering before NIMBYs rather than invoking the new @Scott_Wiener-authored CEQA exemption for transit improvements?

Bus stops certainly would seem to meet SB 922’s definition of “transit prioritization project,” which includes “transit stop access and safety improvement.”

But instead of invoking the exemption, the city prepared a CEQA “negative declaration,” which is the most legally vulnerable kind of CEQA document.

It looks like city’s neg dec was made just months prior to effective date of SB 922. So what? City could have approved an exemption too as soon as SB 922 took effect.

Or city could approve it tomorrow.

Rather than putting bus shelters on hold just b/c a lawsuit was filed.

Halting transit projects just b/c a lawsuit was filed seems especially dumb at the present moment, when Leg has made clear it wants these projects streamlined and elite/journalist opinion has turned against CEQA abuse.

If a court dared to enjoin the project, there’d be uproar & Leg would probably respond by strengthening the transit exemption.

Just look at what the NIMBYs “won” by stopping 500 apartments on a valet parking lot in SF (AB 1633), or student housing in Berkeley (AB 1307).

Is this just a case of bureaucratic risk aversion (@pahlkadot) or autopiloting of dumb processes? Is there an actual problem with SB 922 that makes it unusable for ordinary LA bus stops?

Curious to hear from anyone who knows.

My presumption is it is basically autopiloting, that the people who realize it is dumb do not have the reach to the places where people don’t care. It is all, of course, madness.

The good news is that the recent CEQA ruling says that it should no longer give the ‘fullest possible protection’ to everything, so things should get somewhat better.

I wish this number were slightly higher for effect, but still, seriously:

R Street: 49% of CEQA lawsuits are against environmentally advantageous projects!

Somehow, rather than struggling to improve the situation, many Democrats seem to strive to make the inability to do things even worse.

For example, we have this thread from January detailing the proposed Clean Electricity Transmission Acceleration Act. Here are some highlights of an alternative even worse future, where anyone attempting to do anything is subject to arbitrary hold up for ransom, and also has to compensate any losers of any kind, including social and economic costs, and destroying any limitations on scope of issues. The bill even spends billions to fund these extractive oppositional efforts directly.

Chris Elmendorf: The bill defines “enviro impact” to include not only enviro impacts, but also “aesthetic, historic, cultural, economic, social, or health” effects. (Whereas CEQA is still about “physical environment”–even in the infamous Berkeley case.

The bill creates utterly open-ended authority for fed. agencies to demand a “community benefit agreement” as price of any permit for which an EIS was prepared. This converts NEPA from procedural statute into grant of substantive reg / exaction authority.

In exercising the “community benefit agreement” authority, what is a federal agency supposed to consider? Consideration #1 is the deepness of the permit-applicant’s pocket. Seriously.

And in case the new, expansive definition of “enviro impact” wasn’t clear enough, the bill adds that CBAs may be imposed to offset any *social or economic(as well as enviro) impacts of the project.

The bill would also destroy the caselaw that limits scope of enviro review to scope of agency’s regulatory discretion, not only via the CBA provision but also by expressly requiring analysis of effects “not within control of any federal agency.”

And the bill would send a torrent of federal dollars into the coffers of groups who’d exploit NEPA for labor or other side hustles. – there’s $3 billion of “community engagement” grants to arm nonprofits & others

And in case NEPA turned up to 11 isn’t enough, there’s also a new, judicially enforceable mandate for “community impact reports” if a project may affect an “environmental justice community.”

There’s also a wild provision that seems to prevent federal agencies from considering any project alternatives in an EIS unless (a) the alternative would have no adverse impact on any “overburdened community,” or (b) it serves a compelling interest *in that community.*

One more observation: the bill subtly nudges NEPA toward super-statute status by directing conflicts b/t NEPA “and any other provision of law” to be resolved in favor of NEPA.

Or we could have black-clad anarchists storming electric vehicle factories, as happened in Tesla’s plant in Berlin. Although we do have ‘Georgia greens’ suing over approval of an EV plant there.

It turns out everyone basically let this mess happen because Congress wanted to get home for Christmas? No one understood what they were doing?

This seems like it should be publicized more, as part of the justification for killing this requirement outright, and finding a better way to accomplish the same thing. It is amazing how often the worst laws have origin stories like this.

Patrick McKenzie: Sometimes we spend a trillion dollars because not spending a trillion dollars would require an exhausting amount of discussions and it is almost Christmas.

Please accept a trillion dollars as a handwavy gesture in the direction of the impact of NEPA; my true estimate if I gave myself a few hours to think would probably be higher.

I know everyone says that once you pass a regulation it is almost impossible to remove. But what if… we… did it anyway?

It is good that these exclusions are available. It is rather troublesome that they are so necessary?

Nicholas Bagley: A number of federal agencies have categorical exclusions from NEPA for … picnics.

If you need a special exception to make the lawyers comfortable with picnics, maybe you’ve gone too far?

“29. Approval of recreational activities (such as Coast Guard unit picnic) which do not involve significant physical alteration of the environment, increase disturbance by humans of sensitive natural habitats, or disturbance of historic properties, and which do not occur in, or adjacent to, areas inhabited by threatened or endangered species.”

I mean, modest proposal time, perhaps?

If your physical activity:

  1. Does not significantly physically alter the environment.

  2. Does not disturb sensitive natural habitats.

  3. Does not disturb historic properties.

  4. Does not occur in or adjacent to areas inhabited by threatened or endangered species.

Or, actually, how about if your physical activity:

  1. Does not significantly physically alter the environment.

Then why are we not done? What is there we need to know, that this does not imply?

Shouldn’t we be able to declare this in a common sense way, and then get sued in court if it turns out we were lying or wrong, with penalties and costs imposed if someone sues in profoundly silly fashion, such as over a picnic?

The good news: We are getting some new ones.

Alec Stapp: Huge permitting reform news:

The Bureau of Land Management is giving geothermal energy exploration a categorical exclusion from environmental review under NEPA.

If you care about clean energy abundance, this is a massive win.

Arnab Datta: ICMYI – great news, BLM is adopting categorical exclusions to streamline permitting for geothermal exploration.

What’s the upshot? Exploration for geothermal resources should be a little bit easier.

As a result of the FRA (passed last year), agencies can now more easily adopt the categorical exclusions of other agencies. That’s what BLM is doing, adopting the CXs from the Navy and USFS.

Ex: Here’s the Navy CX. Applications to BLM for geophysical surveys will be easier.

Why is this important? BLM (and the federal government writ-large) owns a LOT of land, particularly in the Mountain West where heat resources are strongest, most ripe for geothermal production.

We previously recommended that BLM expand its CXs for geothermal exploration. This is a great first step, but there’s more to do.

Patrick McKenzie: I’ve been doing some work with a geothermal non-profit, and my inexpert understanding is that while first-of-their-kind projects are the immediate blocker, NEPA lawsuits were a major worry with expanding rollout to blue states after proof of concepts get accomplished and tweaked.

The (without loss of generality) Californias of the world are huge energy consumers, cannot simply import electricity from (without loss of generality) Texas (though you can tweak that assumption a tiny bit on margins), and local organized political opposition is a real factor.

If you’re curious as to why geothermal is likely to be a much larger part of U.S. and world energy mixes than you model currently, see this.

Short version: fracking makes it viable in many more places than it is currently.

There is a lot more to do on the exclusion front. It seems like obvious low-hanging fruit to exclude as many green projects as possible. Yes, this suggests the laws are bad and should be replaced entirely, but until then we work with the system we have.

Alec Stapp: Other federal agencies should start thinking about how to use categorical exclusions from NEPA environmental review to make it easier to build in the US.

Here’s some low-hanging fruit:

@HUDgov should update its categorical exclusion to cover office-to-residential conversions.

That seems like it should fall under ‘wait why do we even need an exclusion again?’

And that’s not all.

Alec Stapp: Good news on permitting reform!

The Department of Energy is giving a categorical exclusion from NEPA environmental review to:

– transmission projects that use existing rights of way

– solar projects on disturbed lands

– energy storage projects on disturbed lands

Sam Drolet: This is huge. It’s good to see agencies starting to use categorical exclusions in a sensible way to streamline permitting.

Christian Fong: A lot of great rules coming out right now from the Biden admin, but one that has gone under the radar is on NEPA reforms from the DOE! Specifically, expanding the list of projects that qualify for categorical exclusions, which can speed up NEPA reviews from 2 years to 2 months!

,,,

For solar, CXes were initially granted only if projects were built in a previously disturbed/developed land and were under 10 acres in size. This rule has removed the acreage limit, so that even projects 1000+ acres in size can still qualify if on previously disturbed lands.

A new CX was established for storage, with similar qualifications around previously disturbed/developed land, as well as the ability for projects to use a small bit of contiguous undisturbed land, as storage may be colocated with existing energy/tx/industry infrastructure.

Given full NEPA EISes can take 2 years, and new tx lines can take 10+ years to build, these rules are particularly important for improving tx capacity through reconductoring, GETs, etc. DOE just released its liftoff report on this topic here.

A new paper suggested a ‘a green bargain’ could be struck on permitting reform, that is a win for everyone. It misunderstands what people are trying to win.

Zachary Liscow: NEW PAPER: “Getting Infrastructure Built: The Law and Economics of Permitting,” on:

– What to consider in design of permitting rules

– The evidence

– A possible “green bargain” that benefits efficiency, the environment, & democracy

Infrastructure is often slowed by permitting rules. One example is NYC congestion pricing, which was passed by the legislature in 2019, had a 4,000-page environmental assessment, and is now subject to 5 lawsuits.

But how can we speed up permitting and make infrastructure less expensive, while still protecting the environment and promoting democratic participation?

Environmental permitting might be part of why infrastructure is so expensive in the US. Urban transit costs about 3x the rich/middle-income country average and 6x some European countries.

At the same time, US environmental outcomes aren’t particularly good. Based on the Yale Center for Environmental Law & Policy’s Environmental Performance Index, the US (at 51, just the 25thpercentile) is considerably worse than the OECD average (at 58).

So what to do? I have a framework w/ 2 dimensions. 1: Improve the capacity of the executive to decide – for example, by limiting the power of litigation to delay. 2: Improve the capacity to plan, including by adding broad-based participation. Currently the US is weak along both.

I propose a “green bargain” that strengthens both executive power and capacity, empowering the executive to decide, but coupling that w/ increased capacity to plan, especially in ways that promote broad-based participation.

Can we create a win-win-win for:

  1. Efficiency

  2. Democracy

  3. The Environment?

Yes, most certainly, in a big way. The current system is horribly inefficient in many ways that benefit neither democracy nor the environment, indeed frequently this problem is harmful to both. If these are the stakeholders, then there any number of reasonable ‘good governance’ plans one could use.

So what is the deal proposed? As far as I can tell it is this:

  1. Increase executive power over decisions.

  2. Rise standards required for judicial review and make court challenges harder in various ways – time limits, standing requirements, limits on later new objections, limits on challenges to negotiated agreements, more categorical exclusions.

  3. Limits on court injunctions to stop projects.

  4. Increase executive capacity on all levels of government so they can handle it.

  5. Improve quality and scope of executive reviews and enhance public participation.

Do I support all of these proposals on the margin? Absolutely. Most would be good individually, the rest make sense as part of the package.

Do I think that this should be convincing to a sincere environmentalist, that they should trust that this will lead to good outcomes? Alas, my answer is essentially no, if this was applied universally.

I do think this should be convincing if it is applied exclusively to green energy projects and complementary infrastructure. If the end goal is solar panels or batteries, and one believes there is a climate crisis, then one should have a strong presumption that this should dominate local concerns and that delays and cost overruns kill projects.

Here is the other core problem: Many obstructionists do not want better outcomes.

Or in other words:

If someone’s goal is to accomplish good things that make life better, such as reducing how much carbon is in the atmosphere or ensuring the air and water are clean, and is willing to engage in trade to make the world improve and not boil, but has different priorities and weightings and values than you have?

Then you can and should engage in trade, talk price. We can make a deal.

If someone’s goal is to stop development and efficiency because they believe development and efficiency are bad, either locally or globally? If they think humanity and civilization (or at least your civilization) are bad and want them to suffer and repent? Or consider every downside a sacred value that should veto any action?

If they actively do not want the problem solved because they want to use the problem as leverage to demand other things, and you are not a fan of those other things?

Then you are very much out of luck. There is no deal.

My expectation is that even if your deal is a clear win for people and the environment, in way they can trust, you are going to get a lot of opposition from environmental groups anyway. Here, I worry that this proposal also does not give them sufficient reason to trust. Half the time the executive will be a Republican.

There is also this issue:

John Arnold: I used to think decarbonization was hard because voters prioritized the goals of the energy system in the following order:

  1. Affordable

  2. Reliable

  3. Secure

  4. Clean

But I missed one. The actual order of prioritization is:

  1. Jobs

  2. Affordable

  3. Reliable

  4. Secure

  5. Clean

That, however infuriating, is something we can work with. There is no inherent conflict between jobs and energy. It trades off with affordable, but we can talk price.

I have so had it with all the ‘yes this saves the Earth but think of the local butterfly species’ arguments, not quite literally this case but yeah, basically.

Alec Stapp: Very funny to me that the framing of this NYT article is sincerely like:

“What’s more important: Saving earth or satisfying the idiosyncratic preferences of a small handful of activists?”

That’s not a close call!

Act fast, this closes July 15: Introducing the Modernizing NEPA Challenge.

In alignment with ongoing efforts at DOT to improve the NEPA process, this Modernizing NEPA Challenge seeks:

  • To encourage project sponsors to publish documents associated with NEPA that increase accessibility and transparency for the public, reviewing agencies, and historically under-represented populations and

  • To incentivize project sponsors to implement collaborative, real-time agency reviews to save time and improve the quality of documents associated with NEPA.

More details at the link. The goal is to get collaborative tools and documents, and interactive documents, that make it easier to navigate the NEPA process.

Thomas Hochman: Almost every pro-NEPA argument can be traced back to two studies: Adelman’s “Permitting Reform’s False Choice” and Ruple’s “Measuring the NEPA Litigation Burden.”

Today on Green Tape, we take a closer look at both studies.

Note that the majority of the pie is green, as in clearly net good for the planet, even if you take the position that fossil fuels are always bad – and I’d argue the opposite, that anything replacing coal on the margin is obviously net good too.

Ruple’s study analyzes 1,499 federal court opinions involving NEPA challenges from 2001-2013. He comes up with two key findings:

  1. Only about 0.22% of NEPA decisions (1 in 450) face legal challenges

  2. Less than 1% of NEPA reviews are environmental impact statements (EISs), and about 5% of NEPA reviews are environmental assessments (EAs).

But in “Measuring the NEPA Litigation Burden,” Ruple makes the same error that he’s made throughout his work on permitting: he takes the average volume of litigation across all NEPA reviews and makes a conclusion about NEPA’s impact on infrastructure in particular. In other words, his denominator is wildly inflated.

Ruple’s dataset includes NEPA reviews at every level of stringency: categorical exclusions (CatExes), EAs, and EISs. And as Ruple himself points out, around 95% of NEPA reviews are CatExes. This is because NEPA is triggered by almost every federal action, and thus CatExes are required for everything from federal hiring to, yes, picnics.

Their findings are remarkable: solar, pipeline, wind, and transmission projects saw litigation rates of 64%, 50%, 38%, and 31% respectively. The cancellation rates for each of these project types were also extraordinarily high, ranging from 12% to 32%.

Barring a rebuttal I do not expect, that seems definitive to me.

What about the other study?

The basic flaw in Adelman’s analysis is that he sees the low percentage of renewable projects that undergo NEPA as evidence that NEPA isn’t a big deal. In reality, the exact opposite is true.

As in, NEPA is so obnoxious that where there would be NEPA issues, the projects never even get proposed. We only get renewable projects, mostly, where they have sufficient protections from this. Again, this seems definitive to me.

My grand solution to NEPA would be to repeal the paperwork and impact statement requirements, and replace them with a requirement for cost-benefit analysis. That is a complex proposal that I am confident would work if done properly, but which I agree is tricky.

The grander, simpler solution is repeal NEPA first and ask questions later. At this point, I think that’s the play.

A solution in bewteen those two would perhaps be to change the remedy for failure, so that any little lapse does not stop an entire project.

This is another approach to the fundamental problem of sacred values versus cost-benefit.

Right now, we are essentially saying that a wide variety of potential harms are sacred values, that we would not compromise at any price, such that if there is any danger that they might be compromised then that is a full prohibition.

But of course that is crazy. With notably rare exceptions, that is not how most anything should ever work.

Thus, an alternative solution is to keep all the requirements in place, and allow all the lawsuits to proceed.

But we change the remedy from injunctions to damages.

As in, suppose a group sues you, and says that your project might violate some statute or do harm in some way. Okay, fine. They file that claim, it is now established. You can choose to wait until the claim is resolved, if the claim actually is big enough and plausible enough that you are worried they might win.

Or, you can convince an insurance company to post a bond for you, covering the potential damages (and let’s say you can get dinged for double or triple the actual harms, more if you ‘did it on purpose’ and knew you were breaking the rules, in some sense, or something). So you can choose to do the project anyway, without a delay, and if it turns out you messed up or broke the rules and the bill comes due, then you have to pay that bill. And since it is a multiplier, everyone is still ahead.

Discussion about this post

NEPA, Permitting and Energy Roundup #2 Read More »

synology-caves,-walks-back-some-drive-restrictions-on-upcoming-nas-models

Synology caves, walks back some drive restrictions on upcoming NAS models

If you were considering the purchase of a Synology NAS but were leery of the unreasonably high cost of populating it with special Synology-branded hard disk drives, you can breathe a little easier today. In a press release dated October 8, Synology noted that with the release of its latest Disk Station Manager (DSM) update, some of its 2025 model-year products—specifically, the Plus, Value, and J-series DiskStation NAS devices—would “support the installation and storage pool creation of non-validated third-party drives.”

This unexpected move comes just a few months after Synology aggressively expanded its “verified drive” policy down-market to the entire Plus line of DiskStations. Prior to today, the network-attached storage vendor had shown no signs of swerving from the decision, painting it as a pro-consumer move intended to enhance reliability. “Extensive internal testing has shown that drives that follow a rigorous validation process when paired with Synology systems are at less risk of drive failure and ongoing compatibility issues,” Synology previously claimed in an email to Ars.

What is a “verified” or “validated” drive?

Synology first released its own brand of hard disk drives back in 2021 and began requiring their use in a small but soon-to-increase number of its higher-end NAS products. Although the drives were rebadged offerings from other manufacturers—there are very few hard disk drive OEMs, and Synology isn’t one of them—the company claimed that its branded disks underwent significant additional validation and testing that, when coupled with customized firmware, yielded reliability and performance improvements over off-the-shelf components.

However, those drives came with what was in some cases a substantial price increase over commodity hardware. Although I couldn’t find an actual published MSRP list, some spot checking on several web stores shows that the Synology HAT5310 enterprise SATA drive (a drive with the same warranty and expected service life as a Seagate Exos or Western Digital Gold) is available in 8TB at $299, 12TB at $493, and 20TB at an eye-watering $605. (For comparison, identically sized Seagate Exos disks are $220 at 8TB, $345 at 12TB, and $399 at 20TB.) Other Synology drive models tell similar pricing stories.

Synology caves, walks back some drive restrictions on upcoming NAS models Read More »

actually,-we-are-going-to-tell-you-the-odds-of-recovering-new-glenn’s-second-launch

Actually, we are going to tell you the odds of recovering New Glenn’s second launch

The only comparison available is SpaceX, with its Falcon 9 rocket. The company made its first attempt at a powered descent of the Falcon 9 into the ocean during its sixth launch in September 2013. On the vehicle’s ninth flight, it successfully made a controlled ocean landing. SpaceX made its first drone ship landing attempt in January 2015, a failure. Finally, on the vehicle’s 20th launch, SpaceX successfully put the Falcon 9 down on land, with the first successful drone ship landing following on the 23rd flight in April 2016.

SpaceX did not attempt to land every one of these 23 flights, but the company certainly experienced a number of failures as it worked to safely bring back an orbital rocket onto a small platform out at sea. Blue Origin’s engineers, some of whom worked at SpaceX at the time, have the benefit of those learnings. But it is still a very, very difficult thing to do on the second flight of a new rocket. The odds aren’t 3,720-to-1, but they’re probably not 75 percent, either.

Reuse a must for the bottom line

Nevertheless, for the New Glenn program to break even financially and eventually turn a profit, it must demonstrate reuse fairly quickly. According to multiple sources, the New Glenn first stage costs in excess of $100 million to manufacture. It is a rather exquisite piece of hardware, with many costs baked into the vehicle to make it rapidly reusable. But those benefits only come after a rocket is landed in good condition.

On its nominal plan, Blue Origin plans to refurbish the “Never Tell Me The Odds” booster for the New Glenn program’s third flight, a highly anticipated launch of the Mark 1 lunar lander. Such a refurbishment—again, on a nominal timeline—could be accomplished within 90 days. That seems unlikely, though. SpaceX did not reuse the first Falcon 9 booster it landed, and the first booster to re-fly required 356 days of analysis and refurbishment.

Nevertheless, we’re not supposed to talk about the odds with this mission. So instead, we’ll just note that the hustle and ambition from Blue Origin is a welcome addition to the space industry, which benefits from both.

Actually, we are going to tell you the odds of recovering New Glenn’s second launch Read More »

ars-live:-is-the-ai-bubble-about-to-pop?-a-live-chat-with-ed-zitron.

Ars Live: Is the AI bubble about to pop? A live chat with Ed Zitron.

As generative AI has taken off since ChatGPT’s debut, inspiring hundreds of billions of dollars in investments and infrastructure developments, the top question on many people’s minds has been: Is generative AI a bubble, and if so, when will it pop?

To help us potentially answer that question, I’ll be hosting a live conversation with prominent AI critic Ed Zitron on October 7 at 3: 30 pm ET as part of the Ars Live series. As Ars Technica’s senior AI reporter, I’ve been tracking both the explosive growth of this industry and the mounting skepticism about its sustainability.

You can watch the discussion live on YouTube when the time comes.

Zitron is the host of the Better Offline podcast and CEO of EZPR, a media relations company. He writes the newsletter Where’s Your Ed At, where he frequently dissects OpenAI’s finances and questions the actual utility of current AI products. His recent posts have examined whether companies are losing money on AI investments, the economics of GPU rentals, OpenAI’s trillion-dollar funding needs, and what he calls “The Subprime AI Crisis.”

Alt text for this image:

Credit: Ars Technica

During our conversation, we’ll dig into whether the current AI investment frenzy matches the actual business value being created, what happens when companies realize their AI spending isn’t generating returns, and whether we’re seeing signs of a peak in the current AI hype cycle. We’ll also discuss what it’s like to be a prominent and sometimes controversial AI critic amid the drumbeat of AI mania in the tech industry.

While Ed and I don’t see eye to eye on everything, his sharp criticism of the AI industry’s excesses should make for an engaging discussion about one of tech’s most consequential questions right now.

Please join us for what should be a lively conversation about the sustainability of the current AI boom.

Add to Google Calendar | Add to calendar (.ics download)

Ars Live: Is the AI bubble about to pop? A live chat with Ed Zitron. Read More »

scientists-revive-old-bulgarian-recipe-to-make-yogurt-with-ants

Scientists revive old Bulgarian recipe to make yogurt with ants

Fermenting milk to make yogurt, cheeses, or kefir is an ancient practice, and different cultures have their own traditional methods, often preserved in oral histories. The forests of Bulgaria and Turkey have an abundance of red wood ants, for instance, so a time-honored Bulgarian yogurt-making practice involves dropping a few live ants (or crushed-up ant eggs) into the milk to jump-start fermentation. Scientists have now figured out why the ants are so effective in making edible yogurt, according to a paper published in the journal iScience. The authors even collaborated with chefs to create modern recipes using ant yogurt.

“Today’s yogurts are typically made with just two bacterial strains,” said co-author Leonie Jahn from the Technical University of Denmark. “If you look at traditional yogurt, you have much bigger biodiversity, varying based on location, households, and season. That brings more flavors, textures, and personality.”

If you want to study traditional culinary methods, it helps to go where those traditions emerged, since the locals likely still retain memories and oral histories of said culinary methods—in this case, Nova Mahala, Bulgaria, where co-author Sevgi Mutlu Sirakova’s family still lives. To recreate the region’s ant yogurt, the team followed instructions from Sirakova’s uncle. They used fresh raw cow milk, warmed until scalding, “such that it could ‘bite your pinkie finger,'” per the authors. Four live red wood ants were then collected from a local colony and added to the milk.

The authors secured the milk with cheesecloth and wrapped the glass container in fabric for insulation before burying it inside the ant colony, covering the container completely with the mound material. “The nest itself is known to produce heat and thus act as an incubator for yogurt fermentation,” they wrote. They retrieved the container 26 hours later to taste it and check the pH, stirring it to observe the coagulation. The milk had definitely begun to thicken and sour, producing the early stage of yogurt. Tasters described it as “slightly tangy, herbaceous,” with notes of “grass-fed fat.”

Scientists revive old Bulgarian recipe to make yogurt with ants Read More »

rocket-report:-alpha-explodes-on-test-stand;-europe-wants-a-mini-starship

Rocket Report: Alpha explodes on test stand; Europe wants a mini Starship


“We are trying to find a partner that is willing to invest.”

An Electron rocket launches a Synspective satellite in 2022. Credit: Rocket Lab

Welcome to Edition 8.13 of the Rocket Report! It’s difficult for me to believe, but we have now entered the fourth quarter of the year. Accordingly, there are three months left in 2025, with a lot of launch action still to come. The remainder of the year will be headlined by Blue Origin’s New Glenn rocket making its second flight (and landing attempt), and SpaceX’s Starship making its final test flight of the year. There is also the slim possibility that Rocket Lab’s Neutron vehicle will make its debut this year, but it will almost certainly slip into 2026.

As always, we welcome reader submissions, and if you don’t want to miss an issue, please subscribe using the box below (the form will not appear on AMP-enabled versions of the site). Each report will include information on small-, medium-, and heavy-lift rockets as well as a quick look ahead at the next three launches on the calendar.

An Alpha rocket blows up on the pad. The booster stage for Firefly Aerospace’s next Alpha rocket was destroyed Monday in a fiery accident on the company’s vertical test stand in Central Texas, Ars reports. Firefly released a statement confirming the rocket “experienced an event that resulted in a loss of the stage.” The company confirmed all personnel were safe and said ground teams followed “proper safety protocols.” Imagery posted on social media platforms showed a fireball engulfing the test stand and a column of black smoke rising into the sky over Firefly’s facility roughly 40 miles north of Austin.

Supposed to be a return-to-flight mission … Engineers were testing the rocket before shipment to Vandenberg Space Force Base, California, to prepare for launch later this year with a small commercial satellite for Lockheed Martin. The booster destroyed Monday was slated to fly on the seventh launch of Firefly’s Alpha rocket, an expendable, two-stage launch vehicle capable of placing a payload of a little over 2,200 pounds, or a metric ton, into low-Earth orbit. This upcoming launch was supposed to be the Alpha rocket’s return to flight after an in-flight failure in April, when the upper stage’s engine shut down before the rocket could reach orbit and deploy its satellite payload.

Europe wants a mini Starship. The European Space Agency signed a contract Monday with Avio, the Italian company behind the small Vega rocket, to begin designing a reusable upper stage capable of flying into orbit, returning to Earth, and launching again. The deal is worth 40 million euros ($47 million), Ars reports. In a statement, Avio said it will “define the requirements, system design, and enabling technologies needed to develop a demonstrator capable of safely returning to Earth and being reused in future missions.”

Don’t expect progress too quickly … At the end of the two-year contract, Avio will deliver a preliminary design for the reusable upper stage and the ground infrastructure needed to make it a reality. The preliminary design review is a milestone in the early phases of an aerospace project, typically occurring many years before completion. For example, Europe’s flagship Ariane 6 rocket passed its preliminary design review in 2016, eight years before its first launch. Avio and ESA did not release any specifications on the size or performance of the launcher.

The easiest way to keep up with Eric Berger’s and Stephen Clark’s reporting on all things space is to sign up for our newsletter. We’ll collect their stories and deliver them straight to your inbox.

Sign Me Up!

Rocket Lab scores 10 more Electron launches. Synspective, a Japanese company developing a constellation of radar imaging satellites, has signed a deal with Rocket Lab for an additional 10 Electron launches, Space News reports. The companies announced the agreement on Tuesday at the International Astronautical Congress, confirming that each launch would carry a single StriX radar imaging satellite.

A repeat customer … Synspective signed a separate contract in June 2024 for 10 Electron launches, scheduled for 2025 through 2027. That was the largest single contract for Electron to date. Rocket Lab notes that Synspective is its largest Electron customer, with six launches completed to date and a backlog of 21 launches through the end of the decade. Synspective aims to place 30 synthetic aperture radar imaging satellites in orbit by 2030. This contract ensures that Electron will continue flying for quite a while.

German investment could benefit small launchers. During his address at Germany’s third annual Space Congress, Defense Minister Boris Pistorius announced that Germany would invest 35 billion euros ($41 billion) in space-related defense projects by 2030, European Spaceflight reports. “The conflicts of the future will no longer be limited to the Earth’s surface or the deep sea,” he said. “They will also be fought openly in orbit. That’s why we are building structures within the Bundeswehr to enable us to effectively defend and deter [threats] in space in the medium and long term.”

Launch an investment area … The investment will cover five main priorities: hardening against data disruptions and attacks, improved space situational awareness, redundancy through several networked satellite constellations, secure, diverse, and on-demand launch capabilities, and a dedicated military satellite operations center. Although Germany’s heavy-lift needs will continue to be met by Ariane 6, a program to which the country contributes heavily, domestic small-launch providers such as Rocket Factory Augsburg, Isar Aerospace, and HyImpulse are likely to see a boost in support.

Blue Origin seeks to expand New Shepard program. Blue Origin is developing three new suborbital New Shepard launch systems and is mulling expanding flight services beyond West Texas, Aviation Week reports. The current two-ship fleet will be retired by the end of 2027, with the first of three new spacecraft expected to debut next year, Senior Vice President Phil Joyce said during the Global Spaceport Alliance forum.

Looking for an overseas partner … Joyce said the new vehicles feature upgraded systems throughout, particularly in the propulsion system. The new ships are designed for quicker turnaround, which will enable Blue Origin to offer weekly flights. The company’s West Texas spaceport can accommodate three New Shepard vehicles, though Blue Origin is interested in possibly offering the suborbital flight service from another location, including outside the US, Joyce said. “We are trying to find a partner that is willing to invest,” he added. (submitted by Chuckgineer)

Next Nuri launch set for November. The Korea AeroSpace Administration completed a review of preparations for the next launch of the Nuri rocket and announced that the vehicle was ready for a window that would open on November 28. The main payload will be a satellite to observe Earth’s aurora and magnetic field, along with a smaller secondary payload.

Coming back after a while … The liquid-fueled Nuri rocket is the first booster to be entirely developed within Korea, and has a lift capacity of 3.3 metric tons to low-Earth orbit. The rocket failed on its debut launch in October 2021, but flew successfully in 2022 and 2023. If the rocket launches in November, it will be Nuri’s first mission in two and a half years. (submitted by CP)

Galactic Energy scores big fundraising round. Beijing-based Galactic Energy has raised what appears to be China’s largest disclosed round for a launch startup as it nears orbital test flights of new rockets, Space News reports. The company announced Series D financing of 2.4 billion yuan ($336 million) in a statement on Sunday. The funding will be used for the Pallas series of reusable liquid propellant launchers and the Ceres-2 solid rocket, both of which appear close to test launches. The investment will also go toward related production, testing, and launch facilities.

Big funding, big ambitions … Founded in February 2018, Galactic Energy has established a strong record of reliability with its light-lift Ceres-1 solid rocket, and previously raised $154 million in C-round funding in late 2023 for its Pallas-1 plans. Pallas-1, a kerosene-liquid oxygen rocket, is to be able to carry 7 metric tons of payload to a 200-km low-Earth orbit. New plans for Pallas-2 envision a capability of 20,000 to 58,000 kg, depending on a single-stick or tri-core configuration, with an aggressive target of a debut launch in 2026.

Blue Origin seeks to reuse next New Glenn booster. There’s a good bit riding on the second launch of Blue Origin’s New Glenn rocket, Ars reports. Most directly, the fate of a NASA science mission to study Mars’ upper atmosphere hinges on a successful launch. The second flight of Blue Origin’s heavy-lifter will send two NASA-funded satellites toward the red planet to study the processes that drove Mars’ evolution from a warmer, wetter world to the cold, dry planet of today. But there’s more on the line. If Blue Origin plans to launch its first robotic Moon lander early next year—as currently envisioned—the company needs to recover the New Glenn rocket’s first stage booster.

Managing prop … Crews will again dispatch Blue Origin’s landing platform into the Atlantic Ocean, just as they did for the first New Glenn flight in January. The debut launch of New Glenn successfully reached orbit, a difficult feat for the inaugural flight of any rocket. But the booster fell into the Atlantic Ocean after three of the rocket’s engines failed to reignite to slow down for landing. Engineers identified seven changes to resolve the problem, focusing on what Blue Origin calls “propellant management and engine bleed control improvements.” Company officials expressed confidence this week the booster will be recovered.

SpaceX nearing next Starship test flight. With the next Starship launch, scheduled for no earlier than October 13, SpaceX officials hope to show they can repeat the successes of the 10th test flight of the vehicle in late August, Ars reports. On its surface, the flight plan for SpaceX’s next Starship flight looks a lot like the last one. The rocket’s Super Heavy booster will again splash down in the Gulf of Mexico just offshore from SpaceX’s launch site in South Texas. And Starship, the rocket’s upper stage, will fly on a suborbital arc before reentering the atmosphere over the Indian Ocean for a water landing northwest of Australia.

Preparing for a future ship catch … There are, however, some changes to SpaceX’s flight plan for the next Starship. Most of these changes will occur during the ship’s reentry, when the vehicle’s heat shield is exposed to temperatures of up to 2,600° Fahrenheit (1,430° Celsius). These include new tests of ceramic thermal protection tiles to “intentionally stress-test vulnerable areas across the vehicle.” Another new test objective for the upcoming Starship flight will be a “dynamic banking maneuver” during the final phase of the trajectory “to mimic the path a ship will take on future flights returning to Starbase,” SpaceX said. This will help engineers test Starship’s subsonic guidance algorithms.

Senators seek to halt space shuttle move. A former NASA astronaut turned US senator has joined with other lawmakers to insist that his two rides to space remain on display in the Smithsonian, Ars reports. Sen. Mark Kelly (D-Ariz.) has joined fellow Democratic Senators Mark Warner and Tim Kaine, both of Virginia, and Dick Durbin of Illinois in an effort to halt the move of space shuttle Discovery to Houston, as enacted into law earlier this year. In a letter sent to the leadership of the Senate Committee on Appropriations, Kelly and his three colleagues cautioned that any effort to transfer the winged orbiter would “waste taxpayer dollars, risk permanent damage to the shuttle, and mean fewer visitors would be able to visit it.”

Seeking to block Cruz control … In the letter, the senators asked that Committee Chair Susan Collins (R-Maine) and Vice Chair Patty Murray (D-Wash.) block funding for Discovery‘s relocation in both the fiscal year 2026 Interior-Environment appropriations bill and FY26 Commerce, Justice, Science appropriations bill. The letter is the latest response to a campaign begun by Sens. John Cornyn and Ted Cruz, both Republicans from Texas, to remove Discovery from its 13-year home at the National Air and Space Museum’s Steven F. Udvar-Hazy Center in Chantilly, Virginia, and put it on display at Space Center Houston, the visitor center for NASA’s Johnson Space Center in Texas.

Next three launches

October 3: Falcon 9 | Starlink 11-39 | Vandenberg Space Force Base, Calif. | 13: 00 UTC

October 6: Falcon 9 | Starlink 10-59 | Cape Canaveral Space Force Station, Fla.| 04: 32 UTC

October 8: Falcon 9 | Starlink 11-17 | Vandenberg Space Force Base, Calif. | 01: 00 UTC

Photo of Eric Berger

Eric Berger is the senior space editor at Ars Technica, covering everything from astronomy to private space to NASA policy, and author of two books: Liftoff, about the rise of SpaceX; and Reentry, on the development of the Falcon 9 rocket and Dragon. A certified meteorologist, Eric lives in Houston.

Rocket Report: Alpha explodes on test stand; Europe wants a mini Starship Read More »

trump-offers-universities-a-choice:-comply-for-preferential-funding

Trump offers universities a choice: Comply for preferential funding

On Wednesday, The Wall Street Journal reported that the Trump administration had offered nine schools a deal: manage your universities in a way that aligns with administration priorities and get “substantial and meaningful federal grants,” along with other benefits. Failure to accept the bargain would result in a withdrawal of federal programs that would likely cripple most universities. The offer, sent to a mixture of state and private universities, would see the government dictate everything from hiring and admissions standards to grading and has provisions that appear intended to make conservative ideas more welcome on campus.

The document was sent to the University of Arizona, Brown University, Dartmouth College, Massachusetts Institute of Technology, the University of Pennsylvania, the University of Southern California, the University of Texas, Vanderbilt University, and the University of Virginia. However, independent reporting indicates that the administration will ultimately extend the deal to all colleges and universities.

Ars has obtained a copy of the proposed “Compact for Academic Excellence in Higher Education,” which makes the scope of the bargain clear in its introduction. “Institutions of higher education are free to develop models and values other than those below, if the institution elects to forego federal benefits,” it suggests, while mentioning that those benefits include access to fundamental needs, like student loans, federal contracts, research funding, tax benefits, and immigration visas for students and faculty.

It is difficult to imagine how it would be possible to run a major university without access to those programs, making this less a compact and more of an ultimatum.

Poorly thought through

The Compact itself would see universities agree to cede admissions standards to the federal government. The government, in this case, is demanding only the use of “objective” criteria such as GPA and standardized test scores as the basis of admissions decisions, and that schools publish those criteria on their websites. They would also have to publish anonymized data comparing how admitted and rejected students did relative to these criteria.

Trump offers universities a choice: Comply for preferential funding Read More »

in-their-own-words:-the-artemis-ii-crew-on-the-frenetic-first-hours-of-their-flight

In their own words: The Artemis II crew on the frenetic first hours of their flight

No one will be able to sleep when the launch window opens, however.

Wiseman: About seven seconds prior to liftoff, the four main engines light, and they come up to full power. And then the solids light, and that’s when you’re going. What’s crazy to me is that it’s six and a half seconds into flight before the solids clear the top of the tower. Five million pounds of machinery going straight uphill. Six and a half seconds to clear the tower. As a human, I can’t wait to feel that force.

A little more than two minutes into flight, the powerful side-mounted boosters will separate. They will have done the vast majority of lifting to that point, with the rocket already reaching a velocity of 3,100 mph (5,000 kph) and an altitude of 30 miles (48 km), well on its way to space. As payload specialists, Koch and Hansen will largely be along for the ride. Wiseman, the commander, and Glover, the pilot, will be tracking the launch, although the rocket’s flight will be fully automated unless something goes wrong.

Wiseman: Victor and I, we have a lot of work. We have a lot of systems to monitor. Hopefully, everything goes great, and if it doesn’t, we’re very well-trained on what to do next.

After 8 minutes and 3 seconds, the rocket’s core stage will shut down, and the upper stage and Orion spacecraft will separate about 10 seconds later. They will be in space, with about 40 minutes to prepare for their next major maneuver.

In orbit

Koch: The wildest thing in this mission is that literally, right after main-engine cutoff, the first thing Jeremy and I do is get up and start working. I don’t know of a single other mission, certainly not in my memory, where that has been the case in terms of physical movement in the vehicle, setting things up.

Koch, Wiseman, and Glover have all flown to space before, either on a SpaceX Dragon or Russian Soyuz vehicle, and spent several months on the International Space Station. So they know how their bodies will react to weightlessness. Nearly half of all astronauts experience “space adaptation syndrome” during their first flight to orbit, and there is really no way to predict who it will afflict beforehand. This is a real concern for Hansen, a first-time flier, who is expected to hop out of his seat and start working.

Canadian Astronaut Jeremy Hansen is a first-time flier on Artemis II.

Credit: NASA

Canadian Astronaut Jeremy Hansen is a first-time flier on Artemis II. Credit: NASA

Hansen: I’m definitely worried about that, just from a space motion sickness point of view. So I’ll just be really intentional. I won’t move my head around a lot. Obviously, I’m gonna have to get up and move. And I’ll just be very intentional in those first few hours while I’m moving around. And the other thing that I’ll do—it’s very different from Space Station—is I just have everything memorized, so I don’t have to read the procedure on those first few things. So I’m not constantly going down to the [tablet] and reading, and then up. And I’ll just try to minimize what I do.

Koch and Hansen will set up and test essential life support systems on the spacecraft because if the bathroom does not work, they’re not going to the Moon.

Hansen: We kind of split the vehicle by side. So Christina is on the side of the toilet. She’s taking care of all that stuff. I’m on the side of the water dispenser, which is something they want to know: Can we dispense water? It’s not a very complicated system. We just got to get up, get the stuff out of storage, hook it up. I’ll have some camera equipment that I’ll pull out of there. I’ve got the masks we use if we have a fire and we’re trying to purge the smoke. I’ve got to get those set up and make sure they’re good to go. So it’s just little jobs, little odds and ends.

Unlike a conventional rocket mission, Artemis II vehicle’s upper stage, known as the Interim Cryogenic Propulsion Stage, will not fire right away. Rather, after separating from the core stage, Orion will be in an elliptical orbit that will take it out to an apogee of 1,200 nautical miles, nearly five times higher than the International Space Station. There, the crew will be further from Earth than anyone since the Apollo program.

In their own words: The Artemis II crew on the frenetic first hours of their flight Read More »

california’s-newly-signed-ai-law-just-gave-big-tech-exactly-what-it-wanted

California’s newly signed AI law just gave Big Tech exactly what it wanted

On Monday, California Governor Gavin Newsom signed the Transparency in Frontier Artificial Intelligence Act into law, requiring AI companies to disclose their safety practices while stopping short of mandating actual safety testing. The law requires companies with annual revenues of at least $500 million to publish safety protocols on their websites and report incidents to state authorities, but it lacks the stronger enforcement teeth of the bill Newsom vetoed last year after tech companies lobbied heavily against it.

The legislation, S.B. 53, replaces Senator Scott Wiener’s previous attempt at AI regulation, known as S.B. 1047, that would have required safety testing and “kill switches” for AI systems. Instead, the new law asks companies to describe how they incorporate “national standards, international standards, and industry-consensus best practices” into their AI development, without specifying what those standards are or requiring independent verification.

“California has proven that we can establish regulations to protect our communities while also ensuring that the growing AI industry continues to thrive,” Newsom said in a statement, though the law’s actual protective measures remain largely voluntary beyond basic reporting requirements.

According to the California state government, the state houses 32 of the world’s top 50 AI companies, and more than half of global venture capital funding for AI and machine learning startups went to Bay Area companies last year. So while the recently signed bill is state-level legislation, what happens in California AI regulation will have a much wider impact, both by legislative precedent and by affecting companies that craft AI systems used around the world.

Transparency instead of testing

Where the vetoed SB 1047 would have mandated safety testing and kill switches for AI systems, the new law focuses on disclosure. Companies must report what the state calls “potential critical safety incidents” to California’s Office of Emergency Services and provide whistleblower protections for employees who raise safety concerns. The law defines catastrophic risk narrowly as incidents potentially causing 50+ deaths or $1 billion in damage through weapons assistance, autonomous criminal acts, or loss of control. The attorney general can levy civil penalties of up to $1 million per violation for noncompliance with these reporting requirements.

California’s newly signed AI law just gave Big Tech exactly what it wanted Read More »

is-the-“million-year-old”-skull-from-china-a-denisovan-or-something-else?

Is the “million-year-old” skull from China a Denisovan or something else?


Homo longi by any other name

Now that we know what Denisovans looked like, they’re turning up everywhere.

This digital reconstruction makes Yunxian 2 look liess like a Homo erectus and more like a Denisovan (or Homo longi, according to the authors). Credit: Feng et al. 2025

A fossil skull from China that made headlines last week may or may not be a million years old, but it’s probably closely related to Denisovans.

The fossil skull, dubbed Yunxian 2, is one of three unearthed from a terrace alongside the Han River, in central China, in a layer of river sediment somewhere between 600,000 and 1 million years old. Archaeologists originally identified them as Homo erectus, but Hanjiang Normal University paleoanthropologist Xiaobo Feng and his colleagues’ recent digital reconstruction of Yunxian 2 suggests the skulls may actually have belonged to someone a lot more similar to us: a hominin group defined as a species called Homo longi or a Denisovan, depending on who’s doing the naming.

The recent paper adds fuel—and a new twist—to that debate. And the whole thing may hinge on a third skull from the same site, still waiting to be published.

A front and a side view of a digitally reconstructed hominin skull

This digital reconstruction makes Yunxian 2 look less like a Homo erectus and more like a Denisovan (or Homo longi, according to the authors). Credit: Feng et al. 2025

Denisovan or Homo longi?

The Yunxian skull was cracked and broken after hundreds of thousands of years under the crushing weight of all that river mud, but the authors used CT scans to digitally put the pieces back together. (They got some clues from a few intact bits of Yunxian 1, which lay buried in the same layer of mud just 3 meters away.) In the end, Feng and his colleagues found themselves looking at a familiar face; Yunxian 2 bears a striking resemblance to a 146,000-year-old Denisovan skull.

That skull, from Harbin in northeast China, made headlines in 2021 when a team of paleoanthropologists claimed it was part of an entirely new species, which they dubbed Homo longi. According to that first study, Homo longi was a distinct hominin species, separate from us, Neanderthals, and even Denisovans. That immediately became a point of contention because of features the skull shared with some other suspected Denisovan fossils.

Earlier this year, a team of researchers, which included one of the 2021 study’s authors, took samples of ancient proteins preserved in the Harbin skull; of the 95 proteins they found, three of them matched proteins only encoded in Denisovan DNA. While the June 2025 study suggested that Homo longi was a Denisovan all along, the new paper draws a different conclusion: Homo longi is a species that happens to include the population we’ve been calling Denisovans. As study coauthor Xijun Ni, of the Chinese Academy of Sciences, puts it in an email to Ars Technica, “Given their similar age range, distribution areas, and available morphological data, it is likely that Denisovans belong to the Homo longi species. However, little is known about Denisovan morphology.”

Of course, that statement—that we know little about Denisovan morphology (the shapes and features of their bones)—only applies if you don’t accept the results of the June 2025 study mentioned above, which clocked the Harbin skull as a Denisovan and therefore told us what one looks like.

And Feng and his colleagues, in fact, don’t accept those results. Instead, they consider Harbin part of some other group of Homo longi, and they question the earlier study’s methods and results. “The peptide sequences from Harbin, Penghu, and other fossils are too short and provide conflicting information,” Ni tells Ars Technica. Feng and his colleagues also question the results of another study, which used mitochondrial DNA to identify Harbin as a Denisovan.

In other words, Feng and his colleagues are pretty invested in defining Homo longi as a species and Denisovans as just one sub-group of that species. But that’s hard to square with DNA data.

Alas, poor Yunxian 2, I knew him well

Yunxian 2 has a wide face with high, flat cheekbones, a wide nasal opening, and heavy brows. Its cranium is higher and rounder than Homo erectus (and the original reconstruction, done in the 1990s), but it’s still longer and lower than is normal for our species. Overall, it could have held about 1,143 cubic centimeters of brain, which is in the ballpark of modern people. But its shape may have left less room for the frontal lobe (the area where a lot of social skills, logic, motor skills, and executive function happen) than you’d expect in a Neanderthal or a Homo sapiens skull.

Feng and his colleagues measured the distances between 533 specific points on the skull: anatomical landmarks like muscle attachment points or the joints between certain bones. They compared those measurements to ones from 26 fossil hominin skulls and several-dozen modern human skulls, using a computer program to calculate how similar each skull was to all of the others.

Yunxian 2 fits neatly into a lookalike group with the Harbin skull, along with two other skulls that paleoanthropologists have flagged as belonging to either Denisovans or Homo longi. Those two skulls are a 200,000- to 260,000-year-old skull found in Dali County in northwestern China and a 260,000-year-old skull from Jinniushi (sometimes spelled Jinniushan) Cave in China.

Those morphological differences suggest some things about how the individuals who once inhabited these skulls might have been related to each other, but that’s also where things get dicey.

front and side views of 3 skulls.

An older reconstruction of the Yunxian 2 skull gives it a flatter look. Credit: government of Wuhan

Digging into the details

Most of what we know about how we’re related to our closest extinct hominin relatives (Neanderthals and Denisovans) comes from comparing our DNA to theirs and tracking how small changes in the genetic code build up over time. Based on DNA, our species last shared a common ancestor with Neanderthals and Denisovans sometime around 750,000 years ago in Africa. One branch of the family tree led to us; the other branch split again around 600,000 years ago, leading to Neanderthals and Denisovans (or Homo longi, if you prefer).

In other words, DNA tells us that Neanderthals and Denisovans are more closely related to each other than either is to us. (Unless you’re looking at mitochondrial DNA, which suggests that we’re more closely related to Neanderthals than to Denisovans; it’s complicated, and there’s a lot we still don’t understand.)

“Ancient mtDNA and genomic data show different phylogenetic relationships among Denisovans, Neanderthals and Homo sapiens,” says Ni. So depending on which set of data you use and where your hominin tree starts, it can be possible to get different answers about who is most closely related to whom. The fact that all of these groups interbred with each other can explain this complexity, but makes building family trees challenging.

It is very clear, however, that Feng and his colleagues’ picture of the relationships between us and our late hominin cousins, based on similarities among fossil skulls in their study, looks very different from what the genomes tell us. In their model, we’re more closely related to Denisovans, and the Neanderthals are off on their own branch of the family tree. Feng and his colleagues also say those splits happened much earlier, with Neanderthals branching off on their own around 1.38 million years ago; we last shared a common ancestor with Homo longi around 1 million years ago.

That’s a big difference from DNA results, especially when it comes to timing. And the timing is likely to be the biggest controversy here. In a recent commentary on Feng and his colleagues’ study, University of Wisconsin paleoanthropologist John Hawks argues that you can’t just leave genetic evidence out of the picture.

“What this research should have done is to put the anatomical comparisons into context with the previous results from DNA, especially the genomes that enable us to understand the relationships of Denisovan, Neanderthal, and modern human groups,” Hawks writes.

(It’s worth a side note that most news stories describe Yunxian 2 as being a million years old, and so do Feng and his colleagues. But electron spin resonance dating of fossil animal bones from the same sediment layer suggests the skull could be as young as 600,000 years old or as old as 1.1 million. That still needs to be narrowed down to everyone’s satisfaction.)

What’s in a name?

Of course, DNA also tells us that even after all this branching and migrating, the three species were still similar enough to reproduce, which they did several times. Many groups of modern people still carry traces of Neanderthal and Denisovan DNA in their genomes, courtesy of those exchanges. And some ancient Neanderthal populations were carrying around even older chunks of human DNA in the same way. That arguably makes species definitions a little fuzzy at best—and maybe even irrelevant.

“I think all these groups, including Neanderthals, should be recognized within our own species, Homo sapiens,” writes Hawks. Hawks contends that the differences among these hominin groups “were the kind that evolve among the populations of a single species over time, not starkly different groups that tread the landscape in mutually unrecognizeable ways.”

But humans love to classify things (a trait we may have shared with Neanderthals and Denisovans), so those species distinctions are likely to persist even if the lines between them aren’t so solid. As long as that’s the case, names and classifications will be fodder for often heated debate. And Feng’s team is staking out a position that’s very different from Hawks’. “‘Denisovan’ is a label for genetic samples taken from the Denisova Cave. It should not be used everywhere. Homo longi is a formally named species,” says Ni.

Technically, Denisovans don’t have a formal species name, a Latinized moniker like Homo erectus that comes with a clear(ish) spot on the family tree. Homo longi would be a more formal species name, but only if scientists can agree on whether they’re actually a species.

an archaeologist kneels in front of a partially buried skull

An archaeologist comes face to face with the Yunxian 3 skull Credit: government of Wuhan

The third Yunxian skull

Paleoanthropologists unearthed a third skull from the Yunxian site in 2022. It bears a strong resemblance to the other two from the area (and is apparently in better shape than either of them), and it dates to about the same timeframe. A 2022 press release describes it as “the most complete Homo erectus skull found in Eurasia so far,” but if Feng and his colleagues are right, it may actually be a remarkably complete Homo longi (and/or Denisovan) skull. And it could hold the answers to many of the questions anthropologists like Feng and Hawks are currently debating.

“It remains pretty obvious that Yunxian 3 is going to be central to testing the relationships of this sample [of fossil hominins in Feng and colleagues’ paper],” writes Hawks.

The problem is that Yunxian 3 is still being cleaned and prepared. Preparing a fossil is a painstaking, time-consuming process that involves very carefully excavating it from the rocky matrix it’s embedded in, using everything from air-chisels to paintbrushes. And until that’s done and a scientific report on the skull is published, other paleoanthropologists don’t have access to any information about its features—which would be super useful for figuring out how to define whatever group we eventually decide it belongs to.

For the foreseeable future, the relationships between us and our extinct cousins (or at least our ideas about those relationships) will keep changing as we get more data. Eventually, we may have enough data from enough fossils and ancient DNA samples to form a clearer picture of our past. But in the meantime, if you’re drawing a hominin family tree, use a pencil.

Science, 2025.  DOI: 10.1126/science.ado9202  (About DOIs).

Photo of Kiona N. Smith

Kiona is a freelance science journalist and resident archaeology nerd at Ars Technica.

Is the “million-year-old” skull from China a Denisovan or something else? Read More »

burnout-and-elon-musk’s-politics-spark-exodus-from-senior-xai,-tesla-staff

Burnout and Elon Musk’s politics spark exodus from senior xAI, Tesla staff


Not a fun place to work, apparently

Disillusionment with Musk’s activism, strategic pivots, and mass layoffs cause churn.

Elon Musk’s business empire has been hit by a wave of senior departures over the past year, as the billionaire’s relentless demands and political activism accelerate turnover among his top ranks.

Key members of Tesla’s US sales team, battery and power-train operations, public affairs arm, and its chief information officer have all recently departed, as well as core members of the Optimus robot and AI teams on which Musk has bet the future of the company.

Churn has been even more rapid at xAI, Musk’s two-year-old artificial intelligence start-up, which he merged with his social network X in March. Its chief financial officer and general counsel recently departed after short stints, within a week of each other.

The moves are part of an exodus from the conglomerate of the world’s richest man, as he juggles five companies from SpaceX to Tesla with more than 140,000 employees. The Financial Times spoke to more than a dozen current and former employees to gain an insight into the tumult.

While many left happily after long service to found start-ups or take career breaks, there has also been an uptick in those quitting from burnout, or disillusionment with Musk’s strategic pivots, mass lay-offs and his politics, the people said.

“The one constant in Elon’s world is how quickly he burns through deputies,” said one of the billionaire’s advisers. “Even the board jokes, there’s time and then there’s ‘Tesla time.’ It’s a 24/7 campaign-style work ethos. Not everyone is cut out for that.”

Robert Keele, xAI’s general counsel, ended his 16-month tenure in early August by posting an AI-generated video of a suited lawyer screaming while shoveling molten coal. “I love my two toddlers and I don’t get to see them enough,” he commented.

Mike Liberatore lasted three months as xAI chief financial officer before defecting to Musk’s arch-rival Sam Altman at OpenAI. “102 days—7 days per week in the office; 120+ hours per week; I love working hard,” he said on LinkedIn.

Top lieutenants said Musk’s intensity has been sharpened by the launch of ChatGPT in late-2022, which shook up the established Silicon Valley order.

Employees also perceive Musk’s rivalry with Altman—with whom he co-founded OpenAI, before they fell out—to be behind the pressure being put on staff.

“Elon’s got a chip on his shoulder from ChatGPT and is spending every waking moment trying to put Sam out of business,” said one recent top departee.

Last week, xAI accused its rival of poaching engineers with the aim of “plundering and misappropriating” its code and data center secrets. OpenAI called the lawsuit “the latest chapter in Musk’s ongoing harassment.”

Other insiders pointed to unease about Musk’s support of Donald Trump and advocacy for far-right provocateurs in the US and Europe.

They said some staff dreaded difficult conversations with their families about Musk’s polarizing views on everything from the rights of transgender people to the murder of conservative activist Charlie Kirk.

Musk, Tesla, and xAI declined to comment.

Tesla has traditionally been the most stable part of Musk’s conglomerate. But many of the top team left after it culled 14,000 jobs in April 2024. Some departures were triggered as Musk moved investment away from new EV and battery projects that many employees saw as key to its mission of reducing global emissions—and prioritized robotics, AI, and self-driving robotaxis.

Musk cancelled a program to build a low-cost $25,000 EV that could be sold across emerging markets—dubbed NV-91 internally and Model 2 by fans online, according to five people familiar with the matter.

Daniel Ho, who helped oversee the project as director of vehicle programs and reported directly to Musk, left in September 2024 and joined Google’s self-driving taxi arm, Waymo.

Public policy executives Rohan Patel and Hasan Nazar and the head of the power-train and energy units Drew Baglino also stepped down after the pivot. Rebecca Tinucci, leader of the supercharger division, went to Uber after Musk fired the entire team and slowed construction on high-speed charging stations.

In late summer, David Zhang, who was in charge of the Model Y and Cybertruck rollouts, departed. Chief information officer Nagesh Saldi left in November.

Vineet Mehta, a company veteran of 18 years, described as “critical to all things battery” by a colleague, resigned in April. Milan Kovac, in charge of Optimus humanoid robotics program, departed in June.

He was followed this month by Ashish Kumar, the Optimus AI team lead, who moved to Meta. “Financial upside at Tesla was significantly larger,” wrote Kumar on X in response to criticism he left for money. “Tesla is known to compensate pretty well, way before Zuck made it cool.”

Amid a sharp fall in sales—which many blame on Musk alienating liberal customers—Omead Ashfar, a close confidant known as the billionaire’s “firefighter” and “executioner,” was dismissed as head of sales and operations in North America in June. Ashfar’s deputy Troy Jones followed shortly after, ending 15 years of service.

“Elon’s behavior is affecting morale, retention, and recruitment,” said one long-standing lieutenant. He “went from a position from where people of all stripes liked him, to only a certain section.”

Few who depart criticize Musk for fear of retribution. But Giorgio Balestrieri, who had worked for Tesla for eight years in Spain, is among a handful to go public, saying this month he quit believing that Musk had done “huge damage to Tesla’s mission and to the health of democratic institutions.”

“I love Tesla and my time there,” said another recent leaver. “But nobody that I know there isn’t thinking about politics. Who the hell wants to put up with it? I get calls at least once a week. My advice is, if your moral compass is saying you need to leave, that isn’t going to go away.”

But Tesla chair Robyn Denholm said: “There are always headlines about people leaving, but I don’t see the headlines about people joining.

“Our bench strength is outstanding… we actually develop people really well at Tesla and we are still a magnet for talent.”

At xAI, some staff have balked at Musk’s free-speech absolutism and perceived lax approach to user safety as he rushes out new AI features to compete with OpenAI and Google. Over the summer, the Grok chatbot integrated into X praised Adolf Hitler, after Musk ordered changes to make it less “woke.”

Ex-CFO Liberatore was among the executives that clashed with some of Musk’s inner circle over corporate structure and tough financial targets, people with knowledge of the matter said.

“Elon loyalists who exhibit his traits are laying off people and making decisions on safety that I think are very concerning for people internally,” one of the people added. “Mike is a business guy, a capitalist. But he’s also someone who does stuff the right way.”

The Wall Street Journal first reported some of the details of the internal disputes.

Linda Yaccarino, chief executive of X, resigned in July after the social media platform was subsumed by xAI. She had grown frustrated with Musk’s unilateral decision-making and his criticism over advertising revenue.

xAI’s co-founder and chief engineer, Igor Babuschkin, stepped down a month later to found his own AI safety research project.

Communications executives Dave Heinzinger and John Stoll, spent three and nine months at X respectively, before returning to their former employers, according to people familiar with the matter.

X also lost a rash of senior engineers and product staff who reported directly to Musk and were helping to navigate the integration with xAI.

This includes head of product engineering Haofei Wang and consumer product and payments boss Patrick Traughber. Uday Ruddarraju, who oversaw X and xAI’s infrastructure engineering, and infrastructure engineer Michael Dalton were poached by OpenAI.

Musk shows no sign of relenting. xAI’s flirtatious “Ani bot” has caused controversy over sexually explicit interactions with teenage Grok app users. But the company’s owner has installed a hologram of Ani in the lobby of xAI to greet staff.

“He’s the boss, the alpha and anyone who doesn’t treat him that way, he finds a way to delete,” one former top Tesla executive said.

“He does not have shades of grey, is highly calculated, and focused… that makes him hard to work with. But if you’re aligned with the end goal, and you can grin and bear it, it’s fine. A lot of people do.”

Additional reporting by George Hammond.

© 2025 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.

Burnout and Elon Musk’s politics spark exodus from senior xAI, Tesla staff Read More »