Science

spacex-launches-a-pair-of-nasa-satellites-to-probe-the-origins-of-space-weather

SpaceX launches a pair of NASA satellites to probe the origins of space weather


“This is going to really help us understand how to predict space weather in the magnetosphere.”

This artist’s illustration of the Earth’s magnetosphere shows the solar wind (left) streaming from the Sun, and then most of it being blocked by Earth’s magnetic field. The magnetic field lines seen here fold in toward Earth’s surface at the poles, creating polar cusps. Credit: NASA/Goddard Space Flight Center

Two NASA satellites rocketed into orbit from California aboard a SpaceX Falcon 9 rocket Wednesday, commencing a $170 million mission to study a phenomenon of space physics that has eluded researchers since the dawn of the Space Age.

The twin spacecraft are part of the NASA-funded TRACERS mission, which will spend at least a year measuring plasma conditions in narrow regions of Earth’s magnetic field known as polar cusps. As the name suggests, these regions are located over the poles. They play an important but poorly understood role in creating colorful auroras as plasma streaming out from the Sun interacts with the magnetic field surrounding Earth.

The same process drives geomagnetic storms capable of disrupting GPS navigation, radio communications, electrical grids, and satellite operations. These outbursts are usually triggered by solar flares or coronal mass ejections that send blobs of plasma out into the Solar System. If one of these flows happens to be aimed at Earth, we are treated with auroras but vulnerable to the storm’s harmful effects.

For example, an extreme geomagnetic storm last year degraded GPS navigation signals, resulting in more than $500 million in economic losses in the agriculture sector as farms temporarily suspended spring planting. In 2022, a period of elevated solar activity contributed to the loss of 40 SpaceX Starlink satellites.

“Understanding our Sun and the space weather it produces is more important to us here on Earth, I think, than most realize,” said Joe Westlake, director of NASA’s heliophysics division.

NASA’s two TRACERS satellites launched Wednesday aboard a SpaceX Falcon 9 rocket from Vandenberg Space Force Base, California. Credit: SpaceX

The launch of TRACERS was delayed 24 hours after a regional power outage disrupted air traffic control over the Pacific Ocean near the Falcon 9 launch site on California’s Central Coast, according to the Federal Aviation Administration. SpaceX called off the countdown Tuesday less than a minute before liftoff, then rescheduled the flight for Wednesday.

TRACERS, short for Tandem Reconnection and Cusp Electrodynamics Reconnaissance Satellites, will study a process known as magnetic reconnection. As particles in the solar wind head out into the Solar System at up to 1 million mph, they bring along pieces of the Sun’s magnetic field. When the solar wind reaches our neighborhood, it begins interacting with Earth’s magnetic field.

The high-energy collision breaks and reconnects magnetic field lines, flinging solar wind particles across Earth’s magnetosphere at speeds that can approach the speed of light. Earth’s field draws some of these particles into the polar cusps, down toward the upper atmosphere. This is what creates dazzling auroral light shows and potentially damaging geomagnetic storms.

Over our heads

But scientists still aren’t sure how it all works, despite the fact that it’s happening right over our heads, within the reach of countless satellites in low-Earth orbit. But a single spacecraft won’t do the job. Scientists need at least two spacecraft, each positioned in bespoke polar orbits and specially instrumented to measure magnetic fields, electric fields, electrons, and ions.

That’s because magnetic reconnection is a dynamic process, and a single satellite would provide just a snapshot of conditions over the polar cusps every 90 minutes. By the time the satellite comes back around on another orbit, conditions will have changed, but scientists wouldn’t know how or why, according to David Miles, principal investigator for the TRACERS mission at the University of Iowa.

“You can’t tell, is that because the system itself is changing?” Miles said. “Is that because this magnetic reconnection, the coupling process, is moving around? Is it turning on and off, and if it’s turning on and off, how quickly can it do it? Those are fundamental things that we need to understand… how the solar wind arriving at the Earth does or doesn’t transfer energy to the Earth system, which has this downstream effect of space weather.”

This is why the tandem part of the TRACERS name is important. The novel part of this mission is it features two identical spacecraft, each about the size of a washing machine flying at an altitude of 367 miles (590 kilometers). Over the course of the next few weeks, the TRACERS satellites will drift into a formation with one trailing the other by about two minutes as they zip around the world at nearly five miles per second. This positioning will allow the satellites to sample the polar cusps one right after the other, instead of forcing scientists to wait another 90 minutes for a data refresh.

With TRACERS, scientists hope to pick apart smaller, fast-moving changes with each satellite pass. Within a year, TRACERS should collect 3,000 measurements of magnetic reconnections, a sample size large enough to start identifying why some space weather events evolve differently than others.

“Not only will it get a global picture of reconnection in the magnetosphere, but it’s also going to be able to statistically study how reconnection depends on the state of the solar wind,” said John Dorelli, TRACERS mission scientist at NASA’s Goddard Space Flight Center. “This is going to really help us understand how to predict space weather in the magnetosphere.”

One of the two TRACERS satellites undergoes launch preparations at Millennium Space Systems, the spacecraft’s manufacturer. Credit: Millennium Space Systems

“If we can understand these various different situations, whether it happens suddenly if you have one particular kind of event, or it happens in lots of different places, then we have a better way to model that and say, ‘Ah, here’s the likelihood of seeing a certain kind of effect that would affect humans,'” said Craig Kletzing, the principal investigator who led the TRACERS science team until his death in 2023.

There is broader knowledge to be gained with a mission like TRACERS. Magnetic reconnection is ubiquitous throughout the Universe, and the same physical processes produce solar flares and coronal mass ejections from the Sun.

Hitchhiking to orbit

Several other satellites shared the ride to space with TRACERS on Wednesday.

These secondary payloads included a NASA-sponsored mission named PExT, a small technology demonstration satellite carrying an experimental communications package capable of connecting with three different networks: NASA’s government-owned Tracking and Data Relay Satellites (TDRS) and commercial satellite networks owned by SES and Viasat.

What’s unique about the Polylingual Experimental Terminal, or PExT, is its ability to roam across multiple satellite relay networks. The International Space Station and other satellites in low-Earth orbit currently connect to controllers on the ground through NASA’s TDRS satellites. But NASA will retire its TDRS satellites in the 2030s and begin purchasing data relay services using commercial satellite networks.

The space agency expects to have multiple data relay providers, so radios on future NASA satellites must be flexible enough to switch between networks mid-mission. PExT is a pathfinder for these future missions.

Another NASA-funded tech demo named Athena EPIC was also aboard the Falcon 9 rocket. Led by NASA’s Langley Research Center, this mission uses a scalable satellite platform developed by a company named NovaWurks, using building blocks to piece together everything a spacecraft needs to operate in space.

Athena EPIC hosts a single science instrument to measure how much energy Earth radiates into space, an important data point for climate research. But the mission’s real goal is to showcase how an adaptable satellite design, such as this one using NovaWurks’ building block approach, might be useful for future NASA missions.

A handful of other payloads rounded out the payload list for Wednesday’s launch. They included REAL, a NASA-funded CubeSat project to investigate the Van Allen radiation belts and space weather, and LIDE, an experimental 5G communications satellite backed by the European Space Agency. Five commercial spacecraft from the Australian company Skykraft also launched to join a constellation of small satellites to provide tracking and voice communications between air traffic controllers and aircraft over remote parts of the world.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

SpaceX launches a pair of NASA satellites to probe the origins of space weather Read More »

what-exactly-is-golden-dome?-this-space-force-general-owes-trump-an-answer.

What exactly is Golden Dome? This Space Force general owes Trump an answer.


“Basically, I’ve been given 60 days to come up with the objective architecture.”

Gen. Michael Guetlein, overseeing the development of the Golden Dome missile defense system, looks on as President Donald Trump speaks in the Oval Office of the White House on May 20, 2025, in Washington, DC. Credit: Jim Watson/AFP via Getty Images

The newly installed head of the Pentagon’s Golden Dome missile defense shield, a monumental undertaking projected to cost $175 billion over the next three years, knows the clock is ticking to show President Donald Trump some results before the end of his term in the White House.

“We are going to try to craft a schedule to have incremental demonstrations every six months because we are on a short timeline,” said Gen. Michael Guetlein, who was confirmed by the Senate last week to become the military’s Golden Dome czar.

Speaking on Tuesday, his second day on the job leading the Golden Dome initiative, Guetlein said his team will “move out with a sense of urgency and move out with incremental wins” as the military races to meet Trump’s timeline.

Guetlein discussed his new job with retired Gen. John “Jay” Raymond, the first chief of the Space Force, at an event in Washington, DC, hosted by the Space Foundation.

Analysts and retired military officials doubt the Pentagon can achieve all of Trump’s Golden Dome promises by the end of 2028. It’s not yet clear what the Pentagon can finish in three years, but Guetlein said Thursday his team will deliver “a capability” on that schedule. “We’ve got to exploit anything and everything we’ve possibly got,” he said, echoing a tenet of Space Force policy to “exploit what we have, buy what we can, and build what we must.”

This means the Space Force will lean heavily on commercial companies, research labs, academia, and, in the case of Canada, international partners to build the Golden Dome.

“Golden Dome for America requires a whole-of-nation response to deter and, if necessary, to defeat attacks against the United States,” the Defense Department said in a statement Tuesday. “We have the technological foundation, national talent, and decisive leadership to advance our nation’s defenses. We are proud to stand behind Gen. Mike Guetlein as he takes the helm of this national imperative.”

President Trump signed an executive order in January calling for the development of a layered missile defense shield to protect the US homeland. He initially called the project the Iron Dome for America, named for Israel’s Iron Dome missile defense system. But Israel’s Iron Dome, which has proven effective against missile attacks from Iran and its proxies in the Middle East, only has to defend an area the size of New Jersey. The Pentagon’s system, now named Golden Dome, will ostensibly cover the entire United States.

Lay of the land

Advocates for the Golden Dome point to recent events to justify the program. These include Russia’s first use of an intermediate-range ballistic missile against Ukraine last year, and Ukraine’s successful drone attack on a Russian airbase last month. Waves of Iranian missile and drone attacks on Israel have tested the mettle of that country’s Iron Dome.

In the January 27 executive order, the White House said the military’s plan must defend against many types of aerial threats, including ballistic, hypersonic, and advanced cruise missiles, plus “other next-generation aerial attacks,” a category that appears to include drones and shorter-range unguided missiles.

This will require a network of sensors on the ground and in space, including heat-seeking sensors and radars to track incoming aerial threats, and interceptors based on the ground, at sea, and in space capable of destroying missiles at any point in flight—boost phase, midcourse, and during final approach to a target.

This illustration shows how the Missile Defense Agency’s HBTSS satellites can track hypersonic missiles as they glide and maneuver through the atmosphere, evading detection by conventional missile-tracking spacecraft, such as the Space Force’s DSP and SBIRS satellites. Credit: Northrop Grumman

The good news for backers of the Golden Dome program is that the Pentagon and commercial industry were developing most of these elements before Trump’s executive order. The Space Development Agency (SDA) launched a batch of prototype missile-tracking and data-relay satellites in 2023, pathfinders for a constellation of hundreds of spacecraft in low-Earth orbit that will begin launching later this year.

In some cases, the military has already fielded Golden Dome components in combat. The Army has operated the Patriot missile system since the 1980s and the Terminal High Altitude Area Defense (THAAD) interceptors for more than 15 years to defend against lower-level threats like small rockets, aircraft, and drones. The Navy’s Aegis Ballistic Missile Defense System uses sea-launched interceptors to target longer-range missiles in space.

The Missile Defense Agency manages the Ground-based Midcourse Defense (GMD) program, which consists of operational silo-launched missile interceptors based in Alaska and California that could be used to defend against a limited missile strike from a rogue state like North Korea.

GMD has cost approximately $70 billion to date and has worked a little more than half the time the military has tested it against a missile target. On the plus side, GMD has achieved four straight successful intercepts in tests since 2014. But despite its immense cost, GMD is antiquated and would not be effective against a large volley of missiles coming from another nuclear superpower, like China.

Golden Dome will bring all of these systems together, and add more to the mix in order to “double down on the protection of the homeland and protect our American citizens,” Guetlein said.

What’s next?

Guetlein identified several short-term priorities for what is officially called the “Office of Golden Dome for America.” One of them is to begin bringing together the military’s existing missile detection and tracking assets, ground- and sea-based interceptors, and the communication pathways, or “comm pipes,” to connect all the pieces in a sophisticated command-and-control network.

“That includes the sensors, that includes the shooters, as well as the comm pipes,” Guetlein said. “How do we bring all that to bear simultaneously in protection of the homeland, while utilizing the capabilities that are already there and not trying to re-create them?”

The Pentagon said in a statement Tuesday that Guetlein’s office will devise an “objective architecture” for the missile defense shield and “socialize” it by late September. This presumably means sharing some information about the architecture with Congress and the public. So far, Space Force officials have hesitated to provide any specifics, at least in public statements and congressional hearings. They often prefer to describe Golden Dome as a “system of systems” instead of something entirely new.

“Basically, I’ve been given 60 days to come up with the objective architecture. I owe that back to the Deputy Secretary of Defense in 60 days,” Guetlein said. “So, in 60 days, I’ll be able to talk in depth about, ‘Hey, this is our vision for what we want to get after for Golden Dome.'”

Although the major pieces of a layered anti-missile system like Golden Dome may appear obvious to anyone with a casual familiarity with missile defense and space—we just named a few of these elements above—the Trump administration has not published any document describing what the Pentagon might actually achieve in the next three years.

Despite the lack of detail, Congress voted to approve $25 billion as a down payment for Golden Dome in the Trump-backed “One Big Beautiful Bill” signed into law July 4. The bulk of the Golden Dome-related budget is earmarked for procurement of more Patriot and THAAD missile batteries, an increase in funding for SDA’s missile-tracking satellites, ballistic missile defense command-and-control networks, and development of “long-range kill chains” for combat targeting.

Two of the US Army’s THAAD missile batteries are seen deployed in Israel in this 2019 photo. Credit: US Army/Staff Sgt. Cory Payne

So, most of the funding allocated to Golden Dome over the next year will go toward bolstering programs already in the Pentagon’s portfolio. But the military will tie them all together with an integrated command-and-control system that can sense an adversarial missile launch, plot its trajectory, and then generate a targeting solution and send it to an interceptor on the ground or in space to eliminate the threat.

Eventually, military leaders want satellites to handle all of these tasks autonomously in space and do it fast enough for US or allied forces to respond to an imminent threat.

“We know how to get data,” a retired senior military official recently told Ars. “The question is, how do you fuse that data in real time with the characteristics of a fire control system, which means real-time feedback of all this data, filtering that data, filtering out sensors that aren’t helping as much as other ones, and then using that to actually command and control against a large-scale attack of diverse threats.

“I feel like those are still two different things,” said the official, who spoke on background with Ars. “It’s one thing to have all the data and be able to process it. It’s another thing to be able to put it into an active, real-time fire control system.”

Trump introduced Guetlein, the Space Force’s former vice chief of space operations, as his nominee for director of the Golden Dome program in an Oval Office event on May 20. At the time, Trump announced the government had “officially selected an architecture” for Golden Dome. That appears to still be the work in front of Guetlein and his team, which is set to grow with new hiring but will remain “small and flat,” the general said Tuesday.

Guetlein has a compelling résumé to lead Golden Dome. Before becoming the second-ranking officer in the Space Force, he served as head of Space Systems Command, which is responsible for most of the service’s acquisition and procurement activities. His prior assignments included stints as deputy director of the National Reconnaissance Office, program executive at the Missile Defense Agency, program manager for the military’s missile warning satellites, and corporate fellow at SpaceX.

Weapons in space

Guetlein identified command and control and the development of space-based interceptors as two of the most pressing technical challenges for Golden Dome. He believes the command-and-control problem can be “overcome in pretty short order.”

“I think the real technical challenge will be building the space-based interceptor,” Guetlein said. “That technology exists. I believe we have proven every element of the physics that we can make it work. What we have not proven is, first, can I do it economically, and then second, can I do it at scale? Can I build enough satellites to get after the threat? Can I expand the industrial base fast enough to build those satellites? Do I have enough raw materials, etc.?”

This is the challenge that ultimately killed the Strategic Defense Initiative (SDI) or “Star Wars” program proposed by former President Ronald Reagan in the 1980s as a way to counter the threat of a nuclear missile attack from the Soviet Union. The first concept for SDI called for 10,000 interceptors to be launched into Earth orbit. This was pared down to 4,600, then finally to fewer than 1,000 before the cancellation of the space-based element in 1993.

Thirty years ago, the United States lacked the technology and industrial capacity to build and launch so many satellites. It’s a different story today. SpaceX has launched more than 9,000 Starlink communications satellites in six years, and Amazon recently kicked off the deployment of more than 3,200 Internet satellites of its own.

Space-based interceptors are a key tenet of Trump’s executive order on Golden Dome. Specifically, the order calls for space-based interceptors capable of striking a ballistic missile during its boost phase shortly after launch. These interceptors would essentially be small satellites positioned in low-Earth orbit, likely a few hundred miles above the planet, circling the world every 90 minutes ready for commands to prevent nuclear Armageddon.

A Standard Missile 3 Block IIA launches from the Aegis Ashore Missile Defense Test Complex at the Pacific Missile Range Facility in Kauai, Hawaii, on December 10, 2018, during a test to intercept an intermediate-range ballistic missile target in space. Credit: Mark Wright/DOD

Reuters reported Tuesday that the Defense Department, which reportedly favored SpaceX to play a central role in Golden Dome, is now looking to other companies, including Amazon Kuiper and other big defense contractors. SpaceX founder Elon Musk has fallen out of favor with the Trump administration, but the company’s production line continues to churn out spacecraft for the National Reconnaissance Office’s global constellation of spy satellites. And it’s clear the cheapest and most reliable way to launch Golden Dome interceptors into orbit will be using SpaceX’s Falcon 9 rocket.

How many space-based interceptors?

“I would envision that there would be certainly more than 1,000 of those in orbit in different orbital planes,” said retired Air Force Gen. Henry “Trey” Obering III, a senior executive advisor at Booz Allen Hamilton and former commander of the Missile Defense Agency. “You could optimize those orbital planes against the Russian threat or Chinese threat, or both, or all the above, between Iran, North Korea, China, and Russia.”

In an interview with Ars, Obering suggested the interceptors could be modest in size and mass, somewhat smaller than SpaceX’s Starlink satellites, and could launch 100 or 200 at a time on a rocket like SpaceX’s Falcon 9. None of this capability existed in the Reagan era.

Taking all of that into account, it’s understandable why Guetlein and others believe Golden Dome is doable.

But major questions remain unanswered about its ultimate cost and the realism of Trump’s three-year schedule. Some former defense officials have questioned the technical viability of using space-based interceptors to target a missile during its boost phase, within the first few minutes of launch.

It’s true that there are also real emerging threats, such as hypersonic missiles and drones, that the US military is currently ill-equipped to defend against.

“The strategic threats are diversifying, and then the actors are diversifying,” the former military space official told Ars. “It’s no longer just Russia. It’s China now, and to a lesser extent, North Korea and potentially Iran. We’ll see where that goes. So, when you put that all together, our ability to deter and convince a potential adversary, or at least make them really uncertain about how successful they could be with a strike, is degraded compared to what it used to be.”

The official said the Trump administration teed up the Golden Dome executive order without adequately explaining the reasons for it. That’s a political failing that could come back to bite the program. The lack of clarity didn’t stop Congress from approving this year’s $25 billion down payment, but there are more key decision points ahead.

“I’m a little disappointed no one’s really defined the problem very well,” the retired military official said. “It definitely started out as a solution without a problem statement, like, ‘I need an Iron Dome, just like Israel.’ But I feel like the entire effort would benefit from a better problem statement.”

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

What exactly is Golden Dome? This Space Force general owes Trump an answer. Read More »

conspiracy-theorists-don’t-realize-they’re-on-the-fringe

Conspiracy theorists don’t realize they’re on the fringe


Gordon Pennycook: “It might be one of the biggest false consensus effects that’s been observed.”

Credit: Aurich Lawson / Thinkstock

Belief in conspiracy theories is often attributed to some form of motivated reasoning: People want to believe a conspiracy because it reinforces their worldview, for example, or doing so meets some deep psychological need, like wanting to feel unique. However, it might also be driven by overconfidence in their own cognitive abilities, according to a paper published in the Personality and Social Psychology Bulletin. The authors were surprised to discover that not only are conspiracy theorists overconfident, they also don’t realize their beliefs are on the fringe, massively overestimating by as much as a factor of four how much other people agree with them.

“I was expecting the overconfidence finding,” co-author Gordon Pennycook, a psychologist at Cornell University, told Ars. “If you’ve talked to someone who believes conspiracies, it’s self-evident. I did not expect them to be so ready to state that people agree with them. I thought that they would overestimate, but I didn’t think that there’d be such a strong sense that they are in the majority. It might be one of the biggest false consensus effects that’s been observed.”

In 2015, Pennycook made headlines when he co-authored a paper demonstrating how certain people interpret “pseudo-profound bullshit” as deep observations. Pennycook et al. were interested in identifying individual differences between those who are susceptible to pseudo-profound BS and those who are not and thus looked at conspiracy beliefs, their degree of analytical thinking, religious beliefs, and so forth.

They presented several randomly generated statements, containing “profound” buzzwords, that were grammatically correct but made no sense logically, along with a 2014 tweet by Deepak Chopra that met the same criteria. They found that the less skeptical participants were less logical and analytical in their thinking and hence much more likely to consider these nonsensical statements as being deeply profound. That study was a bit controversial, in part for what was perceived to be its condescending tone, along with questions about its methodology. But it did snag Pennycook et al. a 2016 Ig Nobel Prize.

Last year we reported on another Pennycook study, presenting results from experiments in which an AI chatbot engaged in conversations with people who believed at least one conspiracy theory. That study showed that the AI interaction significantly reduced the strength of those beliefs, even two months later. The secret to its success: the chatbot, with its access to vast amounts of information across an enormous range of topics, could precisely tailor its counterarguments to each individual. “The work overturns a lot of how we thought about conspiracies, that they’re the result of various psychological motives and needs,” Pennycook said at the time.

Miscalibrated from reality

Pennycook has been working on this new overconfidence study since 2018, perplexed by observations indicating that people who believe in conspiracies also seem to have a lot of faith in their cognitive abilities—contradicting prior research finding that conspiracists are generally more intuitive. To investigate, he and his co-authors conducted eight separate studies that involved over 4,000 US adults.

The assigned tasks were designed in such a way that participants’ actual performance and how they perceived their performance were unrelated. For example, in one experiment, they were asked to guess the subject of an image that was largely obscured. The subjects were then asked direct questions about their belief (or lack thereof) concerning several key conspiracy claims: the Apollo Moon landings were faked, for example, or that Princess Diana’s death wasn’t an accident. Four of the studies focused on testing how subjects perceived others’ beliefs.

The results showed a marked association between subjects’ tendency to be overconfident and belief in conspiracy theories. And while a majority of participants believed a conspiracy’s claims just 12 percent of the time, believers thought they were in the majority 93 percent of the time. This suggests that overconfidence is a primary driver of belief in conspiracies.

It’s not that believers in conspiracy theories are massively overconfident; there is no data on that, because the studies didn’t set out to quantify the degree of overconfidence, per Pennycook. Rather, “They’re overconfident, and they massively overestimate how much people agree with them,” he said.

Ars spoke with Pennycook to learn more.

Ars Technica: Why did you decide to investigate overconfidence as a contributing factor to believing conspiracies?

Gordon Pennycook: There’s a popular sense that people believe conspiracies because they’re dumb and don’t understand anything, they don’t care about the truth, and they’re motivated by believing things that make them feel good. Then there’s the academic side, where that idea molds into a set of theories about how needs and motivations drive belief in conspiracies. It’s not someone falling down the rabbit hole and getting exposed to misinformation or conspiratorial narratives. They’re strolling down: “I like it over here. This appeals to me and makes me feel good.”

Believing things that no one else agrees with makes you feel unique. Then there’s various things I think that are a little more legitimate: People join communities and there’s this sense of belongingness. How that drives core beliefs is different. Someone may stop believing but hang around in the community because they don’t want to lose their friends. Even with religion, people will go to church when they don’t really believe. So we distinguish beliefs from practice.

What we observed is that they do tend to strongly believe these conspiracies despite the fact that there’s counter evidence or a lot of people disagree. What would lead that to happen? It could be their needs and motivations, but it could also be that there’s something about the way that they think where it just doesn’t occur to them that they could be wrong about it. And that’s where overconfidence comes in.

Ars Technica: What makes this particular trait such a powerful driving force?

Gordon Pennycook: Overconfidence is one of the most important core underlying components, because if you’re overconfident, it stops you from really questioning whether the thing that you’re seeing is right or wrong, and whether you might be wrong about it. You have an almost moral purity of complete confidence that the thing you believe is true. You cannot even imagine what it’s like from somebody else’s perspective. You couldn’t imagine a world in which the things that you think are true could be false. Having overconfidence is that buffer that stops you from learning from other people. You end up not just going down the rabbit hole, you’re doing laps down there.

Overconfidence doesn’t have to be learned, parts of it could be genetic. It also doesn’t have to be maladaptive. It’s maladaptive when it comes to beliefs. But you want people to think that they will be successful when starting new businesses. A lot of them will fail, but you need some people in the population to take risks that they wouldn’t take if they were thinking about it in a more rational way. So it can be optimal at a population level, but maybe not at an individual level.

Ars Technica: Is this overconfidence related to the well-known Dunning-Kruger effect?

Gordon Pennycook: It’s because of Dunning-Kruger that we had to develop a new methodology to measure overconfidence, because the people who are the worst at a task are the worst at knowing that they’re the worst at the task. But that’s because the same things that you use to do the task are the things you use to assess how good you are at the task. So if you were to give someone a math test and they’re bad at math, they’ll appear overconfident. But if you give them a test of assessing humor and they’re good at that, they won’t appear overconfident. That’s about the task, not the person.

So we have tasks where people essentially have to guess, and it’s transparent. There’s no reason to think that you’re good at the task. In fact, people who think they’re better at the task are not better at it, they just think they are. They just have this underlying kind of sense that they can do things, they know things, and that’s the kind of thing that we’re trying to capture. It’s not specific to a domain. There are lots of reasons why you could be overconfident in a particular domain. But this is something that’s an actual trait that you carry into situations. So when you’re scrolling online and come up with these ideas about how the world works that don’t make any sense, it must be everybody else that’s wrong, not you.

Ars Technica: Overestimating how many people agree with them seems to be at odds with conspiracy theorists’ desire to be unique.  

Gordon Pennycook: In general, people who believe conspiracies often have contrary beliefs. We’re working with a population where coherence is not to be expected. They say that they’re in the majority, but it’s never a strong majority. They just don’t think that they’re in a minority when it comes to the belief. Take the case of the Sandy Hook conspiracy, where adherents believe it was a false flag operation. In one sample, 8 percent of people thought that this was true. That 8 percent thought 61 percent of people agreed with them.

So they’re way off. They really, really miscalibrated. But they don’t say 90 percent. It’s 60 percent, enough to be special, but not enough to be on the fringe where they actually are. I could have asked them to rank how smart they are relative to others, or how unique they thought their beliefs were, and they would’ve answered high on that. But those are kind of mushy self-concepts. When you ask a specific question that has an objectively correct answer in terms of the percent of people in the sample that agree with you, it’s not close.

Ars Technica: How does one even begin to combat this? Could last year’s AI study point the way?

Gordon Pennycook: The AI debunking effect works better for people who are less overconfident. In those experiments, very detailed, specific debunks had a much bigger effect than people expected. After eight minutes of conversation, a quarter of the people who believed the thing didn’t believe it anymore, but 75 percent still did. That’s a lot. And some of them, not only did they still believe it, they still believed it to the same degree. So no one’s cracked that. Getting any movement at all in the aggregate was a big win.

Here’s the problem. You can’t have a conversation with somebody who doesn’t want to have the conversation. In those studies, we’re paying people, but they still get out what they put into the conversation. If you don’t really respond or engage, then our AI is not going to give you good responses because it doesn’t know what you’re thinking. And if the person is not willing to think. … This is why overconfidence is such an overarching issue. The only alternative is some sort of propagandistic sit-them-downs with their eyes open and try to de-convert them. But you can’t really convert someone who doesn’t want to be converted. So I’m not sure that there is an answer. I think that’s just the way that humans are.

Personality and Social Psychology Bulletin, 2025. DOI: 10.1177/01461672251338358  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Conspiracy theorists don’t realize they’re on the fringe Read More »

la’s-museum-of-jurassic-technology-damaged-by-fire

LA’s Museum of Jurassic Technology damaged by fire

Not all of the artifacts housed within the MJT’s labyrinthine space are, shall we say, truly historical; Wilson has a sense of humor, a vivid imagination, and a cheeky fondness for the absurd. Lawrence Weschler tracked down the provenance (where relevant) of the exhibits in his 1996 book, Mr. Wilson’s Cabinet of Wonder: Pronged Ants, Horned Humans, Mice on Toast, and Other Marvels of Jurassic Technology. (It’s a delightful read.)

Weschler’s blog provides the most detailed account of what happened when the fire broke out on the night of July 8. Wilson, who lives out back, saw what was happening, grabbed a couple of fire extinguishers, and ran to the gift shop entry hall, where he emptied the canisters into what Wilson describes as “a ferocious column of flame lapping up the far street-facing corner wall.”

That wasn’t enough to douse the fire, but fortunately, Wilson’s daughter and son-in-law soon arrived with a much bigger extinguisher and doused the flames. Firefighters showed up shortly thereafter to stamp out any lingering embers and told Wilson, “Just one more minute and you’d likely have lost the whole building.” Wilson described the smoke damage “as if a thin creamy brown liquid had been evenly poured over all the surfaces—the walls, the vitrines, the ceiling, the carpets, and eyepieces, everything.”

Staff and volunteers have been working to repair the damage ever since, with smoke damage repairs being particularly labor-intensive. Weschler closed his blog post with a call for donations to the MJT’s general fund to help the cash-strapped museum weather this particular storm, praising the MJT as “one of the most truly sublime institutions in the country.”

LA’s Museum of Jurassic Technology damaged by fire Read More »

win-for-chemical-industry-as-epa-shutters-scientific-research-office

Win for chemical industry as EPA shutters scientific research office


Deregulation runs rampant

Companies feared rules and lawsuits based on Office of Research and Development assessments.

Soon after President Donald Trump took office in January, a wide array of petrochemical, mining, and farm industry coalitions ramped up what has been a long campaign to limit use of the Environmental Protection Agency’s assessments of the health risks of chemicals.

That effort scored a significant victory Friday when EPA Administrator Lee Zeldin announced his decision to dismantle the agency’s Office of Research and Development (ORD).

The industry lobbyists didn’t ask for hundreds of ORD staff members to be laid off or reassigned. But the elimination of the agency’s scientific research arm goes a long way toward achieving the goal they sought.

In a January 27 letter to Zeldin organized by the American Chemistry Council, more than 80 industry groups—including leading oil, refining, and mining associations—asked him to end regulators’ reliance on ORD assessments of the risks that chemicals pose for human health. The future of that research, conducted under EPA’s Integrated Risk Information System program, or IRIS, is now uncertain.

“EPA’s IRIS program within ORD has a troubling history of being out of step with the best available science and methods, lacking transparency, and being unresponsive to peer review and stakeholder recommendations,” said an American Chemistry Council spokesperson in an email when asked about the decision to eliminate ORD. “This results in IRIS assessments that jeopardize access to critical chemistries, undercut national priorities, and harm American competitiveness.”

The spokesperson said the organization supports EPA evaluating its resources to ensure tax dollars are being used efficiently and effectively.

Christopher Frey, an associate dean at North Carolina State University who served as EPA assistant administrator in charge of ORD during the Biden administration, defended the quality of the science done by the office, which he said is “the poster case study of what it means to do science that’s subject to intense scrutiny.”

“There’s industry with a tremendous vested interest in the policy decisions that might occur later on,” based on the assessments made by ORD. “What the industry does is try to engage in a proxy war over the policy by attacking the science.”

Among the IRIS assessments that stirred the most industry concern were those outlining the dangers of formaldehyde, ethylene oxide, arsenic, and hexavalent chromium. Regulatory actions had begun or were looming on all during the Biden administration.

The Biden administration also launched a lawsuit against a LaPlace, Louisiana, plant that had been the only US manufacturer of neoprene, Denka Performance Elastomer, based in part on the IRIS assessment of one of its air pollutants, chloroprene, as a likely human carcinogen. Denka, a spinoff of DuPont, announced it was ceasing production in May because of the cost of pollution controls.

Public health advocates charge that eliminating the IRIS program, or shifting its functions to other offices in the agency, will rob the EPA of the independent expertise to inform its mission of protection.

“They’ve been trying for years to shut down IRIS,” said Darya Minovi, a senior analyst with the Union of Concerned Scientists and lead author of a new study on Trump administration actions that the group says undermine science. “The reason why is because when IRIS conducts its independent scientific assessments using a great amount of rigor… you get stronger regulations, and that is not in the best interest of the big business polluters and those who have a financial stake in the EPA’s demise.”

The UCS report tallied more than 400 firings, funding cuts, and other attacks on science in the first six months of the Trump administration, resulting in 54 percent fewer grants for research on topics including cancer, infectious disease, and environmental health.

EPA’s press office did not respond to a query on whether the IRIS controversy helped inform Zeldin’s decision to eliminate ORD, which had been anticipated since staff were informed of the potential plan at a meeting in March. In the agency’s official announcement Friday afternoon, Zeldin said the elimination of the office was part of “organizational improvements” that would deliver $748.8 million in savings to taxpayers. The reduction in force, combined with previous departures and layoffs, have reduced the agency’s workforce by 23 percent, to 12,448, the EPA said.

With the cuts, the EPA’s workforce will be at its lowest level since fiscal year 1986.

“Under President Trump’s leadership, EPA has taken a close look at our operations to ensure the agency is better equipped than ever to deliver on our core mission of protecting human health and the environment while Powering the Great American Comeback,” Zeldin said in the prepared statement. “This reduction in force will ensure we can better fulfill that mission while being responsible stewards of your hard-earned tax dollars.”

The agency will be creating a new Office of Applied Science and Environmental Solutions; a report by E&E News said an internal memo indicated the new office would be much smaller than ORD, and would focus on coastal areas, drinking water safety, and methodologies for assessing environmental contamination.

Zeldin’s announcement also said that scientific expertise and research efforts will be moved to “program offices”—for example, those concerned with air pollution, water pollution, or waste—to tackle “statutory obligations and mission essential functions.” That phrase has a particular meaning: The chemical industry has long complained that Congress never passed a law creating IRIS. Congress did, however, pass many laws requiring that the agency carry out its actions based on the best available science, and the IRIS program, established during President Ronald Reagan’s administration, was how the agency has carried out the task of assessing the science on chemicals since 1985.

Justin Chen, president of the American Federation of Government Employees Council 238, the union representing 8,000 EPA workers nationwide, said the organizational structure of ORD put barriers between the agency’s researchers and the agency’s political decision-making, enforcement, and regulatory teams—even though they all used ORD’s work.

“For them to function properly, they have to have a fair amount of distance away from political interference, in order to let the science guide and develop the kind of things that they do,” Chen said.

“They’re a particular bugbear for a lot of the industries which are heavy donors to the Trump administration and to the right wing,” Chen said. “They’re the ones, I believe, who do all the testing that actually factors into the calculation of risk.”

ORD also was responsible for regularly doing assessments that the Clean Air Act requires on pollutants like ozone and particulate matter, which result from the combustion of fossil fuels.

Frey said a tremendous amount of ORD work has gone into ozone, which is the result of complex interactions of precursor pollutants in the atmosphere. The open source computer modeling on ozone transport, developed by ORD researchers, helps inform decision-makers grappling with how to address smog around the country. The Biden administration finalized stricter standards for particulate matter in its final year based on ORD’s risk assessment, and the Trump administration is now undoing those rules.

Aidan Hughes contributed to this report.

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Win for chemical industry as EPA shutters scientific research office Read More »

nearly-3,000-people-are-leaving-nasa,-and-this-director-is-one-of-them

Nearly 3,000 people are leaving NASA, and this director is one of them

You can add another name to the thousands of employees leaving NASA as the Trump administration primes the space agency for a 25 percent budget cut.

On Monday, NASA announced that Makenzie Lystrup will leave her post as director of the Goddard Space Flight Center on Friday, August 1. Lystrup has held the top job at Goddard since April 2023, overseeing a staff of more than 8,000 civil servants and contractor employees and a budget last year of about $4.7 billion.

These figures make Goddard the largest of NASA’s 10 field centers primarily devoted to scientific research and development of robotic space missions, with a budget and workforce comparable to NASA’s human spaceflight centers in Texas, Florida, and Alabama. Officials at Goddard manage the James Webb and Hubble telescopes in space, and Goddard engineers are assembling the Nancy Grace Roman Space Telescope, another flagship observatory scheduled for launch late next year.

“We’re grateful to Makenzie for her leadership at NASA Goddard for more than two years, including her work to inspire a Golden Age of explorers, scientists, and engineers,” Vanessa Wyche, NASA’s acting associate administrator, said in a statement.

Cynthia Simmons, Goddard’s deputy director, will take over as acting chief at the space center. Simmons started work at Goddard as a contract engineer 25 years ago.

Lystrup came to NASA from Ball Aerospace, now part of BAE Systems, where she managed the company’s work on civilian space projects for NASA and other federal agencies. Before joining Ball Aerospace, Lystrup earned a doctorate in astrophysics from University College London and conducted research as a planetary astronomer.

Formal dissent

The announcement of Lystrup’s departure from Goddard came hours after the release of an open letter to NASA’s interim administrator, Transportation Secretary Sean Duffy, signed by hundreds of current and former agency employees. The letter, titled the “The Voyager Declaration,” identifies what the signatories call “recent policies that have or threaten to waste public resources, compromise human safety, weaken national security, and undermine the core NASA mission.”

Nearly 3,000 people are leaving NASA, and this director is one of them Read More »

southwestern-drought-likely-to-continue-through-2100,-research-finds

Southwestern drought likely to continue through 2100, research finds

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.

The drought in the Southwestern US is likely to last for the rest of the 21st century and potentially beyond as global warming shifts the distribution of heat in the Pacific Ocean, according to a study published last week led by researchers at the University of Texas at Austin.

Using sediment cores collected in the Rocky Mountains, paleoclimatology records and climate models, the researchers found warming driven by greenhouse gas emissions can alter patterns of atmospheric and marine heat in the North Pacific Ocean in a way resembling what’s known as the negative phase of the Pacific Decadal Oscillation (PDO), fluctuations in sea surface temperatures that result in decreased winter precipitation in the American Southwest. But in this case, the phenomenon can last far longer than the usual 30-year cycle of the PDO.

“If the sea surface temperature patterns in the North Pacific were just the result of processes related to stochastic [random] variability in the past decade or two, we would have just been extremely unlucky, like a really bad roll of the dice,” said Victoria Todd, the lead author of the study and a PhD student in geosciences at University of Texas at Austin. “But if, as we hypothesize, this is a forced change in the sea surface temperatures in the North Pacific, this will be sustained into the future, and we need to start looking at this as a shift, instead of just the result of bad luck.”

Currently, the Southwestern US is experiencing a megadrought resulting in the aridification of the landscape, a decades-long drying of the region brought on by climate change and the overconsumption of the region’s water. That’s led to major rivers and their basins, such as the Colorado and Rio Grande rivers, seeing reduced flows and a decline of the water stored in underground aquifers, which is forcing states and communities to reckon with a sharply reduced water supply. Farmers have cut back on the amount of water they use. Cities are searching for new water supplies. And states, tribes, and federal agencies are engaging in tense negotiations over how to manage declining resources like the Colorado River going forward.

Southwestern drought likely to continue through 2100, research finds Read More »

how-android-phones-became-an-earthquake-warning-system

How Android phones became an earthquake warning system

Of course, the trick is that you only send out the warning if there’s an actual earthquake, and not when a truck is passing by. Here, the sheer volume of Android phones sold plays a key role. As a first pass, AEA can simply ignore events that aren’t picked up by a lot of phones in the same area. But we also know a lot about the patterns of shaking that earthquakes produce. Different waves travel at different speeds, cause different types of ground motion, and may be produced at different intensities as the earthquake progresses.

So, the people behind AEA also include a model of earthquakes and seismic wave propagation, and check whether the pattern seen in phones’ accelerometers is consistent with that model. It only triggers an alert when there’s widespread phone activity that matches the pattern expected for an earthquake.

Raising awareness

In practical terms, AEA is distributed as part of the core Android software, and is set to on by default, so it is active in most Android phones. It starts monitoring when the phone has been stationary for a little while, checking for acceleration data that’s consistent with the P or S waves produced by earthquakes. If it gets a match, it forwards the information along with some rough location data (to preserve privacy) to Google servers. Software running on those servers then performs the positional analysis to see if the waves are widespread enough to have been triggered by an earthquake.

If so, it estimates the size and location, and uses that information to estimate the ground motion that will be experienced in different locations. Based on that, AEA sends out one of two alerts, either “be aware” or “take action.” The “be aware” alert is similar to a standard Android notification, but it plays a distinctive sound and is sent to users further from the epicenter. In contrast, the “take action” warning that’s sent to those nearby will display one of two messages in the appropriate language, either “Protect yourself” or “Drop, cover, and hold on.” It ignores any do-not-disturb settings, takes over the entire screen, and also plays a distinct noise.

How Android phones became an earthquake warning system Read More »

local-cuisine-was-on-the-menu-at-cafe-neanderthal

Local cuisine was on the menu at Cafe Neanderthal

Gazelle prepared “a la Amud,” or “a la Kebara”?

Neanderthals at Kebara had pretty broad tastes in meat. The butchered bones found in the cave were mostly an even mix of small ungulates (largely gazelle) and medium-sized ones (red deer, fallow deer, wild goats, and boar), with just a few larger game animals thrown in. And it looks like the Kebara Neanderthals were “use the whole deer” sorts of hunters because the bones came from all parts of the animals’ bodies.

On the other hand (or hoof), at Amud, archaeologists found that the butchered bones were almost entirely long bone shafts—legs, in other words—from gazelle. Apparently, the Neanderthal hunters at Amud focused more on gazelle than on larger prey like red deer or boar, and they seemingly preferred meat from the legs.

And not too fresh, apparently—the bones at Kebara showed fewer cut marks, and the marks that were there tended to be straighter. Meanwhile, at Amud, the bones were practically cluttered with cut marks, which crisscrossed over each other and were often curved, not straight. According to Jallon and her colleagues, the difference probably wasn’t a skill issue. Instead, it may be a clue that Neanderthals at Amud liked their meat dried, boiled, or even slightly rotten.

That’s based on comparisons to what bones look like when modern hunter-gatherers butcher their game, along with archaeologists’ experiments with stone tool butchery. First, differences in skill between newbie butchers and advanced ones don’t produce the same pattern of cut marks Jallon and her colleagues saw at Amud. But “it has been shown that decaying carcasses tend to be more difficult to process, often resulting in the production of haphazard, deep, and sinuous cut marks,” as Jallon and her colleagues wrote in their recent paper.

So apparently, for reasons unknown to modern archaeologists, the meat on the menu at Amud was, shall we say, a bit less fresh than that at Kebara. Said menu was also considerably less varied. All of that meant that if you were a Neanderthal from Amud and stopped by Kebara for dinner (or vice versa) your meal might seem surprisingly foreign.

Local cuisine was on the menu at Cafe Neanderthal Read More »

fanfic-study-challenges-leading-cultural-evolution-theory

Fanfic study challenges leading cultural evolution theory


Fanfic community craves familiarity much more than novelty—but reports greater enjoyment from novelty.

Credit: Aurich Lawson | Marvel

It’s widely accepted conventional wisdom that when it comes to creative works—TV shows, films, music, books—consumers crave an optimal balance between novelty and familiarity. What we choose to consume and share with others, in turn, drives cultural evolution.

But what if that conventional wisdom is wrong? An analysis based on data from a massive online fan fiction (fanfic) archive contradicts this so-called “balance theory,” according to a paper published in the journal Humanities and Social Sciences Communications. The fanfic community seems to overwhelmingly prefer more of the same, consistently choosing familiarity over novelty; however, they reported greater overall enjoyment when they took a chance and read something more novel. In short: “Sameness entices, but novelty enchants.”

Strictly speaking, authors have always copied characters and plots from other works (cf. many of William Shakespeare’s plays), although the advent of copyright law complicated matters. Modern fan fiction as we currently think of it arguably emerged with the 1967 publication of the first Star Trek fanzine (Spockanalia), which included spinoff fiction based on the series. Star Trek also spawned the subgenre of slash fiction, when writers began creating stories featuring Kirk and Spock (Kirk/Spock, or K/S) in a romantic (often sexual) relationship.

The advent of the World Wide Web brought fan fiction to the masses, starting with Usenet newsgroups and mailing lists and eventually the development of massive online archives where creators could upload their work to be read and commented upon by readers. The subculture has since exploded; there’s fanfic based on everything from Sherlock Holmes to The X-Files, Buffy the Vampire Slayer, Game of Thrones, the Marvel Cinematic Universe, and Harry Potter. You name it, there’s probably fanfic about it.

There are also many subgenres within fanfic beyond slash, some of them rather weird, like a magical pregnancy (Mpreg) story in which Sherlock Holmes and Watson fall so much in love with each other that one of them becomes magically pregnant. (One suspects Sherlock would not handle morning sickness very well.) Sometimes fanfic even breaks into the cultural mainstream: E.L. James’ bestselling Fifty Shades of Grey started out as fan fiction set in the world of Stephenie Meyer’s Twilight series.

So fanfic is a genuine cultural phenomenon—hence its fascination for Simon DeDeo, a complexity scientist at Carnegie Mellon University and the Santa Fe Institute who studies cultural evolution and the emergence of social hierarchies. (I reported on DeDeo’s work analyzing the archives of London’s Old Bailey in 2014.) While opinion remains split—even among the authors of the original works—as to whether fanfic is a welcome homage to the original works that just might help drive book sales or whether it constitutes a form of copyright infringement, DeDeo enthusiastically embraces the format.

“It’s the dark matter of creativity,” DeDeo told Ars. “I love that it exists. It’s a very non-elitist form. There’s no New York Times bestseller list. It would be hard to name the most famous fan fiction writers. The world building has been done. The characters exist. The plot elements have already been put together. So the bar to entry is lower. Maybe sometime in the 19th century we get a notion of genius and the individual creator, but that’s not really what storytelling has been about for the majority of human history. In that one sense, fan fiction is closer to what we were doing around the campfire.”

spock lying down in sick bay while kirk holds his hand tenderly at his bedside

Star Trek arguably spawned contemporary fan fiction—including stories imagining Kirk and Spock as romantic partners. Credit: Paramount Pictures

That’s a boon for fanfic writers, most of whom have non-creative day jobs; fanfic provides them with a creative outlet. Every year, when DeDeo asks students in his classes whether they read and/or write fanfic, a significant percentage always raise their hands. (He once asked a woman about why she wrote slash. Her response: “Because no one was writing porn that I wanted to read.”) In fact, that’s how this current study came about. Co-author Elise Jing is one of DeDeo’s former students with a background in both science and the humanities—and she’s also a fanfic connoisseur.

Give them more of the same

Jing thought (and DeDeo concurred) that the fanfic subculture provided an excellent laboratory for studying cultural evolution. “It’s tough to get students to read a book. They write fan fiction voluntarily. This is stuff they care about writing and care about reading. Nobody gets prestige or power in the larger society from writing fan fiction,” said DeDeo. “This is not a top-down model where Hollywood is producing something and then the fans are consuming it. The fans are producing and consuming so it’s a truly self-contained culture that’s constantly evolving. It’s a pure product consumption cycle. People read it, they bookmark it, they write comments on it, and all that gives us insight into how it’s being received. If you’re a psychologist, you couldn’t pay to get this kind of data.”

Fanfic is a tightly controlled ecosystem, so it lacks many of the confounding factors that make it so difficult to study mainstream cultural works. Also, the fan fiction community is enormous, so the potential datasets are huge. For this study, the authors relied on data from the online Archive of Our Own (AO3), which boasts nearly 9 million users covering more than 70,000 different fandoms and some 15 million individual works. (Sadly, the site has since shut down access to its data over concerns of that data being used to train AI.)

According to DeDeo, the idea was to examine the question of cultural evolution on a population level, rather than on the individual level: “How do these individual things agglomerate to produce the culture? “

Strong positive correlation is found between the response variables except for the Kudos-to-hits ratio. Topic novelty is weakly positively correlated with Kudos-to-hits ratio, but negatively correlated with the other response variables.

Strong positive correlation is found between the response variables except for the Kudos-to-hits ratio. Topic novelty is weakly positively correlated with Kudos-to-hits ratio but negatively correlated with the other response variables. Credit: E. Jing et al., 2025

The results were striking. AO3 members overwhelmingly preferred familiarity in their fan fiction, i.e., more of the same. One notable exception was a short story that was both hugely popular and highly novel. Simply titled “I Am Groot,” the story featured the character from Guardians of the Galaxy. The text is just “I am Groot” repeated 40,000 times—a stroke of genius in that this is entirely consistent with the canonical MCU character, whose entire dialogue consists of those words, with meaning conveyed by shifts of tone and context. But such exceptions proved to be very rare.

“We were so stunned that balance theory wasn’t working,” said DeDeo, who credits Jing with the realization that they were dealing with two distinct pieces of the puzzle: how much is being consumed, and how much people like what they consume, i.e., enjoyment. Their analysis revealed, first, that people really don’t want an optimized mix of familiar and new; they want the same thing over and over again, even within the fanfic community. But when people do make the effort to try something new, they tend to enjoy it more than just consuming more of the same.

In short, “We are anti-balance theory,” said DeDeo. “In biology, for example, you make a small variation in the species and you get micro-evolution. In culture, a minor variation is just less likely to be consumed. So it really is a mystery how we evolve at all culturally; it’s not happening by gradual movement. We can see that there’s novelty. We can see that when people encounter novelty, they enjoy it. But we can’t quite make sense of how these two competing effects work out.”

“This is the great paradox,” said DeDeo. “Culture has to be stable. Without long-term stability, there’s no coherent body of work that can even constitute of culture if every year fan fiction totally changes. That inherent cultural conservatism is in some sense a precondition for culture to exist at all.” Yet culture does evolve, even within the fanfic community.

One possible alternative is some kind of punctuated equilibrium model for cultural evolution, in which things remain stable but undergo occasional leaps forward. “One story about how culture evolves is that eventually, the stuff that’s more enjoyable than what people keep re-consuming somehow becomes accessible to the majority of the community,” said DeDeo. “Novelty might act as a gravitational pull on the center and [over time] some new material gets incorporated into the culture.” He draws an analogy to established tech companies like IBM versus startups, most of which die out; but those few that succeed often push the culture substantially forward.

Perhaps there are two distinct groups of people: those who actively seek out new things and those who routinely click on familiar subject matter because even though their enjoyment might be less, it’s not worth overcoming their inertia to try out something new. Perhaps it is those who seek novelty that sow the seeds of eventual shifts in trends.

“Is it that we’re tired? Is it that we’re lazy? Is this a conflict within a human or within a culture?” said DeDeo. “We don’t know because we only get the raw numbers. If we could track an individual reader to see how they moved between these two spaces, that would be really interesting.”

Humanities and Social Sciences Communications, 2025. DOI: 10.1057/s41599-025-05166-3  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Fanfic study challenges leading cultural evolution theory Read More »

rough-road-to-“energy-dominance”-after-gop-kneecaps-wind-and-solar

Rough road to “energy dominance” after GOP kneecaps wind and solar


Experts argue that Trump’s One Big Beautiful Bill Act will increase costs for consumers.

As the One Big Beautiful Bill Act squeaked its way through Congress earlier this month, its supporters heralded what they described as a new era for American energy and echoed what has become a familiar phrase among President Donald Trump’s supporters.

“Congress has taken decisive action to advance President Trump’s energy dominance agenda,” said American Petroleum Institute President and CEO Mike Sommers in a statement after the House passed the bill.

Republicans concurred, with legislators ranging from Rep. Mariannette Miller-Meeks of Iowa, chair of the Conservative Climate Caucus, to Energy and Commerce Committee Chairman Rep. Brett Guthrie of Kentucky releasing statements after the bill’s passage championing its role in securing “energy dominance.”

The idea and rhetoric of energy dominance has its roots in the first Trump administration, although a formal definition for the phrase is hard to come by. When Trump signed an executive order this February establishing the National Energy Dominance Council, he included expanding energy production, lowering prices and reducing reliance on foreign entities among the council’s goals, while also emphasizing the importance of oil production and liquefied natural gas (LNG) exports.

The phrase has become something of a battle cry among the president’s supporters, with EPA Administrator Lee Zeldin writing in the Washington Examiner on July 8 that “Trump is securing America’s energy future in a modern-day version of how our Founding Fathers secured our freedom.”

“Through American energy dominance, we’re not just powering homes and businesses,” Zeldin said. “We’re Powering the Great American Comeback.”

But despite claims from Republican officials and the fossil fuel industry that the megabill will help secure energy dominance, some experts worry that the legislation’s cuts to wind and solar actually undermine those goals at a time when electricity demand is rising, limiting America’s ability to add new generation capacity, raising prices for consumers and ceding global leadership in the clean energy transition.

Dan O’Brien, a senior modeling analyst at the climate policy think tank Energy Innovation, said the bill will increase domestic production of oil and gas by increasing lease sales for drilling—mostly in the Gulf of Mexico, onshore and in Alaska, O’Brien said.

A January study commissioned by the American Petroleum Institute reported that a legislatively directed offshore oil and natural gas leasing program, which API says is similar to the measures included in the One Big Beautiful Bill Act months later, would increase oil and natural gas production by 140,000 barrels of oil equivalent (BOE) per day by 2034.

That number would rise to 510,000 BOE per day by 2040, the study says.

Losses likely to outweigh the gains

However, O’Brien said the gains America can expect from the fossil fuel industry pale in comparison to losses from renewable energy.

Energy Innovation’s analysis projects that less than 20 gigawatts of additional generation capacity from fossil fuels can be expected by 2035 as a result of the bill, compared to a decrease of more than 360 gigawatts in additional capacity from renewable energy.

The difference between those numbers—a decrease of 344 gigawatts—is roughly equivalent to the energy use of about 100 million homes, O’Brien said.

According to O’Brien, if the One Big Beautiful Bill had not been passed, the US could have expected to add around 1,000 gigawatts of electricity generation capacity in the next 10 years.

But as a result of the bill, “around a third of that will be lost,” O’Brien said.

Those losses largely stem from the bill’s rollback of incentives for wind and solar projects.

“Solar and wind are subject to different—and harsher—treatment under the OBBB than other technologies,” according to the law firm Latham & Watkins. Tax credits for those projects are now set to phase out on a significantly faster timeline, rolling back some of the commitments promised under the Inflation Reduction Act.

Lucero Marquez, the associate director for federal climate policy at the Center for American Progress, said that removing those incentives undercuts America’s ability to achieve its energy needs.

“America needs affordable, reliable, and domestically produced energy, which wind and solar does,” Marquez said. “Gutting clean energy incentives really just does not help meet those goals.”

New projects will also be subject to rules “primarily intended to prevent Chinese companies from claiming the tax credits and to reduce reliance on China for supply chains of clean energy technologies,” the Bipartisan Policy Center wrote in an explainer.

However, those rules are “extremely complex” and could lead to “decreased U.S. manufacturing and increased Chinese dominance in these supply chains, contrary to their goal,” according to the think tank.

Surging energy prices

O’Brien said Energy Innovation’s modeling suggests that the loss in additional generation capacity from renewable energies will lead existing power plants, which are more expensive to run than new renewable energy projects would have been, to run more frequently to offset the lack of generation from wind and solar projects not coming online.

The consequences of that, according to O’Brien, are that energy prices will rise, which also means the amount of energy produced will go down in response to decreased demand for the more expensive supply.

An analysis by the REPEAT Project from the Princeton ZERO Lab and Evolved Energy Research similarly predicted increased energy prices for consumers as a result of the bill.

According to that analysis, average household energy costs will increase by over $280 per year by 2035, a more than 13 percent hike.

One of the authors of that analysis, Princeton University professor Jesse D. Jenkins, did not respond to interview requests for this article but previously wrote in an email to Inside Climate News that Republicans’ claims about securing energy dominance through the bill “don’t hold up.”

In an emailed statement responding to questions about those analyses and how their findings align with the administration’s goals of attaining energy dominance, White House assistant press secretary Taylor Rogers wrote that “since Day One, President Trump has taken decisive steps to unleash American energy, which has driven oil production and reduced the cost of energy.”

“The One, Big, Beautiful Bill will turbocharge energy production by streamlining operations for maximum efficiency and expanding domestic production capacity,” Rogers wrote, “which will deliver further relief to American families and businesses.”

In an emailed statement, Rep. Guthrie said that the bill “takes critical steps toward both securing our energy infrastructure and bringing more dispatchable power online.”

“Specifically, the bill does this by repairing and beginning to refill the Strategic Petroleum Reserve that was drained during the Biden-Harris Administration, and through the creation of the Energy Dominance Financing program to support new investments that unleash affordable and reliable energy,” the Energy and Commerce chairman wrote.

Cullen Hendrix, a senior fellow at the Peterson Institute for International Economics, also said that the bill “advances the administration’s stated goal of energy dominance,” but added that it does so “primarily in sunsetting, last-generation technologies, while ceding the renewable energy future to others.”

“It wants lower energy costs at home and more U.S. energy exports abroad—for both economic and strategic reasons … the OBBB delivers on that agenda,” Hendrix said.

Still, Hendrix added that “the United States that emerges from all this may be a bigger player in a declining sector—fossil fuels—and a massively diminished player in a rapidly growing one: renewable energy.”

“It will help promote the Trump administration’s ambitions of fossil dominance (or at least influence) but on pain of helping build a renewable energy sector for the future,” Hendrix wrote. “That is net-negative globally (and locally) from a holistic perspective.”

Adam Hersh, a senior economist at the Economic Policy Institute, argued that he sees a lot in the bill “that is going to move us in the opposite direction from energy dominance.”

“They should have named this bill the ‘Energy Inflation Act,’ because what it’s going to mean is less energy generated and higher costs for households and for businesses, and particularly manufacturing businesses,” Hersh said.

Hersh also said that even if the bill does lead to increased exports of US-produced energy, that would have a direct negative impact on costs for consumers at home.

“That’s only going to increase domestic prices for energy, and this has long been known and why past administrations have been reluctant to expand exports of LNG,” Hersh said. “That increased demand for the products and competition for the resources will mean higher energy prices for U.S. consumers and businesses.”

“Pushing against energy dominance”

Frank Maisano, a senior principal at the lobbying firm Bracewell LLP, said that although the bill creates important opportunities for things such as oil and gas leasing and the expansion of geothermal and hydrogen energy, the bill’s supporters “undercut themselves” by limiting opportunities for growth in wind and solar.

“The Biden folks tried to lean heavily onto the energy transition because they wanted to limit emissions,” Maisano said. “They wanted to push oil and gas out and push renewables in.”

Now, “these guys are doing the opposite, which is to push oil and gas and limit wind and solar,” Maisano said. “Neither of those strategies are good strategies. You need to have a combination of all these strategies and all these generation sources, especially on the electricity side, to make it work and to meet the challenges that we face.”

Samantha Gross, director of the Brookings Institution’s Energy Security and Climate Initiative, said that while she isn’t concerned about whether the US will build enough electricity generation to meet the needs of massive consumers like data centers and AI, she is worried that the bill pushes the next generation of that growth further towards fossil fuels.

“I don’t think energy dominance—not just right this instant, but going forward—is just in fossil fuels,” Gross said.

Even beyond the One Big Beautiful Bill, Gross said that many of the administration’s actions run counter to their stated objectives on energy.

“You hear all this talk about energy dominance, but for me it’s just a phrase, because a lot of things that the administration is actually doing are pushing against energy dominance,” Gross said.

“If you think about the tariff policy, for instance, ‘drill, baby, drill’ and a 50 percent tariff on pipeline steel do not go together. Those are pulling in completely opposite directions.”

Aside from domestic energy needs, Gross also worried that the pullback from renewable energy will harm America’s position on the global stage.

“It’s pretty clear which way the world is going,” Gross said. “I worry that we’re giving up … I don’t like the term ‘energy dominance,’ but future leadership in the world’s energy supply by pulling back from those.”

“We’re sort of ceding those technologies to China in a way that is very frustrating to me.”

Yet even in the wake of the bill’s passage, some experts see hope for the future of renewable energy in the US.

Kevin Book, managing director at the research firm ClearView Energy Partners, said that the bill “sets up a slower, shallower transition” toward renewable energy. However, he added that he doesn’t think it represents the end of that transition.

“Most of the capacity we’re adding to our grid in America these days is renewable, and it’s not simply because of federal incentives,” Book said. “So if you take away those federal incentives, there were still economic drivers.”

Still, Book said that the final impacts of the Trump administration’s actions on renewable energy are yet to be seen.

“The One Big Beautiful Bill Act is not the end of the story,” Book said. “There’s more coming, either regulatorily and/or legislatively.”

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Rough road to “energy dominance” after GOP kneecaps wind and solar Read More »

there-could-be-“dark-main-sequence”-stars-at-the-galactic-center

There could be “dark main sequence” stars at the galactic center


Dark matter particle and antiparticle collisions could make some stars immortal.

For a star, its initial mass is everything. It determines how quickly it burns through its hydrogen and how it will evolve once it starts fusing heavier elements. It’s so well understood that scientists have devised a “main sequence” that acts a bit like a periodic table for stars, correlating their mass and age with their properties.

The main sequence, however, is based on an assumption that’s almost always true: All of the energy involved comes from the gravity-driven fusion of lighter elements into heavier ones. However, three astrophysicists consider an alternative source of energy that may apply at the very center of our galaxy— energy released when dark matter particles and antiparticles collide and annihilate. While we don’t even know that dark matter can do that, it’s a hypothetical with some interesting consequences, like seemingly immortal stars, and others that move backward along the main sequence path.

Dark annihilations

We haven’t figured out what dark matter is, but there are lots of reasons to think that it is comprised of elementary particles. And, if those behave like all of the particles we understand well, then there will be both regular and antimatter versions. Should those collide, they should annihilate each other, releasing energy in the process. Given dark matter’s general propensity not to interact with anything, these collisions will be extremely rare except in locations with very high dark matter concentrations.

The only place that’s likely to happen is at the very center of our galaxy. And, for a while, there was an excess of radiation coming from the galactic core that people thought might be due to dark matter annihilations, although it eventually turned out to have a more mundane explanation.

At the extreme densities found within a light year of the supermassive black hole at the center of our galaxy, concentrations are high enough that these collisions could be a major source of energy. And so astronomers have considered what all that energy might do to stars that end up in a black hole’s orbit, finding that under the right circumstances, dark matter destruction could provide more energy to a star than fusion.

That prompted three astrophysicists (Isabelle John, Rebecca Leane, and Tim Linden) to try to look at things in an organized fashion, modeling a “dark main sequence” of stars as they might exist within a close proximity to the Milky Way’s center.

The intense gravity and radiation found near the galaxy’s core mean that stars can’t form there. So, anything that’s in a tight orbit had formed somewhere else before gravitational interactions had pushed it into the gravitational grasp of the galaxy’s central black hole. The researchers used a standard model of star evolution to build a collection of moderate-sized stars, from one to 20 solar masses at 0.05 solar mass intervals. These are allowed to ignite fusion at their cores and then shift into a dark-matter-rich environment.

Since we have no idea how often dark matter particles might run into each other, John, Leane, and Linden use two different collision frequencies. These determine how much energy is imparted into these stars by dark matter, which the researchers simply add as a supplement to the amount of fusion energy the stars are producing. Then, the stars are allowed to evolve forward in time.

(The authors note that stars that are thrown into the grasp of a supermassive black hole tend to have very eccentric orbits, so they spend a lot of time outside the zone where dark matter collisions take place with a significant frequency. So, what they’ve done is the equivalent of having these stars experience the energy input given their average orbital distance from the galaxy’s core. In reality, a star would spend some years with higher energy input and some years with lower input as it moves about its orbit.)

Achieving immortality

The physics of what happens is based on the same balance of forces that govern fusion-powered stars, but produces some very strange results. Given only fusion power, a star will exist at a balance point. If gravity compresses it, fusion speeds up, more energy is released, and that energy causes the star to expand outward again. That causes the density drop, slowing fusion back down again.

The dark matter annihilations essentially provide an additional source of energy that stays constant regardless of what happens to the star’s density. At the low end of the mass range the researchers considered, this can cause the star to nearly shut off fusion, essentially looking like a far younger star than it actually is. That has the effect of causing the star to move backward along the main sequence diagram.

The researchers note that even lighter stars could essentially get so much additional energy that they can’t hold together and end up dissipating, something that’s been seen in models run by other researchers.

As the mass gets higher, stars reach the point where they essentially give up on fusion and get by with nothing but dark matter annihilations. They have enough mass to hold together gravitationally, but end up too diffused for fusion to continue. And they’ll stay that way as long as they continue to get additional injections of energy. “A star like this might look like a young, still-forming star,” the authors write, “but has features of a star that has undergone nuclear fusion in the past and is effectively immortal.”

John, Leane, and Linden find that the higher mass stars remain dense enough for fusion to continue even in proximity to the galaxy’s black hole. But the additional energy kept that fusion happening at a moderate rate. They proceeded through the main sequence, but at a pace that was exceptionally slow, so that running the simulation for a total of 10 billion years didn’t see them change significantly.

The other strange thing here is that all of this is very sensitive to how much dark matter annihilation is taking place. A star that’s “immortal” at one average distance will progress slowly through the main sequence if its average distance is a light year further out. Similarly, stars that are too light to survive at one location will hold together if they are a bit further from the supermassive black hole.

Is there anything to this?

The big caution is that this work only looks at the average input from dark matter annihilation. In reality, a star that might be immortal at its average distance will likely spend a few years too hot to hold together, and then several years cooling off in conditions that should allow fusion to reignite. It would be nice to see a model run with this sort of pulsed input, perhaps basing it on the orbits of some of the stars we’ve seen that get close to the Milky Way’s central black hole.

In the meantime, John, Leane, and Linden write that their results are consistent with some of the oddities that are apparent in the stars we’ve observed at the galaxy’s center. These have two distinctive properties: They appear heavier than the average star in the Milky Way, and all seem to be quite young. If there is a “dark main sequence,” then the unusual heft can be explained simply by the fact that lower mass stars end up dissipating due to the additional energy. And the model would suggest that these stars simply appear to be young because they haven’t undergone much fusion.

The researchers suggest that we could have a clearer picture if we were able to spend enough time observing the stars at our galaxy’s core with a large enough telescope, allowing us to understand their nature and orbits.

Physical Review D, 2025. DOI: Not yet available  (About DOIs).

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

There could be “dark main sequence” stars at the galactic center Read More »