Science

sizing-up-the-5-companies-selected-for-europe’s-launcher-challenge

Sizing up the 5 companies selected for Europe’s launcher challenge

The European Space Agency has selected five launch startups to become eligible for up to 169 million euros ($198 million) in funding to develop alternatives to Arianespace, the continent’s incumbent launch service provider.

The five companies ESA selected are Isar Aerospace, MaiaSpace, Rocket Factory Augsburg, PLD Space, and Orbex. Only one of these companies, Isar Aerospace, has attempted to launch a rocket into orbit. Isar’s Spectrum rocket failed moments after liftoff from Norway on a test flight in March.

None of these companies is guaranteed an ESA contract or funding. Over the next several months, the European Space Agency and the five launch companies will negotiate with European governments for funding leading up to ESA’s ministerial council meeting in November, when ESA member states will set the agency’s budget for at least the next two years. Only then will ESA be ready to sign binding agreements.

In a press release, ESA referred to the five companies as “preselected challengers” in a competition for ESA support in the form of launch contracts and an ESA-sponsored demonstration to showcase upgraded launch vehicles to heave heavier payloads into orbit. So far, all five of the challengers are focusing on small rockets.

Earlier this year, ESA released a request for proposals to European industry for bids to compete in the European Launch Challenge. ESA received 12 proposals from European companies and selected five to move on to the next phase of the challenge.

A new way of doing business

In this competition, ESA is eschewing a rule that governs nearly all of the space agency’s other programs. This policy, known as geographic return, guarantees industrial contracts to ESA member states commensurate with the level of money they put into each project. The most obvious example of this is Europe’s Ariane rocket family, whose development was primarily funded by France, followed by Germany in second position. Therefore, the Ariane 6 rocket’s core stage and engines are built in France, and its upper stage is manufactured in Germany.

Sizing up the 5 companies selected for Europe’s launcher challenge Read More »

wildfires-are-challenging-air-quality-monitoring-infrastructure

Wildfires are challenging air quality monitoring infrastructure


Can the US’s system to monitor air pollutants keep up with a changing climate?

The Downtown Manhattan skyline stands shrouded in a reddish haze as a result of Canadian wildfires on June 6, 2023. Credit: Lokman Vural Elibol/Anadolu Agency via Getty Images

Ten years ago, Tracey Holloway, an atmospheric scientist at the University of Wisconsin–Madison, would have said that air pollution in the United States was a huge success story. “Our air had been getting cleaner and cleaner almost everywhere, for almost every pollutant,” she said. But in June 2023, as wildfire smoke from Canada spread, the air quality dropped to historically low levels in her home state of Wisconsin.

Just last month, the region’s air quality dipped once more to unhealthy levels. Again, wildfires were to blame.

While the US has made significant strides in curbing car and industrial pollution through setting emission limits on industrial facilities and automakers, the increasing frequency and intensity of fires are “erasing the gains that we have obtained through this pollutant control effort,” said Nga Lee “Sally” Ng, an aerosol researcher at Georgia Institute of Technology.

The changing dynamics present a challenge for both residents and researchers tracking air quality. Many of the high-quality monitors used to measure pollution reside near large cities and stationary sources, such as coal-powered plants, and don’t cover the US uniformly. Regions that lack such stations are called air quality monitoring deserts, and they may leave vulnerable populations in the dark about their local conditions.

The current infrastructure also isn’t set up to fully account for the shifting behavior of wildfire smoke, which can travel hundreds of miles or more from fire sources to affect air quality and public health in distant communities. That smoke can also include toxins, such as lead when cars and homes burn.

“Fires are really changing the story,” said Holloway.

Since the introduction of the Air Pollution Control Act of 1955, air quality has been recognized as a national issue in the United States. Then with the enactment of the Clean Air Act in 1970 and following amendments, researchers and federal agencies began to monitor the level of pollutants, particularly carbon monoxide, nitrogen dioxide, ozone, particulate matter, and sulfur dioxide, to identify if these were up to the established National Ambient Air Quality Standards.

The Environmental Protection Agency uses these pollutant levels to calculate an air quality index, or AQI, a numerical and color-coded system scaled from 0 to 500 that informs the public how safe the air is. Higher numbers, associated with red, purple, and maroon, indicate worse quality; in June 2023, for example, parts of Wisconsin topped 300, indicating “hazardous” air. All residents were advised to stay indoors as much as possible.

The EPA and other federal agencies make use of various networks of advanced ground monitors that can pick up on different air pollutants, and many experts say that the US has one of the most advanced air quality tracking systems in the world.

Still, there are gaps: Regulatory monitors cost around $50,000 upfront and require continuous maintenance, so states place them in locations where researchers expect pollution may be high. Currently, there are 4,821 active monitors across the US in the EPA’s AirData system—many of which were installed in the 1990s and 2000s—but they are more likely to be near more populated areas and in states in the West and Northeast, creating air quality monitoring deserts elsewhere, according to a new study published in April.

When looking at their distribution, researchers at The Pennsylvania State University found that 59 percent of US counties—home to more than 50 million people—lacked an air quality monitoring site. Many of those air quality monitoring deserts were in rural areas in the South and Midwest. Counties with higher poverty rates and a higher concentration of Black and Hispanic residents were also more likely to be air quality monitoring deserts when accounting for population.

Similarly, a Reuters investigation found that 120 million Americans live in counties that have no monitors for small particle pollution and that in 2020, “the government network of 3,900 monitoring devices nationwide has routinely missed major toxic releases and day-to-day pollution dangers,” including those linked to refinery explosions. (In response to a request for comment, an EPA spokesperson noted that the agency “continues to work closely with state, local, and tribal monitoring programs to expand the use of air sensors to improve measurement coverage, which provide near-real time data to a number of publicly available sources, such as the AIRNow Fire and Smoke Map.”)

These gaps in coverage can be accentuated with wildfires, which often originate in sparsely populated areas without monitor coverage. Wildfires can also be unpredictable, making it difficult to identify priority sites for new monitors. “You certainly can’t anticipate what areas are going to see wildfire smoke,” said Mary Uhl, executive director of Western States Air Resources Council, which shares air quality information across 15 western state air agencies. Meanwhile, wildfire pollutants can spread widely from their original source, and smoke particles can sometimes travel for up to 10 days, Ng pointed out.

Such shifting dynamics are driving researchers to expand their monitoring infrastructure and complement it with crowdsourced and satellite data to capture the widespread pollution. “There will be potential to increase the spatial covering of these monitoring networks,” said Ng. “Because, as you can see, we could still make use of better measurement, maybe at the different community level, to better understand the air that we are being exposed to.”

To expand coverage in a cost-efficient way, agencies are investigating a variety of different approaches and technologies. Low-cost monitors now allow people to crowdsource data about air quality in their communities. (However, these tend to be less precise and accurate than the high-grade instruments.) State, local, and tribal agencies also play a critical role in monitoring air quality, such as New York’s Community Air Monitoring Initiative, which tracked pollution for a year using mobile monitoring in 10 disadvantaged communities with high air pollution burdens. And the EPA has a pilot program that loans compact mobile air monitoring systems to air quality professionals, who can set them up in their vehicles to map air quality during and after wildfires.

Satellites can also provide critical information since they can estimate levels of gases and pollutants, providing data about where pollution levels are highest and how pollutants are transported. “We can see where we’re missing things in those deserts,” said Uhl.

This strategy might be helpful to address the challenge with wildfires because satellites can get a more global view of the spread of pollutants. Fires “change season to season, so they’re not always coming from the same place,” said Holloway, who leads a team that uses NASA satellite data to monitor air quality. “And I think really what you need is a way of evaluating what’s going on over a wide area. And these satellites up in space, I think, offer exactly the tool for the job.”

Other advancements allow scientists to study the composition of pollution more granularly, since different pollutants can have different toxicities and health effects. For example, particulate matter 2.5, or PM2.5—which has a diameter of 2.5 micrometers or less—can cause respiratory and heart problems. Ng led the establishment of a system called ASCENT, or the Atmospheric Science and Chemistry Measurement Network, which measures the specific chemical constituents in PM2.5 to identify which particles might be the most toxic to human health, along with aiming to answer many other scientific questions.

After the Eaton Canyon and Palisades fires that burned across Los Angeles County in January 2025, Ng and colleagues used the system and identified that lead concentration increased approximately 110 times over the average levels, likely due to the ignition of the lead-ridden vehicles, plastics, buildings, and other fuel. The system works as a “magnifying glass to look into PM2.5,” said Ng. Currently, they have 12 sites and hope to expand ASCENT to more locations in the future if resources are available.

Different approaches to collecting air quality monitoring data, along with computational modeling, could be combined to improve researchers’ understanding of air pollution and expand air quality information to underserved populations, said Holloway.

Today, although wildfires represent a new challenge, “we have all these additional tools to help us understand air quality,” said Uhl. “And in the end, that’s what we want to do: We want to understand it. We want to be able to have some ideas, some ways to predict it, to ultimately protect public health.”

This article was originally published on Undark. Read the original article.

Wildfires are challenging air quality monitoring infrastructure Read More »

as-california-faces-court-battles,-states-scramble-to-save-their-climate-goals

As California faces court battles, states scramble to save their climate goals


With or without authority to regulate heightened emissions, states plan to meet climate goals.

Traffic jam forms on Interstate 5 north of Los Angeles. Credit: Hans Gutknecht/MediaNews Group/Los Angeles Daily News

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.

When President Donald Trump signed legislation to revoke California’s authority to enforce stricter tailpipe emissions standards and to ban sales of gas-powered cars by 2035, the effects rippled far beyond the Golden State.

Seventeen states relied on California’s Clean Air Act waivers to adopt stronger vehicle pollution rules on their own, including New York, New Jersey, Oregon, Massachusetts, and Washington.

California, joined by several states, immediately sought a court injunction, calling the revocation illegal on the basis that the waivers are not subject to congressional review and that it violated decades of legal precedent and procedure. These same states recently launched an Affordable Clean Cars Coalition to coordinate legal action and policy to defend their rights to transition to cleaner vehicles.

As the legal battle plays out, states that have relied on the waivers are leaning into expanding multimillion-dollar ways to keep their EV transitions on track. Among their efforts: amping up rebates, tightening rules on the carbon intensity of fuels, and cracking down on pollution where trucks congregate.

“Climate change is still around, whether we have the waiver or not. So we have to figure out ways to make sure that we’re doing what we can to address the problem at hand,” said Michelle Miano, who heads the New Mexico environment protection division of the Environment Department.

According to data from the California Air Resources Board, the states that have passed tougher pollution rules account for about 40 percent of new light-duty vehicle registrations and 25 percent of new heavy-duty vehicle registrations in the United States, where the transportation sector is the highest source of greenhouse gas emissions as of 2022.

Among these stronger rules are the Advanced Clean Cars (ACC) I and II and Advanced Clean Trucks (ACT), which require automakers to sell a growing share of electric passenger cars and medium and heavy-duty trucks to reduce emissions from gasoline-powered counterparts.

The goal is for all new vehicles sold to be electric by 2035.

Bolstering incentives 

Without ACC and ACT, states are betting they can increase demand for EVs by reducing the costs of buying a vehicle with rebates, vouchers, and grants and boosting the number of charging stations in their states. These incentives can range from a few thousand dollars for individual EV purchases to hundreds of thousands for building charging infrastructure and fleet upgrades.

On June 18, New York announced a $53 million expansion to its voucher program for electrifying last-mile truck fleets, offering vouchers from $340,000 to $425,000 for each truck, depending on the model.

“Despite the current federal administration’s efforts to erode certainty in the ongoing transition to cleaner vehicles, New York State will continue to act to protect our air, lands, and waters,” said Amanda Lefton, commissioner of the Department of Environmental Conservation.

In Oregon, where over a third of in-state emissions are from transportation use, the government this month opened applications for $34 million in grants toward the purchase of zero-emission trucks and developing charging stations for EVs or retrofitting diesel trucks. Lawmakers are considering expanding a popular rebate program through a bill introduced in February. The program so far has given car owners almost $100 million for EV purchases. (The program has been suspended twice after running out of money. It resumed as of May 2025.)

In Massachusetts, Gov. Maura Healey promised in May to announce “dedicated additional grant funding” for electric vehicles and vowed to increase “grant funding opportunities” for charging. Advocacy groups, including the Environmental League of Massachusetts, are counting on increased funding for its MOR-EV rebate program, which provides up to $3,500 for new EV purchases. This year, the rebate program has distributed $15.7 million in total incentives, according to the program’s statistics page.

In Washington state, lawmakers earmarked $126 million—a $16 million increase from 2024—to subsidize purchases of electric truck fleets and chargers. Many states are targeting trucks because they account for a huge share in emissions relative to their number on the road.

Will Toor, executive director of the Colorado Energy Office, credited state rebates and investments in charging infrastructure for helping Colorado reach a 20 percent electric vehicle market share in the first quarter of 2025. One in five new cars sold in the state was electric. Toor also credited the state agency’s EV buyer’s education campaign launched in late 2022, which promoted available rebates and incentives for prospective EV owners.

The scope and generosity of these programs vary widely depending on each state’s climate priorities, budget capacity, and access to federal or market-based funding streams.

“Those types of incentives can be expensive,” said Terrence Gray, director of the Rhode Island Department of Environmental Management. “In Rhode Island, our budget is tight. There’s not a lot of funding available right now, so we would have to make a very strong argument that there’s a strong cost benefit to invest in these types of areas.”

With the Trump administration threatening to cut down federal funding for EV rebates through the Biden-era Inflation Reduction Act, states will have to increasingly rely on themselves to fund these programs.

“The federal government isn’t going to come save us,” said Alex Ambrose, an analyst with the nonpartisan think tank New Jersey Policy Perspective.

Some are already ahead on this. California and Washington state have devised carbon markets that charge major polluters—like oil refiners, power plants, large industrial facilities, and fuel suppliers—for each ton of carbon dioxide they release. California’s auctions bring in about $3 to $4 billion per year, which support programs such as public transit and EV rebates. Washington’s system, launched in 2023, covers around 97 major emitters and has raised over $3 billion in its first two years, funding clean transportation, air quality devices, and EV chargers.

The states of New York, New Jersey, Massachusetts, and other Northeast and Mid-Atlantic states have signed up to the Regional Greenhouse Gas Initiative, or RGGI, which is a cooperative cap-and-invest program launched in 2009 that limits emissions from the power sector and reinvests proceeds into clean energy programs like EV rebates.

Making fuels greener

While many states focus on promoting electric vehicles, others are also targeting the fuel of gas-powered cars, by adopting or developing standards that lower the carbon intensity.

These policies require fuel producers and importers to blend cleaner alternatives like biofuels, renewable diesel, or electricity into the fuel mix.

Patterned after California, Washington has a clean fuel standard in effect since 2023, targeting a 20 percent reduction in carbon intensity of transportation fuels by 2034 compared to 2017 levels.

Oregon has a similar program in place that aims to reduce carbon intensity in fuels by 37 percent by 2035.

New Mexico approved its Clean Transportation Fuel Standard in March 2024. A formal adoption hearing before the Environmental Improvement Board is scheduled to begin in September.

“We know that those (electric) vehicles aren’t for everyone and so we are very respectful of folks that decide to not purchase them,” said Miano, New Mexico’s environment protection division head.

No East Coast states have enacted a clean fuel standard, but New York state legislators may change that.

There are bills in the State Senate and Assembly that, if passed, would require fuel providers to reduce the carbon intensity of their transportation fuels by at least 20 percent by 2030. (Legislation has passed the Senate but remains at the committee level in the Assembly as of June.)

Michigan also had bills introduced in its Senate and House in 2023, but neither passed before the 2024 session ended. Similar bills have not been introduced since then.

Some of these clean fuel standards have faced criticism from environmental advocates, who argue that they allow polluters to buy their way out of reducing emissions.

But Trisha DelloIacono, policy head at advocacy group CALSTART, said the fuel standards remain one of the few politically viable tools to gradually shift the transportation sector toward cleaner fuels.

“What we need to be looking at right now is incremental changes and incremental progress in a place where we’re fighting tooth and nail to hold on to what we have,” DelloIacono said.

Where trucks congregate

There’s also a policy tool called indirect source rules, or ISR.

The rules are called “indirect” because they don’t regulate the vehicles themselves, but the facilities that attract emissions-heavy traffic, like large warehouses, ports, or rail yards. The rules hold the facilities owners or operators responsible for reducing or offsetting the pollution from their profitable traffic.

Studies show that the pollution from these trucks often ends up in nearby neighborhoods, which are disproportionately lower-income and communities of color.

California is currently the only state enforcing ISRs.

In Southern California, large warehouses must take steps to reduce the pollution caused by truck visits, either by switching to electric vehicles, installing chargers, or paying into a clean air fund. It’s the first rule of its kind in the country and it survived a court challenge in 2023, paving the way for other states to consider similar action.

New York is one of them. Its lawmakers introduced a bill in January that could require warehouses with over 50,000 square feet to reduce emissions from trucks by meeting certain benchmarks, such as hosting electric deliveries or offering bike loading zones. New York City has its own version of the rule under deliberation in the Council. As of June 2025, the bill remains stalled in the environmental committee. City Council has until December to act before the bill expires.

In New Jersey, where warehouse growth has boomed, legislators in 2024 proposed a bill that would require “high-traffic facilities” to apply for air pollution permits and provide plans to reduce diesel truck pollution.

“This is really being pushed by the community groups and environmental justice communities, especially in North Jersey. But also, warehouses are starting to pop up even in very rural parts of South Jersey. So this is very quickly becoming a statewide issue in New Jersey,” said Ambrose of the New Jersey Policy Perspective.

In Colorado, its regional air quality council in April announced plans to ask its air quality control commission to use ISR for areas with the worst air quality.

Industry groups, especially in the logistics sector, are pushing back. The industry group Supply Chain Federation told The Wall Street Journal that the southern California ISR was a “backdoor approach [that] does little to cut emissions and instead raises costs, disrupts supply chains.”

Still, experts say this may be one of the few options left for states to cut emissions from traffic-heavy facilities. Because these rules don’t directly regulate the car companies or trucks themselves, they don’t need federal approval.

“We definitely have to be nimble and fluid and also understand the kind of landscape in the state,” DelloIacono said.

Photo of Inside Climate News

As California faces court battles, states scramble to save their climate goals Read More »

oldest-wooden-tools-in-east-asia-may-have-come-from-any-of-three-species

Oldest wooden tools in East Asia may have come from any of three species

That leaves a few possibilities: Denisovans, Homo heidelbergensis (the common ancestor of Neanderthals, Denisovans, and our species), or Homo erectus. All three species could have lived in the area at the time. But nobody at Gantangqing left behind any convenient, readily identifiable bones along with their wooden tools, stone tools, and butchered animal bones (so inconsiderate of them), making it hard to pin down exactly which species these 300,000-year-old hunter-gatherers belonged to.

Homo erectus had been in Asia for more than a million years by the time Gantangqing’s lakeshore was occupied; the oldest Homo erectus fossils in Asia are from Indonesia and date back 1.8 million years. They also stuck around until quite recently. In caves at a site called Zhoukoudian, outside Beijing in eastern China, Homo erectus remains date to sometime between 700,000 and 200,000 years ago (there’s still a lot of debate on exactly how old the site is).

All of that means that Homo erectus’ presence in the region overlaps the age of the wood tools at Gantangqing. And the stone tools found nearby are fairly simple cores and flakes that don’t rule out Homo erectus as their makers. Archaeologists haven’t unearthed evidence of Homo erectus making or using sophisticated wooden tools like this, but for a species that managed to harness fire and cross miles of ocean, it’s not too wild a speculation.

On the other hand, we know that Denisovans were probably in the area, too, or at least not too far away. A recently identified Denisovan skull from Harbin, China, is 146,000 years old but bears a striking resemblance to other hominin skulls from sites all over China, which range from 300,000 to 200,000 years old. And making finely crafted wooden tools fits with everything we know about Denisovan capabilities.

Then there’s Homo heidelbergensis, the direct ancestor of Denisovans. In fact, it’s a little hard to tell where hominins stop being Homo heidelbergensis and start being Denisovans, or even whether the distinction matters. It’s a problem paleoanthropologists refer to as the “muddle in the Middle,” since both species date to the Middle Pleistocene. So if Homo erectus and Denisovans are in the running, so is Homo heidelbergensis, by default.

And unless someone finds a telltale skull nearby or another very similar toolkit at a site with telltale skulls to consult, we may not know for sure.

Science, 2023. DOI: 10.1126/science.adr8540  (About DOIs).

Oldest wooden tools in East Asia may have come from any of three species Read More »

figuring-out-why-a-nap-might-help-people-see-things-in-new-ways

Figuring out why a nap might help people see things in new ways


An EEG signal of sleep is associated with better performance on a mental task.

The guy in the back may be doing a more useful activity. Credit: XAVIER GALIANA

Dmitri Mendeleev famously saw the complete arrangement of the periodic table after falling asleep on his desk. He claimed in his dream he saw a table where all the elements fell into place, and he wrote it all down when he woke up. By having a eureka moment right after a nap, he joined a club full of rather talented people: Mary Shelley, Thomas Edison, and Salvador Dali.

To figure out if there’s a grain of truth to all these anecdotes, a team of German scientists at the Hamburg University, led by cognitive science researcher Anika T. Löwe, conducted an experiment designed to trigger such nap-following strokes of genius—and catch them in the act with EEG brain monitoring gear. And they kind of succeeded.

Catching Edison’s cup

“Thomas Edison had this technique where he held a cup or something like that when he was napping in his chair,” says Nicolas Schuck, a professor of cognitive science at the Hamburg University and senior author of the study. “When he fell asleep too deeply, the cup falling from his hand would wake him up—he was convinced that was the way to trigger these eureka moments.” While dozing off in a chair with a book or a cup doesn’t seem particularly radical, a number of cognitive scientists got serious about re-creating Edison’s approach to insights and testing it in their experiments.

One of the recent such studies was done at Sorbonne University by Célia Lacaux, a cognitive neuroscientist, and her colleagues. Over 100 participants were presented with a mathematical problem and told it could be solved by applying two simple rules in a stepwise manner. However, there was also an undescribed shortcut that made reaching the solution much quicker. The goal was to see if participants would figure this shortcut out after an Edison-style nap. The scientists would check whether the eureka moment would show in EEG.

Lacaux’s team also experimented with different objects the participants should hold while napping: spoons, steel spheres, stress balls, etc. It turned out Edison was right, and a cup was by far the best choice. It also turned out that most participants recognized there was a hidden rule after the falling cup woke them up. The nap was brief, only long enough to enter the light, non-REM N1 phase of sleep.

Initially, Schuck’s team wanted to replicate the results of Lacaux’s study. They even bought the exact same make of cups, but the cups failed this time. “For us, it just didn’t work. People who fell asleep often didn’t drop these cups—I don’t know why,” Schuck says.

The bigger surprise, however, was that the N1 phase sleep didn’t work either.

Tracking the dots

Schuck’s team set up an experiment that involved asking 90 participants to track dots on a screen in a series of trials, with a 20-minute-long nap in between. The dots were rather small, colored either purple or orange, placed in a circle, and they moved in one of two directions. The task for the participants was to determine the direction the dots were moving. That could range from easy to really hard, depending on the amount of jitter the team introduced.

The insight the participants could discover was hidden in the color coding. After a few trials where the dots’ direction was random, the team introduced a change that tied the movement to the color: orange dots always moved in one direction, and the purple dots moved in the other. It was up to the participants to figure this out, either while awake or through a nap-induced insight.

Those dots were the first difference between Schuck’s experiment and the Sorbonne study. Lacaux had her participants cracking a mathematical problem that relied on analytical skills. Schuck’s task was more about perceptiveness and out-of-the-box thinking.

The second difference was that the cups failed to drop and wake participants up. Muscles usually relax more when sleep gets deeper, which is why most people drop whatever they’re holding either at the end of the N1 phase or at the onset of the N2 phase, when the body starts to lose voluntary motor control. “We didn’t really prevent people from reaching the N2 phase, and it turned out the participants who reached the N2 phase had eureka moments most often,” Schuck explains.

Over 80 percent of people who reached the deeper, N2 phase of sleep found the color-coding solution. Participants who fell into a light N1 sleep had a 61 percent success rate; that dropped to just 55 percent in a group that stayed awake during their 20-minute nap time. In a control group that did the same task without a nap break, only 49 percent of participants figured out the hidden trick.

The divergent results in Lacaux’s and Schuck’s experiments were puzzling, so the team looked at the EEG readouts, searching for features in the data that could predict eureka moments better than sleep phases alone. And they found something.

The slope of genius

The EEG signal in the human brain consists of low and high frequencies that can be plotted on a spectral slope. When we are awake, there are a lot of high-frequency signals, and this slope looks rather flat. During sleep, these high frequencies get muted, there are more low-frequency signals, and the slope gets steeper. Usually, the deeper we sleep, the steeper our EEG slope is.

The team noticed that eureka moments seemed to be highly correlated with a steep EEG spectral slope—the steeper the slope, the more likely people were to get a breakthrough. In fact, the models based on the EEG signal alone predicted eureka moments better than predictions made based on sleep phases and even based on the sleep phases and EEG readouts combined.

“Traditionally, people divided sleep EEG readouts down into discrete stages like N1 or N2, but as usual in biology, things in reality are not as discrete,” Schuck says. “They’re much more continuous, there’s kind of a gray zone.” He told Ars that looking specifically at the EEG trace may help us better understand what exactly happens in the brain when a sudden moments of insight arrives.

But Shuck wants to get even more data in the future. “We’re currently running a study that’s been years in the making: We want to use both EEG and [functional magnetic resonance imaging] at the same time to see what happens in the brain when people are sleeping,” Schuck says. The addition of the fMRI imaging will enable Schuck and his colleagues to see which areas of the brain get activated during sleep. What the team wants to learn from combining EEG and fMRI imagery is how sleep boosts memory consolidation.

“We also hope to get some insights, no pun intended, into the processes that play a role in generating insights,” Schuck adds.

PLOS Biology, 2025.  DOI: 10.1371/journal.pbio.3003185

Photo of Jacek Krywko

Jacek Krywko is a freelance science and technology writer who covers space exploration, artificial intelligence research, computer science, and all sorts of engineering wizardry.

Figuring out why a nap might help people see things in new ways Read More »

judge:-you-can’t-ban-dei-grants-without-bothering-to-define-dei

Judge: You can’t ban DEI grants without bothering to define DEI

Separately, Trump v. Casa blocked the use of a national injunction against illegal activity. So, while the government’s actions have been determined to be illegal, Young can only protect the people who were parties to this suit. Anyone who lost a grant but wasn’t a member of any of the parties involved, or based in any of the states that sued, remains on their own.

Those issues aside, the ruling largely focuses on whether the termination of grants violates the Administrative Procedures Act, which governs how the executive branch handles decision- and rule-making. Specifically, it requires that any decisions of this sort cannot be “arbitrary and capricious.” And, Young concludes that the government hasn’t cleared that bar.

Arbitrary and capricious

The grant cancellations, Young concludes, “Arise from the NIH’s newly minted war against undefined concepts of diversity, equity, and inclusion and gender identity, that has expanded to include vaccine hesitancy, COVID, influencing public opinion and climate change.” The “undefined” aspect plays a key part in his reasoning. Referring to DEI, he writes, “No one has ever defined it to this Court—and this Court has asked multiple times.” It’s not defined in Trump’s executive order that launched the “newly minted war,” and Young found that administrators within the NIH issued multiple documents that attempted to define it, not all of which were consistent with each other, and in some cases seemed to use circular reasoning.

He also noted that the officials who sent these memos had a tendency to resign shortly afterward, writing, “it is not lost on the Court that oftentimes people vote with their feet.”

As a result, the NIH staff had no solid guidance for determining whether a given grant violated the new anti-DEI policy, or how that might be weighed against the scientific merit of the grant. So, how were they to identify which grants needed to be terminated? The evidence revealed at trial indicates that they didn’t need to make those decisions; DOGE made them for the NIH. In one case, an NIH official approved a list of grants to terminate received from DOGE only two minutes after it showed up in his inbox.

Judge: You can’t ban DEI grants without bothering to define DEI Read More »

new-evidence-that-some-supernovae-may-be-a-“double-detonation”

New evidence that some supernovae may be a “double detonation”

Type Ia supernovae are critical tools in astronomy, since they all appear to explode with the same intensity, allowing us to use their brightness as a measure of distance. The distance measures they’ve given us have been critical to tracking the expansion of the Universe, which led to the recognition that there’s some sort of dark energy hastening the Universe’s expansion. Yet there are ongoing arguments over exactly how these events are triggered.

There’s widespread agreement that type Ia supernovae are the explosions of white dwarf stars. Normally, these stars are composed primarily of moderately heavy elements like carbon and oxygen, and lack the mass to trigger additional fusion. But if some additional material is added, the white dwarf can reach a critical mass and reignite a runaway fusion reaction, blowing the star apart. But the source of the additional mass has been somewhat controversial.

But there’s an additional hypothesis that doesn’t require as much mass: a relatively small explosion on a white dwarf’s surface can compress the interior enough to restart fusion in stars that haven’t yet reached a critical mass. Now, observations of the remains of a supernova provide some evidence of the existence of these so-called “double detonation” supernovae.

Deconstructing white dwarfs

White dwarfs are the remains of stars with a similar mass to our Sun. After having gone through periods during which hydrogen and helium were fused, these tend to end up as carbon and oxygen-rich embers: hot due to their history, but incapable of reaching the densities needed to fuse these elements. Left on their own, these stellar remnants will gradually cool.

But many stars are not left on their own; they exist in binary systems with a companion, or even larger systems. These companions can provide the material needed to boost white dwarfs to the masses that can restart fusion. There are two potential pathways for this to happen. Many stars go through periods where they are so large that their gravitational pull is barely enough to hold on to their outer layers. If the white dwarf orbits closely enough, it can pull in material from the other star, boosting its mass until it passes a critical threshold, at which point fusion can restart.

New evidence that some supernovae may be a “double detonation” Read More »

rice-could-be-key-to-brewing-better-non-alcoholic-beer

Rice could be key to brewing better non-alcoholic beer

small glass of light colored beer with a nice foam head

Rice enhances flavor profiles for nonalcoholic beer, reduces fermentation time, and may contribute to flavor stability. Credit: Paden Johnson/CC BY-NC-SA

He and his team—including Christian Schubert, a visiting postdoc from the Research Institute for Raw Materials and Beverage Analysis in Berlin—brewed their own non-alcoholic beers, ranging from those made with 100 percent barley malt to ones made with 100 percent rice. They conducted a volatile chemical analysis to identify specific compounds present in the beers and assembled two sensory panels of tasters (one in the US, one in Europe) to assess aromas, flavors, and mouthfeel.

The panelists determined the rice-brewed beers had less worty flavors, and the chemical analysis revealed why: lower levels of aldehyde compounds. Instead, other sensory attributes emerged, most notably vanilla or buttery notes. “If a brewer wanted a more neutral character, they could use nonaromatic rice,” the authors wrote. Along with brewing beers with 50 percent barley/50 percent rice, this would produce non-alcoholic beers likely to appeal more broadly to consumers.

The panelists also noted that higher rice content resulted in beers with a fatty/creamy mouthfeel—likely because higher rice content was correlated with increased levels of larger alcohol molecules, which are known to contribute to a pleasant mouthfeel. But it didn’t raise the alcohol content above the legal threshold for a nonalcoholic beer.

There were cultural preferences, however. The US panelists didn’t mind worty flavors as much as the European tasters did, which might explain why the former chose beers brewed with 70 percent barley/30 percent rice as the optimal mix. Their European counterparts preferred the opposite ratio (30 percent barley/70 percent rice). The explanation “may lie in the sensory expectations shaped by each region’s brewing traditions,” the authors wrote. Fermentation also occurred more quickly as the rice content increased because of higher levels of glucose and fructose.

The second study focused on testing 74 different rice cultivars to determine their extract yields, an important variable when it comes to an efficient brewing process, since higher yields mean brewers can use less grain, thereby cutting costs. This revealed that cultivars with lower amylose content cracked more easily to release sugars during the mashing process, producing the highest yields. And certain varieties also had lower gelatinization temperatures for greater ease of processing.

International Journal of Food Science, 2025. DOI: 10.1080/10942912.2025.2520907  (About DOIs)

Journal of the American Society of Brewing Chemists, 2025. DOI: 10.1080/03610470.2025.2499768

Rice could be key to brewing better non-alcoholic beer Read More »

pentagon-may-put-spacex-at-the-center-of-a-sensor-to-shooter-targeting-network

Pentagon may put SpaceX at the center of a sensor-to-shooter targeting network


Under this plan, SpaceX’s satellites would play a big role in the Space Force’s kill chain.

The Trump administration plans to cancel a fleet of orbiting data relay satellites managed by the Space Development Agency and replace it with a secretive network that, so far, relies primarily on SpaceX’s Starlink Internet constellation, according to budget documents.

The move prompted questions from lawmakers during a Senate hearing on the Space Force’s budget last week. While details of the Pentagon’s plan remain secret, the White House proposal would commit $277 million in funding to kick off a new program called “pLEO SATCOM” or “MILNET.”

The funding line for a proliferated low-Earth orbit satellite communications network hasn’t appeared in a Pentagon budget before, but plans for MILNET already exist in a different form. Meanwhile, the budget proposal for fiscal year 2026 would eliminate funding for a new tranche of data relay satellites from the Space Development Agency. The pLEO SATCOM or MILNET program would replace them, providing crucial support for the Trump administration’s proposed Golden Dome missile defense shield.

“We have to look at what are the other avenues to deliver potentially a commercial proliferated low-Earth orbit constellation,” Gen. Chance Saltzman, chief of space operations, told senators last week. “So, we are simply looking at alternatives as we look to the future as to what’s the best way to scale this up to the larger requirements for data transport.”

What will these satellites do?

For six years, the Space Development Agency’s core mission has been to provide the military with a more resilient, more capable network of missile tracking and data relay platforms in low-Earth orbit. Those would augment the Pentagon’s legacy fleet of large, billion-dollar missile warning satellites that are parked more than 20,000 miles away in geostationary orbit.

These satellites detect the heat plumes from missile launches—and also large explosions and wildfires—to provide an early warning of an attack. The US Space Force’s early warning satellites were critical in allowing interceptors to take out Iranian ballistic missiles launched toward Israel last month.

Experts say there are good reasons for the SDA’s efforts. One motivation was the realization over the last decade or so that a handful of expensive spacecraft make attractive targets for an anti-satellite attack. It’s harder for a potential military adversary to go after a fleet of hundreds of smaller satellites. And if they do take out a few of these lower-cost satellites, it’s easier to replace them with little impact on US military operations.

Missile-tracking satellites in low-Earth orbit, flying at altitudes of just a few hundred miles, are also closer to the objects they are designed to track, meaning their infrared sensors can detect and locate dimmer heat signatures from smaller projectiles, such as hypersonic missiles.

The military’s Space Development Agency is in the process of buying, building, and launching a network of hundreds of missile-tracking and communications satellites. Credit: Northrop Grumman

But tracking the missiles isn’t enough. The data must reach the ground in order to be useful. The SDA’s architecture includes a separate fleet of small communications satellites to relay data from the missile tracking network, and potentially surveillance spacecraft tracking other kinds of moving targets, to military forces on land, at sea, or in the air through a series of inter-satellite laser crosslinks.

The military refers to this data relay component as the transport layer. When it was established in the first Trump administration, the SDA set out to deploy tranches of tracking and data transport satellites. Each new tranche would come online every couple of years, allowing the Pentagon to tap into new technologies as fast as industry develops them.

The SDA launched 27 so-called “Tranche 0” satellites in 2023 to demonstrate the concept’s overall viability. The first batch of more than 150 operational SDA satellites, called Tranche 1, is due to begin launching later this year. The SDA plans to begin deploying more than 250 Tranche 2 satellites in 2027. Another set of satellites, Tranche 3, would have followed a couple of years later. Now, the Pentagon seeks to cancel the Tranche 3 transport layer, while retaining the Tranche 3 tracking layer under the umbrella of the Space Development Agency.

Out of the shadows

While SpaceX’s role isn’t mentioned explicitly in the Pentagon’s budget documents, the MILNET program is already on the books, and SpaceX is the lead contractor. It has been made public in recent months, after years of secrecy, although many details remain unclear. Managed in a partnership between the Space Force and the National Reconnaissance Office (NRO), MILNET is designed to use military-grade versions of Starlink Internet satellites to create a “hybrid mesh network” the military can rely on for a wide range of applications.

The military version of the Starlink platform is called Starshield. SpaceX has already launched nearly 200 Starshield satellites for the NRO, which uses them for intelligence, surveillance, and reconnaissance missions.

At an industry conference last month, the Space Force commander in charge of operating the military’s communications satellites revealed new information about MILNET, according to a report by Breaking Defense. The network uses SpaceX-made user terminals with additional encryption to connect with Starshield satellites in orbit.

Col. Jeff Weisler, commander of a Space Force unit called Delta 8, said MILNET will comprise some 480 satellites operated by SpaceX but overseen by a military mission director “who communicates to the contracted workforce to execute operations at the timing and tempo of warfighting.”

The Space Force has separate contracts with SpaceX to use the commercial Starlink service. MILNET’s dedicated constellation of more secure Starshield satellites is separate from Starlink, which now has more 7,000 satellites in space.

“We are completely relooking at how we’re going to operate that constellation of capabilities for the joint force, which is going to be significant because we’ve never had a DoD hybrid mesh network at LEO,” Weisler said last month.

So, the Pentagon already relies on SpaceX’s communication services, not to mention the company’s position as the leading launch provider for Space Force and NRO satellites. With MILNET’s new role as a potential replacement for the Space Development Agency’s data relay network, SpaceX’s satellites would become a cog in combat operations.

Gen. Chance Saltzman, chief of Space Operations in the US Space Force, looks on before testifying before a House Defense Subcommittee on May 6, 2025. Credit: Brendan Smialowski/AFP via Getty Images

The data transport layer, whether it’s SDA’s architecture or a commercial solution like Starshield, will “underpin” the Pentagon’s planned Golden Dome missile defense system, Saltzman said.

But it’s not just missiles. Data relay satellites in low-Earth orbit will also have a part in the Space Force’s initiatives to develop space-based platforms to track moving targets on the ground and in the air. Eventually, all Space Force satellites could have the ability to plug into MILNET to send their data to the ground.

A spokesperson for the Department of the Air Force, which includes the Space Force, told Air & Space Forces Magazine that the pLEO, or MILNET, constellation “will provide global, integrated, and resilient capabilities across the combat power, global mission data transport, and satellite communications mission areas.”

That all adds up to a lot of bits and bytes, and the Space Force’s need for data backhaul is only going to increase, according to Col. Robert Davis, head of the Space Sensing Directorate at Space Systems Command.

He said the SDA’s satellites will use onboard edge processing to create two-dimensional missile track solutions. Eventually, the SDA’s satellites will be capable of 3D data fusion with enough fidelity to generate a full targeting solution that could be transmitted directly to a weapons system for it to take action without needing any additional data processing on the ground.

“I think the compute [capability] is there,” Davis said Tuesday at an event hosted by the Mitchell Institute, an aerospace-focused think tank in Washington, DC. “Now, it’s a comm[unication] problem and some other technical integration challenges. But how do I do that 3D fusion on orbit? If I do 3D fusion on orbit, what does that allow me to do? How do I get low-latency comms to the shooter or to a weapon itself that’s in flight? So you can imagine the possibilities there.”

The possibilities include exploiting automation, artificial intelligence, and machine learning to sense, target, and strike an enemy vehicle—a truck, tank, airplane, ship, or missile—nearly instantaneously.

“If I’m on the edge doing 3D fusion, I’m less dependent on the ground and I can get around the globe with my mesh network,” Davis said. “There’s inherent resilience in the overall architecture—not just the space architecture, but the overall architecture—if the ground segment or link segment comes under attack.”

Questioning the plan

Military officials haven’t disclosed the cost of MILNET, either in its current form or in the future architecture envisioned by the Trump administration. For context, SDA has awarded fixed-price contracts worth more than $5.6 billion for approximately 340 data relay satellites in Tranches 1 and 2.

That comes out to roughly $16 million per spacecraft, at least an order of magnitude more expensive than a Starlink satellite coming off of SpaceX’s assembly line. Starshield satellites, with their secure communications capability, are presumably somewhat more expensive than an off-the-shelf Starlink.

Some former defense officials and lawmakers are uncomfortable with putting commercially operated satellites in the “kill chain,” the term military officials use for the process of identifying threats, making a targeting decision, and taking military action.

It isn’t clear yet whether SpaceX will operate the MILNET satellites in this new paradigm, but the company has a longstanding preference for doing so. SpaceX built a handful of tech demo satellites for the Space Development Agency a few years ago, but didn’t compete for subsequent SDA contracts. One reason for this, sources told Ars, is that the SDA operates its satellite constellation from government-run control centers.

Instead, the SDA chose L3Harris, Lockheed Martin, Northrop Grumman, Rocket Lab, Sierra Space, Terran Orbital, and York Space Systems to provide the next batches of missile tracking and data transport satellites. RTX, formerly known as Raytheon, withdrew from a contract after the company determined it couldn’t make money on the program.

The tracking satellites will carry different types of infrared sensors, some with wide fields of view to detect missile launches as they happen, and others with narrow-angle sensors to maintain custody of projectiles in flight. The data relay satellites will employ different frequencies and anti-jam waveforms to supply encrypted data to military forces on the ground.

This frame from a SpaceX video shows a stack of Starlink Internet satellites attached to the upper stage of a Falcon 9 rocket, moments after the launcher’s payload fairing is jettisoned. Credit: SpaceX

The Space Development Agency’s path hasn’t been free of problems. The companies the agency selected to build its spacecraft have faced delays, largely due to supply chain issues, and some government officials have worried the Army, Navy, Air Force, and Marine Corps aren’t ready to fully capitalize on the information streaming down from the SDA’s satellites.

The SDA hired SAIC, a government services firm, earlier this year with a $55 million deal to act as a program integrator with responsibility to bring together satellites from multiple contractors, keep them on schedule, and ensure they provide useful information once they’re in space.

SpaceX, on the other hand, is a vertically integrated company. It designs, builds, and launches its own Starlink and Starshield satellites. The only major components of SpaceX’s spy constellation for the NRO that the company doesn’t build in-house are the surveillance sensors, which come from Northrop Grumman.

Buying a service from SpaceX might save money and reduce the chances of further delays. But lawmakers argued there’s a risk in relying on a single company for something that could make or break real-time battlefield operations.

Sen. Chris Coons (D-Del.), ranking member of the Senate Appropriations Subcommittee on Defense, raised concerns that the Space Force is canceling a program with “robust competition and open standards” and replacing it with a network that is “sole-sourced to SpaceX.”

“This is a massive and important contract,” Coons said. “Doesn’t handing this to SpaceX make us dependent on their proprietary technology and avoid the very positive benefits of competition and open architecture?”

Later in the hearing, Sen. John Hoeven (R-N.D.) chimed in with his own warning about the Space Force’s dependence on contractors. Hoeven’s state is home to one of the SDA’s satellite control centers.

“We depend on the Air Force, the Space Force, the Department of Defense, and the other services, and we can’t be dependent on private enterprise when it comes to fighting a war, right? Would you agree with that?” Hoeven asked Saltzman.

“Absolutely, we can’t be dependent on it,” Saltzman replied.

Air Force Secretary Troy Meink said military officials haven’t settled on a procurement strategy. He didn’t mention SpaceX by name.

As we go forward, MILNET, the term, should not be taken as just a system,” Meink said. “How we field that going forward into the future is something that’s still under consideration, and we will look at the acquisition of that.”

An Air Force spokesperson confirmed the requirements and architecture for MILNET are still in development, according to Air & Space Forces Magazine. The spokesperson added that the department is “investigating” how to scale MILNET into a “multi-vendor satellite communication architecture that avoids vendor lock.”

This doesn’t sound all that different than the SDA’s existing technical approach for data relay, but it shifts more responsibility to commercial companies. While there’s still a lot we don’t know, contractors with existing mega-constellations would appear to have an advantage in winning big bucks under the Pentagon’s new plan.

There are other commercial low-Earth orbit constellations coming online, such as Amazon’s Project Kuiper broadband network, that could play a part in MILNET. However, if the Space Force is looking for a turnkey commercial solution, Starlink and Starshield are the only options available today, putting SpaceX in a strong position for a massive windfall.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Pentagon may put SpaceX at the center of a sensor-to-shooter targeting network Read More »

gop-budget-bill-poised-to-crush-renewable-energy-in-the-us

GOP budget bill poised to crush renewable energy in the US

An early evaluation shows the administration’s planned energy policies would result in the drilling of 50,000 new oil wells every year for the next few years, he said, adding that it “ensures the continuation of land devastation… the poisoning of soil and groundwater due to fossil fuels and the continuation of gas blowouts and fires.”

There is nothing beneficial about the tax, he said, “only guaranteed misery.”

An analysis by the Rhodium Group, and energy policy research institute, projected that the Republican regime’s proposed energy policies would result in about 4 billion tons more greenhouse gas emissions than a continuation of current policies—enough to raise the average global temperature by .0072° Fahrenheit.

The overall budget bill was also panned in a June 28 statement by the president of North America’s Building Trades Unions, Sean McGarvey.

McGarvey called it “a massive insult to the working men and women of North America’s Building Trades Unions and all construction workers.”

He said that, as written, the budget “stands to be the biggest job-killing bill in the history of this country,” potentially costing as many jobs as shutting down 1,000 Keystone X pipeline projects, threatening an estimated 1.75 million construction jobs and over 3 billion work hours, which translates to $148 billion in lost annual wages and benefits.

“These are staggering and unfathomable job loss numbers, and the bill throws yet another lifeline and competitive advantage to China in the race for global energy dominance,” he said.

Research in recent years shows how right-wing populist and nationalist ideologies have used anti-renewable energy arguments to win voters, in defiance of environmental logic and scientific fact, in part by using social media to spread misleading and false information about wind, solar and other emissions-free electricity sources.

The same forces now seem to be at work in the US, said Stephan Lewandowsky, a cognitive psychologist at the University of Bristol who studies how people respond to misinformation and propaganda, and why people reject well-established scientific facts, such as those regarding climate change.

“This is a bonus for fossil fuels at the expense of future generations and the future of the American economy,” he said. “Other countries will continue working towards renewable-energy economies, especially China. That competitive advantage will eventually pay out to the detriment of American businesses. You can’t negotiate with the laws of physics.”

This story originally appeared on Inside Climate News.

GOP budget bill poised to crush renewable energy in the US Read More »

a-mammoth-tusk-boomerang-from-poland-is-40,000-years-old

A mammoth tusk boomerang from Poland is 40,000 years old

A boomerang carved from a mammoth tusk is one of the oldest in the world, and it may be even older than archaeologists originally thought, according to a recent round of radiocarbon dating.

Archaeologists unearthed the mammoth-tusk boomerang in Poland’s Oblazowa Cave in the 1990s, and they originally dated it to around 18,000 years old, which made it one of the world’s oldest intact boomerangs. But according to recent analysis by University of Bologna researcher Sahra Talamo and her colleagues, the boomerang may have been made around 40,000 years ago. If they’re right, it offers tantalizing clues about how people lived on the harsh tundra of what’s now Poland during the last Ice Age.

A boomerang carved from mammoth tusk

The mammoth-tusk boomerang is about 72 centimeters long, gently curved, and shaped so that one end is slightly more rounded than the other. It still bears scratches and scuffs from the mammoth’s life, along with fine, parallel grooves that mark where some ancient craftsperson shaped and smoothed the boomerang. On the rounded end, a series of diagonal marks would have made the weapon easier to grip. It’s smoothed and worn from frequent handling: the last traces of the life of some Paleolithic hunter.

Based on experiments with a replica, the Polish mammoth boomerang flies smoothly but doesn’t return, similar to certain types of Aboriginal Australian boomerangs. In fact, it looks a lot like a style used by Aboriginal people from Queensland, Australia, but that’s a case of people in different times and places coming up with very similar designs to fit similar needs.

But critically, according to Talamo and her colleagues, the boomerang is about 40,000 years old.

That’s a huge leap from the original radiocarbon date, made in 1996, which was based on a sample of material from the boomerang itself and estimated an age of 18,000 years. But Talamo and her colleagues claim that original date didn’t line up well with the ages of other nearby artifacts from the same layer of the cave floor. That made them suspect that the boomerang sample may have gotten contaminated by modern carbon somewhere along the way, making it look younger. To test the idea, the archaeologists radiocarbon dated samples from 13 animal bones—plus one from a human thumb—unearthed from the same layer of cave floor sediment as the boomerang.

A mammoth tusk boomerang from Poland is 40,000 years old Read More »

research-roundup:-6-cool-science-stories-we-almost-missed

Research roundup: 6 cool science stories we almost missed


Final Muon g-2 results, an ultrasonic mobile brain imaging helmet, re-creating Egyptian blue, and more.

The “world’s smallest violin” created by Loughborough University physicists. Credit: Loughborough University

It’s a regrettable reality that there is never enough time to cover all the interesting scientific stories we come across each month. In the past, we’ve featured year-end roundups of cool science stories we (almost) missed. This year, we’re experimenting with a monthly collection. June’s list includes the final results from the Muon g-2 experiment, re-creating the recipe for Egyptian blue, embedding coded messages in ice bubbles, and why cats seem to have a marked preference for sleeping on their left sides.

Re-creating Egyptian blues

Closeup image of an ancient wooden Egyptian falcon. Researchers have found a way to repoduce the blue pigment visible on the artifact

Close-up image of an ancient wooden Egyptian falcon. Researchers have found a way to reproduce the blue pigment visible on the artifact. Credit: Matt Unger, Carnegie Museum of Natural History

Artists in ancient Egypt were particularly fond of the color known as Egyptian blue—deemed the world’s oldest synthetic pigment—since it was a cheap substitute for pricier materials like lapis lazuli or turquoise. But archaeologists have puzzled over exactly how it was made, particularly given the wide range of hues, from deep blue to gray or green. That knowledge had long been forgotten. However, scientists at Washington State University have finally succeeded in recreating the recipe, according to a paper published in the journal npj Heritage Science.

The interdisciplinary team came up with 12 different potential recipes using varying percentages of silicon dioxide, copper, calcium, and sodium carbonate. They heated the samples to 1,000° Celsius (about what ancient artists could have achieved), varying the time between one and 11 hours. They also cooled the samples at different rates. Then they analyzed the samples using microscopy and other modern techniques and compared them to the Egyptian blue on actual Egyptian artifacts to find the best match.

Their samples are now on display at the Carnegie Museum of Natural History in Pittsburgh. Apart from its historical interest, Egyptian blue also has fascinating optical, magnetic, and biological properties that could prove useful in practical applications today, per the authors. For instance, it might be used for counterfeit-proof inks, since it emits light in the near-infrared, and its chemistry is similar to high-temperature superconductors.

npj Heritage Science, 2025. DOI: 10.1038/s40494-025-01699-7  (About DOIs).

World’s smallest violin

It’s an old joke, possibly dating back to the 1970s. Whenever someone is complaining about an issue that seems trivial in the grand scheme of things, it’s tradition to rub one’s thumb and forefinger together and declare, “This the world’s smallest violin playing just for you.” (In my snarky circles we used to say the violin was “playing ‘My Heart Bleeds for You.'”) Physicists at Loughborough University have now made what they claim really is the world’s smallest violin, just 35 microns long and 13 microns wide.

There are various lithographic methods for creating patterned electronic devices, such as photolithography, which can be used either with a mask or without. The authors relied on scanning probe thermal lithography instead, specifically a cutting-edge nano-sculpting machine they dubbed the NanoFrazor. The first step was to coat a small chip with two layers of a gel material and then place it under the NanoFrazor. The instrument’s heated tip burned the violin pattern into the gel. Then they “developed” the gel by dissolving the underlayer so that only a violin-shaped cavity remained.

Next, they poured on a thin layer of platinum and rinsed off the chip with acetone. The resulting violin is a microscopic image rather than a playable tiny instrument—you can’t even see it without a microscope—but it’s still an impressive achievement that demonstrates the capabilities of the lab’s new nano lithography system. And the whole process can take as little as three hours.

Muon g-2 anomaly no more?

overhead view of the Muon g-2 experiment at Fermilab

Overhead view of the Muon g-2 experiment at Fermilab. Credit: Fermilab

The Muon g-2 experiment (pronounced “gee minus two”) is designed to look for tantalizing hints of physics beyond the Standard Model of particle physics. It does this by measuring the magnetic field (aka the magnetic moment) generated by a subatomic particle known as the muon. Back in 2001, an earlier run of the experiment at Brookhaven National Laboratory found a slight discrepancy, hinting at possible new physics, but that controversial result fell short of the critical threshold required to claim discovery.

Physicists have been making new measurements ever since in hopes of resolving this anomaly. For instance, in 2021, we reported on data from the updated Muon g-2 experiment that showed “excellent agreement” with the discrepancy Brookhaven recorded. They improved on their measurement precision in 2023. And now it seems the anomaly is very close to being resolved, according to a preprint posted to the physics arXiv based on analysis of a data set triple the size as the one used for the 2023 analysis. (You can watch a video explanation here.)

The final Muon g-2 result is in agreement with the 2021 and 2023 results, but much more precise, with error bars four times smaller than those of the original Brookhaven experiment. Combine that with new predictions by the related Muon g-2 Theory Initiative using a new means of calculating the muon’s magnetic moment, and the discrepancy between theoretical prediction and experiment narrows even further.

While some have declared victory, and the Muon g-2 experiment is completed, theorists are still sounding a note of caution as they seek to further refine their models. Meanwhile, Fermilab is building a new experiment designed to hunt for muon-to-electron conversions. If they find any, that would definitely comprise new physics beyond the Standard Model.

arXiv, 2025. DOI: 10.48550/arXiv.2506.03069 (About DOIs).

Message in a bubble

Physicists have embedded Morse code messages in ice bubbles.

Physicists have embedded Morse code messages in ice bubbles. Credit: Keke Shao et al., 2025

Forget sending messages in a bottle. Scientists have figured out how to encode messages in both binary and Morse code in air bubbles trapped in ice, according to a paper published in the journal Cell Physical Science. Trapped air bubbles are usually shaped like eggs or needles, and the authors discovered that they could manipulate the sizes, shapes, and distribution of those ice bubbles by varying the freezing rate. (Faster rates produce egg-shaped bubbles, slower rates produce needle-shaped ones, for example.)

To encode messages, the researchers assigned different bubble sizes, shapes, and orientations to Morse code and binary characters and used their freezing method to produce ice bubbles representing the desired characters. Next, they took a photograph of the ice layer and converted it to gray scale, training a computer to identify the position and the size of the bubbles and decode the message into English letters and Arabic numerals. The team found that binary coding could store messages 10 times longer than Morse code.

Someday, this freezing method could be used for short message storage in Antarctica and similar very cold regions where traditional information storage methods are difficult and/or too costly, per the authors. However, Qiang Tang of the University of Australia, who was not involved in the research, told New Scientist that he did not see much practical application for the breakthrough in cryptography or security, “unless a polar bear may want to tell someone something.”

Cell Physical Science, 2025. DOI: 10.1016/j.xcrp.2025.102622 (About DOIs).

Cats prefer to sleep on left side

sleepy tuxedo cat blissfully stretched out on a blue rug

Caliban marches to his own drum and prefers to nap on his right side. Credit: Sean Carroll

The Internet was made for cats, especially YouTube, which features millions of videos of varying quality, documenting the crazy antics of our furry feline friends. Those videos can also serve the interests of science, as evidenced by the international team of researchers who analyzed 408 publicly available videos of sleeping cats to study whether the kitties showed any preference for sleeping on their right or left sides. According to a paper published in the journal Current Biology, two-thirds of those videos showed cats sleeping on their left sides.

Why should this behavioral asymmetry be the case? There are likely various reasons, but the authors hypothesize that it has something to do with kitty perception and their vulnerability to predators while asleep (usually between 12 to 16 hours a day). The right hemisphere of the brain dominates in spatial attention, while the right amygdala is dominant for processing threats. That’s why most species react more quickly when a predator approaches from the left. Because a cat’s left visual field is processed in the dominant right hemisphere of their brains, “sleeping on the left side can therefore be a survival strategy,” the authors concluded.

Current Biology, 2025. DOI: 10.1016/j.cub.2025.04.043 (About DOIs).

A mobile ultrasonic brain imaging helmet

A personalized 3D-printed helmet for mobile functional ultrasound brain imaging.

A personalized 3D-printed helmet for mobile functional ultrasound brain imaging. Credit: Sadaf Soloukey et al., 2025

Brain imaging is a powerful tool for both medical diagnosis and neuroscience research, from noninvasive methods like EEGs, MRI,  fMRI, and diffuse optical tomography, to more invasive techniques like intracranial EEG. But the dream is to be able to capture the human brain functioning in real-world scenarios instead of in the lab. Dutch scientists are one step closer to achieving that goal with a specially designed 3D-printed helmet that relies upon functional ultrasound imaging (fUSi) to enable high-quality 2D imaging, according to a paper published in the journal Science Advances.

Unlike fMRI, which requires subjects to remain stationary, the helmet monitors the brain as subjects are walking and talking (accompanied by a custom mobile fUSi acquisition cart). The team recruited two 30-something male subjects who had undergone cranioplasty to embed an implant made of polyetheretherketone (PEEK). While wearing the helmet, the subjects were asked to perform stationary motor and sensory tasks: pouting or brushing their lips, for example. Then the subjects walked in a straight line, pushing the cart for a minute up to 30 meters while licking their lips to demonstrate multitasking. The sessions ran over a 20-month period, thereby demonstrating that the helmet is suitable for long-term use. The next step is to improve the technology to enable mobile 3D imaging of the brain.

Science Advances, 2025. DOI: 10.1126/sciadv.adu9133  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Research roundup: 6 cool science stories we almost missed Read More »