dark energy

the-$4.3-billion-space-telescope-trump-tried-to-cancel-is-now-complete

The $4.3 billion space telescope Trump tried to cancel is now complete


“We’re going to be making 3D movies of what is going on in the Milky Way galaxy.”

Artist’s concept of the Nancy Grace Roman Space Telescope. Credit: NASA Goddard Space Flight Center Scientific Visualization Studio

A few weeks ago, technicians inside a cavernous clean room in Maryland made the final connection to complete assembly of NASA’s Nancy Grace Roman Space Telescope.

Parts of this new observatory, named for NASA’s first chief astronomer, recently completed a spate of tests to ensure it can survive the shaking and intense sound of a rocket launch. Engineers placed the core of the telescope inside a thermal vacuum chamber, where it withstood the airless conditions and extreme temperature swings it will see in space.

Then, on November 25, teams at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, joined the inner and outer portions of the Roman Space Telescope. With this milestone, NASA declared the observatory complete and on track for launch as soon as fall 2026.

“The team is ecstatic,” said Jackie Townsend, the observatory’s deputy project manager at Goddard, in a recent interview with Ars. “It has been a long road, but filled with lots of successes and an ordinary amount of challenges, I would say. It’s just so rewarding to get to this spot.”

An ordinary amount of challenges is not something you usually hear a NASA official say about a one-of-a-kind space mission. NASA does hard things, and they usually take more time than originally predicted. Astronomers endured more than 10 years of delays, fixes, and setbacks before the James Webb Space Telescope finally launched in 2021.

Webb is the largest telescope ever put into space. After launch, Webb had to perform a sequence of more than 50 major deployment steps, with 178 release mechanisms that had to work perfectly. Any one of the more than 300 single points of failure could have doomed the mission. In the end, Webb unfolded its giant segmented mirror and delicate sunshield without issue. After a quarter-century of development and more than $11 billion spent, the observatory is finally delivering images and science results. And they’re undeniably spectacular.

The completed Nancy Grace Roman Space Telescope, seen here with its solar panels deployed inside a clean room at NASA’s Goddard Space Flight Center in Maryland. Credit: NASA/Jolearra Tshiteya

Seeing far and wide

Roman is far less complex, with a 7.9-foot (2.4-meter) primary mirror that is nearly three times smaller than Webb’s. While it lacks Webb’s deep vision, Roman will see wider swaths of the sky, enabling a cosmic census of billions of stars and galaxies near and far (on the scale of the Universe). This broad vision will support research into dark matter and dark energy, which are thought to make up about 95 percent of the Universe. The rest of the Universe is made of regular atoms and molecules that we can see and touch.

It is also illustrative to compare Roman with the Hubble Space Telescope, which has primary mirrors of the same size. This means Roman will produce images with similar resolution to Hubble. The distinction lies deep inside Roman, where technicians have delicately laid an array of detectors to register the faint infrared light coming through the telescope’s aperture.

“Things like night vision goggles will use the same basic detector device, just tuned to a different wavelength,” Townsend said.

These detectors are located in Roman’s Wide Field Instrument, the mission’s primary imaging camera. There are 18 of them, each 4,096×4,096 pixels wide, combining to form a roughly 300-megapixel camera sensitive to visible and near-infrared light. Teledyne, the company that produced the detectors, says this is the largest infrared focal plane ever made.

The near-infrared channel on Hubble’s Wide Field Camera 3, which covers much the same part of the spectrum as Roman, has a single 1,024-pixel detector.

“That’s how you get to a much higher field-of-view for the Roman Space Telescope, and it was one of the key enabling technologies,” Townsend told Ars. “That was one place where Roman invested significant dollars, even before we started as a mission, to mature that technology so that it was ready to infuse into this mission.”

With these detectors in its bag, Roman will cover much more cosmic real estate than Hubble. For example, Roman will be able to re-create Hubble’s famous Ultra Deep Field image with the same sharpness, but expand it to show countless stars and galaxies over an area of the sky at least 100 times larger.

This infographic illustrates the differences between the sizes of the primary mirrors and detectors on the Hubble, Roman, and Webb telescopes. Credit: NASA

Roman has a second instrument, the Roman Coronagraph, with masks, filters, and adaptive optics to block out the glare from stars and reveal the faint glow from objects around them. It is designed to photograph planets 100 million times fainter than their stars, or 100 to 1,000 times better than similar instruments on Webb and Hubble. Roman can also detect exoplanets using the tried-and-true transit method, but scientists expect the new telescope will find a lot more than past space missions, thanks to its wider vision.

“With Roman’s construction complete, we are poised at the brink of unfathomable scientific discovery,” said Julie McEnery, Roman’s senior project scientist at NASA Goddard, in a press release. “In the mission’s first five years, it’s expected to unveil more than 100,000 distant worlds, hundreds of millions of stars, and billions of galaxies. We stand to learn a tremendous amount of new information about the universe very rapidly after Roman launches.”

Big numbers are crucial for learning how the Universe works, and Roman will feed vast volumes of data down to astronomers on Earth. “So much of what physics is trying to understand about the nature of the Universe today needs large number statistics in order to understand,” Townsend said.

In one of Roman’s planned sky surveys, the telescope will cover in nine months what would take Hubble between 1,000 and 2,000 years. In another survey, Roman will cover an area equivalent to 3,455 full moons in about three weeks, then go back and observe a smaller portion of that area repeatedly over five-and-a-half days—jobs that Hubble and Webb can’t do.

“We will do fundamentally different science,” Townsend said. “In some subset of our observations, we’re going to be making 3D movies of what is going on in the Milky Way galaxy and in distant galaxies. That is just something that’s never happened before.”

Getting here and getting there

Roman’s promised scientific bounty will come at a cost of $4.3 billion, including expenses for development, manufacturing, launch, and five years of operations.

This is about $300 million more than NASA expected when it formally approved Roman for development in 2020, an overrun the agency blamed on complications related to the coronavirus pandemic. Otherwise, Roman’s budget has been stable since NASA officials finalized the mission’s architecture in 2017, when it was still known by a bulky acronym: WFIRST, the Wide Field InfraRed Survey Telescope.

At that time, the agency reclassified the Roman Coronagraph as a technology demonstration, allowing managers to relax their requirements for the instrument and stave off concerns about cost growth.

Roman survived multiple attempts by the first Trump administration to cancel the mission. Each time, Congress restored funding to keep the observatory on track for launch in the mid-2020s. With Donald Trump back in the White House, the administration’s budget office earlier this year again wanted to cancel Roman. Eventually, the Trump administration released its fiscal year 2026 budget request in May, calling for a drastic cut to Roman, but not total cancellation.

Once again, both houses of Congress signaled their opposition to the cuts, and the mission remains on track for launch next year, perhaps as soon as September. This is eight months ahead of the schedule NASA has publicized for Roman for the last few years.

Townsend told Ars the mission escaped the kind of crippling cost overruns and delays that afflicted Webb through careful planning and execution. “Roman was under a cost cap, and we operated to that,” she said. “We went through reasonable efforts to preclude those kinds of highly complex deployments that lead you to having trouble in integration and test.”

The outer barrel section of the Roman Space Telescope inside a thermal vacuum chamber at NASA’s Goddard Space Flight Center, Maryland. Credit: NASA/Sydney Rohde

There are only a handful of mechanisms that must work after Roman’s launch. They include a deployable cover designed to shield the telescope’s mirror during launch and solar array wings that will unfold once Roman is in space. The observatory will head to an observing post about a million miles (1.5 million kilometers) from Earth.

“We don’t have moments of terror for the deployment,” Townsend said. “Obviously, launch is always a risk, the tip-off rates that you have when you separate from the launch vehicle… Then, obviously, getting the aperture door open so that it’s deployed is another one. But these feel like normal aerospace risks, not unusual, harrowing moments for Roman.”

It also helps that Roman will use a primary mirror gifted to NASA by the National Reconnaissance Office, the US government’s spy satellite agency. The NRO originally ordered the mirror for a telescope that would peer down on the Earth, but the spy agency no longer needed it. Before NASA got its hands on the surplus mirror in 2012, scientists working on the preliminary design for what became Roman were thinking of a smaller telescope.

The larger telescope will make Roman a more powerful tool for science, and the NRO’s donation eliminated the risk of a problem or delay manufacturing a new mirror. But the upside meant NASA had to build a more massive spacecraft and use a bigger rocket to accommodate it, adding to the observatory’s cost.

Tests of Roman’s components have gone well this year. Work on Roman continued at Goddard through the government shutdown in the fall. On Webb, engineers uncovered one problem after another as they tried to verify the observatory would perform as intended in space. There were leaky valves, tears in the Webb’s sunshield, a damaged transducer, and loose screws. With Roman, engineers so far have found no “significant surprises” during ground testing, Townsend said.

“What we always hope when you’re doing this final round of environmental tests is that you’ve wrung out the hardware at lower levels of assembly, and it looks like, in Roman’s case, we did a spectacular job at the lower level,” she said.

With Roman now fully assembled, attention at Goddard will turn to an end-to-end functional test of the observatory early next year, followed by electromagnetic interference testing, and another round of acoustic and vibration tests. Then, perhaps around June of next year, NASA will ship the observatory to Kennedy Space Center, Florida, to prepare for launch on a SpaceX Falcon Heavy rocket.

“We’re really down to the last stretch of environmental testing for the system,” Townsend said. “It’s definitely already seen the worst environment until we get to launch.”

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

The $4.3 billion space telescope Trump tried to cancel is now complete Read More »

hints-grow-stronger-that-dark-energy-changes-over-time

Hints grow stronger that dark energy changes over time

In its earliest days, the Universe was a hot, dense soup of subatomic particles, including hydrogen and helium nuclei, aka baryons. Tiny fluctuations created a rippling pattern through that early ionized plasma, which froze into a three-dimensional place as the Universe expanded and cooled. Those ripples, or bubbles, are known as baryon acoustic oscillations (BAO). It’s possible to use BAOs as a kind of cosmic ruler to investigate the effects of dark energy over the history of the Universe.

DESI is a state-of-the-art instrument and can capture light from up to 5,000 celestial objects simultaneously.

DESI is a state-of-the-art instrument that can capture light from up to 5,000 celestial objects simultaneously.

That’s what DESI was designed to do: take precise measurements of the apparent size of these bubbles (both near and far) by determining the distances to galaxies and quasars over 11 billion years. That data can then be sliced into chunks to determine how fast the Universe was expanding at each point of time in the past, the better to model how dark energy was affecting that expansion.

An upward trend

Last year’s results were based on analysis of a full year’s worth of data taken from seven different slices of cosmic time and include 450,000 quasars, the largest ever collected, with a record-setting precision of the most distant epoch (between 8 to 11 billion years back) of 0.82 percent. While there was basic agreement with the Lamba CDM model, when those first-year results were combined with data from other studies (involving the cosmic microwave background radiation and Type Ia supernovae), some subtle differences cropped up.

Essentially, those differences suggested that the dark energy might be getting weaker. In terms of confidence, the results amounted to a 2.6-sigma level for the DESI’s data combined with CMB datasets. When adding the supernovae data, those numbers grew to 2.5-sigma, 3.5-sigma, or 3.9-sigma levels, depending on which particular supernova dataset was used.

It’s important to combine the DESI data with other independent measurements because “we want consistency,” said DESI co-spokesperson Will Percival of the University of Waterloo. “All of the different experiments should give us the same answer to how much matter there is in the Universe at present day, how fast the Universe is expanding. It’s no good if all the experiments agree with the Lambda-CDM model, but then give you different parameters. That just doesn’t work. Just saying it’s consistent to the Lambda-CDM, that’s not enough in itself. It has to be consistent with Lambda-CDM and give you the same parameters for the basic properties of that model.”

Hints grow stronger that dark energy changes over time Read More »

our-universe-is-not-fine-tuned-for-life,-but-it’s-still-kind-of-ok

Our Universe is not fine-tuned for life, but it’s still kind of OK


Inspired by the Drake equation, researchers optimize a model universe for life.

Physicists including Robert H. Dickle and Fred Hoyle have argued that we are living in a universe that is perfectly fine-tuned for life. Following the anthropic principle, they claimed that the only reason fundamental physical constants have the values we measure is because we wouldn’t exist if those values were any different. There would simply have been no one to measure them.

But now a team of British and Swiss astrophysicists have put that idea to test. “The short answer is no, we are not in the most likely of the universes,” said Daniele Sorini, an astrophysicist at Durham University. “And we are not in the most life-friendly universe, either.” Sorini led a study aimed at establishing how different amounts of the dark energy present in a universe would affect its ability to produce stars. Stars, he assumed, are a necessary condition for intelligent life to appear.

But worry not. While our Universe may not be the best for life, the team says it’s still pretty OK-ish.

Expanding the Drake equation

Back in the 1960s, Frank Drake, an American astrophysicist and astrobiologist, proposed an equation aimed at estimating the number of intelligent civilizations in our Universe. The equation started with stars as a precondition for life and worked its way down in scale from there. How many new stars appear in the Universe per year? How many of the stars are orbited by planets? How many of those planets are habitable? How many of those habitable planets can develop life? Eventually, you’re left with the fraction of planets that host intelligent civilizations.

The problem with the Drake equation was that it wasn’t really supposed to yield a definite number. We couldn’t—and still can’t—know the values for most of its variables, like the fraction of the planets that developed life. So far, we know of only one such planet, and you can’t infer any statistical probabilities when you only have one sample. The equation was meant more as a guide for future researchers, giving them ideas of what to look for in their search for extraterrestrial life.

But even without knowing the actual values of all those variables present in the Drake equation, one thing was certain: The more stars you had at the beginning, the better the odds for life were. So Sorini’s team focused on stars.

“Our work is connected to the Drake equation in that it relies on the same logic,” Sorini said. “The difference is we are not adding to the life side of the equation. We’re adding to the stars’ side of the equation.” His team attempted to identify the basic constituents of a universe that’s good at producing stars.

“By ‘constituents,’ I mean ordinary matter, the stuff we are made of—the dark matter, which is a weirder, invisible type of matter, and the dark energy, which is what is making the expansion of a universe proceed faster and faster,” Sorinin explained. Of all those constituents, his team found that dark energy has a key influence on the star formation rate.

Into the multiverse

Dark energy accelerates the expansion of the Universe, counteracting gravity and pushing matter further apart. If there’s enough dark energy, it would be difficult to form the dark matter web that structures galaxies. “The idea is ‘more dark energy, fewer galaxies—so fewer stars,’” Sorini said.

The effect of dark energy in a universe can be modeled by a number called the cosmological constant. “You could reinterpret it as a form of energy that can make your universe expand faster,” Sorinin said.

(The cosmological constant was originally a number Albert Einstein came up with to fix the fact that his theory of general relativity caused the expansion of what was thought to be a static universe. Einstein later learned that the Universe actually was expanding and declared the cosmological constant his greatest blunder. But the idea eventually managed to make a comeback after it was discovered that the Universe’s expansion is accelerating.)

The cosmological constant was one of the variables Sorini’s team manipulated to determine if we are living in a universe that is maximally efficient at producing stars. Sorini based this work on an idea put forward by Steven Weinberg, a Nobel Prize-winning physicist, back in 1989. “Weinberg proposed that there could be a multiverse of all possible universes, each with a different value of dark energy,” Sorini explained.  Sorini’s team modeled that multiverse composed of thousands upon thousands of possible universes, each complete with a past and future.

Cosmological fluke

To simulate the history of all those universes, Sorini used a slightly modified version of a star formation model he developed back in 2021 with John A. Peacock, a British astronomer at the University of Edinburgh, Scotland, and co-author of the study. It wasn’t the most precise model, but the approximations it suggested produced a universe that was reasonably close to our own. The team validated the results by predicting the stellar mass fraction in the total mass of the Milky Way Galaxy, which we know stands somewhere between 2.2 and 6.6 percent. The model came up with 6.7 percent, which was deemed good enough for the job.

In the next step, Sorini and his colleagues defined a large set of possible universes in which the value of the cosmological constant ranged from a very tiny fraction of the one we observe in our Universe all the way to the value 100,000 times higher than our own.

It turned out our Universe was not the best at producing stars. But it was decent.

“The value of the cosmological constant in the most life-friendly universe would be measured at roughly one-tenth of the value we observe in our own,” Sorini said.

In a universe like that, the fraction of the matter that gets turned into stars would stand at 27 percent. “But we don’t seem to be that far from the optimal value. In our Universe, stars are formed with around 23 percent of the matter,” Sorini said.

The last question the team addressed was how lucky we are to even be here. According to Sorini’s calculations, if all universes in the multiverse are equally likely, the chances of having a cosmological constant at or lower than the value present in our Universe is just 0.5 percent. In other words, we rolled the dice and got a pretty good score, although it could have been a bit better. The odds of getting a cosmological constant at one-tenth of our own or lower were just 0.2 percent.

Things also could have been much worse. The flip side of these odds is that the number of possible universes that are worse than our own vastly exceeds the number of universes that are better.

“That is of course all subject to the assumptions of our model, and the only assumption about life we made was that more stars lead to higher chances for life to appear,” Sorini said. In the future, his team plans to go beyond that idea and make the model more sophisticated by considering more parameters. “For example, we could ask ourselves what the chances are of producing carbons in order to have life as we know it or something like that,” Sorini said.

Monthly Notices of the Royal Astronomical Society, 2024.  DOI: https://doi.org/10.1093/mnras/stae2236

Photo of Jacek Krywko

Jacek Krywko is a freelance science and technology writer who covers space exploration, artificial intelligence research, computer science, and all sorts of engineering wizardry.

Our Universe is not fine-tuned for life, but it’s still kind of OK Read More »