Features

centurylink-nightmares:-users-keep-asking-ars-for-help-with-multi-month-outages

CenturyLink nightmares: Users keep asking Ars for help with multi-month outages


More CenturyLink horror stories

Three more tales of CenturyLink failing to fix outages until hearing from Ars.

Horror poster take on the classic White Zombie about Century Link rendering the internet powerless

Credit: Aurich Lawson | White Zombie (Public Domain)

Credit: Aurich Lawson | White Zombie (Public Domain)

CenturyLink hasn’t broken its annoying habit of leaving customers without service for weeks or months and repeatedly failing to show up for repair appointments.

We’ve written about CenturyLink’s failure to fix long outages several times in the past year and a half. In each case, desperate customers contacted Ars because the telecom provider didn’t reconnect their service. And each time, CenturyLink finally sprang into action and fixed the problems shortly after hearing from an Ars reporter.

Unfortunately, it keeps happening, and CenturyLink (also known as Lumen) can’t seem to explain why. In only the last two months, we heard from CenturyLink customers in three states who were without service for periods of between three weeks and over four months.

In early December, we heard from John in Boulder, Colorado, who preferred that we not publish his last name. John said he and his wife had been without CenturyLink phone and DSL Internet service for over three weeks.

“There’s no cell service where we live, so we have to drive to find service… We’ve scheduled repairs [with CenturyLink] three different times, but each time nobody showed up, emailed, or called,” he told us. They pay $113 a month for phone and DSL service, he said.

John also told us his elderly neighbors were without service. He read our February 2024 article about a 39-day outage in Oregon and wondered if we could help. We also published an August 2023 article about CenturyLink leaving an 86-year-old woman in Minnesota with no Internet service for a month and a May 2024 article about CenturyLink leaving a couple in Oregon with no service for two months, then billing them for $239.

We contacted CenturyLink about the outages affecting John and his neighbor, providing both addresses to the company. Service for both was fixed several hours later. Suddenly, a CenturyLink “repair person showed up today, replaced both the modem and the phone card in the nearest pedestal, and we are reconnected to the rest of the world,” John told us.

John said he also messaged a CenturyLink technician whose contact information he saved from a previous visit for a different matter. It turned out this technician had been promoted to area supervisor, so John’s outreach to him may also have contributed to the belated fix. However it happened, CenturyLink confirmed to Ars that service was restored for both John and his neighbor on the same day,

“Good news, we were able to restore service to both customers today,” a company spokesperson told us. “One had a modem issue, which needed to be replaced, and the other had a problem with their line.”

What were you waiting for?

After getting confirmation that the outages were fixed, we asked the CenturyLink spokesperson whether the company has “a plan to make sure that customer outages are always fixed when a customer contacts the company instead of waiting for a reporter to contact the company on the customer’s behalf weeks later.”

Here is the answer we got from CenturyLink: “Restoring customer service is a priority, and we apologized for the delay. We’re looking at why there was a repair delay.”

It appears that nothing has changed. Even as John’s problem was fixed, CenturyLink users in other states suffered even longer outages, and no one showed up for scheduled repair appointments. These outages weren’t fixed until late January—and only after the customers contacted us to ask for help.

Karen Kurt, a resident of Sheridan, Oregon, emailed us on January 23 to report that she had no CenturyLink DSL Internet service since November 4, 2024. One of her neighbors was also suffering through the months-long outage.

“We have set up repair tickets only to have them voided and/or canceled,” Kurt told us. “We have sat at home on the designated repair day from 8–5 pm, and no one shows up.” Kurt’s CenturyLink phone and Internet service costs $172.04 a month, according to a recent bill she provided us. Kurt said she also has frequent CenturyLink phone outages, including some stretches that occurred during the three-month Internet outage.

Separately, a CenturyLink customer named David Stromberg in Bellevue, Washington, told us that his phone service had been out since September 16. He repeatedly scheduled repair appointments, but the scheduled days went by with no repairs. “Every couple weeks, they do this and the tech doesn’t show up,” he said.

“Quick” fixes

As far as we can tell, there weren’t any complex technical problems preventing CenturyLink from ending these outages. Once the public relations department heard from Ars, CenturyLink sent technicians to each area, and the customers had their services restored.

On the afternoon of January 24, we contacted CenturyLink about the outage affecting Kurt and her neighbor. CenturyLink restored service for both houses less than three hours later, finally ending outages that lasted over 11 weeks.

On Sunday, January 26, we informed CenturyLink’s public relations team about the outage affecting Stromberg in Washington. Service was restored about 48 hours later, ending the phone outage that lasted well over four months.

As we’ve done in previous cases, we asked CenturyLink why the outages lasted so long and why the company repeatedly failed to show up for repair appointments. We did not receive any substantive answer. “Services have been restored, and appropriate credits will be provided,” the CenturyLink spokesperson replied.

Stromberg said getting the credit wasn’t so simple. “We contacted them after service was restored. They credited the full amount, but it took a few phone calls. They also gave us a verbal apology,” he told us. He said they pay $80.67 a month for CenturyLink phone service and that they get Internet access from Comcast.

Kurt said she had to call CenturyLink each month the outage dragged on to obtain a bill credit. Though the outage is over, she said her Internet access has been unreliable since the fix, with webpages often taking painfully long times to load.

Kurt has only a 1.5Mbps DSL connection, so it’s not a modern Internet connection even on a good day. CenturyLink told us it found no further problems on its end, so it appears that Kurt is stuck with what she has for now.

Desperation

“We are just desperate,” Kurt told us when she first reached out. Kurt, a retired teacher, said she and her husband were driving to a library to access the Internet and help grandchildren with schoolwork. She said there’s no reliable cell service in the area and that they are on a waiting list for Starlink satellite service.

Kurt said her husband once suggested they switch to a different Internet provider, and she pointed out that there aren’t any better options. On the Starlink website, entering their address shows they are in an area labeled as sold out.

Although repair appointments came and went without a fix, Kurt said she received emails from CenturyLink falsely claiming that service had been restored. Kurt said she spoke with technicians doing work nearby and asked if CenturyLink is trying to force people to drop the service because it doesn’t want to serve the area anymore.

Kurt said a technician replied that there are some areas CenturyLink doesn’t want to serve anymore but that her address isn’t on that list. A technician explained that they have too much work, she said.

CenturyLink has touted its investments in modern fiber networks but hasn’t upgraded the old copper lines in Kurt’s area and many others.

“This is DSL. No fiber here!” Kurt told us. “Sometimes when things are congested, you can make a sandwich while things download. I have been told that is because this area is like a glass of water. At first, there were only a few of us drinking out of the glass. Now, CenturyLink has many more customers drinking out of that same glass, and so things are slower/congested at various times of the day.”

Kurt said the service tends to work better in mid-morning, early afternoon, after 9 pm on weeknights, and on weekends. “Sometimes pages take a bit of time to load. That is especially frustrating while doing school work with my grandson and granddaughter,” she said.

CenturyLink Internet even slower than expected

After the nearly three-month outage ended, Kurt told us on January 27 that “many times, we will get Internet back for two or three days, only to lose it again.” This seemed to be what happened on Sunday, February 2, when Kurt told us her Internet stopped working again and that she couldn’t reach a human at CenturyLink. She restarted the router but could not open webpages.

We followed up with CenturyLink’s public relations department again, but this time, the company said its network was performing as expected. “We ran a check and called Karen regarding her service,” CenturyLink told us on February 3. “Everything looks good on our end, with no problems reported since the 24th. She mentioned that she could access some sites, but the speed seemed really slow. We reminded her that she has a 1.5Mbps service. Karen acknowledged this but felt it was slower than expected.”

Kurt told us that her Internet is currently slower than it was before the outage. “Before October, at least the webpages loaded,” she said. Now, “the pages either do not load, continue to attempt to load, or finally time out.”

While Kurt is suffering from a lack of broadband competition, municipalities sometimes build public broadband networks when private companies fail to adequately serve their residents. ISPs such as CenturyLink have lobbied against these efforts to expand broadband access.

In May 2024, we wrote about how public broadband advocates say they’ve seen a big increase in opposition from “dark money” groups that don’t have to reveal their donors. At the time, CenturyLink did not answer questions about specific donations but defended its opposition to government-operated networks.

“We know it will take everyone working together to close the digital divide,” CenturyLink told us then. “That’s why we partner with municipalities on their digital inclusion efforts by providing middle-mile infrastructure that supports last-mile networks. We have and will continue to raise legitimate concerns when government-owned networks create an anti-competitive environment. There needs to be a level playing field when it comes to permitting, right-of-way fees, and cross subsidization of costs.”

Stuck with CenturyLink

Kurt said that CenturyLink has set a “low bar” for its service, and it isn’t even meeting that low standard. “I do not use the Internet a lot. I do not use the Internet for gaming or streaming things. The Internet here would never be able to do that. But I do expect the pages to load properly and fully,” she said.

Kurt said she and her husband live in a house they built in 2007 and originally were led to believe that Verizon service would be available. “Prior to purchasing the property, we did our due diligence and sought out all utility providers… Verizon insisted it was their territory on at least two occasions,” she said.

But when it was time to install phone and Internet lines, it turned out Verizon didn’t serve the location, she said. This is another problem we’ve written about multiple times—ISPs incorrectly claiming to offer service in an area, only to admit they don’t after a resident moves in. (Verizon sold its Oregon wireline operations to Frontier in 2010.)

“We were stuck with CenturyLink,” and “CenturyLink did not offer Internet when we first built this home,” Kurt said. They subscribed to satellite Internet offered by WildBlue, which was acquired by ViaSat in 2009. They used satellite for several years until they could get CenturyLink’s DSL Internet.

Now they’re hoping to replace CenturyLink with Starlink, which uses low-Earth orbit satellites that offer faster service than older satellite services. They’re on the waiting list for Starlink and are interested in Amazon’s Kuiper satellite service, which isn’t available yet.

“We are hoping one of these two vendors will open up a spot for us and we can move our Internet over to satellite,” Kurt said. “We have also heard that Starlink and Amazon are going to be starting up phone service as well as Internet. That would truly be a gift to us. If we could move all of our services over to something reliable, our life would be made so much easier.”

Not enough technicians for copper network

John, the Colorado resident who had a three-week CenturyLink outage, said his default DSL speed is 10Mbps downstream and 2Mbps upstream. He doubled that by getting a second dedicated line to create a bonded connection, he said.

When John set up repair appointments during the outage, the “dates came and went without the typical ‘your tech’s on their way’ email, without anyone showing up,” he said. John said he repeatedly called CenturyLink and was told there was a bad cable that was being fixed.

“Every time I called, I’d get somebody who said that it was a bad cable and it was being fixed. Every single time, they’d say it would be fixed by 11 pm the following day,” he said. “It wasn’t, so I’d call again. I asked to talk with a supervisor, but that was always denied. Every time, they said they’d expedite the request. The people I talked with were all very nice and very apologetic about our outage, but they clearly stayed in their box.”

John still had the contact information for the CenturyLink technician who set up his bonded connection and messaged him around the same time he contacted Ars. When a CenturyLink employee finally showed up to fix the problem, he “found that our DSL was out because our modem was bad, and the phone was out because there was a bad dial-tone card in the closest pedestal. It took this guy less than an hour to get us back working—and it wasn’t a broken cable,” John said.

John praised CenturyLink’s local repair team but said his requests for repairs apparently weren’t routed to the right people. A CenturyLink manager told John that the local crew never got the repair ticket from the phone-based customer service team, he said.

The technician who fixed the service offered some insight into the local problems, John told us. “He said that in the mountains of western Boulder County, there are a total of five techs who know how to work with copper wire,” John told us. “All the other employees only work with fiber. CenturyLink is losing the people familiar with copper and not replacing them, even though copper is what the west half of the county depends on.”

Lumen says it has 1.08 million fiber broadband subscribers and 1.47 million “other broadband subscribers,” defined as “customers that primarily subscribe to lower speed copper-based broadband services marketed under the CenturyLink brand.”

John doesn’t know whether his copper line will ever be upgraded to fiber. His house is 1.25 miles from the nearest fiber box. “I wonder if they’ll eventually replace lines like the one to our house or if they’ll drop us as customers when the copper line eventually degrades to the point it’s not usable,” he said.

Photo of Jon Brodkin

Jon is a Senior IT Reporter for Ars Technica. He covers the telecom industry, Federal Communications Commission rulemakings, broadband consumer affairs, court cases, and government regulation of the tech industry.

CenturyLink nightmares: Users keep asking Ars for help with multi-month outages Read More »

the-severance-writer-and-cast-on-corporate-cults,-sci-fi,-and-more

The Severance writer and cast on corporate cults, sci-fi, and more

The following story contains light spoilers for season one of Severence but none for season 2.

The first season of Severance walked the line between science-fiction thriller and Office Space-like satire, using a clever conceit (characters can’t remember what happens at work while at home, and vice versa) to open up new storytelling possibilities.

It hinted at additional depths, but it’s really season 2’s expanded worldbuilding that begins to uncover additional themes and ideas.

After watching the first six episodes of season two and speaking with the series’ showrunner and lead writer, Dan Erickson, as well as a couple of members of the cast (Adam Scott and Patricia Arquette), I see a show that’s about more than critiquing corporate life. It’s about all sorts of social mechanisms of control. It’s also a show with a tremendous sense of style and deep influences in science fiction.

Corporation or cult?

When I started watching season 2, I had just finished watching two documentaries about cults—The Vow, about a multi-level marketing and training company that turned out to be a sex cult, and Love Has Won: The Cult of Mother God, about a small, Internet-based religious movement that believed its founder was the latest human form of God.

There were hints of cult influences in the Lumon corporate structure in season 1, but without spoiling anything, season 2 goes much deeper into them. As someone who has worked at a couple of very large media corporations, I enjoyed Severance’s send-up of corporate culture. And as someone who has worked in tech startups—both good and dysfunctional ones—and who grew up in a radical religious environment, I now enjoy its send-up of cult social dynamics and power plays.

Employees watch a corporate propaganda video

Lumon controls what information is presented to its employees to keep them in line. Credit: Apple

When I spoke with showrunner Dan Erickson and actor Patricia Arquette, I wasn’t surprised to learn that it wasn’t just me—the influence of stories about cults on season 2 was intentional.

Erickson explained:

I watched all the cult documentaries that I could find, as did the other writers, as did Ben, as did the actors. What we found as we were developing it is that there’s this weird crossover. There’s this weird gray zone between a cult and a company, or any system of power, especially one where there is sort of a charismatic personality at the top of it like Kier Eagan. You see that in companies that have sort of a reverence for their founder.

Arquette also did some research on cults. “Very early on when I got the pilot, I was pretty fascinated at that time with a lot of cult documentaries—Wild Wild Country, and I don’t know if you could call it a cult, but watching things about Scientology, but also different military schools—all kinds of things like that with that kind of structure, even certain religions,” she recalled.

The Severance writer and cast on corporate cults, sci-fi, and more Read More »

weight-saving-and-aero-optimization-feature-in-the-2025-porsche-911-gt3

Weight saving and aero optimization feature in the 2025 Porsche 911 GT3


Among the changes are better aero, shorter gearing, and the return of the Touring.

A pair of Porsche 911 GT3s parked next to a wall with the words

The Porsche 911 GT3 is to other 911s as other 911s are to regular cars. Credit: Jonathan Gitlin

The Porsche 911 GT3 is to other 911s as other 911s are to regular cars. Credit: Jonathan Gitlin

VALENCIA, SPAIN—A Porsche 911 is rather special compared to most “normal” cars. The rear-engined sports car might be bigger and less likely to swap ends than the 1960s version, but it remains one of the more nimble and engaging four-wheeled vehicles you can buy. The 911 comes in a multitude of variants, but among driving enthusiasts, few are better regarded than the GT3. And Porsche has just treated the current 911 GT3 to its midlife refresh, which it will build in regular and Touring flavors.

The GT3 is a 911 you can drive to the track, spend the day lapping, and drive home again. It’s come a long way since the 1999 original—that car made less power than a base 911 does now. Now, the recipe is a bit more involved, with a naturally aspirated flat-six engine mounted behind the rear axle that generates 502 hp (375 kW) and 331 lb-ft (450 Nm) and a redline that doesn’t interrupt play until 9,000 rpm. You’ll need to exercise it to reach those outputs—peak power arrives at 8,500, although peak torque happens a bit sooner at around 6,000 revs.

It’s a mighty engine indeed, derived from the racing version of the 911, with some tweaks for road legality. So there are things like individual throttle valves, dry sump lubrication, solid cam finger followers (instead of hydraulic valve lifters), titanium con rods, and forged pistons.

I’ve always liked GT3s in white.

For this car, Porsche has also worked on reducing its emissions, fitting four catalytic converters to the exhaust, plus a pair of particulate filters, which together help cut NOx emissions on the US test cycle by 44 percent. This adds 3 lbs (1.4 kg) of mass and increases exhaust back pressure by 17 percent. But there are also new cylinder heads and reprofiled camshafts (from the even more focused, even more expensive GT3 RS), which increase drivability and power delivery in the upper rev range by keeping the valves open for longer.

Those tweaks might not be immediately noticeable when you look at last year’s GT3, but the shorter gearing definitely will be. The final drive ratios for both the standard seven-speed PDK dual-clutch gearbox and the six-speed manual have been reduced by 8 percent. This lowers the top speed a little—a mostly academic thing anyway outside of the German Autobahn and some very long runways—but it increases the pulling force on the rear wheels in each gear across the entire rev range. In practical terms, it means you can take a corner in a gear higher than you would in the old car.

There have been suspension tweaks, too. The GT3 moved to double front wishbone suspension (replacing the regular car’s MacPherson struts) in 2021, but now the front pivot point has been lowered to reduce the car diving under braking, and the trailing arms have a new teardrop profile that improves brake cooling and reduces drag a little. Porsche has altered the bump stops, giving the suspension an inch (24 mm) more travel at the front axle and slightly more (27 mm) at the rear axle, which in turn means more body control on bumpy roads.

A white Porsche 911 GT3 seen in profile

Credit: Porsche

New software governs the power steering. Because factors like manufacturing tolerances, wear, and even temperature can alter how steering components interact with each other, the software automatically tailors friction compensation to axle friction. Consequently, the steering is more precise and more linear in its behavior, particularly in the dead-ahead position.

The GT3 also has new front and rear fascias, again derived from the racing GT3. There are more cooling inlets, vents, and ducts, plus a new front diffuser that reduces lift at the front axle at speed. Porsche has tuned the GT3’s aerodynamics to be constant across the speed range, and like the old model, it generates around 309 lbs (140 kg) of downforce at 125 mph (200 km/h). Under the car, there are diffusers on the rear lower wishbones, and Porsche has improved brake and driveshaft cooling.

Finally, Porsche has made some changes to the interior. For instance, the GT3 now gains the same digital display seen on other facelifted 911s (the 992.2 generation if you’re a Porsche nerd), similar to the one you’d find in a Taycan, Macan, or Panamera.

Some people may mourn the loss of the big physical tachometer, but I’m not one of them. The car has a trio of UI settings: a traditional five-dial display, a more reduced three-dial display, and a track mode with just the big central tach, which you can reorient so the red line is at 12 o’clock, as was the case with many an old Porsche racing car, rather than its normal position down around 5 o’clock. And instead of a push button to start the car, there’s a twister—if a driver spins on track, it’s more intuitive to restart the car by twisting the control the way you would a key.

You can see the starter switch on the left of the steering wheel. Porsche

Finally, there are new carbon fiber seats, which now have folding backrests for better access to the rear. (However, unless I’m mistaken, you can’t adjust the angle of the backrest.) In a very clever and welcome touch, the headrest padding is removable so that your head isn’t forced forward when wearing a helmet on track. Such is the attention to detail here. (Customers can also spec the car with Porsche’s 18-way sports seats instead.)

Regular, Touring, Lightweight, Wiessach

In fact, the new GT3 is available in two different versions. There’s the standard car, with its massive rear wing (complete with gooseneck mounts), which is the one you’d pick if your diet included plenty of track days. For those who want a 911 that revs to 9 but don’t plan on spending every weekend chasing lap times, Porsche has reintroduced the GT3 Touring. This version ditches the rear wing for the regular 911 rear deck, the six-speed manual is standard (with PDK as an option), and you can even specify rear seats—traditionally, the GT3 has eliminated those items in favor of weight saving.

Of course, it’s possible to cut even more weight from the GT3 with the Weissach Pack for the winged car or a lightweight package for the Touring. These options involve lots of carbon fiber bits for the interior and the rear axle, a carbon fiber roof for the Touring, and even the option of a carbon fiber roll cage for the GT3. The lightweight package for the touring also includes an extra-short gear lever with a shorter throw.

The track mode display might be too minimalist for road driving—I tend to like being able to see my directions as well as the rpm and speed—but it’s perfect for track work. Note the redline at 12 o’clock. Porsche

Although Porsche had to add some weight to the 992.2 compared to the 992.1 thanks to thicker front brake discs and more door-side impact protection, the standard car still weighs just 3,172 lbs (1,439 kg), which you can reduce to 3,131 lbs (1,420 kg) if you fit all the lightweight goodies, including the ultra-lightweight magnesium wheels.

Behind the wheel

I began my day with a road drive in the GT3 Touring—a PDK model. Porsche wasn’t kidding about the steering. I hesitate to call it telepathic, as that’s a bit of a cliché, but it’s extremely direct, particularly the initial turn-in. There’s also plenty of welcome feedback from the front tires. In an age when far too many cars have essentially numb steering, the GT3 is something of a revelation. And it’s proof that electronic power steering can be designed and tuned to deliver a rewarding experience.

The cockpit ergonomics are spot-on, with plenty of physical controls rather than relegating everything to a touchscreen. If you’re short like me and you buy a GT3, you’ll want to have the buckets set for your driving position—while the seat adjusts for height, as you raise it up, it also pitches forward a little, making the seat back more vertical than I’d like. (The seats slide fore and aft, so they’re not quite fixed buckets as they would be in a racing car.)

The anti-dive effect of that front suspension is quite noticeable under braking, and in either Normal or Sport mode, the damper settings are well-calibrated for bumpy back roads. It’s a supple ride, if not quite a magic carpet. On the highway, the Touring cruises well, although the engine can start to sound a little droning at a constant rpm. But the highway is not what the GT3 is optimized for.

On a dusty or wet road, you need to be alert if you’re going to use a lot of throttle at low speed. Jonathan Gitlin

On windy mountain roads, again in Normal or Sport, the car comes alive. Second and third gears are perfect for these conditions, allowing you to keep the car within its power band. And boy, does it sound good as it howls between 7,000 and 9,000 rpm. Porsche’s naturally aspirated flat-sixes have a hard edge to them—the 911 RSR was always the loudest race car in the pack—and the GT3 is no exception. Even with the sports exhaust in fruity mode, there’s little of the pops, bangs, and crackles you might hear in other sports cars, but the drama comes from the 9000 rpm redline.

Porsche asked us to keep traction control and ESC enabled during our drive—there are one-touch buttons to disable them—and given the muddy and dusty state of the roads, this was a wise idea. (The region was beset by severe flooding recently, and there was plenty of evidence of that on the route.) Even with TC on, the rear wheels would break traction if you were injudicious with the throttle, and presumably that would be the same in the wet. But it’s very easy to catch, even if you are only of moderate driving ability, like your humble correspondent.

After lunch, it was time to try the winged car, this time on the confines of the Ricardo Torno circuit just outside the city. On track, the handling was very neutral around most of the corners, with some understeer through the very slow turn 2. While a low curb weight and more than 500 hp made for a very fast accelerating car, the braking performance was probably even more impressive, allowing you to stand on the pedal and shed speed with no fade and little disturbance to the body control. Again, I am no driving god, but the GT3 was immensely flattering on track, and unlike much older 911s, it won’t try to swap ends on you when trail-braking or the like.

The landing was not nearly as jarring as you might think. Porsche

After some time behind the wheel, I was treated to some passenger laps by one of my favorite racing drivers, the inimitable Jörg Bergmeister. Unlike us journalists, he was not required to stay off the high curbs, and he demonstrated how well the car settles after launching its right-side wheels into the air over one of them. It settles down very quickly! He also demonstrated that the GT3 can be plenty oversteer-y on the exit of corners if you know what you’re doing, aided by the rear-wheel steering. It’s a testament to his driving that I emerged from two passenger laps far sweatier than I was after lapping the track myself.

The GT3 and GT3 Touring should be available from this summer in the US, with a starting price of $222,500. Were I looking for a 911 for road driving, I think I might be more tempted by the much cheaper 911 Carrera T, which is also pared to the bone weight-wise but uses the standard 380 hp (283 kW) turbocharged engine (which is still more power than the original GT3 of 1999). That car delivers plenty of fun at lower speeds, so it’s probably more useable on back roads.

A green Porsche 911 GT3 seen at sunset

Credit: Porsche

But if you want a 911 for track work, this new GT3 is simply perfect.

Photo of Jonathan M. Gitlin

Jonathan is the Automotive Editor at Ars Technica. He has a BSc and PhD in Pharmacology. In 2014 he decided to indulge his lifelong passion for the car by leaving the National Human Genome Research Institute and launching Ars Technica’s automotive coverage. He lives in Washington, DC.

Weight saving and aero optimization feature in the 2025 Porsche 911 GT3 Read More »

ai-haters-build-tarpits-to-trap-and-trick-ai-scrapers-that-ignore-robots.txt

AI haters build tarpits to trap and trick AI scrapers that ignore robots.txt


Making AI crawlers squirm

Attackers explain how an anti-spam defense became an AI weapon.

Last summer, Anthropic inspired backlash when its ClaudeBot AI crawler was accused of hammering websites a million or more times a day.

And it wasn’t the only artificial intelligence company making headlines for supposedly ignoring instructions in robots.txt files to avoid scraping web content on certain sites. Around the same time, Reddit’s CEO called out all AI companies whose crawlers he said were “a pain in the ass to block,” despite the tech industry otherwise agreeing to respect “no scraping” robots.txt rules.

Watching the controversy unfold was a software developer whom Ars has granted anonymity to discuss his development of malware (we’ll call him Aaron). Shortly after he noticed Facebook’s crawler exceeding 30 million hits on his site, Aaron began plotting a new kind of attack on crawlers “clobbering” websites that he told Ars he hoped would give “teeth” to robots.txt.

Building on an anti-spam cybersecurity tactic known as tarpitting, he created Nepenthes, malicious software named after a carnivorous plant that will “eat just about anything that finds its way inside.”

Aaron clearly warns users that Nepenthes is aggressive malware. It’s not to be deployed by site owners uncomfortable with trapping AI crawlers and sending them down an “infinite maze” of static files with no exit links, where they “get stuck” and “thrash around” for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models. That’s likely an appealing bonus feature for any site owners who, like Aaron, are fed up with paying for AI scraping and just want to watch AI burn.

Tarpits were originally designed to waste spammers’ time and resources, but creators like Aaron have now evolved the tactic into an anti-AI weapon. As of this writing, Aaron confirmed that Nepenthes can effectively trap all the major web crawlers. So far, only OpenAI’s crawler has managed to escape.

It’s unclear how much damage tarpits or other AI attacks can ultimately do. Last May, Laxmi Korada, Microsoft’s director of partner technology, published a report detailing how leading AI companies were coping with poisoning, one of the earliest AI defense tactics deployed. He noted that all companies have developed poisoning countermeasures, while OpenAI “has been quite vigilant” and excels at detecting the “first signs of data poisoning attempts.”

Despite these efforts, he concluded that data poisoning was “a serious threat to machine learning models.” And in 2025, tarpitting represents a new threat, potentially increasing the costs of fresh data at a moment when AI companies are heavily investing and competing to innovate quickly while rarely turning significant profits.

“A link to a Nepenthes location from your site will flood out valid URLs within your site’s domain name, making it unlikely the crawler will access real content,” a Nepenthes explainer reads.

The only AI company that responded to Ars’ request to comment was OpenAI, whose spokesperson confirmed that OpenAI is already working on a way to fight tarpitting.

“We’re aware of efforts to disrupt AI web crawlers,” OpenAI’s spokesperson said. “We design our systems to be resilient while respecting robots.txt and standard web practices.”

But to Aaron, the fight is not about winning. Instead, it’s about resisting the AI industry further decaying the Internet with tech that no one asked for, like chatbots that replace customer service agents or the rise of inaccurate AI search summaries. By releasing Nepenthes, he hopes to do as much damage as possible, perhaps spiking companies’ AI training costs, dragging out training efforts, or even accelerating model collapse, with tarpits helping to delay the next wave of enshittification.

“Ultimately, it’s like the Internet that I grew up on and loved is long gone,” Aaron told Ars. “I’m just fed up, and you know what? Let’s fight back, even if it’s not successful. Be indigestible. Grow spikes.”

Nepenthes instantly inspires another tarpit

Nepenthes was released in mid-January but was instantly popularized beyond Aaron’s expectations after tech journalist Cory Doctorow boosted a tech commentator, Jürgen Geuter, praising the novel AI attack method on Mastodon. Very quickly, Aaron was shocked to see engagement with Nepenthes skyrocket.

“That’s when I realized, ‘oh this is going to be something,'” Aaron told Ars. “I’m kind of shocked by how much it’s blown up.”

It’s hard to tell how widely Nepenthes has been deployed. Site owners are discouraged from flagging when the malware has been deployed, forcing crawlers to face unknown “consequences” if they ignore robots.txt instructions.

Aaron told Ars that while “a handful” of site owners have reached out and “most people are being quiet about it,” his web server logs indicate that people are already deploying the tool. Likely, site owners want to protect their content, deter scraping, or mess with AI companies.

When software developer and hacker Gergely Nagy, who goes by the handle “algernon” online, saw Nepenthes, he was delighted. At that time, Nagy told Ars that nearly all of his server’s bandwidth was being “eaten” by AI crawlers.

Already blocking scraping and attempting to poison AI models through a simpler method, Nagy took his defense method further and created his own tarpit, Iocaine. He told Ars the tarpit immediately killed off about 94 percent of bot traffic to his site, which was primarily from AI crawlers. Soon, social media discussion drove users to inquire about Iocaine deployment, including not just individuals but also organizations wanting to take stronger steps to block scraping.

Iocaine takes ideas (not code) from Nepenthes, but it’s more intent on using the tarpit to poison AI models. Nagy used a reverse proxy to trap crawlers in an “infinite maze of garbage” in an attempt to slowly poison their data collection as much as possible for daring to ignore robots.txt.

Taking its name from “one of the deadliest poisons known to man” from The Princess Bride, Iocaine is jokingly depicted as the “deadliest poison known to AI.” While there’s no way of validating that claim, Nagy’s motto is that the more poisoning attacks that are out there, “the merrier.” He told Ars that his primary reasons for building Iocaine were to help rights holders wall off valuable content and stop AI crawlers from crawling with abandon.

Tarpits aren’t perfect weapons against AI

Running malware like Nepenthes can burden servers, too. Aaron likened the cost of running Nepenthes to running a cheap virtual machine on a Raspberry Pi, and Nagy said that serving crawlers Iocaine costs about the same as serving his website.

But Aaron told Ars that Nepenthes wasting resources is the chief objection he’s seen preventing its deployment. Critics fear that deploying Nepenthes widely will not only burden their servers but also increase the costs of powering all that AI crawling for nothing.

“That seems to be what they’re worried about more than anything,” Aaron told Ars. “The amount of power that AI models require is already astronomical, and I’m making it worse. And my view of that is, OK, so if I do nothing, AI models, they boil the planet. If I switch this on, they boil the planet. How is that my fault?”

Aaron also defends against this criticism by suggesting that a broader impact could slow down AI investment enough to possibly curb some of that energy consumption. Perhaps due to the resistance, AI companies will be pushed to seek permission first to scrape or agree to pay more content creators for training on their data.

“Any time one of these crawlers pulls from my tarpit, it’s resources they’ve consumed and will have to pay hard cash for, but, being bullshit, the money [they] have spent to get it won’t be paid back by revenue,” Aaron posted, explaining his tactic online. “It effectively raises their costs. And seeing how none of them have turned a profit yet, that’s a big problem for them. The investor money will not continue forever without the investors getting paid.”

Nagy agrees that the more anti-AI attacks there are, the greater the potential is for them to have an impact. And by releasing Iocaine, Nagy showed that social media chatter about new attacks can inspire new tools within a few days. Marcus Butler, an independent software developer, similarly built his poisoning attack called Quixotic over a few days, he told Ars. Soon afterward, he received messages from others who built their own versions of his tool.

Butler is not in the camp of wanting to destroy AI. He told Ars that he doesn’t think “tools like Quixotic (or Nepenthes) will ‘burn AI to the ground.'” Instead, he takes a more measured stance, suggesting that “these tools provide a little protection (a very little protection) against scrapers taking content and, say, reposting it or using it for training purposes.”

But for a certain sect of Internet users, every little bit of protection seemingly helps. Geuter linked Ars to a list of tools bent on sabotaging AI. Ultimately, he expects that tools like Nepenthes are “probably not gonna be useful in the long run” because AI companies can likely detect and drop gibberish from training data. But Nepenthes represents a sea change, Geuter told Ars, providing a useful tool for people who “feel helpless” in the face of endless scraping and showing that “the story of there being no alternative or choice is false.”

Criticism of tarpits as AI weapons

Critics debating Nepenthes’ utility on Hacker News suggested that most AI crawlers could easily avoid tarpits like Nepenthes, with one commenter describing the attack as being “very crawler 101.” Aaron said that was his “favorite comment” because if tarpits are considered elementary attacks, he has “2 million lines of access log that show that Google didn’t graduate.”

But efforts to poison AI or waste AI resources don’t just mess with the tech industry. Governments globally are seeking to leverage AI to solve societal problems, and attacks on AI’s resilience seemingly threaten to disrupt that progress.

Nathan VanHoudnos is a senior AI security research scientist in the federally funded CERT Division of the Carnegie Mellon University Software Engineering Institute, which partners with academia, industry, law enforcement, and government to “improve the security and resilience of computer systems and networks.” He told Ars that new threats like tarpits seem to replicate a problem that AI companies are already well aware of: “that some of the stuff that you’re going to download from the Internet might not be good for you.”

“It sounds like these tarpit creators just mainly want to cause a little bit of trouble,” VanHoudnos said. “They want to make it a little harder for these folks to get” the “better or different” data “that they’re looking for.”

VanHoudnos co-authored a paper on “Counter AI” last August, pointing out that attackers like Aaron and Nagy are limited in how much they can mess with AI models. They may have “influence over what training data is collected but may not be able to control how the data are labeled, have access to the trained model, or have access to the Al system,” the paper said.

Further, AI companies are increasingly turning to the deep web for unique data, so any efforts to wall off valuable content with tarpits may be coming right when crawling on the surface web starts to slow, VanHoudnos suggested.

But according to VanHoudnos, AI crawlers are also “relatively cheap,” and companies may deprioritize fighting against new attacks on crawlers if “there are higher-priority assets” under attack. And tarpitting “does need to be taken seriously because it is a tool in a toolkit throughout the whole life cycle of these systems. There is no silver bullet, but this is an interesting tool in a toolkit,” he said.

Offering a choice to abstain from AI training

Aaron told Ars that he never intended Nepenthes to be a major project but that he occasionally puts in work to fix bugs or add new features. He said he’d consider working on integrations for real-time reactions to crawlers if there was enough demand.

Currently, Aaron predicts that Nepenthes might be most attractive to rights holders who want AI companies to pay to scrape their data. And many people seem enthusiastic about using it to reinforce robots.txt. But “some of the most exciting people are in the ‘let it burn’ category,” Aaron said. These people are drawn to tools like Nepenthes as an act of rebellion against AI making the Internet less useful and enjoyable for users.

Geuter told Ars that he considers Nepenthes “more of a sociopolitical statement than really a technological solution (because the problem it’s trying to address isn’t purely technical, it’s social, political, legal, and needs way bigger levers).”

To Geuter, a computer scientist who has been writing about the social, political, and structural impact of tech for two decades, AI is the “most aggressive” example of “technologies that are not done ‘for us’ but ‘to us.'”

“It feels a bit like the social contract that society and the tech sector/engineering have had (you build useful things, and we’re OK with you being well-off) has been canceled from one side,” Geuter said. “And that side now wants to have its toy eat the world. People feel threatened and want the threats to stop.”

As AI evolves, so do attacks, with one 2021 study showing that increasingly stronger data poisoning attacks, for example, were able to break data sanitization defenses. Whether these attacks can ever do meaningful destruction or not, Geuter sees tarpits as a “powerful symbol” of the resistance that Aaron and Nagy readily joined.

“It’s a great sign to see that people are challenging the notion that we all have to do AI now,” Geuter said. “Because we don’t. It’s a choice. A choice that mostly benefits monopolists.”

Tarpit creators like Nagy will likely be watching to see if poisoning attacks continue growing in sophistication. On the Iocaine site—which, yes, is protected from scraping by Iocaine—he posted this call to action: “Let’s make AI poisoning the norm. If we all do it, they won’t have anything to crawl.”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

AI haters build tarpits to trap and trick AI scrapers that ignore robots.txt Read More »

nvidia-geforce-rtx-5090-costs-as-much-as-a-whole-gaming-pc—but-it-sure-is-fast

Nvidia GeForce RTX 5090 costs as much as a whole gaming PC—but it sure is fast


Even setting aside Frame Generation, this is a fast, power-hungry $2,000 GPU.

Credit: Andrew Cunningham

Credit: Andrew Cunningham

Nvidia’s GeForce RTX 5090 starts at $1,999 before you factor in upsells from the company’s partners or price increases driven by scalpers and/or genuine demand. It costs more than my entire gaming PC.

The new GPU is so expensive that you could build an entire well-specced gaming PC with Nvidia’s next-fastest GPU in it—the $999 RTX 5080, which we don’t have in hand yet—for the same money, or maybe even a little less with judicious component selection. It’s not the most expensive GPU that Nvidia has ever launched—2018’s $2,499 Titan RTX has it beat, and 2022’s RTX 3090 Ti also cost $2,000—but it’s safe to say it’s not really a GPU intended for the masses.

At least as far as gaming is concerned, the 5090 is the very definition of a halo product; it’s for people who demand the best and newest thing regardless of what it costs (the calculus is probably different for deep-pocketed people and companies who want to use them as some kind of generative AI accelerator). And on this front, at least, the 5090 is successful. It’s the newest and fastest GPU you can buy, and the competition is not particularly close. It’s also a showcase for DLSS Multi-Frame Generation, a new feature unique to the 50-series cards that Nvidia is leaning on heavily to make its new GPUs look better than they already are.

Founders Edition cards: Design and cooling

RTX 5090 RTX 4090 RTX 5080 RTX 4080 Super
CUDA cores 21,760 16,384 10,752 10,240
Boost clock 2,410 MHz 2,520 MHz 2,617 MHz 2,550 MHz
Memory bus width 512-bit 384-bit 256-bit 256-bit
Memory bandwidth 1,792 GB/s 1,008 GB/s 960 GB/s 736 GB/s
Memory size 32GB GDDR7 24GB GDDR6X 16GB GDDR7 16GB GDDR6X
TGP 575 W 450 W 360 W 320 W

We won’t spend too long talking about the specific designs of Nvidia’s Founders Edition cards since many buyers will experience the Blackwell GPUs with cards from Nvidia’s partners instead (the cards we’ve seen so far mostly look like the expected fare: gargantuan triple-slot triple-fan coolers, with varying degrees of RGB). But it’s worth noting that Nvidia has addressed a couple of my functional gripes with the 4090/4080-series design.

The first was the sheer dimensions of each card—not an issue unique to Nvidia, but one that frequently caused problems for me as someone who tends toward ITX-based PCs and smaller builds. The 5090 and 5080 FE designs are the same length and height as the 4090 and 4080 FE designs, but they only take up two slots instead of three, which will make them an easier fit for many cases.

Nvidia has also tweaked the cards’ 12VHPWR connector, recessing it into the card and mounting it at a slight angle instead of having it sticking straight out of the top edge. The height of the 4090/4080 FE design made some cases hard to close up once you factored in the additional height of a 12VHPWR cable or Nvidia’s many-tentacled 8-pin-to-12VHPWR adapter. The angled connector still extends a bit beyond the top of the card, but it’s easier to tuck the cable away so you can put the side back on your case.

Finally, Nvidia has changed its cooler—whereas most OEM GPUs mount all their fans on the top of the GPU, Nvidia has historically placed one fan on each side of the card. In a standard ATX case with the GPU mounted parallel to the bottom of the case, this wasn’t a huge deal—there’s plenty of room for that air to circulate inside the case and to be expelled by whatever case fans you have installed.

But in “sandwich-style” ITX cases, where a riser cable wraps around so the GPU can be mounted parallel to the motherboard, the fan on the bottom side of the GPU was poorly placed. In many sandwich-style cases, the GPU fan will dump heat against the back of the motherboard, making it harder to keep the GPU cool and creating heat problems elsewhere besides. The new GPUs mount both fans on the top of the cards.

Nvidia’s Founders Edition cards have had heat issues in the past—most notably the 30-series GPUs—and that was my first question going in. A smaller cooler plus a dramatically higher peak power draw seems like a recipe for overheating.

Temperatures for the various cards we re-tested for this review. The 5090 FE is the toastiest of all of them, but it still has a safe operating temperature.

At least for the 5090, the smaller cooler does mean higher temperatures—around 10 to 12 degrees Celsius higher when running the same benchmarks as the RTX 4090 Founders Edition. And while temperatures of around 77 degrees aren’t hugely concerning, this is sort of a best-case scenario, with an adequately cooled testbed case with the side panel totally removed and ambient temperatures at around 21° or 22° Celsius. You’ll just want to make sure you have a good amount of airflow in your case if you buy one of these.

Testbed notes

A new high-end Nvidia GPU is a good reason to tweak our test bed and suite of games, and we’ve done both here. Mainly, we added a 1050 W Thermaltake Toughpower GF A3 power supply—Nvidia recommends at least 1000 W for the 5090, and this one has a native 12VHPWR connector for convenience. We’ve also swapped the Ryzen 7 7800X3D for a slightly faster Ryzen 7 9800X3D to reduce the odds that the CPU will bottleneck performance as we try to hit high frame rates.

As for the suite of games, we’ve removed a couple of older titles and added some with built-in benchmarks that will tax these GPUs a bit more, especially at 4K with all the settings turned up. Those games include the RT Overdrive preset in the perennially punishing Cyberpunk 2077 and Black Myth: Wukong in Cinematic mode, both games where even the RTX 4090 struggles to hit 60 fps without an assist from DLSS. We’ve also added Horizon Zero Dawn Remastered, a recent release that doesn’t include ray-tracing effects but does support most DLSS 3 and FSR 3 features (including FSR Frame Generation).

We’ve tried to strike a balance between games with ray-tracing effects and games without it, though most AAA games these days include it, and modern GPUs should be able to handle it well (best of luck to AMD with its upcoming RDNA 4 cards).

For the 5090, we’ve run all tests in 4K—if you don’t care about running games in 4K, even if you want super-high frame rates at 1440p or for some kind of ultrawide monitor, the 5090 is probably overkill. When we run upscaling tests, we use the newest DLSS version available for Nvidia cards, the newest FSR version available for AMD cards, and the newest XeSS version available for Intel cards (not relevant here, just stating for the record), and we use the “Quality” setting (at 4K, that equates to an actual rendering version of 1440p).

Rendering performance: A lot faster, a lot more power-hungry

Before we talk about Frame Generation or “fake frames,” let’s compare apples to apples and just examine the 5090’s rendering performance.

The card mainly benefits from four things compared to the 4090: the updated Blackwell GPU architecture, a nearly 33 percent increase in the number of CUDA cores, an upgrade from GDDR6X to GDDR7, and a move from a 384-bit memory bus to a 512-bit bus. It also jumps from 24GB of RAM to 32GB, but games generally aren’t butting up against a 24GB limit yet, so the capacity increase by itself shouldn’t really change performance if all you’re focused on is gaming.

And for people who prioritize performance over all else, the 5090 is a big deal—it’s the first consumer graphics card from any company that is faster than a 4090, as Nvidia never spruced up the 4090 last year when it did its mid-generation Super refreshes of the 4080, 4070 Ti, and 4070.

Comparing natively rendered games at 4K, the 5090 is between 17 percent and 40 percent faster than the 4090, with most of the games we tested landing somewhere in the low to high 30 percent range. That’s an undeniably big bump, one that’s roughly commensurate with the increase in the number of CUDA cores. Tests run with DLSS enabled (both upscaling-only and with Frame Generation running in 2x mode) improve by roughly the same amount.

You could find things to be disappointed about if you went looking for them. That 30-something-percent performance increase comes with a 35 percent increase in power use in our testing under load with punishing 4K games—the 4090 tops out around 420 W, whereas the 5090 went all the way up to 573 W, with the 5090 coming closer to its 575 W TDP than the 4090 does to its theoretical 450 W maximum. The 50-series cards use the same TSMC 4N manufacturing process as the 40-series cards, and increasing the number of transistors without changing the process results in a chip that uses more power (though it should be said that capping frame rates, running at lower resolutions, or running less-demanding games can rein in that power use a bit).

Power draw under load goes up by an amount roughly commensurate with performance. The 4090 was already power-hungry; the 5090 is dramatically more so. Credit: Andrew Cunningham

The 5090’s 30-something percent increase over the 4090 might also seem underwhelming if you recall that the 4090 was around 55 percent faster than the previous-generation 3090 Ti while consuming about the same amount of power. To be even faster than a 4090 is no small feat—AMD’s fastest GPU is more in line with Nvidia’s 4080 Super—but if you’re comparing the two cards using the exact same tests, the relative leap is less seismic.

That brings us to Nvidia’s answer for that problem: DLSS 4 and its Multi-Frame Generation feature.

DLSS 4 and Multi-Frame Generation

As a refresher, Nvidia’s DLSS Frame Generation feature, as introduced in the GeForce 40-series, takes DLSS upscaling one step further. The upscaling feature inserted interpolated pixels into a rendered image to make it look like a sharper, higher-resolution image without having to do all the work of rendering all those pixels. DLSS FG would interpolate an entire frame between rendered frames, boosting your FPS without dramatically boosting the amount of work your GPU was doing. If you used DLSS upscaling and FG at the same time, Nvidia could claim that seven out of eight pixels on your screen were generated by AI.

DLSS Multi-Frame Generation (hereafter MFG, for simplicity’s sake) does the same thing, but it can generate one to three interpolated frames for every rendered frame. The marketing numbers have gone up, too; now, 15 out of every 16 pixels on your screen can be generated by AI.

Nvidia might point to this and say that the 5090 is over twice as fast as the 4090, but that’s not really comparing apples to apples. Expect this issue to persist over the lifetime of the 50-series. Credit: Andrew Cunningham

Nvidia provided reviewers with a preview build of Cyberpunk 2077 with DLSS MFG enabled, which gives us an example of how those settings will be exposed to users. For 40-series cards that only support the regular DLSS FG, you won’t notice a difference in games that support MFG—Frame Generation is still just one toggle you can turn on or off. For 50-series cards that support MFG, you’ll be able to choose from among a few options, just as you currently can with other DLSS quality settings.

The “2x” mode is the old version of DLSS FG and is supported by both the 50-series cards and 40-series GPUs; it promises one generated frame for every rendered frame (two frames total, hence “2x”). The “3x” and “4x” modes are new to the 50-series and promise two and three generated frames (respectively) for every rendered frame. Like the original DLSS FG, MFG can be used in concert with normal DLSS upscaling, or it can be used independently.

One problem with the original DLSS FG was latency—user input was only being sampled at the natively rendered frame rate, meaning you could be looking at 60 frames per second on your display but only having your input polled 30 times per second. Another is image quality; as good as the DLSS algorithms can be at guessing and recreating what a natively rendered pixel would look like, you’ll inevitably see errors, particularly in fine details.

Both these problems contribute to the third problem with DLSS FG: Without a decent underlying frame rate, the lag you feel and the weird visual artifacts you notice will both be more pronounced. So DLSS FG can be useful for turning 120 fps into 240 fps, or even 60 fps into 120 fps. But it’s not as helpful if you’re trying to get from 20 or 30 fps up to a smooth 60 fps.

We’ll be taking a closer look at the DLSS upgrades in the next couple of weeks (including MFG and the new transformer model, which will supposedly increase upscaling quality and supports all RTX GPUs). But in our limited testing so far, the issues with DLSS MFG are basically the same as with the first version of Frame Generation, just slightly more pronounced. In the built-in Cyberpunk 2077 benchmark, the most visible issues are with some bits of barbed-wire fencing, which get smoother-looking and less detailed as you crank up the number of AI-generated frames. But the motion does look fluid and smooth, and the frame rate counts are admittedly impressive.

But as we noted in last year’s 4090 review, the xx90 cards portray FG and MFG in the best light possible since the card is already capable of natively rendering such high frame rates. It’s on lower-end cards where the shortcomings of the technology become more pronounced. Nvidia might say that the upcoming RTX 5070 is “as fast as a 4090 for $549,” and it might be right in terms of the number of frames the card can put up on your screen every second. But responsiveness and visual fidelity on the 4090 will be better every time—AI is a good augmentation for rendered frames, but it’s iffy as a replacement for rendered frames.

A 4090, amped way up

Nvidia’s GeForce RTX 5090. Credit: Andrew Cunningham

The GeForce RTX 5090 is an impressive card—it’s the only consumer graphics card to be released in over two years that can outperform the RTX 4090. The main caveats are its sky-high power consumption and sky-high price; by itself, it costs as much (and consumes as much power as) an entire mainstream gaming PC. The card is aimed at people who care about speed way more than they care about price, but it’s still worth putting it into context.

The main controversy, as with the 40-series, is how Nvidia talks about its Frame Generation-inflated performance numbers. Frame Generation and Multi-Frame Generation are tools in a toolbox—there will be games where they make things look great and run fast with minimal noticeable impact to visual quality or responsiveness, games where those impacts are more noticeable, and games that never add support for the features at all. (As well-supported as DLSS generally is in new releases, it is incumbent upon game developers to add it—and update it when Nvidia puts out a new version.)

But using those Multi-Frame Generation-inflated FPS numbers to make topline comparisons to last-generation graphics cards just feels disingenuous. No, an RTX 5070 will not be as fast as an RTX 4090 for just $549, because not all games support DLSS MFG, and not all games that do support it will run it well. Frame Generation still needs a good base frame rate to start with, and the slower your card is, the more issues you might notice.

Fuzzy marketing aside, Nvidia is still the undisputed leader in the GPU market, and the RTX 5090 extends that leadership for what will likely be another entire GPU generation, since both AMD and Intel are focusing their efforts on higher-volume, lower-cost cards right now. DLSS is still generally better than AMD’s FSR, and Nvidia does a good job of getting developers of new AAA game releases to support it. And if you’re buying this GPU to do some kind of rendering work or generative AI acceleration, Nvidia’s performance and software tools are still superior. The misleading performance claims are frustrating, but Nvidia still gains a lot of real advantages from being as dominant and entrenched as it is.

The good

  • Usually 30-something percent faster than an RTX 4090
  • Redesigned Founders Edition card is less unwieldy than the bricks that were the 4090/4080 design
  • Adequate cooling, despite the smaller card and higher power use
  • DLSS Multi-Frame Generation is an intriguing option if you’re trying to hit 240 or 360 fps on your high-refresh-rate gaming monitor

The bad

  • Much higher power consumption than the 4090, which already consumed more power than any other GPU on the market
  • Frame Generation is good at making a game that’s running fast run faster, it’s not as good for bringing a slow game up to 60 Hz
  • Nvidia’s misleading marketing around Multi-Frame Generation is frustrating—and will likely be more frustrating for lower-end cards since they aren’t getting the same bumps to core count and memory interface that the 5090 gets

The ugly

  • You can buy a whole lot of PC for $2,000, and we wouldn’t bet on this GPU being easy to find at MSRP

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Nvidia GeForce RTX 5090 costs as much as a whole gaming PC—but it sure is fast Read More »

sleeping-pills-stop-the-brain’s-system-for-cleaning-out-waste

Sleeping pills stop the brain’s system for cleaning out waste


Cleanup on aisle cerebellum

A specialized system sends pulses of pressure through the fluids in our brain.

Our bodies rely on their lymphatic system to drain excessive fluids and remove waste from tissues, feeding those back into the blood stream. It’s a complex yet efficient cleaning mechanism that works in every organ except the brain. “When cells are active, they produce waste metabolites, and this also happens in the brain. Since there are no lymphatic vessels in the brain, the question was what was it that cleaned the brain,” Natalie Hauglund, a neuroscientist at Oxford University who led a recent study on the brain-clearing mechanism, told Ars.

Earlier studies done mostly on mice discovered that the brain had a system that flushed its tissues with cerebrospinal fluid, which carried away waste products in a process called glymphatic clearance. “Scientists noticed that this only happened during sleep, but it was unknown what it was about sleep that initiated this cleaning process,” Hauglund explains.

Her study found the glymphatic clearance was mediated by a hormone called norepinephrine and happened almost exclusively during the NREM sleep phase. But it only worked when sleep was natural. Anesthesia and sleeping pills shut this process down nearly completely.

Taking it slowly

The glymphatic system in the brain was discovered back in 2013 by Dr. Maiken Nedergaard, a Danish neuroscientist and a coauthor of Hauglund’s paper. Since then, there have been numerous studies aimed at figuring out how it worked, but most of them had one problem: they were done on anesthetized mice.

“What makes anesthesia useful is that you can have a very controlled setting,” Hauglund says.

Most brain imaging techniques require a subject, an animal or a human, to be still. In mouse experiments, that meant immobilizing their heads so the research team could get clear scans. “But anesthesia also shuts down some of the mechanisms in the brain,” Hauglund argues.

So, her team designed a study to see how the brain-clearing mechanism works in mice that could move freely in their cages and sleep naturally whenever they felt like it. “It turned out that with the glymphatic system, we didn’t really see the full picture when we used anesthesia,” Hauglund says.

Looking into the brain of a mouse that runs around and wiggles during sleep, though, wasn’t easy. The team pulled it off by using a technique called flow fiber photometry which works by imaging fluids tagged with fluorescent markers using a probe implanted in the brain. So, the mice got the optical fibers implanted in their brains. Once that was done, the team put fluorescent tags in the mice’s blood, cerebrospinal fluid, and on the norepinephrine hormone. “Fluorescent molecules in the cerebrospinal fluid had one wavelength, blood had another wavelength, and norepinephrine had yet another wavelength,” Hauglund says.

This way, her team could get a fairly precise idea about the brain fluid dynamics when mice were awake and asleep. And it turned out that the glymphatic system basically turned brain tissues into a slowly moving pump.

Pumping up

“Norepinephrine is released from a small area of the brain in the brain stem,” Hauglund says. “It is mainly known as a response to stressful situations. For example, in fight or flight scenarios, you see norepinephrine levels increasing.” Its main effect is causing blood vessels to contract. Still, in more recent research, people found out that during sleep, norepinephrine is released in slow waves that roll over the brain roughly once a minute. This oscillatory norepinephrine release proved crucial to the operation of the glymphatic system.

“When we used the flow fiber photometry method to look into the brains of mice, we saw these slow waves of norepinephrine, but we also saw how it works in synchrony with fluctuation in the blood volume,” Hauglund says.

Every time the norepinephrine level went up, it caused the contraction of the blood vessels in the brain, and the blood volume went down. At the same time, the contraction increased the volume of the perivascular spaces around the blood vessels, which were immediately filled with the cerebrospinal fluid.

When the norepinephrine level went down, the process worked in reverse: the blood vessels dilated, letting the blood in and pushing the cerebrospinal fluid out. “What we found was that norepinephrine worked a little bit like a conductor of an orchestra and makes the blood and cerebrospinal fluid move in synchrony in these slow waves,” Hauglund says.

And because the study was designed to monitor this process in freely moving, undisturbed mice, the team learned exactly when all this was going on. When mice were awake, the norepinephrine levels were much higher but relatively steady. The team observed the opposite during the REM sleep phase, where the norepinephrine levels were consistently low. The oscillatory behavior was present exclusively during the NREM sleep phase.

So, the team wanted to check how the glymphatic clearance would work when they gave the mice zolpidem, a sleeping drug that had been proven to increase NREM sleep time. In theory, zolpidem should have boosted brain-clearing. But it turned it off instead.

Non-sleeping pills

“When we looked at the mice after giving them zolpidem, we saw they all fell asleep very quickly. That was expected—we take zolpidem because it makes it easier for us to sleep,” Hauglund says. “But then we saw those slow fluctuations in norepinephrine, blood volume, and cerebrospinal fluid almost completely stopped.”

No fluctuations meant the glymphatic system didn’t remove any waste. This was a serious issue, because one of the cellular waste products it is supposed to remove is amyloid beta, found in the brains of patients suffering from Alzheimer’s disease.

Hauglund speculates it could be possible zolpidem induces a state very similar to sleep but at the same time it shuts down important processes that happen during sleep. While heavy zolpidem use has been associated with increased risk of the Alzheimer disease, it is not clear if this increased risk was there because the drug was inhibiting oscillatory norepinephrine release in the brain. To better understand this, Hauglund wants to get a closer look into how the glymphatic system works in humans.

“We know we have the same wave-like fluid dynamics in the brain, so this could also drive the brain clearance in humans,” Haugland told Ars. “Still, it’s very hard to look at norepinephrine in the human brain because we need an invasive technique to get to the tissue.”

But she said norepinephrine levels in people can be estimated based on indirect clues. One of them is pupil dilation and contraction, which work in in synchrony with the norepinephrine levels. Another other clue may lay in microarousals—very brief, imperceivable awakenings which, Hauglund thinks, can be correlated with the brain clearing mechanism. “I am currently interested in this phenomenon […]. Right now we have no idea why microarousals are there or what function they have” Hauglund says.

But the last step she has on her roadmap is making better sleeping pills. “We need sleeping drugs that don’t have this inhibitory effect on the norepinephrine waves. If we can have a sleeping pill that helps people sleep without disrupting their sleep at the same time it will be very important,” Hauglund concludes.

Cell, 2025. DOI: 10.1016/j.cell.2024.11.027

Photo of Jacek Krywko

Jacek Krywko is a freelance science and technology writer who covers space exploration, artificial intelligence research, computer science, and all sorts of engineering wizardry.

Sleeping pills stop the brain’s system for cleaning out waste Read More »

fire-destroys-starship-on-its-seventh-test-flight,-raining-debris-from-space

Fire destroys Starship on its seventh test flight, raining debris from space

This launch debuted a more advanced, slightly taller version of Starship, known as Version 2 or Block 2, with larger propellant tanks, a new avionics system, and redesigned feed lines flowing methane and liquid oxygen propellants to the ship’s six Raptor engines. SpaceX officials did not say whether any of these changes might have caused the problem on Thursday’s launch.

SpaceX officials have repeatedly and carefully set expectations for each Starship test flight. They routinely refer to the rocket as experimental, and the primary focus of the rocket’s early demo missions is to gather data on the performance of the vehicle. What works, and what doesn’t work?

Still, the outcome of Thursday’s test flight is a clear disappointment for SpaceX. This was the seventh test flight of SpaceX’s enormous rocket and the first time Starship failed to complete its launch sequence since the second flight in November 2023. Until now, SpaceX has made steady progress, and each Starship flight has achieved more milestones than the one before.

On the first flight in April 2023, the rocket lost control a little more than two minutes after liftoff, and the ground-shaking power of the booster’s 33 engines shattered the concrete foundation beneath the launch pad. Seven months later, on Flight 2, the rocket made it eight minutes before failing. On that mission, Starship failed at roughly the same point of its ascent, just before the cutoff of the vehicle’s six methane-fueled Raptor engines.

Back then, a handful of photos and images from the Florida Keys and Puerto Rico showed debris in the sky after Starship activated its self-destruct mechanism due to an onboard fire caused by a dump of liquid oxygen propellant. But that flight occurred in the morning, with bright sunlight along the ship’s flight path.

This time, the ship disintegrated and reentered the atmosphere at dusk, with impeccable lighting conditions accentuating the debris cloud’s appearance. These twilight conditions likely contributed to the plethora of videos posted to social media on Thursday.

Starship and Super Heavy head downrange from SpaceX’s launch site near Brownsville, Texas. Credit: SpaceX

The third Starship test flight last March saw the spacecraft reach its planned trajectory and fly halfway around the world before succumbing to the scorching heat of atmospheric reentry. In June, the fourth test flight ended with controlled splashdowns of the rocket’s Super Heavy booster in the Gulf of Mexico and of Starship in the Indian Ocean.

In October, SpaceX caught the Super Heavy booster with mechanical arms at the launch pad for the first time, proving out the company’s audacious approach to recovering and reusing the rocket. On this fifth test flight, SpaceX modified the ship’s heat shield to better handle the hot temperatures of reentry, and the vehicle again made it to an on-target splashdown in the Indian Ocean.

Most recently, Flight 6 on November 19 demonstrated the ship’s ability to reignite its Raptor engines in space for the first time and again concluded with a bullseye splashdown. But SpaceX aborted an attempt to again catch the booster back at Starbase due to a problem with sensors on the launch pad’s tower.

With Flight 7, SpaceX hoped to test more changes to the heat shield protecting Starship from reentry temperatures up to 2,600° Fahrenheit (1,430° Celsius). Musk has identified the heat shield as one of the most difficult challenges still facing the program. In order for SpaceX to reach its ambition for the ship to become rapidly reusable, with minimal or no refurbishment between flights, the heat shield must be resilient and durable.

Fire destroys Starship on its seventh test flight, raining debris from space Read More »

the-8-most-interesting-pc-monitors-from-ces-2025

The 8 most interesting PC monitors from CES 2025


Monitors worth monitoring

Here are upcoming computer screens with features that weren’t around last year.

Yes, that’s two monitors in a suitcase.

Yes, that’s two monitors in a suitcase.

Plenty of computer monitors made debuts at the Consumer Electronics Show (CES) in Las Vegas this year, but many of the updates at this year’s event were pretty minor. Many could have easily been a part of 2024’s show.

But some brought new and interesting features to the table for 2025—in this article, we’ll tell you all about them.

LG’s 6K monitor

Pixel addicts are always right at home at CES, and the most interesting high-resolution computer monitor to come out of this year’s show is the LG UltraFine 6K Monitor (model 32U990A).

People seeking more than 3840×2160 resolution have limited options, and they’re all rather expensive (looking at you, Apple Pro Display XDR). LG’s 6K monitor means there’s another option for professionals needing extra pixels for things like developing, engineering, and creative work. And LG’s 6144×3456, 32-inch display has extra oomph thanks to something no other 6K monitor has: Thunderbolt 5.

This is the only image LG provided for the monitor. Credit: LG

LG hasn’t confirmed the refresh rate of its 6K monitor, so we don’t know how much bandwidth it needs. But it’s possible that pairing the UltraFine with a Thunderbolt 5 PC could trigger Bandwidth Boost, a Thunderbolt 5 feature that automatically increases bandwidth from 80Gbps to 120Gbps. For comparison, Thunderbolt 4 maxes out at 40Gbps. Thunderbolt 5 also requires 140 W power delivery and maxes out at 240 W. That’s a notable bump from Thunderbolt 4’s 100–140 W.

Considering that Apple’s only 6K monitor has Thunderbolt 3, Thunderbolt 5 is a differentiator. With this capability, the LG UltraFine is ironically better equipped in this regard for use with the new MacBook Pros and Mac Mini (which all have Thunderbolt 5) compared to Apple’s own monitors. LG may be aware of this, as the 32U990A’s aesthetic could be considered very Apple-like.

Inside the 32U990A’s silver chassis is a Nano IPS panel. In recent years, LG has advertised its Nano IPS panels as having “nanometer-sized particles” applied to their LED backlight to absorb “excess, unnecessary light wavelengths” for “richer color expression.” LG’s 6K monitor claims to cover 98 percent of DCI-P3 and 99.5 percent of Adobe RGB. IPS Black monitors, meanwhile, have higher contrast ratios (up to 3,000:1) than standard IPS panels. However, LG has released Nano IPS monitors with 2,000:1 contrast, the same contrast ratio as Dell’s 6K, IPS Black monitor.

LG hasn’t shared other details, like price or a release date. But the monitor may cost more than Dell’s Thunderbolt 4-equipped monitor, which is currently $2,480.

Brelyon’s multi-depth monitor

Brelyon Ultra Reality Extend.

Someone from CNET using the Ultra Reality Extend. Credit: CNET/YouTube

Brelyon is headquartered in San Mateo, California, and was founded by scientists and executives from MIT, IMAX, UCF, and DARPA. It’s been selling display technology for commercial and defense applications since 2022. At CES, the company unveiled the Ultra Reality Extend, describing it as an “immersive display line that renders virtual images in multiple depths.”

“As the first commercial multi-focal monitor, the Extend model offers multi-depth programmability for information overlay, allowing users to see images from 0.7 m to as far as 2.5 m of depth virtually rendered behind the monitor; organizing various data streams at different depth layers, or triggering focal cues to induce an ultra immersive experience akin to looking out through a window,” Brelyon’s announcement said.

Brelyon says the monitor runs 4K at 60 Hz with 1 bit of monocular depth for an 8K effect. The monitor includes “OLED-based curved 2D virtual images, with the largest stretching to 122 inches and extending 2.5 meters deep, viewable through a 30-inch frame,” according to the firm’s announcement. The closer you sit, the greater the field of view you get.

The Extend leverages “new GPU capabilities to process light and video signals inside our display platforms,” Brelyon CEO Barmak Heshmat said in a statement this week. He added: “We are thinking beyond headsets and glasses, where we can leverage GPU capabilities to do real-time driving of higher-bandwidth display interfaces.”

Brelyon says this was captured from the Extend, with its camera lens focus changing from 70 cm to 2,500 cm. Credit: Brelyon

Advancements in AI-based video processing, as well as other software advancements and hardware improvements, purportedly enable the Extend to upscale lower-dimension streams to multiple, higher-dimension ones. Brelyon describes its product as a “generative display system” that uses AI computation and optics to assign different depth values to content in real time for rendering images and information overlays.

The idea of a virtual monitor that surpasses the field of view of typical desktop monitors while allowing users to see the real world isn’t new. Tech firms (including many at CES) usually try to accomplish this through AR glasses. But head-mounted displays still struggle with problems like heat, weight, computing resources, battery, and aesthetics.

Brelyon’s monitor seemingly demoed well at CES. Sam Rutherford, a senior writer at Engadget, watched a clip from the Marvel’s Spider-Man video game on the Extend and said that “trees and light poles whipping past in my face felt so real I started to flinch subconsciously.” He added that the monitor separated “different layers of the content to make snow in the foreground look blurry as it whipped across the screen, while characters in the distance” still looked sharp.

The monitor costs $5,000 to $8,000 depending on how you’ll use it and whether you have other business with Brelyon, per Engadget, and CES is one of the few places where people could actually see the display in action.

Samsung’s 3D monitor

Samsung Odyssey 3D

Samsung’s depiction of the 3D effect of its 3D PC monitor. Credit: Samsung

It’s 2025, and tech companies are still trying to convince people to bring a 3D display into their homes. This week, Samsung took its first swing since 2009 at 3D screens with the Odyssey 3D monitor.

In lieu of 3D glasses. the Odyssey 3D achieves its 3D effect with a lenticular lens “attached to the front of the panel and its front stereo camera,” Samsung says, as well eye tracking and view mapping. Differing from other recent 3D monitors, the Odyssey 3D claims to be able to make 2D content look three-dimensional even if that content doesn’t officially support 3D.

You can find more information in our initial coverage of Samsung’s Odyssey 3D, but don’t bet on finding 3D monitors in many people’s homes soon. The technology for quality 3D displays that work without glasses has been around for years but still has never taken off.

Dell’s OLED productivity monitor

With improvements in burn-in, availability, and brightness, finding OLED monitors today is much easier than it was two years ago. But a lot of the OLED monitors released recently target gamers with features like high refresh rates, ultrawide panels, and RGB. These features are unneeded or unwanted by non-gamers but contribute to OLED monitors’ already high pricing. Numerous smaller OLED monitors were announced at CES, with 27-inch, 4K models being a popular addition. Most of them are still high-refresh gaming monitors, though.

The Dell 32-inch QD-OLED, on the other hand, targets “play, school, and work,” Dell’s announcement says. And its naming (based on a new naming convention Dell announced this week that kills XPS and other longstanding branding) signals that this is a mid-tier monitor from Dell’s entry-level lineup.

Dell 32-inch QD-OLED,

OLED for normies. Credit: Dell

The monitor’s specs, which include a 120 Hz refresh rate, AMD FreeSync Premium, and USB-C power delivery at up to 90 W, make it a good fit for pairing with many mainstream laptops.

Dell also says this is the first QD-OLED with spatial audio, which uses head tracking to alter audio coming from the monitor’s five 5 W speakers. This is a feature we’ve seen before, but not on an OLED monitor.

For professionals and/or Mac users that prefer the sleek looks, reputation, higher power delivery and I/O hubs associated with Dell’s popular UltraSharp line, Dell made two more notable announcements at CES: an UltraSharp 32 4K Thunderbolt Hub Monitor (U3225QE) coming out in February 25 for $950 and an UltraSharp 27 4K Thunderbolt Hub Monitor (U2725QE) coming out that same day for $700.

The suitcase monitors

Before we get into the Base Case, please note that this product has no release date because its creators plan to go to market via crowdfunding. Base Case says it will launch its Indiegogo campaign next month, but even then, we don’t know if the project will be funded, if any final product will work as advertised, or if customers will receive orders in a timely fashion. Still, this is one of the most unusual monitors at CES, and it’s worth discussing.

The Base Case is shaped like a 24x14x16.5-inch rolling suitcase, but when you open it up, you’ll find two 24-inch monitors for connecting to a laptop. Each screen reportedly has a 1920×1080 resolution, a 75 Hz refresh rate, and a max brightness claim of 350 nits. Base Case is also advertising PC and Mac support (through DisplayLink), as well as HDMI, USB-C, USB-A, Thunderbolt, and Ethernet ports. Telescoping legs allow the case to rise 10 inches so the display can sit closer to eye level.

Ultimately, the Base Case would see owners lug around a 20-pound product for the ability to quickly create a dual-monitor setup equipped with a healthy amount of I/O. Tom’s Guide demoed a prototype at CES and reported that the monitors took “seconds to set up.”

In case you’re worried that the Base Case prioritizes displays over storage, note that its makers plan on adding a front pocket to the suitcase that can fit a laptop. The pocket wasn’t on the prototype Tom’s Guide saw, though.

Again, this is far from a finalized product, but Base Case has alluded to a $2,400 starting price. For comparison to other briefcase-locked displays—and yes, doing this is possible—LG’s StanbyME Go (27LX5QKNA) tablet in a briefcase currently has a $1,200 MSRP.

Corsair’s PC-mountable touchscreen

A promotional image of the touchscreen.

If the Base Case is on the heftier side of portable monitors, Corsair’s Xeneon Edge is certainly on the minute side. The 14.5-inch LCD touchscreen isn’t meant to be a primary display, though. Corsair built it as a secondary screen for providing quick information, like the song your computer is playing, the weather, the time, and calendar events. You could also use the 2560×720 pixels to display system information, like component usage and temperatures.

Corsair says its iCue software will be able to provide system information on the Xeneon, but because the Xeneon Edge works like a regular monitor, you could (and likely would prefer to) use your own methods. Still, the Xeneon Edge stands out from other small, touchscreen PC monitors with its clean UI that can succinctly communicate a lot of information on the tiny display at once.

Specs-wise, this is a 60 Hz IPS panel with 5-point capacitive touch. Corsair says the monitor can hit 350 nits of brightness.

You can connect the Xeneon Edge to a computer via USB-C (DisplayPort Alt mode) or HDMI. There are also screw holes, so PC builders could install it via a 360 mm radiator mounting point inside their PC case.

Alternatively, Corsair recommends attaching the touchscreen to the outside of a PC case through the monitor’s 14 integrated magnets. Corsair said in a blog post that the “magnets are underneath the plastic casing so the metal surface you stick it to won’t get scratched.” Or, in traditional portable monitor style, the Xeneon Edge could also just sit on a desk with its included stand.

Corsair Xeneon Edge

Corsair demos different ways the screen could attach to a case. Credit: TechPowerUp/YouTube

Corsair plans to release the Xeneon Edge in Q2. Expected pricing is “around $249,” Tom’s Hardware reported.

MSI’s side panel display panel

Why attach a monitor to your PC case when you can turn your PC case into a monitor instead?

MSI says that the touchscreen embedded into this year’s MEG Vision X AI 2nd gaming desktop’s side panel can work like a regular computer monitor. Similar to Corsair’s monitor, the MSI’s display has a corresponding app that can show system information and other customizations, which you can toggle with controls on the front of the case, PCMag reported.

MSI used an IPS panel with 1920×1080 resolution for the display, which also has an integrated mic and speaker. MSI says “electric vehicle control centers” inspired the design. We’ve seen similar PC cases, like iBuyPower’s more translucent side panel display and the touchscreen on Hyte’s pentagonal PC case, before. But MSI is bringing the design to a more mainstream form factor by including it in a prebuilt desktop, potentially opening the door for future touchscreen-equipped desktops.

Considering the various locations people place their desktops and the different angles at which they may try to look at this screen, I’m curious about the monitor’s viewing angles and brightness. IPS seems like a good choice since it tends to have strong image quality when viewed from different angles. A video PC Mag shot from the show floor shows images on the monitor appearing visible and lively:

Hands on with MSI’s MEG Vision X AI Desktop: Now, your PC tower’s a monitor, too.

World’s fastest monitor

There’s a competitive air at CES that lends to tech brands trying to one-up each other on spec sheets. Some of the most heated competition concerns monitor refresh rates; for years, we’ve been meeting the new world’s fastest monitor at CES. This year is no different.

The brand behind the monitor is Koorui, a three-year-old Chinese firm whose website currently lists monitors and keyboards. Koorui hasn’t confirmed when it will make its 750 Hz display available, where it will sell it, or what it will cost. That should bring some skepticism about this product actually arriving for purchase in the US. However, Koorui did bring the display to the CES show floor.

The speedy display had a refresh rate test running at CES, and according to several videos we’ve seen from attendees, the monitor appeared to consistently hit the 750 Hz mark.

World’s first 750Hz monitor???

For those keeping track, high-end gaming monitors—namely ones targeting professional gamers—hit 360 Hz in 2020. Koorui’s announcement means max monitor speeds have increased 108.3 percent in four years.

One CES attendee noticed, however, that the monitor wasn’t showing any gameplay. This could be due to the graphical and computing prowess needed to demonstrate the benefits of a 750 Hz monitor. A system capable of 750 frames per second would give people a chance to see if they could detect improved motion resolution but would also be very expensive. It’s also possible that the monitor Koorui had on display wasn’t ready for that level of scrutiny yet.

Like many eSports monitors, the Koorui is 24.5 inches, with a resolution of 1920×1080. Perhaps more interesting than Koorui taking the lead in the perennial race for higher refresh rates is the TN monitor’s claimed color capabilities. TN monitors aren’t as popular as they were years ago, but OEMs still employ them sometimes for speed.

They tend to be less colorful than IPS and VA monitors, though. Most offer sRGB color gamuts instead of covering the larger DCI-P3 color space. Asus’ 540 Hz ROG Swift Pro PG248QP, for example, is a TN monitor claiming 125 percent sRGB coverage. Koorui’s monitor claims to cover 95 percent of DCI-P3, due to the use of a quantum dot film. Again, there’s a lot that prospective shoppers should confirm about this monitor if it becomes available.

For those seeking the fastest monitors with more concrete release plans, several companies announced 600 Hz monitors coming out this year. Acer, for example, has a 600 Hz Nitro XV240 F6 (also a TN monitor) that it plans to release in North America this quarter at a starting price of $600.

Photo of Scharon Harding

Scharon is a Senior Technology Reporter at Ars Technica writing news, reviews, and analysis on consumer gadgets and services. She’s been reporting on technology for over 10 years, with bylines at Tom’s Hardware, Channelnomics, and CRN UK.

The 8 most interesting PC monitors from CES 2025 Read More »

how-i-program-with-llms

How I program with LLMs

The second issue is we can do better. I am happy we now live in a time when programmers write their own tests, but we do not hold ourselves to the same standards with tests as we do with production code. That is a reasonable trade-off; there are only so many hours in the day. But what LLMs lack in arithmetical prowess, they make up for in enthusiasm.

Let’s ask for an even better test.

In the tests, implement the simplest, most readable version of the standard code for quartiles over a fixed set of known values in a slice. Then pass the test cases through the standard code and the reservoir sampler and confirm they are within an epsilon of each other. Structure the comparison code such that it can be used in a fuzz test, too.

This got us some new test code:

// referenceQuartiles calculates the exact quartiles for a slice of float64 values    // using linear interpolation, matching the behavior expected from the sampler.    func referenceQuartiles(data []float64) (q1, median, q3 float64)  …     // compareQuartiles checks if two sets of quartiles are within epsilon of each other.    // Returns true if they match within the tolerance, false otherwise.    func compareQuartiles(q1a, meda, q3a, q1b, medb, q3b, epsilon float64) bool  …     // checkQuartiles is a test helper that compares sampler output against the reference    // implementation and reports any differences.    func checkQuartiles(t *testing.T, data []float64, epsilon float64)        t.Helper()             // Get reference values       wantQ1, wantMed, wantQ3 := referenceQuartiles(data)             // Get sampler values using a large reservoir for accuracy       qs := NewQuartileSampler(1000)       for _, v := range data            qs.Add(v)              gotQ1, gotMed, gotQ3 := qs.Quartiles()             if !compareQuartiles(gotQ1, gotMed, gotQ3, wantQ1, wantMed, wantQ3, epsilon)            t.Errorf("Quartiles mismatch:ngot  (q1=%v, med=%v, q3=%v)nwant (q1=%v, med=%v, q3=%v)nepsilon=%v",               gotQ1, gotMed, gotQ3, wantQ1, wantMed, wantQ3, epsilon)             

The original test from above has been reworked to use checkQuartiles, and we have something new:

func FuzzQuartileSampler(f *testing.F)        // Add some seed corpus       f.Add([]float641, 2, 3, 4, 5)             f.Fuzz(func(t *testing.T, data []float64)            // Use a larger epsilon for fuzzing since we might get more extreme values           checkQuartiles(t, data, 0.2)       )      

This is fun because it’s wrong. My running gopls tool immediately says:

fuzzing arguments can only have the following types:      string, bool, float32, float64,      int, int8, int16, int32, int64,      uint, uint8, uint16, uint32, uint64,      []byte  

Pasting that error back into the LLM gets it to regenerate the fuzz test such that it is built around a func(t *testing.T, data []byte) function that uses math.Float64frombits to extract floats from the data slice. Interactions like this point us toward automating the feedback from tools; all it needed was the obvious error message to make solid progress toward something useful. I was not needed.

Doing a quick survey of the last few weeks of my LLM chat history shows (which, as I mentioned earlier, is not a proper quantitative analysis by any measure) that more than 80 percent of the time there is a tooling error, the LLM can make useful progress without me adding any insight. About half the time, it can completely resolve the issue without me saying anything of note. I am just acting as the messenger.

How I program with LLMs Read More »

tech-worker-movements-grow-as-threats-of-rto,-ai-loom

Tech worker movements grow as threats of RTO, AI loom


Advocates say tech workers movements got too big to ignore in 2024.

Credit: Aurich Lawson | Getty Images

It feels like tech workers have caught very few breaks over the past several years, between ongoing mass layoffs, stagnating wages amid inflation, AI supposedly coming for jobs, and unpopular orders to return to office that, for many, threaten to disrupt work-life balance.

But in 2024, a potentially critical mass of tech workers seemed to reach a breaking point. As labor rights groups advocating for tech workers told Ars, these workers are banding together in sustained strong numbers and are either winning or appear tantalizingly close to winning better worker conditions at major tech companies, including Amazon, Apple, Google, and Microsoft.

In February, the industry-wide Tech Workers Coalition (TWC) noted that “the tech workers movement is far more expansive and impactful” than even labor rights advocates realized, noting that unionized tech workers have gone beyond early stories about Googlers marching in the streets and now “make the headlines on a daily basis.”

Ike McCreery, a TWC volunteer and ex-Googler who helped found the Alphabet Workers Union, told Ars that although “it’s hard to gauge numerically” how much movements have grown, “our sense is definitely that the momentum continues to build.”

“It’s been an exciting year,” McCreery told Ars, while expressing particular enthusiasm that even “highly compensated tech workers are really seeing themselves more as workers” in these fights—which TWC “has been pushing for a long time.”

In 2024, TWC broadened efforts to help workers organize industry-wide, helping everyone from gig workers to project managers build both union and non-union efforts to push for change in the workplace.

Such widespread organizing “would have been unthinkable only five years ago,” TWC noted in February, and it’s clear from some of 2024’s biggest wins that some movements are making gains that could further propel that momentum in 2025.

Workers could also gain the upper hand if unpopular policies increase what one November study called “brain drain.” That’s a trend where tech companies adopting potentially alienating workplace tactics risk losing top talent at a time when key industries like AI and cybersecurity are facing severe talent shortages.

Advocates told Ars that unpopular policies have always fueled workers movements, and RTO and AI are just the latest adding fuel to the fire. As many workers prepare to head back to offices in 2025 where worker surveillance is only expected to intensify, they told Ars why they expect to see workers’ momentum continue at some of the world’s biggest tech firms.

Tech worker movements growing

In August, Apple ratified a labor contract at America’s first unionized Apple Store—agreeing to a modest increase in wages, about 10 percent over three years. While small, that win came just a few weeks before the National Labor Relations Board (NLRB) determined that Amazon was a joint employer of unionized contract-based delivery drivers. And Google lost a similar fight last January when the NLRB ruled it must bargain with a union representing YouTube Music contract workers, Reuters reported.

For many workers, joining these movements helped raise wages. In September, facing mounting pressure, Amazon raised warehouse worker wages—investing $2.2 billion, its “biggest investment yet,” to broadly raise base salaries for workers. And more recently, Amazon was hit with a strike during the busy holiday season, as warehouse workers hoped to further hobble the company during a clutch financial quarter to force more bargaining. (Last year, Amazon posted record-breaking $170 billion holiday quarter revenues and has said the current strike won’t hurt revenues.)

Even typically union-friendly Microsoft drew worker backlash and criticism in 2024 following layoffs of 650 video game workers in September.

These mass layoffs are driving some workers to join movements. A senior director for organizing with Communications Workers of America (CWA), Tom Smith, told Ars that shortly after the 600-member Tech Guild—”the largest single certified group of tech workers” to organize at the New York Times—reached a tentative deal to increase wages “up to 8.25 percent over the length of the contract,” about “460 software engineers at a video game company owned by Microsoft successfully unionized.”

Smith told Ars that while workers for years have pushed for better conditions, “these large units of tech workers achieving formal recognition, building lasting organization, and winning contracts” at “a more mass scale” are maturing, following in the footsteps of unionizing Googlers and today influencing a broader swath of tech industry workers nationwide. From CWA’s viewpoint, workers in the video game industry seem best positioned to seek major wins next, Smith suggested, likely starting with Microsoft-owned companies and eventually affecting indie game companies.

CWA, TWC, and Tech Workers Union 1010 (a group run by tech workers that’s part of the Office and Professional Employees International Union) all now serve as dedicated groups supporting workers movements long-term, and that stability has helped these movements mature, McCreery told Ars. Each group plans to continue meeting workers where they are to support and help expand organizing in 2025.

Cost of RTOs may be significant, researchers warn

While layoffs likely remain the most extreme threat to tech workers broadly, a return-to-office (RTO) mandate can be just as jarring for remote tech workers who are either unable to comply or else unwilling to give up the better work-life balance that comes with no commute. Advocates told Ars that RTO policies have pushed workers to join movements, while limited research suggests that companies risk losing top talents by implementing RTO policies.

In perhaps the biggest example from 2024, when Amazon announced that it was requiring workers in-office five days a week next year, a poll on the anonymous platform where workers discuss employers, Blind, found an overwhelming majority of more than 2,000 Amazon employees were “dissatisfied.”

“My morale for this job is gone…” one worker said on Blind.

Workers criticized the “non-data-driven logic” of the RTO mandate, prompting an Amazon executive to remind them that they could take their talents elsewhere if they didn’t like it. Many confirmed that’s exactly what they planned to do. (Amazon later announced it would be delaying RTO for many office workers after belatedly realizing there was a lack of office space.)

Other companies mandating RTO faced similar backlash from workers, who continued to question the logic driving the decision. One February study showed that RTO mandates don’t make companies any more valuable but do make workers more miserable. And last month, Brian Elliott, an executive advisor who wrote a book about the benefits of flexible teams, noted that only one in three executives thinks RTO had “even a slight positive impact on productivity.”

But not every company drew a hard line the way that Amazon did. For example, Dell gave workers a choice to remain remote and accept they can never be eligible for promotions, or mark themselves as hybrid. Workers who refused the RTO said they valued their free time and admitted to looking for other job opportunities.

Very few studies have been done analyzing the true costs and benefits of RTO, a November academic study titled “Return to Office and Brain Drain” said, and so far companies aren’t necessarily backing the limited findings. The researchers behind that study noted that “the only existing study” measuring how RTO impacts employee turnover showed this year that senior employees left for other companies after Microsoft’s RTO mandate, but Microsoft disputed that finding.

Seeking to build on this research, the November study tracked “over 3 million tech and finance workers’ employment histories reported on LinkedIn” and analyzed “the effect of S&P 500 firms’ return-to-office (RTO) mandates on employee turnover and hiring.”

Choosing to only analyze the firms requiring five days in office, the final sample covered 54 RTO firms, including big tech companies like Amazon, Apple, and Microsoft. From that sample, researchers concluded that average employee turnover increased by 14 percent after RTO mandates at bigger firms. And since big firms typically have lower turnover, the increase in turnover is likely larger at smaller firms, the study’s authors concluded.

The study also supported the conclusion that “employees with the highest skill level are more likely to leave” and found that “RTO firms take significantly longer time to fill their job vacancies after RTO mandates.”

“Together, our evidence suggests that RTO mandates are costly to firms and have serious negative effects on the workforce,” the study concluded, echoing some remote workers’ complaints about the seemingly non-data-driven logic of RTO, while urging that further research is needed.

“These turnovers could potentially have short-term and long-term effects on operation, innovation, employee morale, and organizational culture,” the study concluded.

A co-author of the “brain drain” study, Mark Ma, told Ars that by contrast, Glassdoor going fully remote at least anecdotally seemed to “significantly” increase the number and quality of applications—possibly also improving retention by offering the remote flexibility that many top talents today require.

Ma said that next his team hopes to track where people who leave firms over RTO policies go next.

“Do they become self-employed, or do they go to a competitor, or do they fund their own firm?” Ma speculated, hoping to trace these patterns more definitively over the next several years.

Additionally, Ma plans to investigate individual firms’ RTO impacts, as well as impacts on niche classes of workers with highly sought-after skills—such as in areas like AI, machine learning, or cybersecurity—to see if it’s easier for them to find other jobs. In the long-term, Ma also wants to monitor for potentially less-foreseeable outcomes, such as RTO mandates possibly increasing firms’ number of challengers in their industry.

Will RTO mandates continue in 2025?

Many tech workers may be wondering if there will be a spike in return-to-office mandates in 2025, especially since one of the most politically influential figures in tech, Elon Musk, recently reiterated that he thinks remote work is “poison.”

Musk, of course, banned remote work at Tesla, as well as when he took over Twitter. And as co-lead of the US Department of Government Efficiency (DOGE), Musk reportedly plans to ban remote work for government employees, as well. If other tech firms are influenced by Musk’s moves and join executives who seem to be mandating RTO based on intuition, it’s possible that more tech workers could be forced to return to office or else seek other employment.

But Ma told Ars that he doesn’t expect to see “a big spike in the number of firms announcing return to office mandates” in 2025.

His team only found eight major firms in tech and finance that issued five-day return-to-office mandates in 2024, which was the same number of firms flagged in 2023, suggesting no major increase in RTOs from year to year. Ma told Ars that while big firms like Amazon ordering employees to return to the office made headlines, many firms seem to be continuing to embrace hybrid models, sometimes allowing employees to choose when or if they come into the office.

That seeming preference for hybrid work models seems to align with “future of work” surveys outlining workplace trends and employee preferences that the Consumer Technology Association (CTA) conducted for years but has seemingly since discontinued. In 2021, CTA reported that “89 percent of tech executives say flexible work arrangements are the most important employee benefit and 65 percent say they’ll hire more employees to work remotely.” The next year, which apparently was the last time CTA published the survey, the CTA suggested hybrid models could help attract talents in a competitive market hit with “an unprecedented demand for workers with high-tech skills.”

The CTA did not respond to Ars’ requests to comment on whether it expects hybrid work arrangements to remain preferred over five-day return-to-office policies next year.

CWA’s Smith told Ars that workers movements are growing partly because “folks are engaged in this big fight around surveillance and workplace control,” as well as anything “having to do with to what extent will people return to offices and what does that look like if and when people do return to offices?”

Without data backing RTO mandates, Ma’s study suggests that firms will struggle to retain highly skilled workers at a time when tech innovation remains a top priority for the US. As workers appear increasingly put off by policies—like RTO or AI-driven workplace monitoring or efficiency efforts threatening to replace workers with AI—Smith’s experience seems to show that disgruntled workers could find themselves drawn to unions that could help them claw back control over work-life balance. And the cost of the ensuing shuffle to some of the largest tech firms in the world could be “significant,” Ma’s study warned.

TWC’s McCreery told Ars that on top of unpopular RTO policies driving workers to join movements, workers have also become more active in protesting unpopular politics, frustrated to see their talents apparently used to further controversial conflicts and military efforts globally. Some workers think workplace organizing could be more powerful than voting to oppose political actions their companies take.

“The workplace really remains an important site of power for a lot of people where maybe they don’t feel like they can enact their values just by voting or in other ways,” McCreery said.

While unpopular policies “have always been a reason workers have joined unions and joined movements,” McCreery said that “the development of more of these unpopular policies” like RTO and AI-enhanced surveillance “really targeted” at workers has increased “the political consciousness and the sense” that tech workers are “just like any other workers.”

Layoffs at companies like Microsoft and Amazon during periods when revenue is increasing in the double-digits also unify workers, advocates told Ars. Forbes noted Microsoft laid off 1,000 workers “just five days before reporting a 17.6 percent increase in revenue to $62 billion,” while Amazon’s 1,000-worker layoffs followed a 14 percent rise in revenue to $170 billion. And demand for AI led to the highest profit margins Amazon’s seen for its cloud business in a decade, CNBC reported in October.

CWA’s Smith told Ars as companies continue to rake in profits and workers feel their work-life balance slipping away while their efforts in the office are potentially “used to increase control and cause broader suffering,” some of the biggest fights workers raised in 2024 may intensify next year.

“It’s like a shock to employees, these industries pushing people to lower your expectations because we’re going to lay off hundreds of thousands of you just because we can while we make more profits than we ever have,” Smith said. “I think workers are going to step into really broad campaigns to assert a different worldview on employment security.”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

Tech worker movements grow as threats of RTO, AI loom Read More »

ars’-favorite-games-of-2024-that-were-not-released-in-2024

Ars’ favorite games of 2024 that were not released in 2024


Look what we found laying around

The games that found us in 2024, from 2003 space sims to 2022 backyard survival.

More than 18,500 games will have been released onto the PC gaming platform Steam in the year 2024, according to SteamDB. Dividing that by the number of people covering games at Ars, or the gaming press at large, or even everybody who games and writes about it online, yields a brutal ratio.

Games often float down the river of time to us, filtered by friends, algorithms, or pure happenstance. They don’t qualify for our best games of the year list, but they might be worth mentioning on their own. Many times, they’re better games then they were at release, either by patching or just perspective. And they are almost always lower priced.

Inspired by the cruel logic of calendars and year-end lists, I asked my coworkers to tell me about their favorite games of 2024 that were not from 2024. What resulted were some quirky gems, some reconsiderations, and some titles that just happened to catch us at the right time.

Stardew Valley

Screenshot from Stardew Valley, in front of the blacksmith's shop, where a player character is holding up a bone (for some reason).

Credit: ConcernedApe



ConcernedApe; Basically every platform

After avoiding it forever and even bouncing off of it once or twice, I finally managed to fall face-first into Stardew Valley (2016) in 2024. And I’ve fallen hard—I only picked it up in October, but Steam says I’ve spent about 110 hours playing farmer.

In addition to being a fun distraction and a great way to kill both short and long stretches of time, what struck me is how remarkably soothing the game has been. I’m a nervous flyer, and it’s only gotten worse since the pandemic, but I’ve started playing Stardew on flights, and having my little farm to focus on has proven to be a powerful weapon against airborne anxiety—even when turbulence starts up. Ars sent me on three trips in the last quarter of the year, and Stardew got me through all the flights.

Hell, I’m even enjoying the multiplayer—and I don’t generally do multiplayer. My cousin Shaun and I have been meeting up most weekends to till the fields together, and the primary activity tends to be seeing who can apply the most over-the-top creatively scatological names to the farm animals. I’ve even managed to lure Ur-Quan Masters designer Paul Reiche III to Pelican Town for a few weekends of hoedowns and harvests. (Perhaps unsurprisingly, Paul was already a huge fan of the game. And also of over-the-top creatively scatological farm animal names. Between him and Shaun, I’m amassing quite a list!)

So here’s to you, Stardew Valley. You were one of the brightest parts of my 2024, and a game that I already know I’ll return to for years.

Lee Hutchinson

Grounded

First-person perspective of a suburban house in the background, fall leaves on a tree nearby, and a relatively giant spider approaching the player, who is holding a makeshift bow and arrow, ready to fire.

Credit: Xbox Game Studios

Obsidian; Windows, Switch, Xbox, PlayStation

My favorite discovery this year has probably been Grounded, a Microsoft-published, Obsidian Entertainment-developed survival crafting game that was initially released back in 2022 (2020 if you count early access) but received its final planned content update back in April.

You play as one of four plucky tweens, zapped down to a fraction-of-an-inch high as part of a nefarious science experiment. The game is heavily inspired by 1989’s classic Honey, I Shrunk the Kids, both in its ’80s setting and its graphical design. Explore the backyard, fight bugs, find new crafting materials, build out a base of operations, and power yourself up with special items and steadily better equipment so you can figure out what happened to you and get back to your regular size.

Grounded came up because I was looking for another game for the four-player group I’ve also played Deep Rock Galactic and Raft with. Like RaftGrounded has a main story with achievable objectives and an endpoint, plus a varied enough mix of activities that everyone will be able to find something they like doing. Some netcode hiccups notwithstanding, if you like survival crafting-style games but don’t like Minecraft-esque, objective-less, make-your-own-fun gameplay, Grounded might scratch an itch for you.

Andrew Cunningham

Fights in Tight Spaces

A black-colored figure does a backwards flip kick on a red goon holding a gun, while three other red and maroon goons point guns at him from a perpendicular angle, inside a grayscale room.

Credit: Raw Fury

Ground Shatter; Windows, Switch, Xbox, PlayStation

I spent a whole lot of time browsing, playing, and thinking about roguelike deckbuilders in 2024. Steam’s recommendation algorithm noticed, and tossed 2021’s Fights in Tight Spaces at me. I was on a languid week’s vacation, with a Steam Deck packed, with just enough distance from the genre by then to maybe dip a toe back in. More than 15 hours later, Steam’s “Is this relevant to you?” question is easy to answer.

Back in college, I spent many weekends rounding out my Asian action film knowledge, absorbing every instance of John Woo, Jackie Chan, Jet Li, Flying Guillotine, Drunken Master, and whatever I could scavenge from friends and rental stores. I thrilled to frenetic fights staged in cramped, cluttered, or quirky spaces. When the hero ducks so that one baddie punches the other one, then backflips over a banister to two-leg kick the guy coming up from beneath? That’s the stuff.

Fights gives you card-based, turn-by-turn versions of those fights. You can see everything your opponents are going to do, in what order, and how much it would hurt if they hit you. Your job is to pick cards that move, hit, block, counter, slip, push, pull, and otherwise mess with these single-minded dummies, such that you dodge the pain and they either miss or take each other out. Woe be unto the guy with a pistol who thinks he’s got one up on you, because he’s standing right by a window, and you’ve got enough momentum to kick a guy right into him.

This very low-spec game has a single-color visual style, beautifully smooth animations, and lots of difficulty tweaking to prevent frustration. The developer plans to release a game “in the same universe,” Knights in Tight Spaces, in 2025, and that’s an auto-buy for me now.

Kevin Purdy

The Elder Scrolls III: Morrowind

Axe-wielding polygonal character, wearing furs and armor, complete with bear face above his head, in front of a wooden lodge in a snowy landscape.

Credit: Bethesda Game Studios

Bethesda; Windows, Xbox

The Elder Scrolls III: Morrowind always had a sort of mythic quality for me. It came out when I was 18 years old—the perfect age for it, really. And more than any other game I had ever played, it inspired hope and imagination for where the medium might go.

In the ensuing years, Morrowind (2002) ended up seeming like the end of the line instead of the spark that would start something new. With some occasional exceptions, modern games have emphasized predictable formulae and proven structures over the kind of experimentation, depth, and weirdness that Morrowind embraced. Even Bethesda’s own games gradually became stodgier.

So Morrowind lived in my memory for years, a sort of holy relic of what gaming could have been before AAA game design became quite so oppressively formalist.

After playing hundreds of hours of Starfield this year, I returned to Morrowind for the first time in 20 years.

To be clear: I quite liked Starfield, counter to the popular narrative about it—though I definitely understood why it wasn’t for everyone. But people criticized Starfield for lacking the magic of a game like Morrowind, and I was skeptical of that criticism. As such, my return to the island of Vvardenfell was a test: did Morrowind really have a magic that Starfield lacks, even when taken out of the context of its time and my youthful imagination and open-mindnedness?

I was surprised to find that the result was a strong affirmative. I still like Starfield, but its cardinal sin is that it is unimaginative because it is derivative—of No Man’s Sky, of Privateer and Elite, of Mass Effect, of various 70s and 80s sci-fi films and TV series, and most of all, of Bethesda Game Studios’ earlier work.

In contrast, Morrowind is a fever dream of bold experimentation that seems to come more from the creativity of ambitious designers who were too young to know any better, than from the proven designs of past hits.

I played well over a hundred hours of Morrowind this year, and while I did find it tedious at times, it’s engrossing for anyone who’s willing to put up with its archaic pacing and quirks.

To be clear, many of the design experiments in the game simply don’t work, with systems that are easily exploited. Its designers’ naivety shines through clearly, and its rough edges serve as clear reminders of why today’s strict formalism has taken root, especially in AAA games where too-big budgets and payrolls leave no room at all for risk.

Regardless, it’s been wild to go back and play this game from 2002 and realize that in the 22 years since there have been very few other RPGs that were nearly as brazenly creative. I love it for that, just as much as I did when I was 18.

Samuel Axon

Tetrisweeper

Tetris-style colored blocks fallen inside a column on top of settled blocks, most of which are gray and have Minesweeper-like numbers indicating an explosive tile nearby.

Credit: Kertis Jones Interactive

Kertis Jones; Itch.io, coming to Steam

If you ask someone to list the most addictive puzzle games of all time, Tetris and Minesweeper will probably be at or near the top of the list. So it shouldn’t be too surprising that Tetrisweeper makes an even more addictive experience by combining the two grid-based games together in a frenetic, brain-melting mess.

Tetrisweeper starts just like Tetris, asking you to arrange four-block pieces dropping down a well to make lines without gaps. But in Tetrisweeper, those completed lines won’t clear until you play a game of Minesweeper on top of those dropped pieces, using adjacency information and logical rules to mark which ones are safe and which ones house game-ending mines (if you want to learn more about Minesweeper, there’s a book I can recommend).

At first, playing Tetris with your keyboard fingers while managing Minesweeper with your mouse hand can feel a little unwieldy—a bit like trying to drive a car and cook an omelet at the same time. After a few games, though, you’ll learn how to split your attention effectively to drop pieces and solve complex mine patterns nearly simultaneously. That’s when you start to master the game’s intricate combo multiplier system and bonus scoring, striving for point-maximizing Tetrisweeps and T-spins (my high score is just north of 3 million, but pales in comparison to that of the best players).

While Tetrisweeper grew out of a 2020 Game Jam, I didn’t discover the game until this year, when it helped me clear my head during many a work break (and passed the time during a few dull Zoom calls as well). I’m hoping the game’s planned Steam release—still officially listed as “Coming Soon”—will help attract even more addicts than its current itch.io availability.

Kyle Orland

Freelancer

Ship with three thruster engines approaching a much larger freighter, long and slightly cylindrical, in murky green space, with a HUD around the borders.

Digital Anvil; Windows

What if I told you that Star Citizen creator Chris Roberts previously tried to make Star Citizen more than two decades ago but left the project and saw it taken over by real, non-crazy professionals who had the discipline to actually finish something?

That’s basically the story behind 2003’s forgotten PC game Freelancer. What started as a ludicrously ambitious space life sim concept ended up as a sincere attempt to make games like Elite and Wing Commander: Privateer far more accessible.

That meant a controversial, mouse-based control scheme instead of flight sticks, as well as cutting-edge graphics, celebrity voice actors, carefully designed economy and progression systems, and flashy cutscenes.

I followed the drama of Freelancer‘s development in forums, magazines, and gaming news websites when I was younger. I bought the hype as aggressively as Star Citizen fans did years later. The game that came out wasn’t what I was dreaming of, and that disappointment prevented me from finishing it.

Fast-forward to 2024: on a whim, I played Freelancer from beginning to end for the first time.

And honestly? It’s great. In a space trading sim genre that’s filled with giant piles of jank (the X series) or inaccessible titles that fly a little too far into the simulation zone for some (Elite Dangerous), Freelancer might be the most fun you can have with the genre even today.

It’s understandable that it didn’t have much lasting cultural impact since the developers who took it over lacked the wild ambition of the man who started it, but I enjoyed a perfectly pleasant 20–30 hours smuggling space goods and shooting pirates—and I didn’t have to spend $48,000 of real money on a ship to get that.

Samuel Axon

Cyberpunk 2077

A woman with a red mohawk, wearing a belly shirt, amidst a dense, steel, multi-colored cityscape, suffused with neon.

Credit: CD Projekt Red

CD Projekt Red; Windows, Xbox, PlayStation (macOS in 2025)

Can one simply play, as a game, one of the biggest and most argued-over gaming narratives of all time? Four years after its calamitous launch sparked debates about AAA gaming sprawl, developer crunch, game review practicalities, and, eventually, post-release redemption arcs, what do you get when you launch Cyberpunk 2077?

I got a first-person shooter, one with some interesting ideas, human-shaped characters you’d expect from the makers of The Witcher 3, and some confused and unrefined systems and ideas. I enjoyed my time with it, appreciate the work put into it, and can recommend it to anyone who is okay with something that’s not quite an in-depth FPS RPG (or “immersive sim”) but likes a bit of narrative thrust to their shooting and hacking.

You can’t fit everything about Cyberpunk 2077 into one year-end blurb (or a 1.0 release, apparently), so I’ll stick to the highs and lows. I greatly enjoyed the voice performances, especially from Keanu Reeves and Idris Elba (the latter in the Phantom Liberty DLC), and those behind Jackie, Viktor Vektor, and the female version of protagonist V. I was surprised at how good the shooting felt, given the developer’s first time out; the discovery of how a “Smart” shotgun worked will stick with me a while. The driving: less so. There were moments of quiet, ambient world appreciation, now that the game’s engine is running okay. And the side quests have that Witcher-ish quality to them, where they’re never as straightforward as described and also tell little stories about life in this place.

What seems missing to me, most crucially, are the bigger pieces, the real choices and unexpected consequences, and the sense of really living in this world. You can choose one of three backgrounds, but it only comes up as an occasional dialogue option. You can build your character in myriad ways, and there are lots of dialogue options. But the main quest keeps you on a fairly strict path, with the options to talk, hack, or stealth your way past inevitable shootouts not as great as you might think. Once you’ve brought your character up to power-fantasy levels, the larger city becomes a playground, but not one I much enjoyed playing in. (Plus, the idea of idle wandering and amassing wealth, given the main plot contrivance, is kind of ridiculous, but this is a game, after all).

Phantom Liberty, in my experience, patches up every one of these weaknesses inside its smaller play space, providing more real choices and a tighter story, with more set pieces arriving at a faster pace. If you can buy this game bundled with its DLC, by all means, do so. I didn’t encounter any game-breaking bugs in my mid-2024 playthrough, nor even many crashes. Your mileage may vary, especially on consoles, as other late-coming players have seen.

Waiting on this game a good bit certainly helps me grade it on a curve; nobody today is losing $60 on something that looks like it’s playing over a VNC connection. When CD Projekt Red carries on in this universe, I think they’ll have learned a lot from what they delivered here, much like we’ve all learned about pre-release expectations. It’s okay to take your time getting to a gargantuan game; there are lots of games from prior years to look into.

Kevin Purdy

Photo of Kevin Purdy

Kevin is a senior technology reporter at Ars Technica, covering open-source software, PC gaming, home automation, repairability, e-bikes, and tech history. He has previously worked at Lifehacker, Wirecutter, iFixit, and Carbon Switch.

Ars’ favorite games of 2024 that were not released in 2024 Read More »