gaming

civilization-vii,-one-month-later:-the-community-and-developers-chime-in

Civilization VII, one month later: The community and developers chime in


Executive Producer Dennis Shirk talks with Ars about the state of the game.

Civilization VII has a lot of visual polish, and great gameplay systems. A flurry of patches have been improving other aspects, too. Credit: 2K Games

A month ago, Civilization VII launched to generally positive critical reviews, but user reviews on Steam and Metacritic weren’t nearly so positive, at least at first.

Take a look at the Civilization subreddit, and you’ll see a general consensus: The bones of this game are great, and even most of the radical changes to the classic formula (like breaking the game into much more distinct ages) are a welcome refresh.

On the other hand, there’s also a sentiment that players are disappointed that some expected features are missing, some gameplay elements need additional polish, and most of all, the user interface was a bit of a mess at launch.

A month later, developer Firaxis has already released a few patches and has more planned. As the game’s state continues to evolve, this seems like a good time to check in on it.

I spent some time in the Civ community and spoke with Dennis Shirk, the game’s executive producer, to learn how the launch went, how the game has changed since launch, and what its next steps are.

Breaking with tradition

Civilization VII broke with tradition in a few ways—splitting the game into distinct ages that each play like a separate game, allowing anachronistic leader/civilization combinations, and removing worker units, to name a few.

You might have expected those to be the source of any controversy around the game’s launch, but that hasn’t really been the case. In my review, I wrote that those shifts take the franchise in a new direction, bring over the best ideas from competing titles, and address long-standing problems with the Civilization experience.

If you want a more traditional experience, you can go back to Civilization V, Civilization IV, Civilization II, or whichever your favorite was. Those games are infinitely re-playable, so there’s no need to retread with a sequel.

“Our rule that we live by at Firaxis is the rule of thirds. We want to keep one-third of the game the same as previous iterations, one-third tweaked and improved upon, and one-third new,” Shirk told me. “Did we lean farther into the last third than we have in the past? We may have, but it was a risk we were willing to take to deliver a completely new part of the experience.”

A suboptimal starting position

The Civilization subreddit is full of positive responses to those changes, and the large contingent of Civ geeks on the Ars editorial staff are mostly in agreement that they’re good changes, too. (The game has been a frequent discussion topic in the Ars Slack for several weeks.)

The last month has seen players giving critical feedback, and Firaxis has been releasing patches to address complaints. For example, patch 1.1.0 on March 4 fixed some visual problems with the technology tree and made big changes to some victory conditions in the Modern Age, among other things.

Players have noted positive changes that weren’t mentioned in patch notes, too. Reddit user AndyNemmity posted that the “AI is significantly better in Military” after a recent patch a week ago, writing:

I know most of you don’t see the Military AI in the fog of war, but I work on the AI mod, and run a ton of autoplays. I am 10+ autoplays with the new patch, and the base game military AI is VASTLY improved.

Before, the AI would get stuck on the map in tons of different scenarios, often dying because they have an entire army stuck on the map, and can’t use it. This is fixed. Now the autoplays look like actual militaries, warring, attacking, killing independents quickly and efficiently.

The goodwill about the bones of the game and the positive responses to some patch additions are still accompanied by some consternation about the UI.

“Part of launching a game, especially when big changes are made, is figuring out what is resonating with players, and what may be an opportunity for improvement,” Shirk said when asked about the launch challenges. “In this instance, the UI did not meet players’ expectations, and we are committed to addressing that—although it will take time.”

There’s still a fair bit to be done, and modders have been filling the gaps. Modder Sukritact released a UI overhaul that addressed several complaints—including showing the gains and losses players will see if they replace a tile improvement or building with another one in the city view.

Players praised these tweaks, going so far as to call that example in particular a “game changer.” A few days later, it was announced on the Civilization Discord that Firaxis had hired Sukritact as a technical artist.

A panel that shows a detailed explanation of the bonuses affecting a tile improvement

This mod by Sukritact adds much-needed information to the city view. The modder has since been hired by Firaxis. Credit: RileyTaugor

The community has speculated that the game was rushed out the door before it was ready, primarily citing the UI issues.

“In hindsight, our UI team needed more time and space to take the UI where it needed to go, to really expose the level of information our players expect,” Shirk admitted. “Our team has been working hard to address these issues through rapid patching, and players will continue to see support for the foreseeable future.”

That said, debate about the UI is happening in the context of a wider discussion about the scope of Civilization VII’s launch.

A tale of 10 platforms

Every mainline Civilization game in the past launched on just desktop platforms like Windows or Mac, but Civilization VII greatly expanded that. Depending on what counts (we’ll say here that the Steam Deck counts as distinct from Linux, and the Xbox Series S is distinct from Xbox Series X), there were 10 launch platforms for Civilization VII:

  • Windows
  • Linux
  • macOS
  • Steam Deck
  • Nintendo Switch
  • PlayStation 4
  • PlayStation 5
  • Xbox One
  • Xbox Series S
  • Xbox Series X

That’s a lot to target at launch, and players in the subreddit have speculated that Firaxis was spread a bit thin here, making this part of the explanation for a relatively buggy UI on day one.

Some also speculated that the classic desktop PC platform got a worse experience in order to accommodate console and Steam Deck players. For example, players lamented the lack of a drag and drop feature for views like the policy selection screen.

The developers have made it crystal clear that PC is the top priority, though. “Our core audience is absolutely PC, so we always start there, and work our way outward, adapting UI systems along the way, iterating on different UX approaches,” Shirk said.

He added that the controller support was developed with a partner, suggesting that supporting consoles out of the gate might not have taxed the team working on the desktop interface as much as some feared.

At least in one respect, Firaxis has already publicly walked the walk: at one point it made the controversial decision to temporarily pause cross-save between PC and console so they could push updates to PC faster. Patching games on consoles requires a relatively slow and laborious certification process, but that’s not the case for updating a game on Steam.

The cloud save menu in Civilization VII

Cross-loading cloud saves across PC and console was turned off for a while so Firaxis could iterate faster on PC. Credit: Samuel Axon

Meanwhile, some console and handheld players have complained about their version of the interface.

The most commonly named UI problem on console and handhelds is related to how the camera and hex selector could be moved across the map more efficiently. Currently, moving the camera is easy—you just use the left stick to pan around. But doing this doesn’t move the hex selector with it, so you have to drag that selector hex by hex all the way across the map.

Some similar games have a button you can press to bring the selector to where the camera is. In Civilization VII, the R3 button brings the camera to where the selector is, not vice versa—which isn’t useful.

Shirk talked a bit about the process of developing the controller-based interface and the challenges the team faced:

We’ve been lucky enough to have some great partners help us develop the controller support, which added some strong console specific features like the radial menu. However, when you’re working with different interfaces across different platforms, there are many assumptions that cannot be made like they can on PC. For example, a player using a mouse is not walled off from anything, but switch that to a controller, and a completely different thought process has to come into play.

As for solutions, he added:

We’re working to give all versions the attention they deserve. When it comes to UI updates, we’re having team members continue to look at the community feedback in-depth and see how we can improve the experience for players regardless of system.

When I asked about drag-and-drop on desktop, and R3’s selection functionality on console and handheld, he said “the examples you shared are among features we are tracking and exploring how to address,” and that the March 4 1.1.0 patch that brought some UI changes was just a start. He added that a 1.1.1 coming March 25 will be when “fans will really start to see the results of their feedback.”

“And to answer your original question, ‘R3’ is coming along for the ride,” he said.

Following the legacy path to balanced gameplay

It seems like the UI is on the right track, but some tweaks need to happen on the gameplay front too, as players and critics tell it.

There are complaints about the AI—something as old as the franchise itself, to be fair. Some improvements have already been made, but players continue to report that AI civs keep founding cities close to players’ capitals for no apparent reason, causing frustration.

A small city appears close to the player's capitol

“Ashoka traveled across the entire continent just to settle four tiles away from my capital,” said DayTemporary3369, the Reddit user who posted this screenshot. They weren’t alone in this complaint. Credit: DayTemporary3369

Religion gameplay needs attention, as there’s no way to stop other leaders’ missionaries, leading to unsatisfying back-and-forth conversion gameplay. Similarly, players feel there aren’t enough defenses against espionage.

“If they’re all allowed to target me at the same time, I should be allowed to defend myself from all of them, provided I have enough influence,” said Reddit user Pay_No_Heed on the topic of counter-espionage. The complaint is reasonable, though a working design solution may not be as obvious as it seems.

Players have also complained that ages end too abruptly, and that holds true for the end of the game, which happens when the Modern Age concludes. It’s a quibble I also shared in my review. Many players are maxing out the game’s age length setting to combat this. Past Civilization games offered a “one more turn” option to extend the game past when someone had won. Firaxis has said this is coming to the end of the modern age in a future update.

There’s also the Civilopedia, the in-game database of concepts and help documentation. Players have noted it’s more barebones than expected, with several key concepts lacking entries or explanation. Firaxis acknowledged this complaint and said it’s being worked on.

“Yes, with each update we’re improving what’s exposed in the Civilopedia, including more gameplay data, easier navigation, et cetera. Expect much more to come in future updates,” Shirk explained.

In general, the game needs to have more information exposed to players. The gap is big enough that Reddit user JordiTK posted the heavily upvoted “Ultimate List of Things That Civilization VII Doesn’t Tell You.” It’s almost 5,000 words long, with more than 100 items.

Almost every prior Civilization game has had players complaining that it didn’t explain itself well enough, but the sentiment seems stronger this time. For what it’s worth, Shirk says the team recognizes this.

“Internally, our primary design goal for Civilization VII was to focus and iterate on the new mechanics, to really make sure this design would sing,” he said. “This focus on the new probably led us to work with a few false assumptions about what base level information players would need with our legacy systems, and it wasn’t something that came up as loudly as it should have in user testing.”

It’s not “We Love the Developer Day” just yet

While everyone in the community and within Firaxis agrees there’s still work to be done, the tone has improved since the launch because of these patches, and thanks to frequent engagement on Steam, Discord, and Reddit by the developer’s community manager.

The launch situation was made a little worse than it needed to be because of, strangely enough, confusion around nomenclature. Players who paid for the pricier special editions of the game were given “Advanced Access” a few days before the main launch date.

After it was apparent there were problems, some of the communications to players on storefronts and on Reddit called it “early access,” causing a bit of a stir because until then players hadn’t perceived the special edition advanced access to be the same as early access, which is a term typically used in the industry to let players know a game is incomplete and in a pre-release state.

When asked about this, a spokesperson for 2K Games (the game’s publisher) gave a statement to Ars that read:

Our goal is always to deliver the best product possible, including during Advanced Access periods. With a game the size and scope of Civilization VII there will always be fixes and optimizations once the entirety of the player base is able to jump in. The intent behind the Advanced Access granted to purchasers of the Deluxe Edition and Founders Edition was not to offer a work in progress product, and we take the feedback delivered during that period seriously.

We’re working hard to make sure that players have the best experience in the world of 4X strategy for years to come, and player feedback remains critical in helping us grow and build the future of Civ.

That suggests the use of “early access” was just a misstatement and not an attempt to cover for a rough pre-launch access period, but it wasn’t a great start to the conversation.

Since then, though, some of the most critical problems have been addressed, and the studio shared a roadmap that promised “UI updates and polish” in patches on March 4 (1.1.0, already released), March 25 (1.1.1), and sometime in April (1.2.0). The roadmap lists “additional UI updates & polish” for beyond April, too, confirming this will be a lengthy process.

A roadmap promising updates on March, April, and Beyond

Here’s the updated roadmap from Firaxis. Credit: 2K Games

This frequent communication, combined with the fact that players recognize there’s a good game here that needs some more polish, has meant that most of the discussions in the community during this first month have been pretty optimistic, despite the launch woes.

There was a time years ago when games were marketed leading up to their launch, but then the communication with players was over. In today’s market (especially for complex games like Civilization) there’s often a need to iterate in public. Players understand that and will roll with it if it’s communicated clearly to them. Firaxis stumbled on that in the opening days, but it’s now clear the studio understands that well, and the updates are rolling out.

We’ve seen a lot of rough launches for big games in recent years, and they often turn quite toxic. That said, the core Civilization community seems more patient and optimistic than you typically see in situations like this. That’s a credit to Firaxis’ years of goodwill, but it’s also a credit to the moderators and other leaders in the game’s community.

When I reviewed Civilization VII, I wrote that the core systems were strong, and that the game likely has a bright future ahead of it—but I also said it might make sense to wait a few weeks to dive in because of UI and balance issues.

It’s a few weeks later, and it looks like the game is on the right track, but there’s still a way to go if you’re looking for an impeccably polished product. That hasn’t stopped me from enjoying the dozens of hours I’ve played so far, though.

Photo of Samuel Axon

Samuel Axon is a senior editor at Ars Technica. He covers Apple, software development, gaming, AI, entertainment, and mixed reality. He has been writing about gaming and technology for nearly two decades at Engadget, PC World, Mashable, Vice, Polygon, Wired, and others. He previously ran a marketing and PR agency in the gaming industry, led editorial for the TV network CBS, and worked on social media marketing strategy for Samsung Mobile at the creative agency SPCSHP. He also is an independent software and game developer for iOS, Windows, and other platforms, and he is a graduate of DePaul University, where he studied interactive media and software development.

Civilization VII, one month later: The community and developers chime in Read More »

leaked-geforce-rtx-5060-and-5050-specs-suggest-nvidia-will-keep-playing-it-safe

Leaked GeForce RTX 5060 and 5050 specs suggest Nvidia will keep playing it safe

Nvidia has launched all of the GeForce RTX 50-series GPUs that it announced at CES, at least technically—whether you’re buying from Nvidia, AMD, or Intel, it’s nearly impossible to find any of these new cards at their advertised prices right now.

But hope springs eternal, and newly leaked specs for GeForce RTX 5060 and 5050-series cards suggest that Nvidia may be announcing these lower-end cards soon. These kinds of cards are rarely exciting, but Steam Hardware Survey data shows that these xx60 and xx50 cards are what the overwhelming majority of PC gamers are putting in their systems.

The specs, posted by a reliable leaker named Kopite and reported by Tom’s Hardware and others, suggest a refresh that’s in line with what Nvidia has done with most of the 50-series so far. Along with a move to the next-generation Blackwell architecture, the 5060 GPUs each come with a small increase to the number of CUDA cores, a jump from GDDR6 to GDDR7, and an increase in power consumption, but no changes to the amount of memory or the width of the memory bus. The 8GB versions, in particular, will probably continue to be marketed primarily as 1080p cards.

RTX 5060 Ti (leaked) RTX 4060 Ti RTX 5060 (leaked) RTX 4060 RTX 5050 (leaked) RTX 3050
CUDA Cores 4,608 4,352 3,840 3,072 2,560 2,560
Boost Clock Unknown 2,535 MHz Unknown 2,460 MHz Unknown 1,777 MHz
Memory Bus Width 128-bit 128-bit 128-bit 128-bit 128-bit 128-bit
Memory bandwidth Unknown 288 GB/s Unknown 272 GB/s Unknown 224 GB/s
Memory size 8GB or 16GB GDDR7 8GB or 16GB GDDR6 8GB GDDR7 8GB GDDR6 8GB GDDR6 8GB GDDR6
TGP 180 W 160 W 150 W 115 W 130 W 130 W

As with the 4060 Ti, the 5060 Ti is said to come in two versions, one with 8GB of RAM and one with 16GB. One of the 4060 Ti’s problems was that its relatively narrow 128-bit memory bus limited its performance at 1440p and 4K resolutions even with 16GB of RAM—the bandwidth increase from GDDR7 could help with this, but we’ll need to test to see for sure.

Leaked GeForce RTX 5060 and 5050 specs suggest Nvidia will keep playing it safe Read More »

six-ways-microsoft’s-portable-xbox-could-be-a-steam-deck-killer

Six ways Microsoft’s portable Xbox could be a Steam Deck killer

Bring old Xbox games to PC

The ultimate handheld system seller.

Credit: Microsoft / Bizarre Creations

The ultimate handheld system seller. Credit: Microsoft / Bizarre Creations

Microsoft has made a lot of hay over the way recent Xbox consoles can play games dating all the way back to the original Xbox. If Microsoft wants to set its first gaming handheld apart, it should make those old console games officially available on a Windows-based system for the first time.

The ability to download previous console games dating back to the Xbox 360 era (or beyond) would be an instant “system seller” feature for any portable Xbox. While this wouldn’t be a trivial technical lift on Microsoft’s part, the same emulation layer that powers Xbox console backward compatibility could surely be ported to Windows with a little bit of work. That process might be easier with a specific branded portable, too, since Microsoft would be working with full knowledge of what hardware was being used.

If Microsoft can give us a way to play Geometry Wars 2 on the go without having to deal with finicky third-party emulators, we’ll be eternally grateful.

Multiple hardware tiers

Xbox Series S (left), next to Xbox Series X (right).

One size does not fit all when it comes to consoles or to handhelds.

Credit: Sam Machkovech

One size does not fit all when it comes to consoles or to handhelds. Credit: Sam Machkovech

On the console side, Microsoft’s split simultaneous release of the Xbox Series S and X showed an understanding that not everyone wants to pay more money for the most powerful possible gaming hardware. Microsoft should extend this philosophy to gaming handhelds by releasing different tiers of portable Xbox hardware for price-conscious consumers.

Raw hardware power is the most obvious differentiator that could set a more expensive tier of Xbox portables apart from any cheaper options. But Microsoft could also offer portable options that reduce the overall bulk (a la the Nintendo Switch Lite) or offer relative improvements in screen size and quality (a la the Steam Deck OLED and Switch OLED).

“Made for Xbox”

It worked for Valve, it can work for Microsoft.

Credit: Valve

It worked for Valve, it can work for Microsoft. Credit: Valve

One of the best things about console gaming is that you can be confident any game you buy for a console will “just work” with your hardware. In the world of PC gaming handhelds, Valve has tried to replicate this with the “Deck Verified” program to highlight Steam games that are guaranteed to work in a portable setting.

Microsoft is well-positioned to work with game publishers to launch a similar program for its own Xbox-branded portable. There’s real value in offering gamers assurances that “Made for Xbox” PC games will “just work” on their Xbox-branded handheld.

This kind of verification system could also help simplify and clarify hardware requirements across different tiers of portable hardware power; any handheld marketed as “level 2” could play any games marketed as level 2 or below, for instance.

Six ways Microsoft’s portable Xbox could be a Steam Deck killer Read More »

amd-says-top-tier-ryzen-9900x3d-and-9950x3d-cpus-arrive-march-12-for-$599-and-$699

AMD says top-tier Ryzen 9900X3D and 9950X3D CPUs arrive March 12 for $599 and $699

Like the 7950X3D and 7900X3D, these new X3D chips combine a pair of AMD’s CPU chiplets, one that has the extra 64MB of cache stacked underneath it and one that doesn’t. For the 7950X3D, you get eight cores with extra cache and eight without; for the 7900X3D, you get eight cores with extra cache and four without.

It’s up to AMD’s chipset software to decide what kinds of apps get to run on each kind of CPU core. Non-gaming workloads prioritize the normal CPU cores, which are generally capable of slightly higher peak clock speeds, while games that benefit disproportionately from the extra cache are run on those cores instead. AMD’s software can “park” the non-V-Cache CPU cores when you’re playing games to ensure they’re not accidentally being run on less-suitable CPU cores.

We didn’t have issues with this core parking technology when we initially tested the 7950X3D and 7900X3D, and AMD has steadily made improvements since then to make sure that core parking is working properly. The new 9000-series X3D chips should benefit from that work, too. To get the best results, AMD officially recommends a fresh and fully updated Windows install, along with the newest BIOS for your motherboard and the newest AMD chipset drivers; swapping out another Ryzen CPU for an X3D model (or vice versa) without reinstalling Windows can occasionally lead to CPUs being parked (or not parked) when they are supposed to be (or not supposed to be).

AMD says top-tier Ryzen 9900X3D and 9950X3D CPUs arrive March 12 for $599 and $699 Read More »

blood-typers-is-a-terrifically-tense,-terror-filled-typing-tutor

Blood Typers is a terrifically tense, terror-filled typing tutor

When you think about it, the keyboard is the most complex video game controller in common use today, with over 100 distinct inputs arranged in a vast grid. Yet even the most complex keyboard-controlled games today tend to only use a relative handful of all those available keys for actual gameplay purposes.

The biggest exception to this rule is a typing game, which by definition asks players to send their fingers flying across every single letter on the keyboard (and then some) in quick succession. By default, though, typing games tend to take the form of extremely basic typing tutorials, where the gameplay amounts to little more than typing out words and sentences by rote as they appear on screen, maybe with a few cute accompanying animations.

Typing “gibbon” quickly has rarely felt this tense or important.

Credit: Outer Brain Studios

Typing “gibbon” quickly has rarely felt this tense or important. Credit: Outer Brain Studios

Blood Typers adds some much-needed complexity to that basic type-the-word-you-see concept, layering its typing tests on top of a full-fledged survival horror game reminiscent of the original PlayStation era. The result is an amazingly tense and compelling action adventure that also serves as a great way to hone your touch-typing skills.

See it, type it, do it

For some, Blood Typers may bring up first-glance memories of Typing of the Dead, Sega’s campy, typing-controlled take on the House of the Dead light gun game series. But Blood Typers goes well beyond Typing of the Dead‘s on-rails shooting, offering an experience that’s more like a typing-controlled version of Resident Evil.

Practically every action in Blood Typers requires typing a word that you see on-screen. That includes basic locomotion, which is accomplished by typing any of a number of short words scattered at key points in your surroundings in order to automatically walk to that point. It’s a bit awkward at first, but quickly becomes second nature as you memorize the names of various checkpoints and adjust to using the shift keys to turn that camera as you move.

Each of those words on the ground is a waypoint that you can type to move toward.

Credit: Outer Brain Studios

Each of those words on the ground is a waypoint that you can type to move toward. Credit: Outer Brain Studios

When any number of undead enemies appear, a quick tap of the tab key switches you to combat mode, which asks you to type longer words that appear above those enemies to use your weapons. More difficult enemies require multiple words to take down, including some with armor that means typing a single word repeatedly before you can move on.

While you start each scenario in Blood Typers with a handy melee weapon, you’ll end up juggling a wide variety of projectile firearms that feel uniquely tuned to the typing gameplay. The powerful shotgun, for instance, can take out larger enemies with just a single word, while the rapid-fire SMG lets you type only the first few letters of each word, allowing for a sort of rapid fire feel. The flamethrower, on the other hand, can set whole groups of nearby enemies aflame, which makes each subsequent attack word that much shorter and faster.

Blood Typers is a terrifically tense, terror-filled typing tutor Read More »

“literally-just-a-copy”—hit-ios-game-accused-of-unauthorized-html5-code-theft

“Literally just a copy”—hit iOS game accused of unauthorized HTML5 code theft

Viral success (for someone else)

VoltekPlay writes on Reddit that it was only alerted to the existence of My Baby or Not! on iOS by “a suspicious burst of traffic on our itch.io page—all coming from Google organic search.” Only after adding a “where did you find our game?” player poll to the page were the developers made aware of some popular TikTok videos featuring the iOS version.

“Luckily, some people in the [Tiktok] comments mentioned the real game name—Diapers, Please!—so a few thousand players were able to google their way to our page,” VoltekPlay writes. “I can only imagine how many more ended up on the thief’s App Store page instead.”

Earlier this week, the $2.99 iOS release of My Baby or Not! was quickly climbing iOS’s paid games charts, attracting an estimated 20,000 downloads overall, according to Sensor Tower.

Marwane Benyssef’s only previous iOS release, Kiosk Food Night Shift, also appears to be a direct copy of an itch.io release.

Marwane Benyssef’s only previous iOS release, Kiosk Food Night Shift, also appears to be a direct copy of an itch.io release.

The App Store listing credited My Baby or Not! to “Marwane Benyssef,” a new iOS developer with no apparent history in the game development community. Benyssef’s only other iOS game, Kiosk Food Night Shift, was released last August and appears to be a direct copy of Kiosk, a pay-what-you-want title that was posted to itch.io last year (with a subsequent “full” release on Steam this year)

In a Reddit post, the team at VoltekPlay said that they had filed a DMCA copyright claim against My Baby or Not! Apple subsequently shared that claim with Bennysof, VoltekPlay writes, along with a message that “Apple encourages the parties to a dispute to work directly with one another to resolve the claim.”

This morning, Ars reached out to Apple to request a comment on the situation. While awaiting a response (which Apple has yet to provide), Apple appears to have removed Benyssef’s developer page and all traces of their games from the iOS App Store.

“Literally just a copy”—hit iOS game accused of unauthorized HTML5 code theft Read More »

amd-radeon-rx-9070-and-9070-xt-review:-rdna-4-fixes-a-lot-of-amd’s-problems

AMD Radeon RX 9070 and 9070 XT review: RDNA 4 fixes a lot of AMD’s problems


For $549 and $599, AMD comes close to knocking out Nvidia’s GeForce RTX 5070.

AMD’s Radeon RX 9070 and 9070 XT are its first cards based on the RDNA 4 GPU architecture. Credit: Andrew Cunningham

AMD’s Radeon RX 9070 and 9070 XT are its first cards based on the RDNA 4 GPU architecture. Credit: Andrew Cunningham

AMD is a company that knows a thing or two about capitalizing on a competitor’s weaknesses. The company got through its early-2010s nadir partially because its Ryzen CPUs struck just as Intel’s current manufacturing woes began to set in, first with somewhat-worse CPUs that were great value for the money and later with CPUs that were better than anything Intel could offer.

Nvidia’s untrammeled dominance of the consumer graphics card market should also be an opportunity for AMD. Nvidia’s GeForce RTX 50-series graphics cards have given buyers very little to get excited about, with an unreachably expensive high-end 5090 refresh and modest-at-best gains from 5080 and 5070-series cards that are also pretty expensive by historical standards, when you can buy them at all. Tech YouTubers—both the people making the videos and the people leaving comments underneath them—have been almost uniformly unkind to the 50 series, hinting at consumer frustrations and pent-up demand for competitive products from other companies.

Enter AMD’s Radeon RX 9070 XT and RX 9070 graphics cards. These are aimed right at the middle of the current GPU market at the intersection of high sales volume and decent profit margins. They promise good 1440p and entry-level 4K gaming performance and improved power efficiency compared to previous-generation cards, with fixes for long-time shortcomings (ray-tracing performance, video encoding, and upscaling quality) that should, in theory, make them more tempting for people looking to ditch Nvidia.

Table of Contents

RX 9070 and 9070 XT specs and speeds

RX 9070 XT RX 9070 RX 7900 XTX RX 7900 XT RX 7900 GRE RX 7800 XT
Compute units (Stream processors) 64 RDNA4 (4,096) 56 RDNA4 (3,584) 96 RDNA3 (6,144) 84 RDNA3 (5,376) 80 RDNA3 (5,120) 60 RDNA3 (3,840)
Boost Clock 2,970 MHz 2,520 MHz 2,498 MHz 2,400 MHz 2,245 MHz 2,430 MHz
Memory Bus Width 256-bit 256-bit 384-bit 320-bit 256-bit 256-bit
Memory Bandwidth 650GB/s 650GB/s 960GB/s 800GB/s 576GB/s 624GB/s
Memory size 16GB GDDR6 16GB GDDR6 24GB GDDR6 20GB GDDR6 16GB GDDR6 16GB GDDR6
Total board power (TBP) 304 W 220 W 355 W 315 W 260 W 263 W

AMD’s high-level performance promise for the RDNA 4 architecture revolves around big increases in performance per compute unit (CU). An RDNA 4 CU, AMD says, is nearly twice as fast in rasterized performance as RDNA 2 (that is, rendering without ray-tracing effects enabled) and nearly 2.5 times as fast as RDNA 2 in games with ray-tracing effects enabled. Performance for at least some machine learning workloads also goes way up—twice as fast as RDNA 3 and four times as fast as RDNA 2.

We’ll see this in more detail when we start comparing performance, but AMD seems to have accomplished this goal. Despite having 64 or 56 compute units (for the 9070 XT and 9070, respectively), the cards’ performance often competes with AMD’s last-generation flagships, the RX 7900 XTX and 7900 XT. Those cards came with 96 and 84 compute units, respectively. The 9070 cards are specced a lot more like last generation’s RX 7800 XT—including the 16GB of GDDR6 on a 256-bit memory bus, as AMD still isn’t using GDDR6X or GDDR7—but they’re much faster than the 7800 XT was.

AMD has dramatically increased the performance-per-compute unit for RDNA 4. AMD

The 9070 series also uses a new 4 nm manufacturing process from TSMC, an upgrade from the 7000 series’ 5 nm process (and the 6 nm process used for the separate memory controller dies in higher-end RX 7000-series models that used chiplets). AMD’s GPUs are normally a bit less efficient than Nvidia’s, but the architectural improvements and the new manufacturing process allow AMD to do some important catch-up.

Both of the 9070 models we tested were ASRock Steel Legend models, and the 9070 and 9070 XT had identical designs—we’ll probably see a lot of this from AMD’s partners since the GPU dies and the 16GB RAM allotments are the same for both models. Both use two 8-pin power connectors; AMD says partners are free to use the 12-pin power connector if they want, but given Nvidia’s ongoing issues with it, most cards will likely stick with the reliable 8-pin connectors.

AMD doesn’t appear to be making and selling reference designs for the 9070 series the way it did for some RX 7000 and 6000-series GPUs or the way Nvidia does with its Founders Edition cards. From what we’ve seen, 2 or 2.5-slot, triple-fan designs will be the norm, the way they are for most midrange GPUs these days.

Testbed notes

We used the same GPU testbed for the Radeon RX 9070 series as we have for our GeForce RTX 50-series reviews.

An AMD Ryzen 7 9800X3D ensures that our graphics cards will be CPU-limited as little as possible. An ample 1050 W power supply, 32GB of DDR5-6000, and an AMD X670E motherboard with the latest BIOS installed round out the hardware. On the software side, we use an up-to-date installation of Windows 11 24H2 and recent GPU drivers for older cards, ensuring that our tests reflect whatever optimizations Microsoft, AMD, Nvidia, and game developers have made since the last generation of GPUs launched.

We have numbers for all of Nvidia’s RTX 50-series GPUs so far, plus most of the 40-series cards, most of AMD’s RX 7000-series cards, and a handful of older GPUs from the RTX 30-series and RX 6000 series. We’ll focus on comparing the 9070 XT and 9070 to other 1440p-to-4K graphics cards since those are the resolutions AMD is aiming at.

Performance

At $549 and $599, the 9070 series is priced to match Nvidia’s $549 RTX 5070 and undercut the $749 RTX 5070 Ti. So we’ll focus on comparing the 9070 series to those cards, plus the top tier of GPUs from the outgoing RX 7000-series.

Some 4K rasterized benchmarks.

Starting at the top with rasterized benchmarks with no ray-tracing effects, the 9070 XT does a good job of standing up to Nvidia’s RTX 5070 Ti, coming within a few frames per second of its performance in all the games we tested (and scoring very similarly in the 3DMark Time Spy Extreme benchmark).

Both cards are considerably faster than the RTX 5070—between 15 and 28 percent for the 9070 XT and between 5 and 13 percent for the regular 9070 (our 5070 scored weirdly low in Horizon Zero Dawn Remastered, so we’d treat those numbers as outliers for now). Both 9070 cards also stack up well next to the RX 7000 series here—the 9070 can usually just about match the performance of the 7900 XT, and the 9070 XT usually beats it by a little. Both cards thoroughly outrun the old RX 7900 GRE, which was AMD’s $549 GPU offering just a year ago.

The 7900 XT does have 20GB of RAM instead of 16GB, which might help its performance in some edge cases. But 16GB is still perfectly generous for a 1440p-to-4K graphics card—the 5070 only offers 12GB, which could end up limiting its performance in some games as RAM requirements continue to rise.

On ray-tracing improvements

Nvidia got a jump on AMD when it introduced hardware-accelerated ray-tracing in the RTX 20-series in 2018. And while these effects were only supported in a few games at the time, many modern games offer at least some kind of ray-traced lighting effects.

AMD caught up a little when it began shipping its own ray-tracing support in the RDNA2 architecture in late 2020, but the issue since then has always been that AMD cards have taken a larger performance hit than GeForce GPUs when these effects are turned on. RDNA3 promised improvements, but our tests still generally showed the same deficit as before.

So we’re looking for two things with RDNA4’s ray-tracing performance. First, we want the numbers to be higher than they were for comparably priced RX 7000-series GPUs, the same thing we look for in non-ray-traced (or rasterized) rendering performance. Second, we want the size of the performance hit to go down. To pick an example: the RX 7900 GRE could compete with Nvidia’s RTX 4070 Ti Super in games without ray tracing, but it was closer to a non-Super RTX 4070 in ray-traced games. It has helped keep AMD’s cards from being across-the-board competitive with Nvidia’s—is that any different now?

Benchmarks for games with ray-tracing effects enabled. Both AMD cards generally keep pace with the 5070 in these tests thanks to RDNA 4’s improvements.

The picture our tests paint is mixed but tentatively positive. The 9070 series and RDNA4 post solid improvements in the Cyberpunk 2077 benchmarks, substantially closing the performance gap with Nvidia. In games where AMD’s cards performed well enough before—here represented by Returnal—performance goes up, but roughly proportionately with rasterized performance. And both 9070 cards still punch below their weight in Black Myth: Wukong, falling substantially behind the 5070 under the punishing Cinematic graphics preset.

So the benefits you see, as with any GPU update, will depend a bit on the game you’re playing. There’s also a possibility that game optimizations and driver updates made with RDNA4 in mind could boost performance further. We can’t say that AMD has caught all the way up to Nvidia here—the 9070 and 9070 XT are both closer to the GeForce RTX 5070 than the 5070 Ti, despite keeping it closer to the 5070 Ti in rasterized tests—but there is real, measurable improvement here, which is what we were looking for.

Power usage

The 9070 series’ performance increases are particularly impressive when you look at the power-consumption numbers. The 9070 comes close to the 7900 XT’s performance but uses 90 W less power under load. It beats the RTX 5070 most of the time but uses around 30 W less power.

The 9070 XT is a little less impressive on this front—AMD has set clock speeds pretty high, and this can increase power use disproportionately. The 9070 XT is usually 10 or 15 percent faster than the 9070 but uses 38 percent more power. The XT’s power consumption is similar to the RTX 5070 Ti’s (a GPU it often matches) and the 7900 XT’s (a GPU it always beats), so it’s not too egregious, but it’s not as standout as the 9070’s.

AMD gives 9070 owners a couple of new toggles for power limits, though, which we’ll talk about in the next section.

Experimenting with “Total Board Power”

We don’t normally dabble much with overclocking when we review CPUs or GPUs—we’re happy to leave that to folks at other outlets. But when we review CPUs, we do usually test them with multiple power limits in place. Playing with power limits is easier (and occasionally safer) than actually overclocking, and it often comes with large gains to either performance (a chip that performs much better when given more power to work with) or efficiency (a chip that can run at nearly full speed without using as much power).

Initially, I experimented with the RX 9070’s power limits by accident. AMD sent me one version of the 9070 but exchanged it because of a minor problem the OEM identified with some units early in the production run. I had, of course, already run most of our tests on it, but that’s the way these things go sometimes.

By bumping the regular RX 9070’s TBP up just a bit, you can nudge it closer to 9070 XT-level performance.

The replacement RX 9070 card, an ASRock Steel Legend model, was performing significantly better in our tests, sometimes nearly closing the gap between the 9070 and the XT. It wasn’t until I tested power consumption that I discovered the explanation—by default, it was using a 245 W power limit rather than the AMD-defined 220 W limit. Usually, these kinds of factory tweaks don’t make much of a difference, but for the 9070, this power bump gave it a nice performance boost while still keeping it close to the 250 W power limit of the GeForce RTX 5070.

The 90-series cards we tested both add some power presets to AMD’s Adrenalin app in the Performance tab under Tuning. These replace and/or complement some of the automated overclocking and undervolting buttons that exist here for older Radeon cards. Clicking Favor Efficiency or Favor Performance can ratchet the card’s Total Board Power (TBP) up or down, limiting performance so that the card runs cooler and quieter or allowing the card to consume more power so it can run a bit faster.

The 9070 cards get slightly different performance tuning options in the Adrenalin software. These buttons mostly change the card’s Total Board Power (TBP), making it simple to either improve efficiency or boost performance a bit. Credit: Andrew Cunningham

For this particular ASRock 9070 card, the default TBP is set to 245 W. Selecting “Favor Efficiency” sets it to the default 220 W. You can double-check these values using an app like HWInfo, which displays both the current TBP and the maximum TBP in its Sensors Status window. Clicking the Custom button in the Adrenalin software gives you access to a Power Tuning slider, which for our card allowed us to ratchet the TBP up by up to 10 percent or down by as much as 30 percent.

This is all the firsthand testing we did with the power limits of the 9070 series, though I would assume that adding a bit more power also adds more overclocking headroom (bumping up the power limits is common for GPU overclockers no matter who makes your card). AMD says that some of its partners will ship 9070 XT models set to a roughly 340 W power limit out of the box but acknowledges that “you start seeing diminishing returns as you approach the top of that [power efficiency] curve.”

But it’s worth noting that the driver has another automated set-it-and-forget-it power setting you can easily use to find your preferred balance of performance and power efficiency.

A quick look at FSR4 performance

There’s a toggle in the driver for enabling FSR 4 in FSR 3.1-supporting games. Credit: Andrew Cunningham

One of AMD’s headlining improvements to the RX 90-series is the introduction of FSR 4, a new version of its FidelityFX Super Resolution upscaling algorithm. Like Nvidia’s DLSS and Intel’s XeSS, FSR 4 can take advantage of RDNA 4’s machine learning processing power to do hardware-backed upscaling instead of taking a hardware-agnostic approach as the older FSR versions did. AMD says this will improve upscaling quality, but it also means FSR4 will only work on RDNA 4 GPUs.

The good news is that FSR 3.1 and FSR 4 are forward- and backward-compatible. Games that have already added FSR 3.1 support can automatically take advantage of FSR 4, and games that support FSR 4 on the 90-series can just run FSR 3.1 on older and non-AMD GPUs.

FSR 4 comes with a small performance hit compared to FSR 3.1 at the same settings, but better overall quality can let you drop to a faster preset like Balanced or Performance and end up with more frames-per-second overall. Credit: Andrew Cunningham

The only game in our current test suite to be compatible with FSR 4 is Horizon Zero Dawn Remastered, and we tested its performance using both FSR 3.1 and FSR 4. In general, we found that FSR 4 improved visual quality at the cost of just a few frames per second when run at the same settings—not unlike using Nvidia’s recently released “transformer model” for DLSS upscaling.

Many games will let you choose which version of FSR you want to use. But for FSR 3.1 games that don’t have a built-in FSR 4 option, there’s a toggle in AMD’s Adrenalin driver you can hit to switch to the better upscaling algorithm.

Even if they come with a performance hit, new upscaling algorithms can still improve performance by making the lower-resolution presets look better. We run all of our testing in “Quality” mode, which generally renders at two-thirds of native resolution and scales up. But if FSR 4 running in Balanced or Performance mode looks the same to your eyes as FSR 3.1 running in Quality mode, you can still end up with a net performance improvement in the end.

RX 9070 or 9070 XT?

Just $50 separates the advertised price of the 9070 from that of the 9070 XT, something both Nvidia and AMD have done in the past that I find a bit annoying. If you have $549 to spend on a graphics card, you can almost certainly scrape together $599 for a graphics card. All else being equal, I’d tell most people trying to choose one of these to just spring for the 9070 XT.

That said, availability and retail pricing for these might be all over the place. If your choices are a regular RX 9070 or nothing, or an RX 9070 at $549 and an RX 9070 XT at any price higher than $599, I would just grab a 9070 and not sweat it too much. The two cards aren’t that far apart in performance, especially if you bump the 9070’s TBP up a little bit, and games that are playable on one will be playable at similar settings on the other.

Pretty close to great

If you’re building a 1440p or 4K gaming box, the 9070 series might be the ones to beat right now. Credit: Andrew Cunningham

We’ve got plenty of objective data in here, so I don’t mind saying that I came into this review kind of wanting to like the 9070 and 9070 XT. Nvidia’s 50-series cards have mostly upheld the status quo, and for the last couple of years, the status quo has been sustained high prices and very modest generational upgrades. And who doesn’t like an underdog story?

I think our test results mostly justify my priors. The RX 9070 and 9070 XT are very competitive graphics cards, helped along by a particularly mediocre RTX 5070 refresh from Nvidia. In non-ray-traced games, both cards wipe the floor with the 5070 and come close to competing with the $749 RTX 5070 Ti. In games and synthetic benchmarks with ray-tracing effects on, both cards can usually match or slightly beat the similarly priced 5070, partially (if not entirely) addressing AMD’s longstanding performance deficit here. Neither card comes close to the 5070 Ti in these games, but they’re also not priced like a 5070 Ti.

Just as impressively, the Radeon cards compete with the GeForce cards while consuming similar amounts of power. At stock settings, the RX 9070 uses roughly the same amount of power under load as a 4070 Super but with better performance. The 9070 XT uses about as much power as a 5070 Ti, with similar performance before you turn ray-tracing on. Power efficiency was a small but consistent drawback for the RX 7000 series compared to GeForce cards, and the 9070 cards mostly erase that disadvantage. AMD is also less stingy with the RAM, giving you 16GB for the price Nvidia charges for 12GB.

Some of the old caveats still apply. Radeons take a bigger performance hit, proportionally, than GeForce cards. DLSS already looks pretty good and is widely supported, while FSR 3.1/FSR 4 adoption is still relatively low. Nvidia has a nearly monopolistic grip on the dedicated GPU market, which means many apps, AI workloads, and games support its GPUs best/first/exclusively. AMD is always playing catch-up to Nvidia in some respect, and Nvidia keeps progressing quickly enough that it feels like AMD never quite has the opportunity to close the gap.

AMD also doesn’t have an answer for DLSS Multi-Frame Generation. The benefits of that technology are fairly narrow, and you already get most of those benefits with single-frame generation. But it’s still a thing that Nvidia does that AMDon’t.

Overall, the RX 9070 cards are both awfully tempting competitors to the GeForce RTX 5070—and occasionally even the 5070 Ti. They’re great at 1440p and decent at 4K. Sure, I’d like to see them priced another $50 or $100 cheaper to well and truly undercut the 5070 and bring 1440p-to-4K performance t0 a sub-$500 graphics card. It would be nice to see AMD undercut Nvidia’s GPUs as ruthlessly as it undercut Intel’s CPUs nearly a decade ago. But these RDNA4 GPUs have way fewer downsides than previous-generation cards, and they come at a moment of relative weakness for Nvidia. We’ll see if the sales follow.

The good

  • Great 1440p performance and solid 4K performance
  • 16GB of RAM
  • Decisively beats Nvidia’s RTX 5070, including in most ray-traced games
  • RX 9070 XT is competitive with RTX 5070 Ti in non-ray-traced games for less money
  • Both cards match or beat the RX 7900 XT, AMD’s second-fastest card from the last generation
  • Decent power efficiency for the 9070 XT and great power efficiency for the 9070
  • Automated options for tuning overall power use to prioritize either efficiency or performance
  • Reliable 8-pin power connectors available in many cards

The bad

  • Nvidia’s ray-tracing performance is still usually better
  • At $549 and $599, pricing matches but doesn’t undercut the RTX 5070
  • FSR 4 isn’t as widely supported as DLSS and may not be for a while

The ugly

  • Playing the “can you actually buy these for AMD’s advertised prices” game

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

AMD Radeon RX 9070 and 9070 XT review: RDNA 4 fixes a lot of AMD’s problems Read More »

shadowveil-is-a-stylish,-tough-single-player-auto-battler

Shadowveil is a stylish, tough single-player auto-battler

One thing Shadowveil: Legend of the Five Rings does well is invoke terror. Not just the terror of an overwhelming mass of dark energy encroaching on your fortress, which is what the story suggests. Moreso, the terror of hoping your little computer-controlled fighters will do the smart thing, then being forced to watch, helpless, as they are consumed by algorithmic choices, bad luck, your strategies, or some combination of all three.

Shadowveil, the first video game based on the more than 30-year-old Legend of the Five Rings fantasy franchise, is a roguelite auto-battler. You pick your Crab Clan hero (berserker hammer-wielder or tactical support type), train up some soldiers, and assign all of them abilities, items, and buffs you earn as you go. When battle starts, you choose which hex to start your fighters on, double-check your load-outs, then click to start and watch what happens. You win and march on, or you lose and regroup at base camp, buying some upgrades with your last run’s goods.

Shadowveil: Legend of the Five Rings launch trailer.

In my impressions after roughly seven hours of playing, Shadowveil could do more to soften its learning curve, but it presents a mostly satisfying mix of overwhelming odds and achievement. What’s irksome now could get patched, and what’s already there is intriguing, especially for the price.

The hard-worn path to knowledge

There are almost always more enemies than you have fighters, so it’s your job to find efficiencies, choke points, and good soldier pairings.

Credit: Palindrome Interactive

There are almost always more enemies than you have fighters, so it’s your job to find efficiencies, choke points, and good soldier pairings. Credit: Palindrome Interactive

Some necessary disclosure: Auto-battlers are not one of my go-to genres. Having responsibility for all the prep, but no control over what fighters will actually do when facing a glut of enemies, can feel punishing, unfair, and only sometimes motivating to try something different. Add that chaos and uncertainty to procedurally generated paths (like in Slay the Spire), and sometimes the defeats felt like my fault, sometimes the random number generator’s doing.

Losing is certainly anticipated in Shadowveil. The roguelite elements are the items and currencies you pick up from victories and carry back after defeat. With these, you can unlock new kinds of fighters, upgrade your squad members, and otherwise grease the skids for future runs. You’ll have to make tough choices here, as there are more than a half-dozen resources, some unique to each upgrade type, and some you might not pick up at all in any given run.

Shadowveil is a stylish, tough single-player auto-battler Read More »

george-orwell’s-1984-as-a-’90s-pc-game-has-to-be-seen-to-be-believed

George Orwell’s 1984 as a ’90s PC game has to be seen to be believed

Quick, to the training sphere!

The Big Brother announcement promised the ability to “interact with everything” and “disable and destroy intrusive tele-screens and spy cameras watching the player’s every move” across “10 square blocks of Orwell’s retro-futuristic world.” But footage from the demo falls well short of that promise, instead covering some extremely basic Riven-style puzzle gameplay (flips switches to turn on the power; use a screwdriver to open the grate, etc.) played from a first-person view.

Sample gameplay from the newly unearthed Big Brother demo.

It all builds up to a sequence where (according to a walk-through included on the demo disc) you have to put on a “zero-g suit” before planting a bomb inside a “zero gravity training sphere” guarded by robots. Sounds like inhabiting the world of the novel to us!

Aside from the brief mentions of the Thought Police and MiniPac, the short demo does include a few other incidental nods to its licensed source material, including a “WAR IS PEACE” propaganda banner and an animated screen with the titular Big Brother seemingly looking down on you. Still, the entire gameplay scenario is so far removed from anything in the actual 1984 novel to make you wonder why they bothered with the license in the first place. Of course, MediaX answers that question in the game’s announcement, predicting that “while the game stands on its own as an entirely new creation in itself and will attract the typical game audience, the ‘Big Brother’ game will undoubtedly also attract a large literary audience.”

We sadly never got the chance to see how that “large literary audience” would have reacted to a game that seemed poised to pervert both the name and themes of 1984 so radically. In any case, this demo can now sit alongside the release of 1984’s Fahrenheit 451 and 1992’s The Godfather: The Action Game on any list of the most questionable game adaptations of respected works of art.

George Orwell’s 1984 as a ’90s PC game has to be seen to be believed Read More »

kaizen:-a-factory-story-makes-a-game-of-perfecting-1980s-japanese-manufacturing

Kaizen: A Factory Story makes a game of perfecting 1980s Japanese manufacturing

Zach Barth, the namesake of game studio Zachtronics, tends to make a certain kind of game.

Besides crafting the free browser game Infiniminer, which inspired the entire global Minecraft industry, Barth and his collaborators made SpaceChem, Infinifactory, TIS-100, Shenzen I/O, Opus Magnum, and Exapunks. Each one of them is some combination of puzzle game, light capitalism horror, and the most memorable introductory-level computer science, chemistry, or logistics class into which you unwittingly enrolled. Each game is its own thing, but they have a certain similar brain feel between them. It is summed up perhaps best by the Zachtronics team itself in a book: Zach-Like.

Barth and his crew have made other kinds of games, including a forward-looking visual novel about AI, Eliza, and multiplayer card battler Nerts!. And Barth himself told PC Gamer that he hates “saying Zach-like.” But fans of refining inputs, ordering operations, and working their way past constraints will thrill to learn that Zach is, in fact, back.

Announcement trailer for Kaizen: A Factory Story.

Kaizen: A Factory Story, from developer Coincidence and comprising “the original Zachtronics team,” puts you, an American neophyte business type, in charge of a factory making toys, tiny electronics, and other goods during the Japanese economic boom of the 1980s. You arrange the spacing and order of operations of the mechanical arms that snap the head onto a robot toy, or the battery onto a Walkman, for as little time, power, and financial cost as possible.

Kaizen: A Factory Story makes a game of perfecting 1980s Japanese manufacturing Read More »

salty-game-dev-comments,-easier-mods-are-inside-command-&-conquer’s-source-code

Salty game dev comments, easier mods are inside Command & Conquer’s source code

Inside the source code are some wonderful reminders of what Windows game development from 1995 to 2003 was really like. One experienced modder posted some gems on Bluesky, like a “HACK ALERT!” text string added just to prevent the Watcom IDE from crashing because of a “magic text heap length” crash: “Who knows why, but it works,” wrote that poor soul.

This writer’s personal favorite is this little bit in the RampOptions.cpp file in Generals, credited to John K. McDonald Jr., which expresses concerns about “TheRampOptions” existing with a set value:

if (TheRampOptions)

// oh shit.

return;

In addition to helping out modders and entertaining experienced coders, the GPL-licensed source code releases do a lot to help preserve these games, such that they can be reworked to run on future platforms. Projects like OpenRA and OpenSAGE already offer open source reimplementations of those games’ code, but having the original source can only help. C&C community stalwart Luke “CCHyper” Feenan worked with EA leaders to get the code back into a build-ready state and said in a press release that the updated code should make the classic games easier to patch in the future.

As part of the source code release, the Command & Conquer team dropped off 35 minutes of footage, newly found in the archives, of alpha and archive footage from the later Sage-engine based Generals and Renegade games.

Archival footage from alpha versions of Command & Conquer: Generals and Renegade, released by EA as part of their source code release.

It’s heartening to see that with the right combination of people and purpose, classic games can find renewed interest and longevity inside a big publisher.

Salty game dev comments, easier mods are inside Command & Conquer’s source code Read More »

details-on-amd’s-$549-and-$599-radeon-rx-9070-gpus,-which-aim-at-nvidia-and-4k

Details on AMD’s $549 and $599 Radeon RX 9070 GPUs, which aim at Nvidia and 4K

AMD is releasing the first detailed specifications of its next-generation Radeon RX 9070 series GPUs and the RDNA4 graphics architecture today, almost two months after teasing them at CES.

The short version is that these are both upper-midrange graphics cards targeting resolutions of 1440p and 4K and meant to compete mainly with Nvidia’s incoming and outgoing 4070- and 5070-series GeForce GPUs, including the RTX 4070, RTX 5070, RTX 4070 Ti and Ti Super, and the RTX 5070 Ti.

AMD says the RX 9070 will start at $549, the same price as Nvidia’s RTX 5070. The slightly faster 9070 XT starts at $599, $150 less than the RTX 5070 Ti. The cards go on sale March 6, a day after Nvidia’s RTX 5070.

Neither Nvidia nor Intel has managed to keep its GPUs in stores at their announced starting prices so far, though, so how well AMD’s pricing stacks up to Nvidia in the real world may take a few weeks or months to settle out. For its part, AMD says it’s confident that it has enough supply to meet demand, but that’s as specific as the company’s reassurances got.

Specs and speeds: Radeon RX 9070 and 9070 XT

RX 9070 XT RX 9070 RX 7900 XTX RX 7900 XT RX 7900 GRE RX 7800 XT
Compute units (Stream processors) 64 RDNA4 (4,096) 56 RDNA4 (3,584) 96 RDNA3 (6,144) 84 RDNA3 (5,376) 80 RDNA3 (5,120) 60 RDNA3 (3,840)
Boost Clock 2,970 MHz 2,520 MHz 2,498 MHz 2,400 MHz 2,245 MHz 2,430 MHz
Memory Bus Width 256-bit 256-bit 384-bit 320-bit 256-bit 256-bit
Memory Bandwidth 650 GB/s 650 GB/s 960 GB/s 800 GB/s 576 GB/s 624 GB/s
Memory size 16GB GDDR6 16GB GDDR6 24GB GDDR6 20GB GDDR6 16GB GDDR6 16GB GDDR6
Total board power (TBP) 304 W 220 W 355 W 315 W 260 W 263 W

As is implied by their similar price tags, the 9070 and 9070 XT have more in common than not. Both are based on the same GPU die—the 9070 has 56 of the chip’s compute units enabled, while the 9070 XT has 64. Both cards come with 16GB of RAM (4GB more than the 5070, the same amount as the 5070 Ti) on a 256-bit memory bus, and both use two 8-pin power connectors by default, though the 9070 XT can use significantly more power than the 9070 (304 W, compared to 220 W).

AMD says that its partners are free to make Radeon cards with the 12VHPWR or 12V-2×6 power connectors on them, though given the apparently ongoing issues with the connector, we’d expect most Radeon GPUs to stick with the known quantity that is the 8-pin connector.

AMD says that the 9070 series is made using a 4 nm TSMC manufacturing process and that the chips are monolithic rather than being split up into chiplets as some RX 7000-series cards were. AMD’s commitment to its memory controller chiplets was always hit or miss with the 7000-series—the high-end cards tended to use them, while the lower-end GPUs were usually monolithic—so it’s not clear one way or the other whether this means AMD is giving up on chiplet-based GPUs altogether or if it’s just not using them this time around.

Details on AMD’s $549 and $599 Radeon RX 9070 GPUs, which aim at Nvidia and 4K Read More »