As AMD and Nvidia start rolling out their latest graphics cards, there is one thing that is clear as day: AMD is moving to reestablish itself as the market leader when it comes to affordability, and I couldn’t be happier.
I’ve had the privilege of playing many of the best PC games of the past few years using the best graphics card on the market as well as many of the best cheap graphics cards at any given time, and a number of things are coming into focus in ways that may not have been visible prior to the RTX era.
First, we’ve all known that graphics cards are getting more expensive, especially high-end cards, and in the Ampere-and-Big-Navi-era, there was a closing of the gap between the two major card makers in terms of price (excluding the RTX 3090 and RTX 3090 Ti, which had no competing AMD Radeon RX card to price against).
Also, we can recognize that the simple fact of running these cards is becoming much more costly, both in terms of the additional hardware required and your actual electricity bill. In fact, I wrote a scathing op-ed on this topic not too long ago.
Now that AMD has released its Ryzen 7000-series chips, though, and especially after it announced its Radeon RX 7000-series graphics cards, I realize I might have been too hasty in lumping AMD in with the worst offenders in this regard.
When good enough is good enough
One of the things that have dogged top-tier graphics cards is that you really do get to the point where you have far more power than you really need, and the RTX 4090 is a perfect case in point.
It is unquestionably the most powerful consumer graphics card on the planet, but unless you are a creative professional who needs this level of raw performance, it is absolute overkill for everything else.
Yes, it can play Cyberpunk 2077 at 4K with every setting maxed and get above 40 fps natively, but what’s the point? You can do much better with an RTX 3080 using DLSS set to performance. And honestly, it looks just as good, especially if you aren’t comparing the two side-by-side.
And that’s taking into account that Cyberpunk 2077 is one of the most taxing games out there. Most PC games don’t push the envelope nearly as far.
The RX 7900 XTX, meanwhile, looks to land somewhere between the RTX 3090 and RTX 4090 in terms of performance, which is just about all you’ll really ever need for gaming.
Beyond this point, you’re really just paying the extra $600/£600 for the bragging rights. Even the Nvidia RTX 4080, which has yet to go on sale, has a significantly higher MSRP. So, even if you compare the Radeon RX 7900 XTX to its stated competitor, it comes out ahead.
Ultimately, if the RX 7900 XT and RX 7900 XTX come anywhere close to the promised performance, it will be very difficult to recommend anything else to anyone but the super-enthusiast set.
About those power cables…
There is also the matter of the 12VHPWR power cable that Nvidia adopted starting with the RTX 3000-series.
This cable, which takes in four standard 8-pin connectors that come with all recent power supplies and converts them into a single 16-pin power connector, has been in the news lately. RTX 4090 customers have purportedly seen their very expensive graphics cards get burned out by failing power adapters and, in at least a couple of cases, native 12VHPWR cables from ATX 3.0 power supplies.
We have not seen anything wrong with the power cable on our RTX 4090 review unit and without the results of an official investigation by Nvidia and its partners or any independent tests that can verify the problem, it’s better to treat these as possibly isolated incidents involving those individual cables rather than a more systemic problem (for now).
But do you know what is a systemic problem? Creating a proprietary-ish power adapter that requires additional investment from consumers who have already invested a lot of money into a graphics card. Sure, it comes with an adapter, but there is something to be said for a graphics card that just uses the same 8-pin connectors that everyone else uses, which is the route AMD chose to go with the RX 7900 XTX. Point, AMD.
And those power requirements…
There is a new benchmark out that indicates that the RTX 4090 Ti is on its way, and while the RTX 4090 Ti looks impressive by the numbers, the RTX 4090 already has a 450W power requirement, which can be overclocked well north of the half-a-freaking-kW level. What will an RTX 4090 Ti look like? Do we even want to know at this point?
There are ad campaigns running right now (opens in new tab) to get those in the UK to flee the high energy costs expected this winter by taking a 30-day trip to Europe because it is cheaper than heating their homes. Is it exaggerated? I have no idea, but the resigned shrugs I’m seeing from some UK colleagues over the prospect of higher energy bills tell me that it’s at least capital-T True if not factually so.
Leaving aside climate change and the many issues inherent with that nightmare, Nvidia and Intel seem to have decided that the way to stay on top is to brute-force their way to dominance by pushing as much power through their transistors as possible, which is an increasingly expensive proposition.
Even in the US, energy bills are higher than they used to be, and running an obscenely high-powered graphics card or processor or both for the pleasure of 30 to 40 fps higher than the 90+ fps you’d get with a lower-powered card just isn’t a worthwhile trade-off for the vast majority of people.
It was probably the biggest complaint in my aforementioned op-ed, and it seems to be one that AMD is at least making progress in. Keeping the board power of the RX 7900 XTX to just 335W is incredibly impressive, and if AMD was able to squeeze the kind of performance it claims out of relatively little power draw, I’m sold.
Add to that, the AMD Ryzen 7000 series isn’t the most powerful out there, and it’s not exactly low-wattage, but it’s leaps and bounds ahead of Intel’s single-minded, throw-more-power-at-the-problem approach to better performance.
We have yet to see how well the RX 7900 XTX and RX 7900 XT perform, so only time will tell, but at this point, I’m already sold on AMD this generation, and I can’t imagine I’ll be the only one.