As we approach the end of this generation of graphics cards, there’s a lot of excitement about what’s next for Nvidia and AMD. I’m certainly one of those people who is eager to see what Team Green and Team Red have in store, especially if they can do more to prioritize energy efficiency and customer value rather than investing in power and performance that no one else – to the planet – can pay.
That said, I’ve been in a pretty privileged position compared to most people, as I’ve been able to play games on pretty much every current-gen graphics card to work, and so I’ve learned a thing or two about the current state of the market for the best graphics cards, and where technology needs to go in the next generation.
Ray tracing is still a work in progress at the moment
Ray tracing is a fascinating technology that has enormous potential for creating stunning lifelike scenes by mimicking the way our eyes actually perceive light, but expensive, how computationally expensive.
The number of calculations required to realistically light a scene in real-time is enormous, which is why real-time ray tracing has been deemed virtually impossible on consumer-grade hardware. That is, of course, until Nvidia released its Turing architecture with the GeForce RTX 2000 series graphics cards.
As the first generation of consumer graphics cards with real-time ray tracing, it’s understandable that it was an interesting experimental feature, but you really couldn’t do much with it while gaming without absolutely cratering the frame rate. This is still true even as we end the Nvidia Ampere generation of cards.
These cards are better able to handle real-time ray tracing, especially at lower resolutions, but you’ll still need to make a compromise between resolution and ray tracing. For example, there’s no graphics card that can effectively track a scene at native 4K resolution other than a full slideshow other than the RTX 3090 Ti, which is capable of tracking Cyberpunk 2077 at around 24 fps with ray tracing enabled. .
Meanwhile, AMD is on its first generation of graphics card hardware with real-time ray tracing, and its performance is definitely where Nvidia Turing cards were more or less when it came to ray tracing performance, which means which isn’t horrible, but still definitely first-gen technology.
Upscaling of the future
Image 1 of 3
So how does anyone effectively play any of the best PC games in high resolution with ray tracing enabled anyway, if even the best gaming PC possible today is going to suffer?
I’m glad you asked, because the real revolutionary development of the last few years wasn’t ray tracing, but graphics upscaling. Nvidia Deep Learning Super Sampling and AMD FidelityFX Super Resolution (as well as AMD Radeon Super Resolution), made PC gaming at high resolutions and settings with ray tracing possible.
In our slideshow above, you can see the difference between native 4K with all settings and ray tracing enabled for ultra presets and how the game looks without DLSS, with DLSS set to quality and DLSS set to performance. I can tell the difference is not really apparent when running the benchmark or playing the game.
Without upscaling, those with Nvidia GTX 1060s and AMD RX 5700 XTs would have very little reason to upgrade to a new graphics card, honestly.
Some of the best games don’t take advantage of this hardware, and those that do can still suck.
The thing about games is that they are rarely about the amazing graphics, but they are about the experience. The kind of hardware we’re looking at now makes some games look great, but if they’re poorly optimized, what’s the point? You end up with Cyberpunk 2077, a game that was released so broken on PCs that it took a substantial amount of market value from the studio that made it, CD Projekt Red.
Meanwhile, something like Vampire Survivor can pretty much dominate Steam, even if it seems to run on an Adderall-doped NES, largely because it gets to the heart of what makes us want to play them in the first place: we want them to be fun. And the fact is, you don’t need an RTX 3090 Ti to have fun, and I think a lot of us forget that.
If Nvidia and AMD were smart, they would focus less on making high-end graphics improvements and more on efficiency and value, so gamers who Does want to get the best graphics and performance out of a game you can do it without having to spend a fortune for it. Gamers will be less and less able to afford the best Nvidia Geforce graphics cards and the best AMD graphics cards in the years to come, and it honestly would suck if we continue to see an already expensive hobby get even more inaccessible.