In its continuing bid to become the next Crysis, Cyberpunk 2077 recently released the "technology preview" of its RT Overdrive mode. That language is important, because it rightly suggests that most people either can't or shouldn't try out the fully path traced rendering mode for what was already a demanding game. Do you already own one of the best graphics cards? RT Overdrive could leave you longing for an upgrade.
Well, maybe. While the fully path traced version of the game is supposed to make everything look amazing, there's a reason most of the comparisons we've seen show RT Overdrive image quality versus pure rasterization-based rendering. If you're already running RT Ultra settings, the improvements aren't nearly as noticeable — they're still there, but it turns out hybrid rendering does a pretty good job of capturing most of the benefits of ray tracing without having to cast 635 rays per pixel (on average, according to Nvidia).
This is a tour de force for Nvidia's RTX hardware, including the beefed-up ray tracing cores, Shader Execution Reordering, Opacity Micro-Maps, and DLSS 3 Frame Generation. Maybe in two more generations of hardware, this is the sort of rendering we'll use in future state-of-the-art games. But that last item, in particular, deserves a bit more investigation.
As I've stated on several occasions, Frame Generation doesn't feel nearly as fast as the inflated numbers in charts might lead you to believe. You can see a full breakdown of five games with DLSS 3 support in our RTX 4070 review, but my take is that a 50% increase in frames to screen via Frame Generation feels more like a 10–20 percent improvement in performance. It will look better to someone looking over your shoulder watching you play than it feels, though, so maybe it's great for streamers?
But don't be concerned about some of those proprietary technologies limiting access to RT Overdrive mode. For better or worse, you can give path tracing a shot on everything from the original RTX 20-series up through the 40-series, as well as AMD's RX 6000- and 7000-series GPUs and Intel's Arc GPUs. Thankfully, Cyberpunk 2077 also has support for FSR 2.1 and XeSS upscaling algorithms, so there's still a chance that non-Nvidia cards will manage playable framerates.
And that's what we set out to determine: Just how well can the various graphics cards run RT Overdrive mode, and how much upscaling will you need to hit playable framerates?
RT Overdrive Image Quality Enhancements
Let's first start with a look at visual fidelity. Here I've captured the game's benchmark on an RTX 4090 running at 1080p using four different settings. Of course, you don't need an RTX 4090 to run Cyberpunk 2077's RT Overdrive mode. That's the point of this story. Also, you don't need RT Overdrive mode — it doesn't make the story or the gameplay in Cyberpunk 2077 any better (or worse). But, spoiler alert: If you want a snowball's chance in Hades of running the game well while using full path tracing, you're probably going to want an Nvidia RTX 40-series or a high-end 30-series part.
Anyway, in the top-left is RT Overdrive running at native 1080p with no upscaling. The top-right has RT Overdrive with DLSS Quality mode and Frame Generation enabled (the "actually useful" mode for most people), while the bottom-left uses the previously maximum quality RT Ultra preset with DLSS Quality upscaling and Frame Generation. Finally, in the bottom right is the Ultra preset (no ray tracing) running at native resolution.
Now, if you watch that video and think, "So, what's all the hubbub about," you're not alone. To be fair, there are areas where the extra RT effects of Overdrive mode are more apparent — the benchmark sequence isn't a great representation of the biggest changes. But you'll find a lot of the game looks quite similar to the previous RT Ultra hybrid rendering mode. If you want an easier way to compare the quality, here's a gallery of three different scenes, using the same settings as in the video.
First, it's important to note that many of the "differences" in the above images are just part of the dynamic nature of the game engine. There's swirling smoke and pulsing lights in the bar at the start, for example, and those are never the same between runs. The same goes for particle effects in the outside world, and the various NPC entities are randomly generated at runtime, so changes in clothing also occur.
Mostly, you'll notice in those screenshots that RT Overdrive looks similar to RT Ultra, only with much lower performance — and DLSS Quality mode plus Frame Generation mitigates that quite a bit, at least at 1080p with an RTX 4090. There are some nuances, but you have to search for them, and in normal gameplay there are lots of areas where you wouldn't really notice the difference.
The best-case results for Overdrive looking substantially better are in darker areas with multiple light sources. Outside, in sunlight? Not so much. Sure, having muzzle flashes accurately light up the environment can look pretty cool, but it doesn't fix the numerous other complaints people have with the game. It's the age-old story of better graphics not inherently making for a better game.
The Ultra preset without any ray tracing still looks good as well, though the missing graphics effects become easier to spot — screen space reflections, less accurate lighting and shadows, and that sort of thing. The shadows in the bar scene for example have relatively crisp outlines, which isn't how they would actually look in the real world (due to the numerous light sources), and the puddle on the sidewalk doesn't reflect things that aren't visible on the screen.
Cyberpunk 2077 RT Overdrive is supposed to be a look at the future of gaming, and if so, most gamers are probably just fine sticking with hybrid rendering for quite some time. Ray tracing or path tracing do have some advantages in that there's no need to "pre-bake" the lighting. Everything from shadows to reflections to ambient occlusion can be calculated in real time. That could potentially mean less work for artists and level designers, but most gamers would likely prefer saving money on the cost of a new graphics card rather than theoretically helping a corporation save on game development costs — and as long as most gamers aren't using full path tracing, it just means the developers have to do both hybrid and ray traced modes, so it ends up costing more.
If you're not feeling wowed by the lack of major improvements in graphics fidelity compared to the existing RT Ultra preset shown here, don't worry. RT Overdrive can look better than I've shown, but due to time constraints I mostly focused on the benchmarking aspect of this article. Overdrive will also provide you with a good excuse to upgrade your graphics card, assuming you're looking for that, because the performance requirements are Mt. Everest steep. Future games may put full path tracing to better use, but that only matters if you can run those games, and for most people that's going to be a stretch. Which brings us to the benchmarks.
Cyberpunk 2077 Overdrive Test Setup
We're using our standard 2023 GPU test PC, with a Core i9-13900K and all the other bells and whistles. For Cyberpunk 2077 RT Overdrive, we're going to include results from a decent collection of Nvidia, AMD, and Intel graphics cards — not everything, but enough to give you a reasonable estimate of where any "missing" cards might land.
We've tested using four different settings: RT Overdrive without any upscaling or Frame Generation, RT Overdrive with DLSS 2 / FSR 2 / XeSS upscaling in Performance mode (4X upscaling), RT Overdrive with DLSS 2 / FSR 2 upscaling in Ultra Performance mode (9X upscaling), and the previously existing RT Ultra settings with DLSS 2 / FSR 2 / XeSS upscaling in Performance mode. For the RTX 40-series GPUs, we'll also run each of those options with Frame Generation (aka DLSS 3) enabled. It's a lot of benchmarks.
We're also testing at 1920x1080, 2560x1440, and 3840x2160, to provide a comprehensive look at performance. There's a catch, however: We're absolutely not going to test every possible DXR-capable graphics card at every one of those settings. We've probably already run too many "useless" benchmarks, considering the early state of the RT Overdrive release. Native rendering will get down into the single digits pretty quickly — hence the inclusion of Ultra Performance upscaling. The 4K Native results for example are going to be somewhat sparse, and they're mostly to show just how demanding that setting would be if we didn't have upscaling algorithms.
There are now hundreds of games with some form up upscaling support, so expectations should be pretty familiar by now. The difference is that full path tracing means there's a far stronger correlation between pixels rendered and framerates. 1920x1080 means Cyberpunk 2077 has to render just over two million pixels. Performance mode upscaling cuts that number down to around 500K pixels, while Ultra Performance mode means the GPU only has to spit out about 230K pixels. If scaling were perfectly tied to pixels rendered, we'd see a 4X and 9X improvement in framerates for Performance and Ultra Performance upscaling, respectively.
Nvidia says RT Overdrive mode on average performs 635 RT calculations per pixel rendered, so running at 640x360 (Ultra Performance mode with 1080p) will be a lot easier, particularly on the less RT-capable GPUs. And with that out of the way, let's get to the benchmarks.
Cyberpunk 2077 RT Overdrive: Native Performance
Disclaimer: This is most assuredly not the way most people will want to play around with RT Overdrive. Still, it's an interesting look at theoretical worst-case ray tracing performance. We've tested all of the cards for the 1080p results, and then we tested progressively fewer cards at 1440p and 4K. Frame Generation (without upscaling) is also an option that we've tested on the RTX 40-series parts.
While we might hope for some measurement of pure ray tracing performance, keep in mind that this is a single game engine, running code that was almost certainly optimized primarily for Nvidia GPUs, so it's definitely not an agnostic look at the ray tracing hardware capabilities. We'll have more to say on that in the "Pure RT" section.
First, the good news: All of the RTX 40-series cards can break 30 fps at native 1080p. Sure, only the RTX 4090 can break 60 fps without help, but turning on Frame Generation boosts performance into the 50 fps and higher range on all of the cards. Nvidia's previous generation RTX 30-series parts are a bit less impressive, with the RTX 3080 just barely hitting 30 fps average while the 3090 gets a little bit of breathing room with a 35 fps result.
AMD and Intel GPUs on the other hand prove completely incapable of handling even 1080p native with full path tracing. The RX 7900 XTX can't even reach 20 fps, though it does just barely manage to beat Nvidia's first generation RTX 2080 Ti. The Arc A750 and A770, along with RX 6950 XT, RTX 2060, and other slower cards all fall into the single digits.
But why stop there? Looking at 1440p native performance, not even the RTX 4080 can break 30 fps without some form of AI enhancement. The RTX 4090 still manages a reasonably playable 44 fps, but nothing else will suffice. Even with Frame Generation, the RTX 4070 just barely squeaks past 30 fps, though the 4080 now reaches 55 fps and the 4090 sits at a comfortable 80 fps.
Finally, at 4K native, even the RTX 4090 manages just 21.4 fps. Frame Generation will push that up to 39 fps, an 82% improvement, but obviously more help is going to be needed.
While we're on the subject of native performance, though, notice a few things. First, performance scales almost perfectly with the number of pixels being rendered. The 4080 and 4090 deliver slightly better than linear scaling, but most of the remaining GPUs match up with the 78% and 125% increase in pixels when going from 1080p to 1440p, and from 1440p to 4K. Let's take two examples.
RX 7900 XTX got 16.7 fps at 1080p and 8.6 fps at 1440p, slightly worse than the target 9.4 fps of linear scaling (1.78X decrease) with pixels rendered. From 1440p to 4K, it dropped to 3.7 fps (3.8 fps would have been a 2.25X reduction). Nvidia's RTX 3090 went from 34.9 fps to 20.8 fps, a 1.68X reduction. Then moving to 4K dropped it to 9.2 fps, a 2.26X decrease. Which bodes well for upscaling techniques, as Performance mode can cut the number of pixels that need to be path traced down to one-fourth.
The other thing we want to note is that Frame Generation only gets Nvidia so far. The RTX 4070 for example got 33 fps at 1440p with FG enabled, but it still feels and plays like ~19 fps. The same goes for 4K on the 4090: 39 fps "performance" that feels like 21 fps. Also, Frame Generation totally broke down on the RTX 4070 at 4K. While the average increased to 12.6 fps from 7.3 fps, there were severe rendering errors. Our best guess is that the OFA (Optical Flow Accelerator) just isn't cut out to interpolate between frames when there's too much change happening. Or maybe it's just a bug, but either way 13 fps isn't going to be usable.
Cyberpunk 2077 RT Overdrive: Performance Upscaling
Going from native to Performance mode upscaling is a big jump. 4K should perform roughly the same as 1080p, for example (minus a bit due to the upscaling computations). That's not enough to push everything into the "playable" range, even at 1080p, but it's a start.
Obviously, Nvidia's 40-series parts with Frame Generation enabled look amazing if you're only seeing this chart. Again, those huge gains don't necessarily correlate directly with the feel of the game, but given we're seeing base performance (without FG) of 86 fps on the RTX 4070, you probably won't notice the slight increase in latency, and if you have a 144Hz or 240Hz monitor, you might even notice the slight benefit in higher frames to screen.
Without Frame Generation, the RTX 4090 now hits 142 fps, which is getting close to the CPU limit. The RTX 3090 and 3080 now easily break 60 fps, even on 1% lows, while the RTX 3070 just barely edges past the 60 fps mark. Note also that the RTX 4070, which was tied with the 3080 on average fps and slightly slower on 1% lows, now holds a slight performance advantage. That's likely thanks to the superior ray tracing hardware and tensor cores in the Ada Lovelace architecture.
AMD's RX 6800 XT and up are now technically playable at 30 fps or more, though really it's only the 7900 XT/XTX that truly reach acceptable levels of performance. Even an RTX 3060 comes out ahead of the RX 6950 XT, while the rather weak RTX 3050 nearly matches the RX 6800 XT. But there are plenty of GPUs that still need more help, even for 1080p gaming with path tracing.
Before we go there, let's quickly check the 1440p and 4K results. Again, 40-series with DLSS 3 Frame Generation enabled all reach very good performance levels — you can definitely play Cyberpunk 2077 with RT Overdrive settings at 1440p, provided you turn on upscaling and Frame Generation. The previous generation RTX 3090 does barely manage to break 60 fps, and again the RTX 4070 (without FG) comes in just a bit ahead of the RTX 3080. That's likely due to the SER and OMM features supported on Ada, which the RTX 30-series lacks.
AMD's RTX 7900 XTX is the only non-Nvidia card to break 30 fps, while Intel's best solution can't even break into the teens. But let's quickly note that FSR2 provided better performance on the Arc A770 than XeSS 1.1. That's an interesting result, but it's tempered by the fact that XeSS looked better than FSR2 — 4X upscaling without AI (meaning, FSR2's algorithm) results in more noticeable "sparkling" on the edges of objects. Not that it matters too much, since even Performance upscaling at 1080p wasn't playable on Arc GPUs.
And finally, at 4K with Performance upscaling, the RTX 4090 still averages 65 fps without Frame Generation — about 7% slower than the native 1080p result, if you're keeping track. FG boosts that by 58%, while the RTX 4080 sees a slightly higher 62% uplift and the RTX 4070 gains 61%. Basically, the OFA hardware is the same on all of the 40-series parts, so the potential gains at a given resolution are generally in the same ballpark. As for previous generation GPUs, the RTX 3090 lands at 32 fps, technically still playable but not a great result.
Cyberpunk 2077 RT Overdrive: Ultra Performance Upscaling
It's almost silly to talk about Ultra Performance upscaling modes (9X upscaling factor) as being something you'd have to use, but here we are. Originally intended "for 8K," many of the non-Nvidia GPUs (and even some of the slower Nvidia models) simply don't have the muscle to fully ray trace / path trace even 1920x1080 pixels. Chop that down to 640x360 on the other hand and suddenly we're in business.
Image fidelity is not great with this level of upscaling, but at least it's a foot in the door if you want to see for yourself how Cyberpunk 2077 looks in RT Overdrive mode. It's also worth noting that FSR2 doesn't seem to look as good with this upscaling mode as DLSS. That's not particularly shocking, as tripling the horizontal and vertical resolutions of the source content will always prove challenging, especially if you want to do that in a very short amount of time (i.e. at 60 fps).
First, notice that the RTX 4080 and 4090 are basically maxed out for 1080p performance, but interestingly the 4090 still has higher Frame Generation performance. That suggests that maybe the OFA isn't the only thing involved with Frame Generation, and the extra shader cores on the 4090 (or tensor cores) still help.
With the base resolution now set to 640x360 (before 9X upscaling), everything we tested manages to hit 30 fps, though that's not saying much for the RTX 2060, RX 6700 XT, and Arc GPUs. Nvidia's slowest RTX card is definitely held back by only having 6GB VRAM, as in other games it generally beats the RTX 3050. Then again, there are some architectural updates that favor the 3050 and we might be seeing those here — the 3070 and 2080 Ti are typically similar levels of performance, but here the 3070 leads by 28%.
AMD's RX 6700 XT meanwhile represents about the lowest GPU from AMD we'd even consider using with complex ray tracing turned on — the RX 6650 XT and below still support the feature, but it's not much practical use (and the same goes for the Intel Arc A380). Also notice that Intel's two Arc GPUs outperform the 6700 XT in this particular workload, another indication of Intel's better RT support. We suspect Intel will be able to further improve performance in this mode as well, maybe not to RTX 3060 levels but probably at least placing ahead of the RTX 3050.
Moving up to 2560x1440, with a base resolution of 853x480, performance ends up being pretty similar to what we saw at 1080p with Performance mode upscaling (i.e. from 960x540). Most of the GPUs are slightly faster at 1440p with Ultra Performance upscaling, and the RTX 3050 now clears the 30 fps mark, but otherwise the standings remain the same.
Finally, 3840x2160 with Ultra Performance mode (upscaled 1280x720) also looks nearly the same as 1440p with Performance mode (upscaled 1280x720). You'll still need an RTX 3070 or RX 7900 XTX to break the 30 fps barrier in either case, but the upscaling complexity ends up being a bit more demanding than at the lower resolution.
"Pure" Ray Tracing Performance
We noted that Cyberpunk 2077's Overdrive setting is our first look at how demanding future games might become when they implement "pure" ray tracing engines. This isn't the first game to utilize what Nvidia calls path tracing, but the previous attempts have all been far less complex environments. Quake II RTX is the oldest, a game that first launched in 1997. Minecraft RTX came next in the path tracing releases, and while the original game launched in 2009, it was intentionally less demanding on the graphics side of the equation (prior to ray tracing). Even Portal RTX represents a game that originally launched in 2007.
So making the jump to a game engine like Cyberpunk 2077, which came out in 2020 and was already one of the more demanding games around, represents a major step forward. And the graphics still don't look massively different, but it proves that we now have hardware (RTX 4080 and 4090 in particular) that can legitimately provide a good gaming experience via full "path tracing" in 2023.
But we also have at least one reasonable synthetic benchmark that provides a slightly different look at what full ray tracing performance might look like from the various GPUs. 3DMark Port Royal has a separate DXR Feature Test that runs at 2560x1440 and implements full ray tracing in real time, with 20 rays per pixel as the maximum setting. Here's how Cyberpunk 2077 Overdrive performance at native 1080p looks compared with the DXR Feature Test, on the same collection of GPUs.
If we're talking about hardware specific optimizations, we have to think that 3DMark is going to be far more agnostic than Cyberpunk 2077. Also, we've dropped the Frame Generation results from the charts, since we're interested in native full ray tracing performance. Still, there are a lot of differences between these two charts.
First, the gap between the 4090 and 4080 becomes quite a bit larger with the 3DMark DXR Feature Test. Curiously, raw fps on a lot of the Nvidia cards ends up being pretty close in the two benchmarks, but Cyberpunk 2077 has the RTX 4090 as one major exception, topping out at 70 fps compared with 85 fps in 3DMark. That's a 21% difference, where the other RTX cards range from almost no difference at all (1% on the 3090) to at most 11% (the 2060, which probably has issues with 6GB VRAM in Cyberpunk 2077).
But the differences for non-Nvidia hardware are quite a bit more dramatic. In the 3DMark test, AMD's RX 7900 XTX performs 80% better than it does in Cyberpunk 2077. Similarly large gaps appear on other AMD GPUs: 7900 XT is 86% faster in 3DMark, the 6950 XT is 142% faster, 6800 XT is 163% faster, and the 6700 XT is three times as fast. On the Intel Arc GPUs, the A770 16GB is 236% faster in 3DMark, while the A750 is 'only' 205% faster.
Now, that doesn't necessarily mean that Cyberpunk 2077 was intentionally optimized for Nvidia's hardware... but that's the most likely explanation for most of the difference. Driver optimizations — or rather, the lack thereof — for Cyberpunk's Overdrive mode are another possibility, but regardless there's plenty of reasons to take the current AMD and Intel results in this particular game with a healthy dose of skepticism.
Cyberpunk 2077 RT Ultra: Performance Upscaling
Finally, just to provide a more realistic view of real-world gaming performance, here are results using Cyberpunk 2077's previous RT Ultra preset that doesn't implement full path tracing. We've run this test a lot over the years, except unlike our usual GPU benchmarks, we're enabling Performance mode upscaling on everything.
If you've previously played Cyberpunk 2077 at maxed out settings with upscaling and managed to get far more playable results than what we're showing with RT Overdrive mode, that's because RT Ultra mode is far less taxing of the ray tracing hardware units.
Intel's Arc GPUs run about 3X faster in RT Ultra mode at 1080p, and up to 4X faster at 1440p. AMD's RX 7900 cards are over twice as fast at 1080p, 3X faster at 1440p, and almost 4X faster at 4K. The same mostly applies with the RX 6000-series GPUs we've tested.
Cyberpunk 2077 Closing Thoughts
For what should be obvious reasons (see the "Pure RT" section), we're not convinced these performance results are fully representative of non-Nvidia GPUs. That's fine in some ways, as Nvidia likely helped out a lot with the programming side of things, not to mention this is labeled as a "Technology Preview." Still, previews can be used for marketing purposes, which shows once more that it's absolutely possible to wildly skew a game engine to favor one architecture over others.
Even disregarding the performance questions, I really do wish that the visual difference between RT Overdrive and RT Ultra modes was more noticeable. Unfortunately, the built-in benchmark shows very little in the way of changes other than cutting framerates by 50–100 percent (or more in some cases). For now, if you're not using an Nvidia GPU, the path tracing technology preview is more of a curiosity than anything you should take seriously.
Looking forward, though, if we can already get a full path tracing engine implemented in Cyberpunk 2077, certainly other games could try this as well. I'd love to see a developer try to create a game from the ground up with the goal of using full ray tracing, with no hybrid rendering support, and then hear some thoughts on how much that did or didn't help with the creation of art assets, level design, and such.
The bottom line is that we're still a solid generation or two away from full ray tracing being viable for even half of PC gamers. Yes, RTX 4090 can chew through Cyberpunk 2077 in RT Overdrive mode and provide pretty impressive performance even at 4K — provided you enable 4X upscaling and DLSS 3 Frame Generation. But the latest Steam Hardware Survey puts the number of gamers with an RTX 4090 at just a quarter of a percent.
In fact, if you sum up all of the RTX 40-series results, it's still less than 1% of gamers. (That's using the February 2023 results, as the March results look a bit mangled.) Take that a step further; at best, 44% of Steam users have a card with any ray tracing support. Drop everything below the RTX 2080 and RTX 3060 (because such cards are "too slow" for DXR), and only about a quarter of gamers have at least reasonably fast ray tracing support.
But we're clearly improving in performance and features, and as questionable as Frame Generation can be in some situations, such technologies aren't going away. That goes double if you're looking at fully ray traced games. Consider that the RTX 3060 performs about the same as an RTX 2070 Super and project that forward. Once we have "RTX 5060" GPUs performing at around RTX 3080 levels and selling for (hopefully / we're dreaming) $350, ray tracing could truly reach the tipping point.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content