Nvidia's Ampere based RTX 30 series cards may be getting choked by less powerful CPUs

Image via MAHSPOONIS2BIG (Reddit)

AMD Radeon, in the past, has been infamous for CPU driver overhead problems when running last-gen DirectX 11 and OpenGL APIs. However, thanks to new more low-level APIs and improved micro-architecture design, the tables may have turned completely, if a test from a popular YouTube channel Hardware Unboxed is any indication.

According to the test, Nvidia"s Ampere GPUs - which constitute the RTX 30 series of graphics cards - maybe be far more easily bottlenecked by less powerful CPUs in comparison to AMD"s Radeon cards. Four processors and five graphics cards across multiple generations were used to conduct the test on several modern AAA titles to expose the driver overhead problem on Nvidia"s Ampere cards.

The games tested were Guerilla Games" Horizon Zero Dawn, Ubisoft"s Watch Dogs: Legion, Square Enix"s Shadow of the Tomb Raider, and more. Each of the three games mentioned here runs on the DirectX 12 API and is known to be very CPU-heavy. It is noteworthy here that lower quality graphics presets and lower resolutions were used in these titles so as to ensure that the test wasn"t bottlenecked by the GPU.

The hardware parts used in the test and their corresponding architectures are stated below:

  • CPUs:
    • Ryzen 5 1600X (Zen),
    • Ryzen 5 2600X (Zen+),
    • Ryzen 5 5600X (Zen 3),
    • Core i3-10100 (Skylake refresh).
  • GPUs:
    • RX 6900 XT & RX 6800 (RDNA 2),
    • RTX 3090 & RTX 3070 (Ampere),
    • RX 5700 XT & RX 5600 XT (RDNA 1).
Horizon Zero Dawn (Original Quality)

As mentioned earlier, lower presets were used for testing. This is indicated by the "Original Quality" preset for Horizon Zero Dawn (above) and the "Medium Quality" preset for Watch Dogs: Legion (below).

Watch Dogs: Legion (Medium Quality)

While the previous two games were used to measure the framerate differentials across the different architectures, Shadow of the Tomb Raider instead has been used to compare the CPU usages on two systems - one with an RTX 3070 and the other with an RX 6800, both running Intel"s Core i3-10100 processor.

The framerate for this test was locked to 60fps on both systems and it was consequently observed that the Nvidia side exhibited significantly higher CPU usage to render the frames.

Shadow of the Tomb Raider

It seems pretty evident, at least from these numbers, that Nvidia"s Ampere cards are suffering from some sort of bottlenecking in such CPU-demanding situations in games, irrespective of the processor used, be it Intel or AMD. However, whether this is due to the driver overhead or if there is some other issue at play here can only be determined by further testing.

It should also be kept in mind that this is an academic exercise since most people running something as powerful as the RX 6900 XT and RTX 3090 probably won"t likely be playing such games with lowered graphics settings. However, even with everything set to ultra quality, such CPU-bound situations can be experienced by people with older, even-less capable CPUs, and in that case, the Radeon cards may be better alternatives for them.

Source and images: Hardware Unboxed (YouTube)

Report a problem with article
Next Article

Microsoft Whiteboard preview is now available on Android, but not for personal accounts

Previous Article

The Outer Worlds' final story expansion, Murder on Eridanos, lands on March 17