Just how powerful are the next-gen consoles compared to PC? We started to answer this question by looking at Assassin's Creed Valhalla, but with the implementation of ray tracing, Call of Duty Black Ops Cold War is another fascinating test case - and the answers are surprising. Join Alex Battaglia for the full low-down.
Written version: https://www.eurogamer.net/articles/digitalfoundry-2021-call-of-duty-black-ops-cold-war-pc-vs-ps5
Laziness hit and I won't do a TL:DW, so here some quotes
The volumetric lighting setting controls the resolution of lit volumetric fog in the game, where PS5 is closest to PC's low setting. Water tessellation controls the displacement of water, offering up more detail and this is turned off on PlayStation 5. Other features are engaged though - motion blur on PS5 is equivalent to the PC game with its 'all' setting, yet has fewer samples in motion than the closest 'high' quality level. Meanwhile, texture quality on PS5 is equivalent to PC's maxed out setting, as long as the high quality texture pack is installed.
...ray tracing settings are intriguing. It is supported on PS5 and Xbox Series X in the form of ray traced shadows, but I swiftly discovered that even when enabled, RT isn't present throughout the game. For example, it seems that the Fractured Jaw mission on consoles does not use ray tracing at all - presumably because the nature of the content is just too taxing to make the effect viable with a target 60fps. Memory usage of RT is also significant, which may offer one reason why Xbox Series S does not feature ray tracing at all. Another aspect to factor in is the additional CPU burden - RT requires a 'BVH structure' which is effectively a copy of scene geometry to shoot rays at in order to calculate the relevant effects. Fractured Jaw is a very complex stage, so setting up this scene for RT will not be insignificant. Consoles drop back to standard shadow maps here, and it's interesting to note that PS5 seems to possess shadow quality that's in excess of PC's ultra preset.
However, the air strip set piece in Turkey throws up an interesting anomaly: on PlayStation 5, it seems that DRS is disabled and all of my pixel counts suggest native 4K rendering throughout - and yes, performance can't sustain the 60fps target as a consequence, hitting a 45.6fps average across the sequence. This is our best shot at directly stacking up console vs PC, but there's a very important caveat to factor in - the lower precision effects buffer on PS5, which we can't replicate on PC, and where we can't even begin to measure the possible performance penalty on our GPUs. Put simply, ballpark is the best we're going to get.
Regardless, at the top end, an RTX 3090 delivers an 81.2 per cent boost to performance in this segment at equivalent settings, while RTX 3070 is just 8.6 per cent faster. An RTX 2070 Super can't match PlayStation 5 - in fact, it's 20 per cent slower. On the AMD side of things, I found the RX 6800 XT's result to be off-pace - it has 72 compute units vs the 36 inside PlayStation 5, it's based on the same architecture, and clock speeds are broadly equivalent, yet it delivered just 29.4 per cent of extra performance. Whether it's an optimisation issue, or a driver issue, I expected more.
It's an interesting exercise, but in this case mostly an academic one in several respects. Beyond the issue we have in precisely matching settings and where we can't access dynamic resolution scaling on PC (more's the pity!) it's not entirely representative of actual use-case scenarios. Take the RTX 2070 Super, for example.