Ok, so after the Xbox series X announcement, I'm pretty worried about any PC GPU that is below a RTX 2080.
We've seen a Hellblade 2 trailer that assumingly sports full ray tracing while also running at 4K and frankly, puts out every other game there is to shame. We also saw a PS5 title that supports full ray traced reflections AND runs at 4K60, while, and that is the important part, current games like Shadow of The Tomb Raider, BF5, Metro and Control look much worse than that and also take a deep hit in performance with RTX on (even if its just reflections like in the PS5 title). And those, with RTX on, can't be done at native 4K60 even on a 2080TI even though they are much simpler games and use less RT techniques.
Furthermore, leaked specs according to insiders, are 12 RDNA2 TFLOPs, which puts them in a league between RTX 2080 and RTX 2080Ti which is just crazy.
So is it safe to say that AMD's solution of real time RT puts the one in Turing to shame? Also the raw performance destroys every card under a 2080.
Sure, Turing also supports VRS and Mesh Shaders which could improve performance more, but is that really enough to catch up with the consoles?
Or maybe Microsoft and AMD found a way to drastically improve Raytracing performance in DXR and Nvidia will further accelerate that with RT cores, so having RT performance on par with RDNA2 or even better? I mean there is also Crytek's version of RT which is much more efficient than RTX and runs even without RT cores, so maybe it's more like that but with dedicated hardware next gen?
What do you think?
We've seen a Hellblade 2 trailer that assumingly sports full ray tracing while also running at 4K and frankly, puts out every other game there is to shame. We also saw a PS5 title that supports full ray traced reflections AND runs at 4K60, while, and that is the important part, current games like Shadow of The Tomb Raider, BF5, Metro and Control look much worse than that and also take a deep hit in performance with RTX on (even if its just reflections like in the PS5 title). And those, with RTX on, can't be done at native 4K60 even on a 2080TI even though they are much simpler games and use less RT techniques.
Furthermore, leaked specs according to insiders, are 12 RDNA2 TFLOPs, which puts them in a league between RTX 2080 and RTX 2080Ti which is just crazy.
So is it safe to say that AMD's solution of real time RT puts the one in Turing to shame? Also the raw performance destroys every card under a 2080.
Sure, Turing also supports VRS and Mesh Shaders which could improve performance more, but is that really enough to catch up with the consoles?
Or maybe Microsoft and AMD found a way to drastically improve Raytracing performance in DXR and Nvidia will further accelerate that with RT cores, so having RT performance on par with RDNA2 or even better? I mean there is also Crytek's version of RT which is much more efficient than RTX and runs even without RT cores, so maybe it's more like that but with dedicated hardware next gen?
What do you think?