Hardware Unboxed spent the better part of a year shitting on DLSS and claiming that AMD's image sharpening was a superior solution - even after the problems of the initial few games were fixed.
Now whenever they benchmark games with RT support they usually say something along the lines of "well, NVIDIA is clearly ahead of AMD, but who really wants the performance hit of RT anyway?"
If I was NVIDIA I wouldn't keep them on the list of outlets that get free cards either.
I don't really like the way they're framing this as being "banned" from receiving future samples. They just aren't on the list for getting a free GPU any more. They aren't
banned from anything.
Let me just remind you:
In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was...
www.techpowerup.com
The source of this was a Hardware Canucks video where Dmitry claimed that NVIDIA must have intentionally made the SDR image look worse because a factory reset of the monitor; i.e. taking it out of the calibrated sRGB mode, ended up making it look brighter and more vibrant than the HDR image.
On my ASUS monitor, even the sRGB mode is not entirely accurate, since it locks the backlight to 50 when the correct setting is 30 (100 nits).
If you actually look at the comparison, the backlight is cranked up and the image lacks any depth after the factory reset.
An inaccurate SDR image can easily be made brighter and more vibrant than an accurate HDR one, and an HDR display lets you take SDR further out-of-spec.
You need to be comparing an accurate SDR image to an accurate HDR image in a side-by-side.
The benchmark for a game in development called boundary also had issues with reflections on AMD hardware.
I wonder if this is actually a bug, or another of their "optimizations" like having the driver limit tessellation back when their GPUs couldn't handle it nearly as well as NVIDIA's - which would often break effects in games (but run better) or going much further back, when they did similar things for Quake 3.