I still haven't seen any proof that DLSS is better than native 4k.
It's a slightly different image, sure. It often sharper (sometimes to the point of being over-sharpened and showing ringing artifacts),
But I don't see how it's better, especially in the general case.
Maybe it's just me and I have a very different definition of image quality and what's "better".
That said, I've been a huge proponent of DLSS since day one even in the original form, and it's by far the most interesting feature of Turing to me.
It being better than checkerboard rendering it's not very hard tho, it's a messy hack and I'm glad it died very quickly.
The pain it caused in its very short lifespan still causes PTSD to many rendering engineer.