• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Reckheim

Avenger
Oct 25, 2017
9,374
It's a game changer for sure.

I'm blown away with what they've done in Death Stranding at the moment. 4k DLSS running around locked 60fps, everything set on ultra.

N1CAC6U.gif
what card?
 

Zedark

Member
Oct 25, 2017
14,719
The Netherlands
Yeah I don't get it, personally. I don't really understand how it's possible.

I play a few games with DLSS 2.0 which didn't support it initially (BFV) and the difference is stark.

Rarely does something affect performance as much as this has, especially with raytracing enabled.
The hidden magic is that at some remote place and time, NVIDIA and the devs computed a sort of ground truth at much higher resolution than 4K, which reveals much higher detail than a 4K image can. Then, the DLSS algorithm uses this information to reconstruct images to a given resolution, using details deduced from an image that has more detail than a native 4K image. The result is that certain aspects of the image are enhanced beyond the native 4K quality because DLSS has more information about the detail available than native 4K, which is what causes the image to look better than native in certain respects. It really is a wonderful application of machine learning imo.
 

Deleted member 14089

Oct 27, 2017
6,264
Indeed it really feels like magic, moreover, I've seen the difference between DLSS 1.x vs DLSS 2.0 on control and it really made a difference.
Wolfenstein Youngblood also really looked amazing with DLSS and now I pretty much play every game with it.

DirectML so far hasn't been demonstrated to run real-time on AMD hardware. Hopefully, that will be demonstrated soon and is also "baked-in" on the next-gen consoles (but I doubt it).
 

His Majesty

Member
Oct 25, 2017
12,171
Belgium
Damn, just realized Cyberpunk will have DLSS.



2070 Super.
Yeah, was announced last month.
www.nvidia.com

Cyberpunk 2077: Ray-Traced Effects Revealed, DLSS 2.0 Supported, Playable On GeForce NOW

See a new ray-traced trailer and screenshots, learn more about the PC ray tracing effects, and discover how you’ll be able to play Cyberpunk 2077 across all your devices with GeForce NOW.

I was hesitant on upgrading my 1080 Ti but now I feel basically obligated to buy an RTX card.
 

seroun

Member
Oct 25, 2018
4,464
What videos do you recommend to see the potential of DLSS? I've seen the feature discussed but I wanna see for myself.
 

Techno

The Fallen
Oct 27, 2017
6,409
Yeah, was announced last month.
www.nvidia.com

Cyberpunk 2077: Ray-Traced Effects Revealed, DLSS 2.0 Supported, Playable On GeForce NOW

See a new ray-traced trailer and screenshots, learn more about the PC ray tracing effects, and discover how you’ll be able to play Cyberpunk 2077 across all your devices with GeForce NOW.

I was hesitant on upgrading my 1080 Ti but now I feel basically obligated to buy an RTX card.

Yeah dude, it's basically a must buy now lol.
 

tokkun

Member
Oct 27, 2017
5,400
Yeah I don't get it, personally. I don't really understand how it's possible.

It took me a while to understand it as well. I don't think the media has done a good job of communicating it. Here is what I have concluded:

The "native resolution" results are using some kind of high-performance but low-quality anti-aliasing method. Typically something like TAA (temporal anti-aliasing). TAA introduces some unintended blurring of details.
DLSS 2.0 is actually very similar to TAA in concept, but it uses a neural network rather than a pre-baked algorithm. The neural network turns out to be far superior to the current TAA algorithm and causes much less unintended blurring of details. It is so much better than traditional TAA that the image may look better even when rendered at a lower resolution and upscaled.

Ultimately the reason DLSS 2.0 can sometimes look better than native is because of how bad TAA is. If you were to render the native resolution image using a higher quality form of anti-aliasing, such as SSAA, then the native resolution image would always win in quality. However the performance cost of SSAA is very high.

What DLSS 2.0 has exposed is that our existing algorithms for doing TAA are woefully sub-optimal. In theory we could figure out a different algorithm that worked as well as DLSS 2.0, but without a neural network. However finding this algorithm is easier said than done.
 

Zedark

Member
Oct 25, 2017
14,719
The Netherlands
Regarding DLSS in AMD: the algorithm itself isn't coming, as it's NVIDIA technology. However, we know from Microsoft that the Series X can perform similar computations in its ALUs that the NVIDIA GPUs do in their tensor cores. Of course, this means you lose general graphics computational power (what the NVIDIA GPUs do in their ALUs), but it does accelerate any machine learning algorithm, so it's a comparable if inferior solution. You can bet that Microsoft are using DirectML as a way of pushing a similar reconstruction to DLSS 2.0 in the coming years, and AMD GPUs should probably benefit from this. However, the NVIDIA solution is likely to be more efficient imo.
 

EduBRK

Member
Oct 30, 2017
981
Brazil
If the 3080 Ti / 3090 has more tensor cores (rumoured a lot more), does it make the implementation even more performant? Like, reconstruct from 720p to 4k? Or just better image quality?
 

z0m3le

Member
Oct 25, 2017
5,418
Indeed it really feels like magic, moreover, I've seen the difference between DLSS 1.x vs DLSS 2.0 on control and it really made a difference.
Wolfenstein Youngblood also really looked amazing with DLSS and now I pretty much play every game with it.

DirectML so far hasn't been demonstrated to run real-time on AMD hardware. Hopefully, that will be demonstrated soon and is also "baked-in" on the next-gen consoles (but I doubt it).
It won't be, even XSX at peak performance using rapid packet math to produce DLSS image if 8bit is possible, would be 4x slower than DLSS on the RTX 2060, and that already requires about 3ms of time to create a 4K image, with a 16ms window for 60fps, it's not something next gen consoles could do at 60fps, the XSX would take over 10ms to produce a 4K reconstruction image, it might not even be possible for 30fps (33ms window). This really requires the fixed function efficiency of tensor cores. Though a less effective DLSS 1.9 was done on cuda cores, which is not fixed function, the errors it makes are noticeable to the naked eye, but combined with sharpen, next gen consoles might be able to reconstruct 1440p+ into 4K without being too noticeable.

As some others have mentioned, this gets really exciting with Nintendo, as a hybrid console using DLSS to 'catch up' built on 7nm+ or even 5nm+, we could see the successor to the Switch hit inside the ballpark for next generation consoles. Probably closer than XB1 was to PS4.
 

J-Skee

The Wise Ones
Member
Oct 25, 2017
11,102
Yeah, was announced last month.
www.nvidia.com

Cyberpunk 2077: Ray-Traced Effects Revealed, DLSS 2.0 Supported, Playable On GeForce NOW

See a new ray-traced trailer and screenshots, learn more about the PC ray tracing effects, and discover how you’ll be able to play Cyberpunk 2077 across all your devices with GeForce NOW.

I was hesitant on upgrading my 1080 Ti but now I feel basically obligated to buy an RTX card.
DLSS IS SUPPORTED ON GEFORCE NOW?!?!?! 👀👀👀

Someone please stop me from cancelling my PS4 pre-order.
 

tokkun

Member
Oct 27, 2017
5,400
If the 3080 Ti / 3090 has more tensor cores (rumoured a lot more), does it make the implementation even more performant? Like, reconstruct from 720p to 4k? Or just better image quality?

We actually don't know. With more tensor cores you could use a more complicated neural network, but we don't know whether a more complicated network would produce noticeably better results than the current one.
 

Shake Appeal

Member
Oct 27, 2017
3,883
First game I played on my new PC was Control with DLSS and all the raytracing features on, and it was revelatory. And that was only DLSS 1.0.
 

Zedark

Member
Oct 25, 2017
14,719
The Netherlands
It took me a while to understand it as well. I don't think the media has done a good job of communicating it. Here is what I have concluded:

The "native resolution" results are using some kind of high-performance but low-quality anti-aliasing method. Typically something like TAA (temporal anti-aliasing). TAA introduces some unintended blurring of details.
DLSS 2.0 is actually very similar to TAA in concept, but it uses a neural network rather than a pre-baked algorithm. The neural network turns out to be far superior to the current TAA algorithm and causes much less unintended blurring of details. It is so much better than traditional TAA that the image may look better even when rendered at a lower resolution and upscaled.

Ultimately the reason DLSS 2.0 can sometimes look better than native is because of how bad TAA is. If you were to render the native resolution image using a higher quality form of anti-aliasing, such as SSAA, then the native resolution image would always win in quality. However the performance cost of SSAA is very high.

What DLSS 2.0 has exposed is that our existing algorithms for doing TAA are woefully sub-optimal. In theory we could figure out a different algorithm that worked as well as DLSS 2.0, but without a neural network. However finding this algorithm is easier said than done.
What you also need to take into account is that DLSS can be fundamentally better than native rendering in certain specific details (for example hair rendering) due to its ability to reconstruct detail from a much higher resolution ground truth image. However, the algorithm also has artifacts, especially in motion, so you don't get a super-resolution result overall, and it is typically quite comparable to native res on average.
 

Anddo

Member
Oct 28, 2017
2,854
Gamer changer for sure.
Currently playing Death Stranding at 1440P/144hz with zero framedrops. Insanity.
 
Nov 8, 2017
13,099
I think it's fantastic tech but that also maybe we're currently experiencing a bit of a honeymoon with it and that people are taking statements like "it looks better than native" as a generalisable universal truth when they probably shouldn't. I've noticed some artefacting with my own eyes that weren't mentioned in some analyses - not that this makes the tech bad! Just that it's not totally free of drawbacks. The most concrete example I could point to is in Wolf YB when you're in the subway stations, there's kind of a shimmering oddness on the staircases that doesn't exist at native res - and this still happened when I used DLSS + DSR to do my own homebrew version of DLSS 2x (native res -> supersample to 4x res -> downsample to screen).
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
It took me a while to understand it as well. I don't think the media has done a good job of communicating it. Here is what I have concluded:

The "native resolution" results are using some kind of high-performance but low-quality anti-aliasing method. Typically something like TAA (temporal anti-aliasing). TAA introduces some unintended blurring of details.
DLSS 2.0 is actually very similar to TAA in concept, but it uses a neural network rather than a pre-baked algorithm. The neural network turns out to be far superior to the current TAA algorithm and causes much less unintended blurring of details. It is so much better than traditional TAA that the image may look better even when rendered at a lower resolution and upscaled.

Ultimately the reason DLSS 2.0 can sometimes look better than native is because of how bad TAA is. If you were to render the native resolution image using a higher quality form of anti-aliasing, such as SSAA, then the native resolution image would always win in quality. However the performance cost of SSAA is very high.

What DLSS 2.0 has exposed is that our existing algorithms for doing TAA are woefully sub-optimal. In theory we could figure out a different algorithm that worked as well as DLSS 2.0, but without a neural network. However finding this algorithm is easier said than done.
brilliant post!
I think that's not the case anymore for DLSS 2.0, if I understood the Digital Foundry analysis correctly.

Indeed - it is a generalised model now, no per-game training.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
Good point if next Nintendo console have DLSS maybe they will compete having all third games...

DLSS 2.0 will certainly allow them to have many more third party games on Switch 2.

I believe Nintendo are currently working with Nvidia on a whole new handheld architecture which will see them through the next 10+ years much like the GameCube technology did in the early 2000's. It will included a much more performant ARM CPU aswell as a ~3tflop GPU with DLSS support so that when docked games will have 4k image quality. I expect the device to launch in Winter 2022.

I'm still torn on whether to build my new PC just now with rumours of a new Ryzen line of CPU's, the new SSD's being so expensive aswell as the 3000 series GPU's coming this year although on that front I think a 2080 using DLSS will be fine through next generation. That intel CPU Alex always uses on DF seems like a beast and it's only 6 cores?
 

tokkun

Member
Oct 27, 2017
5,400
What you also need to take into account is that DLSS can be fundamentally better than native rendering in certain specific details (for example hair rendering) due to its ability to reconstruct detail from a much higher resolution ground truth image. However, the algorithm also has artifacts, especially in motion, so you don't get a super-resolution result overall, and it is typically quite comparable to native res on average.

So can SSAA.

I think DLSS 1.0 demonstrated pretty clearly that simply doing upscaling using a NN trained on a high resolution image does not produce a better quality image. DLSS 1.0 even had the advantage of using game-specific model training, and the quality was still much worse. As AMD famously demonstrated, in many cases it was worse than even traditional upscaling with a sharpening filter.

It is really when used in place of lower quality anti-aliasing methods that DLSS shines.
 

P40L0

Member
Jun 12, 2018
7,610
Italy
I actually cannot get over these Control and Death Stranding videos. Apparently DLSS not only runs at like up to 100% more FPS, it also looks as good or better than native 4K? This all sounds too good to be true. I got an AMD RX 5700 XT - what do I do? By the looks of it, there's no AMD alternative on the market or on the horizon for that matter. Assuming AMD were to develop an equally good equivalent down the line, would my 5700 XT even be able to run it?

Frankly, I'm looking at selling my 5700 XT off and buying one of the next gen Nvidia GPUs once they hit the market. I've not even had my GPU for a year and it's been good but please someone tell me if there's a catch with DLSS. At this rate, I'm inclined to hold off on Cyberpunk until I got a GPU with DLSS.

Is there a chance it'll not have widespread support because next gen consoles run on AMD hardware? Not like DLSS dying would be a good thing, but if it's not be expected to make a breakthrough, then it's obviously not worth selling my 5700 XT off.

edit: This is what I'm talking about. UNREAL.




It really seems that miraculous.
And I hope both Xbox Series X and Playstation 5 already have something comparable under their pockets for the developers to leverage.
 

Carn

Member
Oct 27, 2017
11,911
The Netherlands

twdnewh

Member
Oct 31, 2018
648
Sydney, Australia
When the RTX line first came up we all thought RT effects would be the big thing, but actually it was DLSS.
It didn't get enough attention as pre 2.0 it was really still a work in progress. To me it really is the biggest gpu tech jump in a while, and one we really needed. It's already started to pick up and I suspect it won't take long before we see it implemented with every new major release. The tech is just too good to pass up.
 

OmegaDL50

One Winged Slayer
Member
Oct 25, 2017
9,661
Philadelphia, PA
It's typically never a good idea to speak in absolutes.

I remember a similar statements made in regards AMD will never have a CPU to compete with Intel ever again and then Ryzen 3 came out and turned everyone's expectations on their head.

For the record I own a RTX 2080 and plan to upgrade to a RTX 3000 series, but the world of computing technology is aggressive and things are constantly improving. It would best to hope for AMD to come out with something impressive to keep Nvidia's pricing in check.
 

z0m3le

Member
Oct 25, 2017
5,418
It really seems that miraculous.
And I hope both Xbox Series X and Playstation 5 already have something comparable under their pockets for the developers to leverage.
What you have to understand about DLSS, is until 2.0 was released just a handful of months ago, it was completely unproven, PS5 and XSX were already designed, they won't have tensor cores, but rapid packet math can be a slow solution that could possibly offer 30fps version of DLSS type of upscaling, though PS5 might still be too slow to do so, it's really borderline what they can squeeze out of these boxes, because even a RTX 2060 can perform the math between 4 and 8 times faster, depending on if 8bit precision or 16bit precision is required.
 

P40L0

Member
Jun 12, 2018
7,610
Italy
What you have to understand about DLSS, is until 2.0 was released just a handful of months ago, it was completely unproven, PS5 and XSX were already designed, they won't have tensor cores, but rapid packet math can be a slow solution that could possibly offer 30fps version of DLSS type of upscaling, though PS5 might still be too slow to do so, it's really borderline what they can squeeze out of these boxes, because even a RTX 2060 can perform the math between 4 and 8 times faster, depending on if 8bit precision or 16bit precision is required.
DirectML will at least be in for XSX, providing similar AI reconstruction of the image starting from much lower resolutions, but I also have my doubts it could ever reach current DLSS 2.0 benefits.
 

snausages

Member
Feb 12, 2018
10,337
It's not perfect tho, you can get weird ghosting against the edges of certain objects. It's not hugely prevalent but for instance you will see it if you look at a telephone wire while moving in Death Stranding. Apparently other objects exhibit similar weirdness.

I noticed it even worse in Control, even on DLSS 2.0
 

Oddhouse

Member
Oct 31, 2017
1,036
I need some DLSS advice.

I've got a gaming laptop m; 8750 i7 with 2070 max q.
Control it doubled my FPS when I turned on DLSS.

Death Stranding however I had 70-90fps without dlss but when I switched it on my fps didn't change at all. Tried at all quality DLSS.

Anybody else had the same issue ?
Wondered if it was becuase my laptop is CPU limited ?
 

J_Viper

Member
Oct 25, 2017
25,715
This is a super dumb question, will the 2060 in my laptop support DLSS, or is that only a desktop feature?