• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

MrKlaw

Member
Oct 25, 2017
33,038
You will see in an upcoming video that DLSS 2.0 in a direct comparison with checkerboarding rendering does not produce nearly as many ghosts and instabilities or artefacts, not by a long shot. It in fact crushes checkerboarding rendering from a quality and Stability stand point, with lesser real pixels as well. Have you done the comparison already that you type that it is just as unstable, ghosty, or artefacts with such certainty and conviction?


we have checkerboarding and other temporal reconstruction techniques on current gen consoles. Decent quality for relatively low computational cost. We have DLSS2.0 on dedicated HW on PC. Higher quality but requires custom HW.

Will be interesting to see what next gen consoles come up with. They don't have anywhere near the performance of even the lowest RTX card for HW accelerated ML, but they have way more performance than current gen. So will we see significant improvements in quality by evolution of current temporal solutions? Or a DLSS-lite similar ML approach tuned to the lower TOPS of RDNA2?
 

hob982

Member
Oct 27, 2017
321
I cannot stress this enough. DLSS 2.0 is black magic, it IS the secret sauce, for the first time in ages, there IS a secret sauce.
I really, really hope consoles can catch up because this is really a game changer, like Gsync was back in 2013 or so, it's the most important performance update ever in the PC space imho aside from Gsync. Bless you Nvidia, hate you for your fuckin prices but damn.
 

Isee

Avenger
Oct 25, 2017
6,235
When first 3D accelerator cards arrived, they managed to give us better visuals, while significantly boosting performance.
Especially the 3dfx Voodoo blew me away back then and has a special place in my heart. It was clear, this tech will evolve and become the standard at one point in time.

DLSS is setting a similar milestone imo. Give it ten years and AI upscaling will be the standard. It will be different; it will be improved and more advanced. But it began to take off with this here. Just like "GPUs" began to take off with the voodoo 1

Inb4: "Actually, the 3dfx voodoo one was..." responses,
it's not a GPU, no it wasn't the first 3D acceleration card either. But it was what conquered the market and reached mass appeal and adoption.
 

impingu1984

Member
Oct 31, 2017
3,415
UK
The only reason I feel I need to upgrade my 1080ti is DLSS 2.0...

Waiting for RTX3080ti..

It's just a natural evolution of Nvidia push into GPGPU Machine / Deep learning... They are all in on this and been for a number of years now... That brings in the big money and the tech is now trickling down to the consumer cards...


These specialised cores increase the cost significantly but people are willing to pay so why not?

With DLSS 2.0 I could get a overall 15% increase over a 1080ti but with DLSS that could be like 50% + that makes a very expensive GPU RTX3080ti feel slightly more justifable.

This is why Nvidia is stomping AMD in the GPU space because is AMD is basically a consumer led GPU company.... NVIDIA has diversifed it's technology for these other use cases and therefore diversified it's revenue.

AMD has done exactly the same to Intel in the server CPU space with Eypc BTW. So I'm not hating on AMD.. the big losers right now is Intel.
 

Falus

Banned
Oct 27, 2017
7,656
I tried it on GeForce now. 1080p60 with dlss. It looks ok but not 4k look alike at all. But maybe I'm stopped by streaming technology. Anyway. Death stranding at 60fps max settings is such a looker
 

Arthands

Banned
Oct 26, 2017
8,039
I cannot stress this enough. DLSS 2.0 is black magic, it IS the secret sauce, for the first time in ages, there IS a secret sauce.
I really, really hope consoles can catch up because this is really a game changer, like Gsync was back in 2013 or so, it's the most important performance update ever in the PC space imho aside from Gsync. Bless you Nvidia, hate you for your fuckin prices but damn.

yah but Sony and developers say SSD is the gamechanger not DLSS
 

Orioto

Member
Oct 26, 2017
4,716
Paris
So, just to be sure, i have a little question here.

You can't force DLSS 2.0 if it's not in the game initially right ? Or in things like emulators ?
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
I tried it on GeForce now. 1080p60 with dlss. It looks ok but not 4k look alike at all. But maybe I'm stopped by streaming technology. Anyway. Death stranding at 60fps max settings is such a looker
If you used 1080p DLSS, you were reconstructing from a lower resolution up to 1080p. So 540p or 720p depending on the setting. You need to select 4K and then enable DLSS to reconstruct up to 4K.
 

Isee

Avenger
Oct 25, 2017
6,235
So, just to be sure, i have a little question here.

You can't force DLSS 2.0 if it's not in the game initially right ? Or in things like emulators ?

As of right now it's not a "global" setting. Games need to explicitly support it.
Would be great if that'd change, but I'm not sure it will anytime soon.
 

hob982

Member
Oct 27, 2017
321
yah but Sony and developers say SSD is the gamechanger not DLSS
SSD was needed, ofc. PCs had SSD for a long time now (though not at PS5 speeds) and that is a gamechanger too, but playing on pc one should already be accustumed to SSD speeds and having minimal if no loading times.
Problem is, next big graphical paradigm is lighting and RT, and that is a real hurdle for console since there's no way to do it without a heavy performance penalty, and that's why DLSS is the way to go. Basically it more then doubles the frame rate at the same, or better, visual quality. Sony had to resort to Checkerboarding to give a "boost" to Ps4 Pro to reach 4k, and it looks like it'll need something again, some kind of trick, to have RT enabled effects and hi-res and/or high frame rates, and we're seeing some corners being cut already (lots of 30fps games especially). DLSS truly give RTX card an edge, I'm honestly thinking that the only real problem is that not all games support it, because otherwise the anticipation for RTX 3XXX series would be significantly lower lol since 2080Ti would basically suffice for all next gen games, still reaching higher frame rates than consoles. I have a RTX 2060 *laptop*, it is the lower, least performing RTX card in the market atm, and I'm constantly in awe at what is can accomplish with DLSS 2.0/RTX enabled games.
DLSS 1.0 was bad, image quality took a hit and the final result was a really dirty and unclean graphics, but DLSS 2.0..... maaaan :)
 

Techno

The Fallen
Oct 27, 2017
6,409
yah but Sony and developers say SSD is the gamechanger not DLSS

Imagine these new SSDs + DLSS 2.0.

giphy.webp


I cannot stress this enough. DLSS 2.0 is black magic, it IS the secret sauce, for the first time in ages, there IS a secret sauce.
I really, really hope consoles can catch up because this is really a game changer, like Gsync was back in 2013 or so, it's the most important performance update ever in the PC space imho aside from Gsync. Bless you Nvidia, hate you for your fuckin prices but damn.

So true, it's magic sauce.
 

Arthands

Banned
Oct 26, 2017
8,039
I tried it on GeForce now. 1080p60 with dlss. It looks ok but not 4k look alike at all. But maybe I'm stopped by streaming technology. Anyway. Death stranding at 60fps max settings is such a looker

you should be choosing higher resolution with DLSS. DLSS is designed to boost frame rates at high GPU workloads, rather than boosting resolution.

Lower resolution means you are interpolating from lesser pixels which gives lesser result.
 

bes.gen

Member
Nov 24, 2017
3,343
you should be choosing higher resolution with DLSS. DLSS is designed to boost frame rates at high GPU workloads, rather than boosting resolution.

Lower resolution means you are interpolating from lesser pixels which gives lesser result.

i think geforce now still max out at 1080p streaming
 
Oct 25, 2017
2,884
If the switch Pro has this and can give you 4K with 1080p then it's going to be real game changer. They could have a cheaper game console (potentially) with excellent next gen quality graphics (possibly).
 

Trup1aya

Literally a train safety expert
Member
Oct 25, 2017
21,327
I'm more confused as to why DirectML is constantly brought up as an alternative to help Xbox which is using AMD RDNA 2 and there doesn't seem to be any dedicated processors like the Tensor cores in Nvidia cards that are required for DLSS 2.0? If one console had anything remotely similar, I am pretty damn sure they would be talking about it over the battle of I/O speeds.
The difference is that tensor cores are separate, additional hardware. The RPM for AMD runs on the general compute units. Therefore, running a DLSS-alike will use up far more resources that can't be used for the rest of rendering. It's analogous to running raytracing in software as opposed to RT hardware.


DLSS has plenty of drawbacks, it generates just as many artifacts and temporal instabilities as other reconstruction techniques. (More than some, less than others.) Its usefulness derives not from absolute quality, but because it achieves that quality while running primarily on otherwise underused tensor cores. It thus has very little impact on general compute, meaning larger performance improvements versus other methods.

Running and DLSS-alike on AMD cards wont withhold resources used for rendering because DLSS doesnt use any resources for rendering while the tensor cores are working. The rendering process stops while the tensor cores upscale, then starts again once the upscaling is complete.

Since it's a process than occurs in series with the rendering pipeline, not in parallel, theres no efficiency downside to performing this task on the CUs.

The advantage nvidia has is that their cards have more computational power for the upscaling process, not that the hardware is separate from the CUs. So whatever solution arise for AMD cards will probably be slower but the resolution and post processing benefits have the potential to be very similar.
 

hob982

Member
Oct 27, 2017
321
I really REALLY wish Metro Exodus got a DLSS 2.0 update. The benefit would be HUGE.
Yup, I would replay it for sure, with RTX enabled this time. DLSS 1 was such a disappointment.
It's quite strange 4A isn't on this tho. They've always been on the forefront of new tech since physx. Starting to believe their engine is showing its age and/or limitations, since it is really poor in terms of pc settings for Exodus.
 

Isee

Avenger
Oct 25, 2017
6,235
If somebody wants to use DLSS for their game, does it need to be licensed from Nvidia?
Or can it be implemented for free?
 

Papacheeks

Banned
Oct 27, 2017
5,620
Watertown, NY
No idea why some still compare DLSS to RIS like it's the same thing or like it's something new
People have been using stuff like Reshade for years.
And NV also has image sharpening at the driver level

Do you feel like DLSS 2 and FidelityFX give comparable results here?
DS-Comparison-scaled.jpg

Source: https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/

Yes, one is still new, the other has been out for a while. Radeon Image Sharpening will get updated. DLSS was invented more so to give better frames while things like Ray tracing were on.

And it's uses while really good in select games is not something that has been applied to all engines, or deploy-able via nvidia control panel instead of it needing to be coded for the game engine.

Radeon Image sharpening isn't as advance as DLSS yet, it literally came out last year.
 

Rykane

Prophet of Truth
Member
Nov 22, 2019
210
Sorry if this has been asked a bunch but how do I actually get it working? I run at 1920x1080 144hz and if I want to see any benefit do I just turn on DLSS and it upscale to 1440p or 4k or do I need to switch to 1440p or 4k then turn on DLSS? I assuming the latter right?
 

eonden

Member
Oct 25, 2017
17,078
Sorry if this has been asked a bunch but how do I actually get it working? I run at 1920x1080 144hz and if I want to see any benefit do I just turn on DLSS and it upscale to 1440p or 4k or do I need to switch to 1440p or 4k then turn on DLSS? I assuming the latter right?
You need to set your resolution at the thing you want (so 4k in this case) and activate DLSS. DLSS will ensure that it upscales from a good enough resolution without you paying attention (in most cases, for 4k will be 1440p). So you are right.
 

Isee

Avenger
Oct 25, 2017
6,235
Sorry if this has been asked a bunch but how do I actually get it working? I run at 1920x1080 144hz and if I want to see any benefit do I just turn on DLSS and it upscale to 1440p or 4k or do I need to switch to 1440p or 4k then turn on DLSS? I assuming the latter right?

You just turn it on.

Your set resolution (1920x1080 in this case) is what DLSS will upscale to.
Quality/Performance will set the resolution it will internally render at.

Even at 1080p, DLSS will enable better antialiasing than the native TAA solution. Despite running at a lower internal resolution.

If you want to down sample from a higher resolution you need to enable DSR in the Nvidia Control Panel, set your desktop resolution to the higher resolution and then change the resolution in game. Not all resolutions will work, but 2560x1440 and 3840x2160 will.
 

Afrikan

Member
Oct 28, 2017
16,968
Could an external hdmi device have DLSS type features?

Non RTX Nvidia cards/AMD cards manually set to output at a low resolution > HDMI A.I. upscaling box > TV?

Edit- not counting a eGPU with RTX Cards.
 
Last edited:

Dennis8K

Banned
Oct 25, 2017
20,161

SmartWaffles

Member
Nov 15, 2017
6,244
Could an external hdmi device have DLSS type features?

Non RTX Nvidia cards/AMD cards manually set to output at a low resolution > HDMI A.I. upscaling box > TV?

Edit- not counting a eGPU with RTX Cards.
DLSS works on a decade of Nvidia Deep Learning research and billions of transistors. No simple upscaler can be even remotely close. They are a generation ahead.
 

Zonal Hertz

Banned
Jun 13, 2018
1,079
God I can't wait to upgrade to a 3080TI from my 1080TI.

If my feeble brain has this correct - on DLSS 2.0 supported games - I'll literally get the frames of 1440p - but the resolution essentially as good as (if not better in some places?) of native 4k.

I just hope every game supports it going forwards.
 

impingu1984

Member
Oct 31, 2017
3,415
UK
God I can't wait to upgrade to a 3080TI from my 1080TI.

If my feeble brain has this correct - on DLSS 2.0 supported games - I'll literally get the frames of 1440p - but the resolution essentially as good as (if not better in some places?) of native 4k.

That is the aim yes.... And it seems as of DLSS 2.0 Nvidia has really delivered that promise...

It's the only reason I can justify and want to upgrade my 1080ti honestly
 

Afrikan

Member
Oct 28, 2017
16,968
DLSS works on a decade of Nvidia Deep Learning research and billions of transistors. No simple upscaler can be even remotely close. They are a generation ahead.

I hear ya... just figured they weren't the only company who's been working on Deep Learning research for a decade on a similar scale. If they were, then they deserve to be a generation ahead....and I'll upgrade my GTX 1080 next year.
 

hob982

Member
Oct 27, 2017
321
If my feeble brain has this correct - on DLSS 2.0 supported games - I'll literally get the frames of 1440p - but the resolution essentially as good as (if not better in some places?) of native 4k.
Frame rate gain is probably even higher than this. Look at Death Stranding with DLSS 2.0, 4k, maxed, 60-70 fps on a RTX 2060. If it runs at the same res as native 4k, the frame rate gain is more then double (30 vs 60-70). I don't think passing from 1440p to 4k halves the frame rate, at least not usually.
 

CreepingFear

Banned
Oct 27, 2017
16,766
I'm running 3840x1600 @160hz 8bit full RGB over DP now... True that is tapping out the bandwidth but more DP bandwidth for higher resolution / frames etc isn't something I'm concerned about right now.
I'll be getting the new Vizio Quantum X TV with VRR 48-120 hz, so I will want 2.1 for trying out some PC gaming on the TV. I also found the 4k PC monitors lacking due to not being able to do 4k, HDR, and high refresh rate at the same time. Usually, you have to lower or disable one of the features for the other two. DP 2.0 should see future monitors without those compromises.
 

BeI

Member
Dec 9, 2017
5,974
There seems to be quite a few people on AMD reddit pushing the idea that the FidelityFX alternative is just as good or better. That Digital foundry video can't come soon enough
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
There seems to be quite a few people on AMD reddit pushing the idea that the FidelityFX alternative is just as good or better. That Digital foundry video can't come soon enough
I won't cover fidelity fx in comparison to DLSS 2.0 because Fidelity FX is not image reconstruction - and people on the AMD reddit generally need to take some time to think and actually Look at what they are looking at if they think it needs to be compared to DLSS 2.0. CAS is just sharpening. Just sharpening. If I went to siggraph and was comparing image reconstruction like checkerboarding to some low resolution with sharpening I would get extremely funny looks from the audience, because they are not the same thing at all.

I made a video about RIS talking about this even:
 

fluffy pillow

Member
Sep 12, 2018
154
As far as I'm concerned, DLSS will be magic as soon as it or a similar technology is available on graphics cards that are ~£200 new, i.e. the GPU tier that really needs this kind of technology to keep up with the XSX and PS5. Whoever can deliver that gets my (small amount of) money when it's time to replace my RX580.

In the meantime I'll take Fidelity FX (or rendering at 900p and slapping RIS on in the control panel) and try to ignore the sharpening artefacts.

Edit: uh, this comment looked more confrontational than I meant, reading it back. Just saying I like the RIS to DLSS comparisons because it's all I have access to. It's helpful to see how far just sharpening can take me.
 
Last edited:

Klokwerk

Member
Oct 29, 2017
234
DLSS is the latest Nvidia buzzword after Hairworks and Raytracing but it's hardly the revolution you think it is.

Reconstruction techniques are nothing new and we had them for years (most recently with checkerboard rendering, which achieve very similar results and has been used for years on PS4 Pro).

There are still reconstruction avatars that make it inferior to native and similar to other techniques like CB.

I use DLSS like AI reconstruction every day on my Nvidia Shield and results are mixed. It's great on some images, and not good on others. Depends on the content. It's nice but not a revolution, like CB rendering. You can't tell which is which unless you stop the video and spend 10 minutes searching...

The reason it looks good is everything looks good with diminishing returns. If I give you 3 videos with native / DLSS / CB rendering, you won't be able to tell which is which.

The corollar is we're wasting power when going 4k native instead of reconstructed.
 
Last edited:

Alexandros

Member
Oct 26, 2017
17,800
DLSS is the latest Nvidia buzzword after Hairworks and Raytracing but it's hardly the revolution you think it is.

Reconstruction techniques are nothing new and we had them for years (most recently with checkerboard rendering, which achieve very similar results and has been used for years on PS4 Pro).

There are still reconstruction avatars that make it but inferior to native and similar to other techniques like CB.

I use DLSS like AI reconstruction every day on my Nvidia Shield and results are mixed. It's great on some images, and not good on others. Depends on the content. It's nice but not a revolution, like CB rendering. You can't tell which is which unless you stop the video and spend 10 minutes searching...

Your post is pretty much all wrong.
 

Orioto

Member
Oct 26, 2017
4,716
Paris
DLSS is the latest Nvidia buzzword after Hairworks and Raytracing but it's hardly the revolution you think it is.

Reconstruction techniques are nothing new and we had them for years (most recently with checkerboard rendering, which achieve very similar results and has been used for years on PS4 Pro).

There are still reconstruction avatars that make it but inferior to native and similar to other techniques like CB.

I use DLSS like AI reconstruction every day on my Nvidia Shield and results are mixed. It's great on some images, and not good on others. Depends on the content. It's nice but not a revolution, like CB rendering. You can't tell which is which unless you stop the video and spend 10 minutes searching...

Oh my, thx for letting us know cause with all those obvious comparison videos and explanation of what it is and why it's amazing, we almost fell for it. Damn buzzwords! My eyes were tricked into seeing a huge difference.
 

dgrdsv

Member
Oct 25, 2017
11,846
If somebody wants to use DLSS for their game, does it need to be licensed from Nvidia?
Or can it be implemented for free?
It's similar to any other gameworks feature - you don't need to license or pay anything but you have to join the NV's devrel program - which is basically an online registration these days.
developer.nvidia.com

Deep Learning Super Sampling (DLSS)

Boosts frame rates and generates sharp images.

There seems to be quite a few people on AMD reddit pushing the idea that the FidelityFX alternative is just as good or better. That Digital foundry video can't come soon enough
Just like there was quite a few people everywhere saying how RT is a gimmick and will die with Turing, yeah. Most people are clueless.