• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Sarcastico

Member
Oct 27, 2017
774
I understand and respect your point of view but I believe that the overwhelming majority of gamers have vastly different priorities. If you are able to achieve 4K60 at maximum detail then issues such as the ones you describe would matter. For people who don't have bleeding-edge hardware the choices would be a) don't use DLSS and upscale from a lower resolution, b) don't use DLSS and drop settings, c) don't use DLSS and put up with bad performance and d) use DLSS, avoid all that and perhaps get a slightly altered image.

The choice becomes a true no-brainer if you're using low-end or mid-range hardware. it's no longer an issue of how good the game looks and in what way, it's an issue of whether the game is playable at decent framerates or not. For most people DLSS is a free graphics card upgrade, it saves them a lot of money. They are not going to care about small details that DLSS doesn't get right.

Excellent post. If you value performance, DLSS is a godsend.
 

Sean Mirrsen

Banned
May 9, 2018
1,159
As someone who routinely tries to force games to run at resolutions as low as 540p, with no AA, just to see if they remain playable while maybe actually giving me above 20FPS in most scenes, I empathize a lot with the notion. If I ever actually scrounge up the money to build a desktop PC to complement my little Win8 tablet, I am not going to even look at any GPU choices that won't get me DLSS. Like, to the point of if Nvidia somehow comes up with an x86 CPU with integrated graphics and DLSS capability, I am picking that over any discrete video card. It'd be the ultimate value proposition for low-spec gaming.
 

Laiza

Member
Oct 25, 2017
2,171
eh, if cyberpunks rain wont show up due dlss, its gonna be a big issue . Same for Watch dogs 3. Hopefully nvidia can do something about particiles with next iteration
It's not DLSS's fault. That problem is exclusive to Death Stranding due to its low quality motion vectors which exclude particle effects. Nvidia doesn't have to fix shit.
 

Li bur

Member
Oct 27, 2017
363
DLSS is that tech that sounds way too good to be real. I keep thinking that it's all bullshit, that there has to be some kind of catch that will make it worthless in a real world situation, but it looks 100% legit.

I guess the caveat would always be it needs to be programmed specifically for one game no? I'm waiting for the day that we can use DLSS for any game I want (though I don't know whether that'd be technically possible or not)
 

floridaguy954

Member
Oct 29, 2017
3,631
9532_105_death-stranding-benchmarked-how-does-hideo-kojimas-game-run-at-8k_full.png
I'll post this whenever someone says "10 GB isn't enough" lol
 

laxu

Member
Nov 26, 2017
2,782
Conversely, would it be possible for dlss eventually to become an engine feature that is just toggled on by the devs? Especially if they're doing the stuff for taa anyway

It is already becoming that with Unreal Engine I believe. I certainly hope that's what ends up happening so it becomes a standard feature. At the moment there seems to be a separate branch of UE that supports it but eventually that will probably get merged into the main branch and become available without applying for access from Nvidia.
 

floridaguy954

Member
Oct 29, 2017
3,631
Native looks better than both imo, look at the aliasing on the boxes for example, it's really bad with DLSS.

In motion I imagine that's even worse, is this the best DLSS can offer in 4k? Or are these just bad examples?

What does 1440>4k look like for example?
You're exaggerating, the aliasing is negligible.

Anyways, be my guest if you want to take an over 50% performance hit just to eliminate a small amount of aliasing on some boxes with native 4K.
 

Nerun

Member
Oct 30, 2017
2,273
It's such a great feature, especially with my Gaming Laptop and the RTX 2060, I mean I can play Control with everything maxed (even all RT options) with 40-50 fps+, if I lower some RT values, I might even get 60 fps stable out of it. Same goes for other games, Death Stranding with DLSS 2.0 in Quality Mode and everything maxed runs at 100 to 120 fps most of the time.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,836
Any reason why 2080 Super has lower vram usage than the 2080 and 2070 Super when using DLSS?

That chart looks very odd

Maybe I don't get how it works, but why wouldn't it be the same across the board if you're targeting the same resolution and settings on each GPU? Aren't you trying to recreate the same thing on each one which would need the same assets, frame buffer, etc.
 

Detail

Member
Dec 30, 2018
2,947
You're exaggerating, the aliasing is negligible.

Anyways, be my guest if you want to take an over 50% performance hit just to eliminate a small amount of aliasing on some boxes with native 4K.

It's just my opinion, I don't believe I was exaggerating? It just looks bad to my eyes.

It wasn't meant as an attack on others for what they prefer, I understand why people would see the increased performance as more of a priority than IQ, it's all about preferences.

Aliasing is one of those things that really ruins image quality for me and if it looks like that in a screenshot I imagine it looks worse in motion, to my eyes the aliasing looks very bad when viewing those screenshots on my TV in full screen when compared to the native image.

Also, when talking about missing effects due to DLSS (such as the rain drops in Death Stranding) that to me isn't worth the extra performance because you're then compromising the artists vision and intention for the games visuals, again though, it's all opinions and I understand why people would prefer performance over IQ and vice versa.
 

Kabukimurder

Banned
Oct 28, 2017
550
What bugs me about these compressed screenshot and video comparisons of DLSS 2 is that differences between resolutions get losts in compression and the fact that people aren't viewing them fullscreen. When running on an actual screen it's very clear that the image isn't native and that it's running in a lower resolution. It's my biggest gripe with DLSS. You have to see it for yourself to realize that the magic is limited to say the least.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
What bugs me about these compressed screenshot and video comparisons of DLSS 2 is that differences between resolutions get losts in compression and the fact that people aren't viewing them fullscreen. When running on an actual screen it's very clear that the image isn't native and that it's running in a lower resolution. It's my biggest gripe with DLSS. You have to see it for yourself to realize that the magic is limited to say the least.

I would argue the opposite is the case. You have to see it running on your screen to really get the magic.

Especially in Wolfenstein, DeliverUsTheMoon and Minecraft, DLSS 2.0 looks clearly superior to native resolution on my screen. Image quality is very good. No flicker, no artifacts, very smooth edges, more detail in general.

Control looks good too with 2.0 but when you look closely, you can see sharpening artifacts. Death Stranding looks good as well , but then there's that motion vector issue. So those are not perfect.
 

Laiza

Member
Oct 25, 2017
2,171
It's just my opinion, I don't believe I was exaggerating? It just looks bad to my eyes.

It wasn't meant as an attack on others for what they prefer, I understand why people would see the increased performance as more of a priority than IQ, it's all about preferences.

Aliasing is one of those things that really ruins image quality for me and if it looks like that in a screenshot I imagine it looks worse in motion, to my eyes the aliasing looks very bad when viewing those screenshots on my TV in full screen when compared to the native image.

Also, when talking about missing effects due to DLSS (such as the rain drops in Death Stranding) that to me isn't worth the extra performance because you're then compromising the artists vision and intention for the games visuals, again though, it's all opinions and I understand why people would prefer performance over IQ and vice versa.
Well first of all, you completely missed the point that it's upsampling from 720p, which is an extremely low resolution to be upsampling from. Obviously it looks significantly better at the intended base resolution (which is 1440p, literally four times the pixels). The results are incredible considering where it started; obviously if you want to prioritize image quality and stability you'd start from 1440p instead.

Secondly, we already established that Death Stranding is unique in having low-quality motion vectors that completely disregard particle effects for DLSS. Control doesn't have this problem, and I'm pretty sure the other DLSS 2.0 games don't, either. That's a Kojima Productions issue, not a DLSS issue.
 

Detail

Member
Dec 30, 2018
2,947
Well first of all, you completely missed the point that it's upsampling from 720p, which is an extremely low resolution to be upsampling from. Obviously it looks significantly better at the intended base resolution (which is 1440p, literally four times the pixels). The results are incredible considering where it started; obviously if you want to prioritize image quality and stability you'd start from 1440p instead.

Secondly, we already established that Death Stranding is unique in having low-quality motion vectors that completely disregard particle effects for DLSS. Control doesn't have this problem, and I'm pretty sure the other DLSS 2.0 games don't, either. That's a Kojima Productions issue, not a DLSS issue.

I am not saying it isn't impressive technology, it is clearly very impressive.

I am simply saying that to my eyes it looks worse than native resolution and I would prioritise image quality over performance, that's a personal preference. Performance is important to me in multiplayer games but I almost exclusively play single player so as long as the framerate is playable I would always take image quality over performance.

Again, I am not saying DLSS isn't impressive, all I said was the aliasing looks bad and that native looks better to my eyes.

Also, in regards to Death Stranding, is there any proof behind the claim that it's low quality motion vectors causing the issue and not DLSS? I am trying to get an impartial viewpoint on DLSS (not saying you aren't being impartial btw just wanted to know if proof exists to identify low quality motion vectors being the issue?)
 

RedShift_

Member
Jul 24, 2018
507
What bugs me about these compressed screenshot and video comparisons of DLSS 2 is that differences between resolutions get losts in compression and the fact that people aren't viewing them fullscreen. When running on an actual screen it's very clear that the image isn't native and that it's running in a lower resolution. It's my biggest gripe with DLSS. You have to see it for yourself to realize that the magic is limited to say the least.
Count me in the "I was completely sold on it until I bought an rtx card and saw it in person" camp. Compressed videos or zoomed stills hide a pretty strong "filter" you can definitely see in motion.
When i first started Control and turned DLSS on I had to double check the game was updated, cause I thought I had the crappy DLSS 1.9.

I hate the hyperbole that surrounds this tech, now. It is impressive, but it's not magic by any means and if you care about IQ or tend to notice stuff like shimmering, halos, weird noise around things (especially noticeable if you use a monitor and not a TV) you'll probably hate it.
 

XaviConcept

Art Director for Videogames
Verified
Oct 25, 2017
4,907
I gotta be honest, threads like these make me worry about my eyes cause at first glance these screenshots have no difference to me (and I get thats the point of upscaled 720) but its also one of the "must be getting old" signs
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,640
I think we need to judge titles like Death stranding?

Idk, I can definitely tell the difference between DLSS and native on Control at 1440p, but it's not an obvious jump; there's just something missing.
 

TSM

Member
Oct 27, 2017
5,823
Count me in the "I was completely sold on it until I bought an rtx card and saw it in person" camp. Compressed videos or zoomed stills hide a pretty strong "filter" you can definitely see in motion.
When i first started Control and turned DLSS on I had to double check the game was updated, cause I thought I had the crappy DLSS 1.9.

I hate the hyperbole that surrounds this tech, now. It is impressive, but it's not magic by any means and if you care about IQ or tend to notice stuff like shimmering, halos, weird noise around things (especially noticeable if you use a monitor and not a TV) you'll probably hate it.

Control is actually a bad example. After DLSS 2.0 came out for it they said there is a sharpness setting which they don't expose a user controls for and it's set for maximum by default. So it has bad ringing artifacts from the sharpness being cranked to the max. Nvidia is working on making it adjustable in future games.

Nvidia said:
We are currently hard at work calibrating the user-adjustable sharpness setting to combine well with the internal sharpness value produced by DLSS's deep neural networks, in order to consistently deliver a high-quality output while still giving the user a significant level of flexibility over the amount of sharpening they want applied. It is currently available as a debug feature in non-production DLSS builds, but is disabled by default.

Seems to just be part of the teething pains for a new technology.
 

DieH@rd

Member
Oct 26, 2017
10,567
When 10GB becomes not enough, I will switch textures from Ultra to Very High and continue gaming not noticing any difference.
 

TSM

Member
Oct 27, 2017
5,823
When 10GB becomes not enough, I will switch textures from Ultra to Very High and continue gaming not noticing any difference.

The problem is when textures below Ultra do not exist in decent quality. See the Red Dead Redemption 2 launch. People that didn't have enough vram to use Ultra textures ended up with really bad textures because the Ultra textures were smallest texture size created for the PS4/XB1 base consoles. I'm not sure how they derived the lower quality textures for PC, but it was really bad. With the new consoles having 16GB of ram the smallest sized textures created for console will likely be well above what is generally Ultra on PC now. They would have to specifically create lower quality textures for lower end PC graphics cards to avoid this problem moving forward. See the DF breakdown of RDR2 at launch:

 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,931
Berlin, 'SCHLAND
The problem is when textures below Ultra do not exist in decent quality. See the Red Dead Redemption 2 launch. People that didn't have enough vram to use Ultra textures ended up with really bad textures because the Ultra textures were smallest texture size created for the PS4/XB1 base consoles. I'm not sure how they derived the lower quality textures for PC, but it was really bad. With the new consoles having 16GB of ram the smallest sized textures created for console will likely be well above what is generally Ultra on PC now. They would have to specifically create lower quality textures for lower end PC graphics cards to avoid this problem moving forward. See the DF breakdown of RDR2 at launch:


That is 100% true! But at the same time, RDR2 at launch was an absoutely bad texture scaling implementation - something I state in that video. Other games do their texture settings much better, the vast majority do. They have since patched the game so that lower texture settings do not have that behaviour. Basically, it functions more like other games now.

With regards to "16GB" of RAM in cvonsoles, you are going to have ca. 13.5 GB of that for games depending on the console, where realistically the xbox series X (the consoel that will probably have the largest total available ram pool) will only be using 10 GB of its RAM pool for the GPU.
So I really think 10 or 8 GB will be rather fine going into the next gen on PC, 8 GB especially as on PC there is not silly pressure to push framebuffer size so much like there is on console. Console games go a bit silly with res sometimes to their own detriment.

As I say all that though, this will be stuff we investigate as the gen goes on.
 

Glimpse_Dog

Banned
Oct 29, 2017
1,770
Yeah, it's very exciting to think about the next Switch possibly utilizing DLSS.

What would be the minimum level GPU nintendo would need to pick to support DLSS 2.0 on a next-gen switch? Nintendo are known for choosing price sensitive components, but even they must see the business appeal of DLSS. Its actually the perfect tech for them - stops them getting left behind in the dust power wise without requiring them to chase the latest hardware.
 

Dizastah

Member
Oct 25, 2017
6,124
I thought that one of the PS5 hardware archtects had confirmed no ML hardware in the console via a leaked conversation?

I also seem to remember reading somewhere that the XsX ML hardware was a request from the Azure team, so unsure if it can be utilised for game upscaling.
You have a source for the info regarding PS5 not having it?
 

DieH@rd

Member
Oct 26, 2017
10,567
TSM There will always be weird PC ports and strange developer choices. I don't think that this particular bottleneck [lack of vram] will be as important or impactfull.

The generation has not yet started and people are freaking out about one very small bottleneck [for playing game at maximized visuals, a very PCMR thing to do]. :) IMO, a lot of people who get 8-10GB VRAM card now will most likely upgrade in the second half of this gen anyway, and then we will have 3-5nm cards, possibly with chiplets or stacked chips. Those products will blow away Ampere/RDNA2 in every way.
 

Deleted member 14927

User requested account closure
Banned
Oct 27, 2017
648
You have a source for the info regarding PS5 not having it?

I checked and looks like he deleted it, and a quick search shows he may have gotten into some trouble - NDA's and a furore created by it at the time.

It's unfair to link it publicly again on a forum, to avoid any more blowback for him and avoid uneccessary drama.

I'll ping it to you direct
 

floridaguy954

Member
Oct 29, 2017
3,631
What would be the minimum level GPU nintendo would need to pick to support DLSS 2.0 on a next-gen switch? Nintendo are known for choosing price sensitive components, but even they must see the business appeal of DLSS. Its actually the perfect tech for them - stops them getting left behind in the dust power wise without requiring them to chase the latest hardware.
Probably a custom RTX 3050 or 3060 depending on the cooling requirements.

Nintendo hit the fucking jackpot going with Nvidia. Imagine all of their games running at 1440p/60fps or higher with DLSS 2.0 (or higher).