I should but a bad experience typically puts me off for good. That actually stopped me playing on PC until around 2014
I should but a bad experience typically puts me off for good. That actually stopped me playing on PC until around 2014
ms could do it using azura for xsx/xss, but yeah amd is kinda left to their own devices.Control had a shader version of DLSS that didn't incur a massive performance hit before the upgrade to DLSS 2.0. There's more than one way to do ML supersampling. It didn't look as good as 2.0 but it looked far better than simple upscaling.
I think the biggest hurdle will be training the models and Nvidia has a huge head start in terms of infrastructure.
The images in OP show lack of detail and more aliasing and it's very clear (even for someone with poor eyesight like me) so is this a regular occurance with DLSS or just an anomaly and DLSS really is this magic sauce that makes games look better than native 4k with double the performance?
It's also a problem entirely unique to Death Stranding due to its use of compromised motion vectors.All AA has its benefits and drawbacks, and Nvidia has benefited from DLSS 2.0 leveraging TAA and correcting some of its worst ghosting issues. But ghosting in motion is still better than outright missing visual details, and I can't think of another AA solution that literally omits details in its attempt to reconstruct.
......what?!I continue to struggle with DLSS enthusiasm, almost entirely because of how it fails in Death Stranding.
It's the best AA we have right now. Can't live without it.
I continue to struggle with DLSS enthusiasm, almost entirely because of how it fails in Death Stranding. Nvidia was very careful to use footage in the game's city zones to show off how DLSS 2.0 reconstructs content in those areas, and does so quite well. But as soon as you get into any predominantly organic landscape (which is a high percentage of the game), and then add rain or other particle effects (particularly in cut scenes), DLSS 2.0 loses a ton of detail. I just checked this again last night (for no reason at all) and it's startling how much detail is lost in an average rainstorm once DLSS is turned on.
I fully believe DLSS will be able to account for a wider range of 3D rendering scenarios. But for now, it's an uncanny valley situation for me. As soon as you lose *any* detail that a game creator intended to be visible for the sake of atmosphere, then I'm out--and impressively reconstructed textures and signs don't make up for this.
Ideally DirectML will be competitive, and that should be vendor-agnostic.
At least that way, even if it's not quite as good as DLSS, you get something rather than nothing at all, in AMD-sponsored games.
I'm not sure if 2.0 can officially scale above 4K currently - though there's not really any reason why it shouldn't be able to.
2.1 adds a 9x scaling option which makes that easier to render by enabling 1440p to 8K scaling rather than only 2160p to 8K.
In the absolute most basic way, you train an AI on edge patterns.
It knows that when it sees a low resolution pattern which follows this pixel structure and brightness:
The original high resolution image it was derived from looked like this:
And then the AI tries to transform the top image to look more like the bottom one.
This works because the AI is trained on huge datasets which compares very high resolution images against the exact same image rendered at a low resolution.
So it learns the difference that resolution makes to an otherwise identical scene, and figures out the way to reverse it; starting with a low resolution input and turning it into a high resolution output.
Of course it is far more complicated than that, but that's the most basic way I can think to explain it.
DLSS runs at the end of the frame. You can't reconstruct an image before the low resolution input has been created first.
Because of that, I think the requirement for tensor cores is overstated. Tensor cores run the reconstruction faster, but it's not stealing resources away from the rest of the GPU to run this type of reconstruction on shaders like RDNA 2.0 is said to - they have already finished most of their work.
The thing is that DLSS builds up the image over multiple frames - at least eight of them - so if you're standing still it can do a fantastic job reconstructing the image to look just like native.
The lower the base resolution is, the more resolution you lose as soon as things start moving.
Now in some respects this is ideal for modern displays, since they blur the image as soon as anything is moving. But I do wonder whether this aspect of DLSS would be far more noticeable on an OLED running at 120Hz with BFI enabled for example. That display would have significantly less motion blur, and be more revealing of this aspect of DLSS, while a sample-and-hold LCD monitor will blur the image so much you may not notice it.
What are your thoughts on anti-aliasing? Particularly TAA.
I continue to struggle with DLSS enthusiasm, almost entirely because of how it fails in Death Stranding. Nvidia was very careful to use footage in the game's city zones to show off how DLSS 2.0 reconstructs content in those areas, and does so quite well. But as soon as you get into any predominantly organic landscape (which is a high percentage of the game), and then add rain or other particle effects (particularly in cut scenes), DLSS 2.0 loses a ton of detail. I just checked this again last night (for no reason at all) and it's startling how much detail is lost in an average rainstorm once DLSS is turned on.
I fully believe DLSS will be able to account for a wider range of 3D rendering scenarios. But for now, it's an uncanny valley situation for me. As soon as you lose *any* detail that a game creator intended to be visible for the sake of atmosphere, then I'm out--and impressively reconstructed textures and signs don't make up for this.
I continue to struggle with DLSS enthusiasm, almost entirely because of how it fails in Death Stranding. Nvidia was very careful to use footage in the game's city zones to show off how DLSS 2.0 reconstructs content in those areas, and does so quite well. But as soon as you get into any predominantly organic landscape (which is a high percentage of the game), and then add rain or other particle effects (particularly in cut scenes), DLSS 2.0 loses a ton of detail. I just checked this again last night (for no reason at all) and it's startling how much detail is lost in an average rainstorm once DLSS is turned on.
I fully believe DLSS will be able to account for a wider range of 3D rendering scenarios. But for now, it's an uncanny valley situation for me. As soon as you lose *any* detail that a game creator intended to be visible for the sake of atmosphere, then I'm out--and impressively reconstructed textures and signs don't make up for this.
wait, what? EDIT: forget it, I see you responded already re how you formed this opinion
yep, 2160p, all max settings. here's a 1920*1080 crop of one kind of scene I'm talking about with DLSS off, TAA on, 100% resolution scaling:
and now the same portion cropped at 1920*1080 in with DLSS on, in-game "quality" DLSS preset:
...ignore the RTSS readings. but notice the change in active rain detail. this is hard to convey in a single screenshot, it's even more noticeable in action. what was once discrete rain is now mere haze. I have other examples in my Ars piece from a while back about the game's PC version otherwise being a stunner: https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/
yep, 2160p, all max settings. here's a 1920*1080 crop of one kind of scene I'm talking about with DLSS off, TAA on, 100% resolution scaling:
and now the same portion cropped at 1920*1080 in with DLSS on, in-game "quality" DLSS preset:
...ignore the RTSS readings. but notice the change in active rain detail. this is hard to convey in a single screenshot, it's even more noticeable in action. what was once discrete rain is now mere haze. I have other examples in my Ars piece from a while back about the game's PC version otherwise being a stunner: https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/
Forgive the noob question but, if I have a 1440p monitor, could I use DLSS to effectively upscale the res? So I could use it to scale up to 4k?
Isn't this just a matter of the rain particles not having the correct motion data? A lot of particles in Death Stranding have similar issues.yep, 2160p, all max settings. here's a 1920*1080 crop of one kind of scene I'm talking about with DLSS off, TAA on, 100% resolution scaling:
and now the same portion cropped at 1920*1080 in with DLSS on, in-game "quality" DLSS preset:
...ignore the RTSS readings. but notice the change in active rain detail. this is hard to convey in a single screenshot, it's even more noticeable in action. what was once discrete rain is now mere haze. I have other examples in my Ars piece from a while back about the game's PC version otherwise being a stunner: https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/
It's not normal, per se, but it depends on what your final resolution and quality settings are. I.e. if you have the game at 1080p running Performance mode, the internal resolution for it will be 540p - at which point even for DLSS some details might lose cohesion. Usually DLSS has the opposite effect on flicker/shimmering, especially in remote background details.My first impression of DLSS was a bit mixed. The first time I ever tried it out with was Control just a week ago and it just happened that in the very first thing I experienced, there was a bunch of obvious shimmering that disappears when you play the game at native res. Here are a couple examples:
I'm assuming this is a normal artifact with DLSS enabled in Control, right?
It's not normal, per se, but it depends on what your final resolution and quality settings are. I.e. if you have the game at 1080p running Performance mode, the internal resolution for it will be 540p - at which point even for DLSS some details might lose cohesion. Usually DLSS has the opposite effect on flicker/shimmering, especially in remote background details.
My first impression of DLSS was a bit mixed. The first time I ever tried it out with was Control just a week ago and it just happened that in the very first thing I experienced, there was a bunch of obvious shimmering that disappears when you play the game at native res. Here are a couple examples:
I'm assuming this is a normal artifact with DLSS enabled in Control, right?
Hm. Definitely some kind of issue, then. Well, thankfully they'll keep improving it. And hopefully not just for future titles. :)I was doing 1440p with the rendered resolution set to 1706x960, which is the highest you can do for 1440p, and all settings maxed out.
Getting this as well. Performed nicely, but that shimmer is weird.
And like you, I was also running 1440p with the resolution set to 1706x960.
Hm. Definitely some kind of issue, then. Well, thankfully they'll keep improving it. And hopefully not just for future titles. :)
Only game I tried DLSS with is Anthem, and the result was disgusting. Barely better than straight up upscaling, with added artifacts. I take it it's using the 1.0 version ?
Is it possible in the future to have DLSS as a base nvidia feature that you can toggle off and on for any game?
Anything that is not Control or Wolfenstein Youngblood at the moment is using 1.0.
Anything that is not Control or Wolfenstein Youngblood at the moment is using 1.0.
I fully believe DLSS will be able to account for a wider range of 3D rendering scenarios. But for now, it's an uncanny valley situation for me. As soon as you lose *any* detail that a game creator intended to be visible for the sake of atmosphere, then I'm out--and impressively reconstructed textures and signs don't make up for this.
DLSS is a temporal reconstruction. While AI 'imagines' the final picture it does so based on several frames worth of data, so the amount of actual data on the final frame isn't that low. Though of course the lower you go the more history would be needed for accurate reconstruction and more temporally artifact prone the image is likely to be.although there will always be limits to the minimum viable base resolution (you can't enhance something that isn't there to begin with!)
Wait, when did this happen?
eh, if cyberpunks rain wont show up due dlss, its gonna be a big issue . Same for Watch dogs 3. Hopefully nvidia can do something about particiles with next iterationI understand and respect your point of view but I believe that the overwhelming majority of gamers have vastly different priorities. If you are able to achieve 4K60 at maximum detail then issues such as the ones you describe would matter. For people who don't have bleeding-edge hardware the choices would be a) don't use DLSS and upscale from a lower resolution, b) don't use DLSS and drop settings, c) don't use DLSS and put up with bad performance and d) use DLSS, avoid all that and perhaps get a slightly altered image.
The choice becomes a true no-brainer if you're using low-end or mid-range hardware. it's no longer an issue of how good the game looks and in what way, it's an issue of whether the game is playable at decent framerates or not. For most people DLSS is a free graphics card upgrade, it saves them a lot of money. They are not going to care about small details that DLSS doesn't get right.
OH WOW. The rain missing in THIS game is just funny. Yeah, this is not a good look. I was able to look past the black streaks caused by certain particles but the rain missing is a no go. Wish Digital Foundry had pointed this out. I feel like some people are too positive about DLSS. It is amazing but the problems have to be pointed out in every game so we can make educated decisions. At this point it seems to be a TAA on steroids but with certain problems. I wish we would just get a better AA solution.That's a noticable artistic difference as well, looks like the game is missing effects.
I have to agree with you.
yep, 2160p, all max settings. here's a 1920*1080 crop of one kind of scene I'm talking about with DLSS off, TAA on, 100% resolution scaling:
and now the same portion cropped at 1920*1080 in with DLSS on, in-game "quality" DLSS preset:
...ignore the RTSS readings. but notice the change in active rain detail. this is hard to convey in a single screenshot, it's even more noticeable in action. what was once discrete rain is now mere haze. I have other examples in my Ars piece from a while back about the game's PC version otherwise being a stunner: https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/
yep, 2160p, all max settings. here's a 1920*1080 crop of one kind of scene I'm talking about with DLSS off, TAA on, 100% resolution scaling:
and now the same portion cropped at 1920*1080 in with DLSS on, in-game "quality" DLSS preset:
...ignore the RTSS readings. but notice the change in active rain detail. this is hard to convey in a single screenshot, it's even more noticeable in action. what was once discrete rain is now mere haze. I have other examples in my Ars piece from a while back about the game's PC version otherwise being a stunner: https://arstechnica.com/gaming/2020...of-death-stranding-is-the-definitive-version/
If it were an issue with motion vectors you'd expect TAA to eat those rain particles too.This is very specifically an issue with motion vectors for those rain particles. This isn't a DLSS issue so much as it's a bug with this game and it's implementation with these particles.
This is very specifically an issue with motion vectors for those rain particles. This isn't a DLSS issue so much as it's a bug with this game and it's implementation with these particles.
Seems kind of silly to be so hung up on the technology when everything else about the shot is amazing. Obviously this is an issue that should be addressed, and could be addressed on another title.
In fact we just saw a trailer for Cyberpunk with RTX and DLSS AND lots of particle effects without any obvious issues.
anmd just tkaing a loopk on youtube I cna clealry see that this might be somehting a lot more noticeable on a still screenshot, and why no one else was reporting it, ebcaus ein motion, there doesn't seem to eb an issue at all:
If it were an issue with motion vectors you'd expect TAA to eat those rain particles too.
If it were an issue with motion vectors you'd expect TAA to eat those rain particles too.
Though I cannot say that is is an inherent problem with DLSS. There is not enough data, or differing implementations of DLSS in natural environments to make any declarative statement one way or another.
Well we know those cryptobiotes are rendered using GPU particles, and they have that trailing issue due to the motion vector implementation in the game not being accurate. So it stands to reason that particles in this game have the same issue with motion vectors and rain is using GPU particles too.If it were an issue with motion vectors you'd expect TAA to eat those rain particles too.
Though I cannot say that is is an inherent problem with DLSS. There is not enough data, or differing implementations of DLSS in natural environments to make any declarative statement one way or another.
Conversely, would it be possible for dlss eventually to become an engine feature that is just toggled on by the devs? Especially if they're doing the stuff for taa anywayUnlikely. Nvidia has said that motion vectors from the engine are needed so it will be a feature game developers need to add. In theory any game using temporal antialiasing is a a candidate for DLSS 2.x support but of course devs are not going to go back to implement it.
That doesn't mean Nvidia could not come up with some different tech that works in any game and gives similar results.
I don't believe that game has been upgraded, no.
I understand and respect your point of view but I believe that the overwhelming majority of gamers have vastly different priorities. If you are able to achieve 4K60 at maximum detail then issues such as the ones you describe would matter. For people who don't have bleeding-edge hardware the choices would be a) don't use DLSS and upscale from a lower resolution, b) don't use DLSS and drop settings, c) don't use DLSS and put up with bad performance and d) use DLSS, avoid all that and perhaps get a slightly altered image.
The choice becomes a true no-brainer if you're using low-end or mid-range hardware. it's no longer an issue of how good the game looks and in what way, it's an issue of whether the game is playable at decent framerates or not. For most people DLSS is a free graphics card upgrade, it saves them a lot of money. They are not going to care about small details that DLSS doesn't get right.
I understand and respect your point of view but I believe that the overwhelming majority of gamers have vastly different priorities. If you are able to achieve 4K60 at maximum detail then issues such as the ones you describe would matter. For people who don't have bleeding-edge hardware the choices would be a) don't use DLSS and upscale from a lower resolution, b) don't use DLSS and drop settings, c) don't use DLSS and put up with bad performance and d) use DLSS, avoid all that and perhaps get a slightly altered image.
The choice becomes a true no-brainer if you're using low-end or mid-range hardware. it's no longer an issue of how good the game looks and in what way, it's an issue of whether the game is playable at decent framerates or not. For most people DLSS is a free graphics card upgrade, it saves them a lot of money. They are not going to care about small details that DLSS doesn't get right.