i am telling you, dlss 2.0 is cheating lol. It is what keeps PC afloat this generation vs the monster consoles coming out.
It's a PS4 game, what else were you expecting?Uh, what. That's... it?
I guess you can either run this game well or you can't. Bummer. Don't think my setup will manage in that case.
Aren't 2 year old GPU like 2080ti still faster then consoles?i am telling you, dlss 2.0 is cheating lol. It is what keeps PC afloat this generation vs the monster consoles coming out.
I hope the fact that DLSS 2.0 is does not need to be trained on a per game basis mean it will become the norm for demanding games. That would make the jump from my gtx1080 to a ( soon to be released ? ) gtx3070-3080 even more impressive than what the raw number actually indicate.
Even better Nvidia turned DLSS into a black box devs can slot in place of their TAA solution. So theoretically any game that supports TAA can use DLSS without a lot of work on the dev's part. Hopefully it becomes a common feature in future PC games.
But does DLSS 2.0 support HDR?
I kinda had the assumption that DLSS "broke" HDR. That's good to know
Although I would say running a game at 720p and the computer rendering out a better than native 1440p image is very wanted by myself at least. Imagine how long a powerful gpu could last at that rate. Or even if you do 1080 -> 4k and render it out on a 1440p monitor...DLSS is still useful for 4k. I don't think it will be all that useful in 1080p or 1440p, considering that the cards that can use DLSS should have little issue pushing the game to high frame rates in native resolution.
yes but that's also the only gpu that's faster and its 1k+ € card, not something majority of pc gamers have
i am telling you, dlss 2.0 is cheating lol. It is what keeps PC afloat this generation vs the monster consoles coming out.
Thanks for the reply.They just added these categories referring to NVIDIA cloud gaming, but I'm not sure if this is a reliable indicator. https://steamdb.info/app/1190460/history/?changeid=8831613
Man, if Horizon Zero Dawn ships with DLSS 2.0 on the same engine, it's gonna look absolutely nuts. Drop that release date already, Guerrilla.
So dlss is nvidia patent for their products. Does amd have something similar for next gen consoles (ps5 and xsx)? Has it been said ?
Not just a "software" patent situation here. DLSS 2.0 is using the tensor cores in RTX cards that were originally developed for HPC and AI computing to reconstruct the image.
AMD released their own Fidelty FX reconstruction with RDNA cards, which is a form of content aware sharpening with descent results. In fact, people preferred that over DLSS 1.0 in the early days, while PS4 Pro is using a checkerboard rendering called method to reconstruct higher resolutions.
DLSS 2.0 seems to be vastly superior to both methods though and ages like fine wine currently (to use GPU slang terms). It would be surely possible for AMD to develop and use their own AI powered reconstruction tech, but PS5 and Series X do not have dedicated hardware with that function in mind.
Edit: Example
Control - 1440p - maxed out
Control - 1440p/DLSS - maxed out (better IQ and Performance)
But I just bought a PC1 two months ago!PC will always be a step ahead. But just wait for PC2 coming out next gen.
Not just a "software" patent situation here. DLSS 2.0 is using the tensor cores in RTX cards that were originally developed for HPC and AI computing to reconstruct the image.
AMD released their own Fidelty FX reconstruction with RDNA cards, which is a form of content aware sharpening with descent results. In fact, people preferred that over DLSS 1.0 in the early days, while PS4 Pro is using a checkerboard rendering called method to reconstruct higher resolutions.
DLSS 2.0 seems to be vastly superior to both methods though and ages like fine wine currently (to use GPU slang terms). It would be surely possible for AMD to develop and use their own AI powered reconstruction tech, but PS5 and Series X do not have dedicated hardware with that function in mind.
Edit: Example
Control - 1440p - maxed out
Control - 1440p/DLSS - maxed out (better IQ and Performance)
According according to ARS Technica, it looks like AMD has a match for Nvidia's DLSS 2.0.
Why this month’s PC port of Death Stranding is the definitive version [Updated]
A major embargo is up, so we've added comparison images for anti-aliasing methods.arstechnica.com
I really want to see the comparison videos between the two methods and if FidelityFX CAS can get the same sort of performance boost as DLSS 2.0, then that's really good for next gen consoles. I guess that means there is no real advantage to having the tensor cores on the RTX cards.
The problem is you can only do so much before you need neural network and/or dedicated hardware to run it.I fully expect some really nice reconstruction techniques from Sony first party and hopefully AMD too. We had some really good solutions on current consoles and the potential avaialble performance with next gen means they should only get better
so i have no real use for it since i always plays at 1080p i guess? I don't really care for 4K, at the distance i'm playing from my tv, even though i can see the difference, it's not worth the performance hit
The problem is you can only do so much before you need neural network and/or dedicated hardware to run it.
Reconstruction is basically creating new information based on existing information, and a neural network is always going to be far superior at producing that new information than an ad hoc method without neural network. And while you can run a neural network solution on software, it's always going to run worse than a hardware based approach like tensor cores. Since DLSS has that two levels of advantage of running a neural network and doing so on a dedicated hardware, it's always going to do better than a solution that doesn't have either of them or even a solution that just has one of them.
It would absolutely be useful to you. You could go for 1080p DLSS Quality Mode, which renders at 720p before doing the reconstruction and would give you a nice performance boost or more room to boost graphical effects. Or, you could do 4K DLSS Performance Mode, which renders at 1080p initially, and would give you a great boost in detail with no drop in performance.
Honestly, you should always use DLSS if you can.
This made my decision to upgrade my 1060 to whatever Nvidia deliver in the next 2 years extremely easy I must say.I hope AMD can deliver similar or equivalent functionality with RDNA2, including on console. This tech is becoming more important and if a large number of games adopt it then it might become decisive.
Yea but you still need hardware to understand those generic instructions and compute the information, and the tensor cores are built to crunch these numbers quickly. It's kind of like how you can do ray tracing in any hardware, but because the rtx cards have dedicated RT cores which specialise in crunching numbers for RT they can accelerate that process considerably leading to superior results.this is where my ignorance comes in (and hopefully some learning) - isn't a neural net usually offboarded for the learning part, and then you execute the results locally on weaker hardware?
With DLSS 1, with each game needing to be 'taught' that would seem to follow that model. But DLSS 2 is a 'generic' model so that would suggest to me as a layperson, that it is more of a jack of all trades (albeit a great one) and that the approach could be reusable (with changes perhaps) on other platforms?
Wow that sounds really promising from the FidelityFX CAS version, which would apparently even work on 1xxx series Nvidia cards. Sounds like performance /image quality is similar, although I could have sworn a different article said that CAS had more noticeable issues.
Basically, it's a post-processing filter that does similar work as DLSS, only you don't need an RTX card. The problem is that — at least in the preview build — the sharpening and upscaling causes some visible shimmer. It's not terrible, and it's a way to boost framerates that some people will undoubtedly appreciate, but the effect was certainly noticeable when moving around.
According according to ARS Technica, it looks like AMD has a match for Nvidia's DLSS 2.0.
Why this month’s PC port of Death Stranding is the definitive version [Updated]
A major embargo is up, so we've added comparison images for anti-aliasing methods.arstechnica.com
I really want to see the comparison videos between the two methods and if FidelityFX CAS can get the same sort of performance boost as DLSS 2.0, then that's really good for next gen consoles. I guess that means there is no real advantage to having the tensor cores on the RTX cards.
Only available on RTX series for 2.0.Is DLSS available on any nvidia GPU? For example would it work on my 1080?
Wow that sounds really promising from the FidelityFX CAS version, which would apparently even work on 1xxx series Nvidia cards. Sounds like performance /image quality is similar, although I could have sworn a different article said that CAS had more noticeable issues.
According according to ARS Technica, it looks like AMD has a match for Nvidia's DLSS 2.0.
Why this month’s PC port of Death Stranding is the definitive version [Updated]
A major embargo is up, so we've added comparison images for anti-aliasing methods.arstechnica.com
I really want to see the comparison videos between the two methods and if FidelityFX CAS can get the same sort of performance boost as DLSS 2.0, then that's really good for next gen consoles. I guess that means there is no real advantage to having the tensor cores on the RTX cards.
If AMD can't find a competing technology soon, is there even a point in buying AMD cards over Nvidia? The kind of performance benefit DLSS 2.0 gives in AAA games is just bonkers. Especially with the upcoming next-gen only games, most of which will probably support DLSS 2.0, it's really difficult to see AMD competing. And I say that as an owner of an AMD CPU and GPU.