Did you read the comment I was responding too? If not, why respond without context?The topic about dev tools. Bringing the player audience into this is besides the point
Did you read the comment I was responding too? If not, why respond without context?The topic about dev tools. Bringing the player audience into this is besides the point
I don't think it will. It's becoming the default needed option. Something like Control you run at 960p-1080p internal Res using DLSS with a beefed up 3070. They seem to be using it more to push the graphical bar even higher--it's not giving overhead. Like, it would be nice to take something at 1440p and push it to 4k easily. But it seems they're going the other way--sub full hd internal using DLSS to give you something more like 1440p.I love how quickly DLSS is becoming a ubiquitous feature. Easier access for UE4 developers is only going to further accelerate that.
Something like a 2060 Super should age incredibly well if it becomes a "default" feature in new games.
For 90+% of people 4k is too much. 1080p or 1440p at higher framerates seems to be the sweet spot for most. And DLSS is absolutely great for that purpose.I don't think it will. It's becoming the default needed option. Something like Control you run at 960p-1080p internal Res using DLSS with a beefed up 3070. They seem to be using it more to push the graphical bar even higher--it's not giving overhead. Like, it would be nice to take something at 1440p and push it to 4k easily. But it seems they're going the other way--sub full hd internal using DLSS to give you something more like 1440p.
" One Network For All Games - The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games."
NVIDIA DLSS 2.0: A Big Leap In AI Rendering
Through the power of AI and GeForce RTX Tensor Cores, NVIDIA DLSS 2.0 enables a new level of performance and visuals for your games - available now in MechWarrior 5: Mercenaries and coming this week to Control.www.nvidia.com
I feel like you missed the point. DLSS is being used to drive resolutions down, not mainting them and pumping them up like we would want. Having to use DLSS to get to 1080p on any card that supports the feature with any game that's currently out is absurd.For 90+% of people 4k is too much. 1080p or 1440p at higher framerates seems to be the sweet spot for most. And DLSS is absolutely great for that purpose.
Oh I see what you meant. While I agree that maybe it's making games more computationally expensive, it's still an option, and the more options, the better.I feel like you missed the point. DLSS is being used to drive resolutions down, not mainting them and pumping them up like we would want. Having to use DLSS to get to 1080p on any card that supports the feature with any game that's currently out is absurd.
Interesting. Do devs still need to train, or it's all done by NVidia and embeded in the driver/plug-in level? I am always under the impression that for a ML model, the computational intensive work are mostly at the training stage.
I got lost in my quoting spree. calm down son, I removed itDid you read the comment I was responding too? If not, why respond without context?
I'm calm, was just confused. I'm glad you've seemed to calm down, though. Kneejerk reactions can by stressful.
The training and NN is maintained by nvidia, you add the plugin to your project set it up and your done. The plugin probably just hooks into the driver so nvidia can keep prying eyes at bay as much as possible.
Interesting. Do devs still need to train, or it's all done by NVidia and embeded in the driver/plug-in level? I am always under the impression that for a ML model, the computational intensive work are mostly at the training stage.
Seems like preparing for more dlss compatible hardware. Wonder if the rumored Switch with dlss is driving this timeline as well
Not necessarily. There's a use case for people with 1080p 144hz/165hz/180hz panels that would benefit from scaling up from 540p/720p/900p, even with a 3000 series card.I feel like you missed the point. DLSS is being used to drive resolutions down, not mainting them and pumping them up like we would want. Having to use DLSS to get to 1080p on any card that supports the feature with any game that's currently out is absurd.
High frame rates are a bit of different story. I'm thinking more in line with 60FPS--though I would think you would shut off ray tracing before you went very low res for high frame rate. Ultimately I think that maybe the hardware really isn't quite there yet. I would guess maybe the 5xxx series. Have you seen the medium Digital Foundry review yet? Internal 1080p on a 3090 can't maintain 60FPS without RTX on. That's.... something.Not necessarily. There's a use case for people with 1080p 144hz/165hz/180hz panels that would benefit from scaling up from 540p/720p/900p, even with a 3000 series card.
So this is the new salt seeing the no games support it argument is sinking? Don't worry if your building a game using UE with a radeon gpu you can still add the DLSS plugin and code it in so the majority of your pc customers can use it.
they can't. you can probably add the option to to the settings menu thoughRadeon users can use this too? I didn't see this specifically mentioned, but if that's the case - that's awesome!
Currently Nvidia basically gives you the trained model and you hook it into your game. Developers and consumers just do inference on the model, which is less intensive than training it, yes.
Everyone being UE4 Devs that couldn't implement DLSS in their games before."EVERYONE*****"
*With an RTX GPU, aka the same people that could use it before.
Too late for that, an older version of the engine and it's probably not a demanding game anyway
no licensing requiredSo devs can use it, does that mean they can also release that work on their games with the feature, or does that require licensing?
SE has the staff to fit the plugin to their version of UE4. not to mention Nvidia would be more than ready to jump in to assist DLSS being in such a major game like KH3Too late for that, an older version of the engine and it's probably not a demanding game anyway
Wait. They said you can download it through marketplace so I thought it would direct integration into the engine. The marketplace page directs to an external link from Nvidia and you have to download the file from there. I hope this doesn't mean you have to use the source version of UE4 and compile the plugin with it. It gonan take an eternity and will overheat the PC and also installing 2 instances of the same engine isn't practical
DLSS gave me like 5-10 frames on control with full ultraI love how quickly DLSS is becoming a ubiquitous feature. Easier access for UE4 developers is only going to further accelerate that.
Something like a 2060 Super should age incredibly well if it becomes a "default" feature in new games.
I agree but dlss is only really needed in taxing games. A 2060 could probably do 4k60 just fineno licensing required
SE has the staff to fit the plugin to their version of UE4. not to mention Nvidia would be more than ready to jump in to assist DLSS being in such a major game like KH3
Where is it stated that Control isn't using the tensor cores for acceleration?I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
Where is it stated that Control isn't using the tensor cores for acceleration?
1.0 Uses the tensor cores - it is just it did an entirely different image treatment and maths to get its result from 2.0
1.9 was the one that did not use the tensor cores.
I'm not really trusting Wikipedia when every other source I find says Control DLSS 2.0 is using the cores.
Well, since wiki's source is actually Alex Battaglia, I guess you have to trust him on this one.I'm not really trusting Wikipedia when every other source I find says Control DLSS 2.0 is using the cores.
Dictator can you comment on this?Well, since wiki's source is actually Alex Battaglia, I guess you have to trust him on this one.
Control on release was using version 1.9 - which was a version running on normal compute on the SMs. It did not at all use any AI network at all, but was a simple history buffer upscaler using TAA. Much like... Spider Man Miles Morales, or what some UE4 titles have like Gears 5.
looking at the wikipedia page, it seems the confusion comes from the page listing DLSS 1.9 as 2.0. Nvidia was the one who called it 1.9, so I don't know where wikipedia got "first iteration" from
Today's article is going to cover everything. We'll be looking at the latest titles to use DLSS, focusing primarily on Control and Wolfenstein: Youngblood, to see how Nvidia's DLSS 2.0 (as we're calling it) stacks up.
I agree but dlss is only really needed in taxing games. A 2060 could probably do 4k60 just fine
Seems pretty easy to install from their docs: https://nvdam.widen.net/s/68llfltprt/dlss_plugin_installation_guide
More info on AMD's Super Sampling solution can't come soon enough.IMO this is one of the primary reasons to be a PC gamer. I don't use it much since I spent way too much on a new PC, but the way this can make your hardware go so, so, so much further with little degradation in image quality is crazy. I hope consoles get it too, as I always want to have as many options available as possible - but the ability to play many games at 144hz at 1440p or higher with DLSS is so nice. And then you take a game like Control, which is so demanding even on maxed out hardware, and add in DLSS for full RTX support and its amazing.
I can't believe I flipped so quickly into becoming a PC gamer again the last year or so, but here we are.
I figured there was some confusion in play.looking at the wikipedia page, it seems the confusion comes from the page listing DLSS 1.9 as 2.0. Nvidia was the one who called it 1.9, so I don't know where wikipedia got "first iteration" from
EDIT: found this issue.
from one of the linked sources:
Nvidia DLSS in 2020: Stunning Results
We've been waiting to reexamine Nvidia's Deep Learning Super Sampling (DLSS) for a long time and after a thorough new investigation we're glad to report that DLSS...www.techspot.com
this was before "DLSS 2.0" was actually named. there were just two implementations at the time and no one knew this
Training was DLSS 1.0. No training is required for 2.0 and beyond, as they have a more general-purpose algorithm that works better.Now DLSS can be implemented without NVidia's support, and all training is done on dev's computers? How does it work exactly?
You need the Tensor cores on those cards for DLSS to work. It's not software driven as you would assume.I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
yea, off the top of my head, Mechwarrior 5, Edge of Eternity, CoD WarzoneDLSS is life. Allows me to play the most demanding games maxed or nearly maxed at 3440x1440 and 60+ FPS.
Curious, are there any non rt games that use it?
There would be no reason for them to waste space with tensor core if it wasn't to use it. Even if we were to use think "they don't want older cards to also be able to use DLSS as to sell more new cards" they could simply do that at the driver level.I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
I have the plug-in installed however I'm not getting the DLSS option in the drop down menu. anyone know what's up with that?
There would be no reason for them to waste space with tensor core if it wasn't to use it. Even if we were to use think "they don't want older cards to also be able to use DLSS as to sell more new cards" they could simply do that at the driver level.
Do not underestimate how much ressource is still needed for deep learning stuff past the training stage, even something that many would perceive as 'simple' like generating text is far from instantaneous when using something like GPT-3.
Until we know more about it, UE4. 26 has experimental version of the new temporal AA and upsampling.More info on AMD's Super Sampling solution can't come soon enough.
I figured there was some confusion in play.
was just about to post that this fixed it.This happened to me and the log said my driver was out of date.
You'll need the latest one.
Did you think this means my PS5 will be able to take as advantage of DLSS if games use Unreal Engine like I am currently thinking? If so, darn. Was hoping that was the case!I think the topic title is a little missleading.
Good news for the devs but I came to this topic with the wrong idea.
if Nvidia allows for DLSS to run on shader cores and on AMD, they could. but no, this is about the UE4 plug-in. the title isn't misleadingDid you think this means my PS5 will be able to take as advantage of DLSS if games use Unreal Engine like I am currently thinking? If so, darn. Was hoping that was the case!