• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Deleted member 19533

User requested account closure
Banned
Oct 27, 2017
3,873
I love how quickly DLSS is becoming a ubiquitous feature. Easier access for UE4 developers is only going to further accelerate that.

Something like a 2060 Super should age incredibly well if it becomes a "default" feature in new games.
I don't think it will. It's becoming the default needed option. Something like Control you run at 960p-1080p internal Res using DLSS with a beefed up 3070. They seem to be using it more to push the graphical bar even higher--it's not giving overhead. Like, it would be nice to take something at 1440p and push it to 4k easily. But it seems they're going the other way--sub full hd internal using DLSS to give you something more like 1440p.
 

Dekuman

Member
Oct 27, 2017
19,026
Seems like preparing for more dlss compatible hardware. Wonder if the rumored Switch with dlss is driving this timeline as well
 

kami_sama

Member
Oct 26, 2017
6,993
I don't think it will. It's becoming the default needed option. Something like Control you run at 960p-1080p internal Res using DLSS with a beefed up 3070. They seem to be using it more to push the graphical bar even higher--it's not giving overhead. Like, it would be nice to take something at 1440p and push it to 4k easily. But it seems they're going the other way--sub full hd internal using DLSS to give you something more like 1440p.
For 90+% of people 4k is too much. 1080p or 1440p at higher framerates seems to be the sweet spot for most. And DLSS is absolutely great for that purpose.
 

luoapp

Member
Oct 27, 2017
505
" One Network For All Games - The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games."

www.nvidia.com

NVIDIA DLSS 2.0: A Big Leap In AI Rendering

Through the power of AI and GeForce RTX Tensor Cores, NVIDIA DLSS 2.0 enables a new level of performance and visuals for your games - available now in MechWarrior 5: Mercenaries and coming this week to Control.

Interesting. Do devs still need to train, or it's all done by NVidia and embeded in the driver/plug-in level? I am always under the impression that for a ML model, the computational intensive work are mostly at the training stage.
 

Deleted member 19533

User requested account closure
Banned
Oct 27, 2017
3,873
For 90+% of people 4k is too much. 1080p or 1440p at higher framerates seems to be the sweet spot for most. And DLSS is absolutely great for that purpose.
I feel like you missed the point. DLSS is being used to drive resolutions down, not mainting them and pumping them up like we would want. Having to use DLSS to get to 1080p on any card that supports the feature with any game that's currently out is absurd.
 

kami_sama

Member
Oct 26, 2017
6,993
I feel like you missed the point. DLSS is being used to drive resolutions down, not mainting them and pumping them up like we would want. Having to use DLSS to get to 1080p on any card that supports the feature with any game that's currently out is absurd.
Oh I see what you meant. While I agree that maybe it's making games more computationally expensive, it's still an option, and the more options, the better.
 

GhostofWar

Member
Apr 5, 2019
512
Interesting. Do devs still need to train, or it's all done by NVidia and embeded in the driver/plug-in level? I am always under the impression that for a ML model, the computational intensive work are mostly at the training stage.

The training and NN is maintained by nvidia, you add the plugin to your project set it up and your done. The plugin probably just hooks into the driver so nvidia can keep prying eyes at bay as much as possible.
 

luoapp

Member
Oct 27, 2017
505
The training and NN is maintained by nvidia, you add the plugin to your project set it up and your done. The plugin probably just hooks into the driver so nvidia can keep prying eyes at bay as much as possible.

"maintain" is the keyword here. I believe the bulk of the training is done, maybe as early as they announced DLSS 2.0.
 

Rpgmonkey

Member
Oct 25, 2017
1,347
Interesting. Do devs still need to train, or it's all done by NVidia and embeded in the driver/plug-in level? I am always under the impression that for a ML model, the computational intensive work are mostly at the training stage.

Currently Nvidia basically gives you the trained model and you hook it into your game. Developers and consumers just do inference on the model, which is less intensive than training it, yes.

There are some settings to tweak things and stuff but from what I've seen in the brief hour I've put into the plugin so far, there isn't anything like training a new model entirely.

I assume that as they make improvements to the model they'll push them as updates to the plugin and developers won't have to do too much beyond making sure nothing broke.

Seems like preparing for more dlss compatible hardware. Wonder if the rumored Switch with dlss is driving this timeline as well

There's two generations of RTX GPUs available and they've been building on it in a publicly visible way in several high end game engines for nearly three years now, probably longer behind the scenes.

I think they've just gotten enough data on how it behaves in various game engines and reached a state where there isn't as much risk of a random amateur dev getting their hands on it and getting awful results, so access to it doesn't need to be curated anymore. I don't think Unity has (widely accessible) support yet but I can see that happening within the year.
 

PLASTICA-MAN

Member
Oct 26, 2017
23,496
Wait. They said you can download it through marketplace so I thought it would direct integration into the engine. The marketplace page directs to an external link from Nvidia and you have to download the file from there. I hope this doesn't mean you have to use the source version of UE4 and compile the plugin with it. It gonan take an eternity and will overheat the PC and also installing 2 instances of the same engine isn't practical
 

SnazzyNaz

The Wise Ones
Member
Nov 11, 2019
1,870
I feel like you missed the point. DLSS is being used to drive resolutions down, not mainting them and pumping them up like we would want. Having to use DLSS to get to 1080p on any card that supports the feature with any game that's currently out is absurd.
Not necessarily. There's a use case for people with 1080p 144hz/165hz/180hz panels that would benefit from scaling up from 540p/720p/900p, even with a 3000 series card.
 

Deleted member 19533

User requested account closure
Banned
Oct 27, 2017
3,873
Not necessarily. There's a use case for people with 1080p 144hz/165hz/180hz panels that would benefit from scaling up from 540p/720p/900p, even with a 3000 series card.
High frame rates are a bit of different story. I'm thinking more in line with 60FPS--though I would think you would shut off ray tracing before you went very low res for high frame rate. Ultimately I think that maybe the hardware really isn't quite there yet. I would guess maybe the 5xxx series. Have you seen the medium Digital Foundry review yet? Internal 1080p on a 3090 can't maintain 60FPS without RTX on. That's.... something.
 

ShinUltramanJ

Member
Oct 27, 2017
12,949
So this is the new salt seeing the no games support it argument is sinking? Don't worry if your building a game using UE with a radeon gpu you can still add the DLSS plugin and code it in so the majority of your pc customers can use it.

Radeon users can use this too? I didn't see this specifically mentioned, but if that's the case - that's awesome!
 

luoapp

Member
Oct 27, 2017
505
Currently Nvidia basically gives you the trained model and you hook it into your game. Developers and consumers just do inference on the model, which is less intensive than training it, yes.

I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
 

AndyD

Mambo Number PS5
Member
Oct 27, 2017
8,602
Nashville
So devs can use it, does that mean they can also release that work on their games with the feature, or does that require licensing?
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
So devs can use it, does that mean they can also release that work on their games with the feature, or does that require licensing?
no licensing required

Too late for that, an older version of the engine and it's probably not a demanding game anyway
SE has the staff to fit the plugin to their version of UE4. not to mention Nvidia would be more than ready to jump in to assist DLSS being in such a major game like KH3
 

julia crawford

Took the red AND the blue pills
Member
Oct 27, 2017
35,065
Wait. They said you can download it through marketplace so I thought it would direct integration into the engine. The marketplace page directs to an external link from Nvidia and you have to download the file from there. I hope this doesn't mean you have to use the source version of UE4 and compile the plugin with it. It gonan take an eternity and will overheat the PC and also installing 2 instances of the same engine isn't practical

Seems pretty easy to install from their docs: https://nvdam.widen.net/s/68llfltprt/dlss_plugin_installation_guide
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,196
Dark Space
I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
Where is it stated that Control isn't using the tensor cores for acceleration?
 

luoapp

Member
Oct 27, 2017
505
Where is it stated that Control isn't using the tensor cores for acceleration?

1.0 Uses the tensor cores - it is just it did an entirely different image treatment and maths to get its result from 2.0

1.9 was the one that did not use the tensor cores.

It's also on wiki:
First 2.0 version, also referenced as version 1.9, using an approximated AI of the in-progress version 2.0 running on the CUDA shader cores and specifically adapted for Control[SUP][6][/SUP][SUP][2][/SUP][SUP][10][/SUP]

Control may have update by now though.
 

Calvin

Member
Oct 26, 2017
1,580
IMO this is one of the primary reasons to be a PC gamer. I don't use it much since I spent way too much on a new PC, but the way this can make your hardware go so, so, so much further with little degradation in image quality is crazy. I hope consoles get it too, as I always want to have as many options available as possible - but the ability to play many games at 144hz at 1440p or higher with DLSS is so nice. And then you take a game like Control, which is so demanding even on maxed out hardware, and add in DLSS for full RTX support and its amazing.

I can't believe I flipped so quickly into becoming a PC gamer again the last year or so, but here we are.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,928
Berlin, 'SCHLAND
It's also on wiki:

Control may have update by now though.
Dictator can you comment on this?

I am know to be wrong, notoriously.
Control on release was using version 1.9 - which was a version running on normal compute on the SMs. It did not at all use any AI network at all, but was a simple history buffer upscaler using TAA. Much like... Spider Man Miles Morales, or what some UE4 titles have like Gears 5.

Then Control, post-release, was updated to version 2.0 which uses a neural network to decide how to integrate past frames. That neural network is running on the tensor cores (it would be very slow on the normal compute SMs).

Then control was updated to 2.1 most recently, adding in support for Ultra Low Resolution DLSS (Ultra Performance mode) and, perhaps, some quality improvements.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Dictator can you comment on this?

I am know to be wrong, notoriously.
looking at the wikipedia page, it seems the confusion comes from the page listing DLSS 1.9 as 2.0. Nvidia was the one who called it 1.9, so I don't know where wikipedia got "first iteration" from

EDIT: found this issue.

from one of the linked sources:
Today's article is going to cover everything. We'll be looking at the latest titles to use DLSS, focusing primarily on Control and Wolfenstein: Youngblood, to see how Nvidia's DLSS 2.0 (as we're calling it) stacks up.
www.techspot.com

Nvidia DLSS in 2020: Stunning Results

We've been waiting to reexamine Nvidia's Deep Learning Super Sampling (DLSS) for a long time and after a thorough new investigation we're glad to report that DLSS...

this was before "DLSS 2.0" was actually named. there were just two implementations at the time and no one knew this
 

brain_stew

Member
Oct 30, 2017
4,720
I agree but dlss is only really needed in taxing games. A 2060 could probably do 4k60 just fine

DLSS can give better image quality for increased performance, that's never not useful, even for games that some don't perceive as demanding, it's still beneficial.

Hitting a complete locked 4K60 without a single frame exceeding 16.6ms is a lot more demanding than many would lead you to believe. Even in games that are seen as less demanding, it's not exactly a cake walk on my 3070. We'll start to see DLSS hit lower price points as the year progresses, making it even more useful.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,196
Dark Space
IMO this is one of the primary reasons to be a PC gamer. I don't use it much since I spent way too much on a new PC, but the way this can make your hardware go so, so, so much further with little degradation in image quality is crazy. I hope consoles get it too, as I always want to have as many options available as possible - but the ability to play many games at 144hz at 1440p or higher with DLSS is so nice. And then you take a game like Control, which is so demanding even on maxed out hardware, and add in DLSS for full RTX support and its amazing.

I can't believe I flipped so quickly into becoming a PC gamer again the last year or so, but here we are.
More info on AMD's Super Sampling solution can't come soon enough.

looking at the wikipedia page, it seems the confusion comes from the page listing DLSS 1.9 as 2.0. Nvidia was the one who called it 1.9, so I don't know where wikipedia got "first iteration" from

EDIT: found this issue.

from one of the linked sources:

www.techspot.com

Nvidia DLSS in 2020: Stunning Results

We've been waiting to reexamine Nvidia's Deep Learning Super Sampling (DLSS) for a long time and after a thorough new investigation we're glad to report that DLSS...

this was before "DLSS 2.0" was actually named. there were just two implementations at the time and no one knew this
I figured there was some confusion in play.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
I have the plug-in installed however I'm not getting the DLSS option in the drop down menu. anyone know what's up with that?

ETfjUon.png
 

SiG

Member
Oct 25, 2017
6,485
Now DLSS can be implemented without NVidia's support, and all training is done on dev's computers? How does it work exactly?
Training was DLSS 1.0. No training is required for 2.0 and beyond, as they have a more general-purpose algorithm that works better.
I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
You need the Tensor cores on those cards for DLSS to work. It's not software driven as you would assume.
 

Deleted member 12833

User requested account closure
Banned
Oct 27, 2017
10,078
DLSS is life. Allows me to play the most demanding games maxed or nearly maxed at 3440x1440 and 60+ FPS.

Curious, are there any non rt games that use it?
 

elyetis

Member
Oct 26, 2017
4,547
I always have the suspicion that there is nothing special on RTX 20xx and 30xx make them unique for DLSS, since all the heavy lifting is at the training stage and what the PC GPU really does is not much more than any modern AA solutions. (plus the fact that Control's DLSS 2.0 implementation doesn't require tensor cores.) Of course, there is no way for me to prove it. It's just a hunch.
There would be no reason for them to waste space with tensor core if it wasn't to use it. Even if we were to use think "they don't want older cards to also be able to use DLSS as to sell more new cards" they could simply do that at the driver level.

Do not underestimate how much ressource is still needed for deep learning stuff past the training stage, even something that many would perceive as 'simple' like generating text is far from instantaneous when using something like GPT-3.
 

luoapp

Member
Oct 27, 2017
505
There would be no reason for them to waste space with tensor core if it wasn't to use it. Even if we were to use think "they don't want older cards to also be able to use DLSS as to sell more new cards" they could simply do that at the driver level.

Do not underestimate how much ressource is still needed for deep learning stuff past the training stage, even something that many would perceive as 'simple' like generating text is far from instantaneous when using something like GPT-3.

NVidia maybe the one profited most from the AI gold rush (by selling shovels, smart), and they have tens of billions of reasons to push their AI solution whenever they can. Since DLSS is a closed algorithm, there really isn't a way for an outsider to prove or disprove. But my hunch is not completely pulled out of thin air, remember RTX voice? NVidia claimed you need a RTX card to make it work, because of the "AI capabilities", but turns out you can trick the program and make it run on a GTX card. Of course, this is extraneous evidence, so as I said, DLSS thing stays as a hunch by now.
 

Cloud-Strife

Alt-Account
Banned
Sep 27, 2019
3,140
I think the topic title is a little missleading.

Good news for the devs but I came to this topic with the wrong idea.
 

Deleted member 17184

User-requested account closure
Banned
Oct 27, 2017
5,240
I don't imagine every upcoming UE4 game will have it, though, just those that either start development with this or can afford to upgrade to 4.26 if they weren't using that version before (which can be a very risky move in some scenarios).
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Did you think this means my PS5 will be able to take as advantage of DLSS if games use Unreal Engine like I am currently thinking? If so, darn. Was hoping that was the case!
if Nvidia allows for DLSS to run on shader cores and on AMD, they could. but no, this is about the UE4 plug-in. the title isn't misleading

did a really small test. gonna test out that new Medieval Scene Epic/Quixel put out a couple days ago



RT GI/AO/Reflections (samples at 32) on

DLSS Off - 24fps
Quality - 44fps
Performance - 64fps
Ultra Performance - 95fps