• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Csr

Member
Nov 6, 2017
2,031
DLSS is magic, but I think it's becoming somewhat exaggerated.

The screenshots you linked are not that close if you have good eyesight and are looking at details (surely this is most of the point of 4K, at least it is for me). The "emergency light" text is hugely blurrier on the 720p upscale (which isn't surprising, but needs to be said). Ditto for the '6 persons' text on the left of the screen, and the seams on the leather jacket are indistinct and soft. Without high clarity fine details, the point of calling it "4K" becomes questionable.The aliasing reduction is great, though TLAA is alright at this anyway.

Control is the perfect use case, because the RT effects are very expensive, and the the image was never pin sharp to begin with. 4K Control has very even image quality, but I'd not describe it as 'sharp'. All those post-processing effects and TAA (including DLSS) leave it looking soft. Nice, but soft.

These are the first differences that someone mentions that i can actually see but there are also some details that are better in DLSS see Jesse's outline on the right on her hair and the arm, the outline is blurred at native 4k which isn't the case for the DLSS screens.

edit: After zooming in it is very obvious that the 4k native screenshot is much better but not sure how noticeable that would be when playing.
 
Last edited:

FluffyQuack

Member
Nov 27, 2017
1,353
I think it's only been added to games with taa because of what it needs, and well sharp and taa is an oxymoron. Its why it gets labelled as better than native cause you can't turn taa off in those titles so native essentially has all the taa blur.
Ah, I see. I didn't know it requires TAA.

Image quality has become a weird thing the past years in gaming. TAA, sharpening, dithering representing alpha, resolution scaling, chromatic aberration, lens distortion, film grain... I feel it's becoming rarer and rarer to see a game with crystal clear pixels. We're getting to the point where supersampling is the only way to get the image quality we would see in many games 10 years ago. Not really complaining since games look great overall (well, except for chromatic aberration, kill that with fire pls), but I do miss super crystal clear image quality.
 

dm101

Member
Nov 13, 2018
2,184
I'm a noob and I've just gotten my first pc after 10 years of console gaming. Well, I use to pc game a decade ago. Anyway, this aorus laptop has dlss with a 1080p monitor. It's a 2070 super maxq. Should I be using the dlss option for games, or is this only if i'm using my 4k monitor via the hdmi out?
 

.exe

Member
Oct 25, 2017
22,230
I was wondering, can you also pick a higher output resolution than your monitor's so it then downsamples? So, for example, instead of 720p to 1440p native, doing 720p to 4K which is then downsampled to 1440p. Is there even a benefit to doing that or will it be blurrier than aiming for native resolution output?
 

shinken

Member
Oct 27, 2017
1,917
Consoles should have come out a year later to make use of that technology.
Dlss uses tensor cores. Does AMD have something similar? Nvidia already have tensor cores since 2 years ago with the RTX 2000 series.

Why would consoles have this tech a year from now? It really depends what AMD has to offer. Nvidia already showed all their cards (at least for the higher/high end market 3070/3080/3090). I guess Nvidia are waiting for AMD to show their cards, before Nvidia shows off and announces the price for the 3060. I have no idea what AMD are waiting for though. They really have to show the PC crowd what tech they have to combat Nvidia's tensor cores, dlss, rt and pricing.
 

Kabukimurder

Banned
Oct 28, 2017
550
Yeah I feel like a crazy person reading these threads. I've tried DLSS 2.0 in Control (540pt to 1080 and 720 to 1440) and the inconsistent visuals during motion and other graphical quirks were very distracting.

It's cool tech but I don't buy that it's better than native.

I agree 100%. Everyone has their own opinion of how much they want to sacrifice to use ray tracing.. but when you're playing on a 1080 display with a blurry DLSS 2-image i don't even think the ray tracing looks that good. Too much gets lost in the low resolution blur.

It's not a big deal for me, i didn't buy a GPU just for DLSS 2. But i feel that some people who does buy a certain GPU just for DLSS is gonna be pretty pissed about threads like theese when they realize that if you care even slightly about image quality and resolution you're probably gonna want to have DLSS OFF, at least in its current state.
 

Vexii

Member
Oct 31, 2017
2,386
UK
wkNtt0u.png


The fact that I can see that individual hair strand at 720p is madness
 

convo

Member
Oct 25, 2017
7,377
Ah, I see. I didn't know it requires TAA.

Image quality has become a weird thing the past years in gaming. TAA, sharpening, dithering representing alpha, resolution scaling, chromatic aberration, lens distortion, film grain... I feel it's becoming rarer and rarer to see a game with crystal clear pixels. We're getting to the point where supersampling is the only way to get the image quality we would see in many games 10 years ago. Not really complaining since games look great overall (well, except for chromatic aberration, kill that with fire pls), but I do miss super crystal clear image quality.
As DF once discussed the comparison of super sampled 4k and dlss 4k would be a better comparison point, and that's a pretty good comparison to make. I don't know how well post-processing effects factor into dlss.
In terms of image quality i do like sharp images, going overboard with effects might serve certain types of games but it always seems like a performance hit on consoles like with Bloodborne, but they thankfully dialed it back with Sekiro.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
As long as it's a PC only feature, it will never become a breakthrough in the industry as most devs will not utilize it.

What an utterly clueless statement. Especially given that we are already seeing it implemented or announced in some of the biggest games of the year.

It's not fucking 2002 anymore. PC gaming is HUGE today.
 

Darktalon

Member
Oct 27, 2017
3,266
Kansas
As long as it's a PC only feature, it will never become a breakthrough in the industry as most devs will not utilize it.
Seriously, if Cyberpunk 2077, Call of Duty Black Ops, The Witcher 3 Next-gen Enhancement, Fortnite for god's sake, the entire Unreal Engine, if this doesn't legitimize DLSS in your mind, I don't know what will.
 

DarthBuzzard

Banned
Jul 17, 2018
5,122
Hang on, buying 16K TV right now.

I joke, but it sound like with DLSS gaming could easily make the leap to 8K within a year or two provided 8K monitors and TVs are widely available. Like you said, why stop at 8K, we could easily be seeing 16K.
Because you can't tell any difference between 8K and 16K. That's outside the range of human acuity, even for those with the most gifted vision.

It's only use would be for zooming in for photography and visual effects work.
 

metalslimer

Avenger
Oct 25, 2017
9,565
A fascinating application will be when we see how this can scale down to Nvidia next chip they make for Nintendo. Obviously its not going to be a 3090 but it won't need to be since I doubt nintendo puts anything above a 1080p screen on the actual device. If they could just scale modern games to 1440p they would be set.


On another note this push into 8k seems pretty ridiculous but I guess I'm biased since I don't think native 4k is that big of a leap from 1440p to me
 

GhostofWar

Member
Apr 5, 2019
512
I am eager to see what Microsoft can achieve with their ml upscaling tech for the next generation consoles. This could be especially big for Lockhart.
Agreed. Considering DLSS has only really just become viable at 2.0, I am sure MS will see enough growth with their own stuff in a short amount of time.

You guys mean the upscaling that nvidia did using microsofts api?? cause thats all direct ml is. It's like saying unreal engine uses direct x so microsoft will be whipping up their own game engine in 6 months cause hey direct x.

" This technique has also been showcased during one of Microsoft's SIGGRAPH 2018 tech talks. This talk, which is entitled "Deep Learning for Real-Time Rendering: Accelerating GPU Inferencing with DirectML and DirectX 12" showcases Nvidia hardware upscaling Playground Games' Forza Horizon 3 from 1080p to 4K using DirectML in real-time. DirectML has the potential to improve the graphical fidelity of future console and PC games."

www.overclock3d.net

Microsoft's DirectML is the next-generation game-changer that nobody's talking about

AI has the power to revolutionise gaming, and DirectML is how Microsoft plans to exploit it

I'm not saying they can't do it, but this assumption that microsoft can push out a version of it easily and quickly because nvidia used their api at siggraph is a bit optimistic.
 

catpurrcat

Member
Oct 27, 2017
7,790
Ray tracing is a more long term technology, we are in the early days. New optimizations and hardware will continue to push the technology forward. Keep in mind that ray tracing in modern games is more of an additional feature than a core part of the games. The end goal is to fully path trace visuals, which will result in incredibly realistic graphics that can also be fully dynamic. We're still a ways away from that as a lot of software and hardware work needs to be done, but even the improvements made with the 30 series RTX cards seem like an amazing second step for the technology.
DLSS on the other hand is a more immediately useful technology because it allows for games to look higher res with less performance cost, thus it is the more impressive technology for current use.

all good points, but I think this poster sums it up well:

It's a nice thing, yes, although when I'm playing Control, I don't notice a lot of the nuances of raytracing until I stop and look around; it's more subtle vs. the major performance gains from DLSS. I'm able to get a rock-solid 60fps with RT on high in Control on a 4K @ 720p internal. If I bump it up to 863 (or whatever) or 1080p, I'm in the mid-50s / mid-40s, which is still pretty impressive. Instead, I'll play at 4K @ 1080p with RT at either medium or off to lock it at 60fps. It's a different story on my 1440p144 G-Sync, but there's something nice about seeing the game on a large 4K running with that IQ and framerate.

+1
 

catpurrcat

Member
Oct 27, 2017
7,790
DLSS is magic, but I think it's becoming somewhat exaggerated.

The screenshots you linked are not that close if you have good eyesight and are looking at details (surely this is most of the point of 4K, at least it is for me). The "emergency light" text is hugely blurrier on the 720p upscale (which isn't surprising, but needs to be said). Ditto for the '6 persons' text on the left of the screen, and the seams on the leather jacket are indistinct and soft. Without high clarity fine details, the point of calling it "4K" becomes questionable.The aliasing reduction is great, though TLAA is alright at this anyway.

Control is the perfect use case, because the RT effects are very expensive, and the the image was never pin sharp to begin with. 4K Control has very even image quality, but I'd not describe it as 'sharp'. All those post-processing effects and TAA (including DLSS) leave it looking soft. Nice, but soft.

I thought it was exaggerated too until I play death stranding on PC. That is absolutely DLSS magic and an entirely different + better experience.
 

Deleted member 20297

User requested account closure
Banned
Oct 28, 2017
6,943
You guys mean the upscaling that nvidia did using microsofts api?? cause thats all direct ml is. It's like saying unreal engine uses direct x so microsoft will be whipping up their own game engine in 6 months cause hey direct x.

" This technique has also been showcased during one of Microsoft's SIGGRAPH 2018 tech talks. This talk, which is entitled "Deep Learning for Real-Time Rendering: Accelerating GPU Inferencing with DirectML and DirectX 12" showcases Nvidia hardware upscaling Playground Games' Forza Horizon 3 from 1080p to 4K using DirectML in real-time. DirectML has the potential to improve the graphical fidelity of future console and PC games."

www.overclock3d.net

Microsoft's DirectML is the next-generation game-changer that nobody's talking about

AI has the power to revolutionise gaming, and DirectML is how Microsoft plans to exploit it

I'm not saying they can't do it, but this assumption that microsoft can push out a version of it easily and quickly because nvidia used their api at siggraph is a bit optimistic.
Holy implications... No. We know Microsoft has specific support for ai acceleration and they are researching for upscaling technology which, given the dlss results, seem worthwhile. I don't know what else there is to say and see for now.
 

tuxfool

Member
Oct 25, 2017
5,858
Do we know the difference in performance of Model Inference (not training, but the use of an already trained model) between the Tensor Cores and the Shader Cores within the same Nvidia GPU hardware?

edit: I mean, from what I'm able to re-search online it seems like 1/5 of the wattage is used for doing the same job on a TPU vs a GPU (shader cores). My point being, taking a Nvidia GPU as an example, these devices feature way more Shader Cores than Tensor Cores. So even though the Tensor Cores are doing the job with less wattage (1/5) there's nothing necessarily stopping the GPU from being able to dedicate 10% of the much larger array of Shader Cores to the same job. Yes, it will take 5x the wattage but it is possible. I'm just trying to figure out how efficient (or inefficient) it would be for consoles and AMD hardware to utilize Model inference like DLSS but purely using GPU Shader Cores.
It is also a question of what instructions a shader core is able to execute. Interference typically is done with very low precision ops (it does benefit from higher precision) but it does require more ops.
 

ika

Member
Oct 27, 2017
1,154
MAD, Spain
Man, the quality of 720p in Control upscaled to 4K via DLSS bodes very well for Switch 2.
But 4K would only be important for dock mode; in tabletop and handheld it's a waste of processing as the screen would be way too small to appreciate 4K doesn't it?

For people with good knowledge about DLSS, can it be used in the "next iteration of Switch" in a way to improve image quality, effects and such but not improving the resolution to 4K? For example, internally rendering a 540p or 720p image but with advanced effects/graphic techniques and then use DLSS to show it at 720p or 1080p (30 or 60fps) in handheld and dock mode respectively? 1440p would be pretty neat, but 1080p60 can be pretty good if IQ and effects are near or on par with more modern consoles and PC...

Could it be possible? Or DLSS is just intended for 4K?

Thank you!

Being able to render games at 720p and upscale to 4K with DLSS would certainly be very helpful in porting PS5/XSX games to Switch 2. I hope they update some OG Switch games with it too (assuming Switch 2 does have DLSS). Imagine playing Breath of the Wild and Mario Odyssey in DLSS 4K.
I just saw yesterday this video of Breath of the Wild in 4K and it's frankly amazing even if I'm watching at only 1080p:

 
Nov 8, 2017
13,109
But 4K would only be important for dock mode; in tabletop and handheld it's a waste of processing as the screen would be way too small to appreciate 4K doesn't it?

For people with good knowledge about DLSS, can it be used in the "next iteration of Switch" in a way to improve image quality, effects and such but not improving the resolution to 4K? For example, internally rendering a 540p or 720p image but with advanced effects/graphic techniques and then use DLSS to show it at 720p or 1080p (30 or 60fps) in handheld and dock mode respectively? 1440p would be pretty neat, but 1080p60 can be pretty good if IQ and effects are near or on par with more modern consoles and PC...

Could it be possible? Or DLSS is just intended for 4K?

Thank you!

DLSS is resolution independent. Yes you can go 540p - > 1080p or whatever.
 

Zojirushi

Member
Oct 26, 2017
3,297
I just saw yesterday this video of Breath of the Wild in 4K and it's frankly amazing even if I'm watching at only 1080p:



Wait how do you get Ray Traced Zelda? Is this an emualtor feature now or can you somehow implement it on driver level? Because I played Zelad in CEMU and it did not look like this lol
 

Zojirushi

Member
Oct 26, 2017
3,297
If this will turn into some sort of GPU brand feature war it'll just SUCK. Like, this game has AMD marketing and you will only be able to use the (inevitable) AMD equivalent of DLSS with your AMD card and get x-times the performance compared to an Nvidia card.. On the other hand, this other game has Nvidia marketing...
 

Zomba13

#1 Waluigi Fan! Current Status: Crying
Member
Oct 25, 2017
8,934
Wait how do you get Ray Traced Zelda? Is this an emualtor feature now or can you somehow implement it on driver level? Because I played Zelad in CEMU and it did not look like this lol

Theres a reshade add-on or something that simulates ray tracing. Seen a vid where someone added it to Halo 3.
 

Anastasis

Teyvat Traveler
Member
Oct 25, 2017
3,603
What RTX GPU is minimally needed for DLSS 4K 60?

I was thinking about a 3070 founders edition because I only have 1 8-pin, but maybe I should hold off for a 3060 so I have more options or a 2070 founders (and can save some money).
 

Pachinko

Member
Oct 25, 2017
958
Canada
DLSS in control really is astounding. Certainly there are some temporal artifacts if you know where to look and doing a lot of fast motion can break the reconstruction but man, 90% of the time it's pretty seamless. As one might expect, 1080P internal VS 4K internal lends a fairly substantial performance boost. Looking forward to way more titles supporting it in the future.
 

bionic77

Member
Oct 25, 2017
30,894
This is mostly a software solution that Nvidea came up with that runs on their tensor cores right? There is no reason it can't run on some other hardware if they had a dedicated chip too right?

I wonder how long it will take for this to be copied or improved upon by others.

Seems like a legit revolutionary type of technology. I wonder how much things will change as a result of it.
 

Cow Mengde

Member
Oct 26, 2017
12,715
I'm a noob and I've just gotten my first pc after 10 years of console gaming. Well, I use to pc game a decade ago. Anyway, this aorus laptop has dlss with a 1080p monitor. It's a 2070 super maxq. Should I be using the dlss option for games, or is this only if i'm using my 4k monitor via the hdmi out?

As far as I know, DLSS doesn't have top mean 4K. You can easily run games at DLSS 1080p and achieve 60fps. So essentially, your video card could last longer even as newer games come out (provided they implement DLSS) that are more intensive, since you can save on performance by rendering at lower resolution then upscaling it back to 1080p.
 

tuxfool

Member
Oct 25, 2017
5,858
This is mostly a software solution that Nvidea came up with that runs on their tensor cores right? There is no reason it can't run on some other hardware if they had a dedicated chip too right?
It doesn't even need a dedicated hardware. This will run on general purpose compute. However the issue is the computational cost to achieve it.

The tensor cores are just a lot more efficient at the the task.
 

PennyStonks

Banned
May 17, 2018
4,401
DLSS is magic, but I think it's becoming somewhat exaggerated.

The screenshots you linked are not that close if you have good eyesight and are looking at details (surely this is most of the point of 4K, at least it is for me). The "emergency light" text is hugely blurrier on the 720p upscale (which isn't surprising, but needs to be said). Ditto for the '6 persons' text on the left of the screen, and the seams on the leather jacket are indistinct and soft. Without high clarity fine details, the point of calling it "4K" becomes questionable.The aliasing reduction is great, though TLAA is alright at this anyway.

Control is the perfect use case, because the RT effects are very expensive, and the the image was never pin sharp to begin with. 4K Control has very even image quality, but I'd not describe it as 'sharp'. All those post-processing effects and TAA (including DLSS) leave it looking soft. Nice, but soft.
Keep in mind base 720p to 2160p would be a DLSS low set up. The results get much better at base 1080p or 1440p
 

Stalker

The Fallen
Oct 25, 2017
6,733
Wait how do you get Ray Traced Zelda? Is this an emualtor feature now or can you somehow implement it on driver level? Because I played Zelad in CEMU and it did not look like this lol
It's using the McFly reshade, That thing is SUPER spotty so don't expect amazing results with it and it adds a load of noise
 

bionic77

Member
Oct 25, 2017
30,894
It doesn't even need a dedicated hardware. This will run on general purpose compute. However the issue is the computational cost to achieve it.

The tensor cores are just a lot more efficient at the the task.
Thats what I thought. So you just need an extra chip to perform this.

If they could put this in Switch 2 that would be bananas if it allowed PS5 ports with hardware that has a fraction of the power. Hopefully this takes off in everything.
 

KanameYuuki

Member
Dec 23, 2017
2,650
Colombia
Is there any downside to DLSS like added lag, etc. or is it "free" better performance that sometimes looks better than native? I think the only thing people were "complaining" about was that some screenshots look worse.
 

tuxfool

Member
Oct 25, 2017
5,858
Thats what I thought. So you just need an extra chip to perform this.

If they could put this in Switch 2 that would be bananas if it allowed PS5 ports with hardware that has a fraction of the power. Hopefully this takes off in everything.
Its not an extra chip. It is just an execution block inside the GPU. Doing this off die would absolutely kill any efficiency gains for anything but pure inference compute.

They could put it in the switch, but it scales with die size. Sticking something that is this capable (or even half) in a switch SoC would make it highly unbalanced.
 

Diablos

has a title.
Member
Oct 25, 2017
14,591
I can't even understand how this works so well. It doesn't make sense that you can have parity with an internal resolution that's not even Full HD. Can someone explain this in the simplest terms possible?
 

TeenageFBI

One Winged Slayer
Member
Oct 25, 2017
10,240
Is there any downside to DLSS like added lag, etc. or is it "free" better performance that sometimes looks better than native? I think the only thing people were "complaining" about was that some screenshots look worse.
There are some minor visual artifacts on a game by game basis. Plus it's possible to distinguish between native and DLSS if you look at it carefully, especially when there's a lot of motion. The massive performance boost more than makes up for it. Being able to max out raytracing effects while keeping high framerates is just incredible.