• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Vuze

Member
Oct 25, 2017
4,186
I though 1080p dlss would take 1080++++ quality. I don't get it. So should I play 1080p with or without dlss knowing the streaming cap at 1080 anyway and even without dlss frame rate is over 60 always
Nah, DLSS takes a lower input resolution and creates a higher resolution output (4K DLSS quality Mode = 1440p native; hence the performance boost), what you're referring to would be akin to super sampling (which would decrease performance). I get it, it's a little confusing: DLSS produces a better-than-native image quality in certain aspects (hair, image stability etc) but has some shortcomings as seen in the video. For gaming on a PC with a higher end display, the choice is simple because you get a massive performance boost and only have to deal with the minor negatives of DLSS. When you're hard capped at a - comparatively low - 1080p60, I guess it boils down to a matter of preference.

(somebody more knowledgable please feel free to correct me)
(also I'm confused why the tech is called deep learning super sampling, given that it takes a lower resolution input?)
 
Last edited:

RivalGT

Member
Dec 13, 2017
6,397
DLSS is like magic, more games need to support this magic, the RTX 2080 Ti has been out for over 2 years now, and very few games support the DLSS 2.0 implementation. Excellent video comparing DLSS and checkboard rendering, even though the 2 a very different, they are both trying to accomplish the same thing, DLSS just does it goal way better. Hopefully nextgen systems can improve there upscaling techniques, as native 4k will always be a waste of GPU power.
 

Iron Eddie

Banned
Nov 25, 2019
9,812
So frustrating it seems like we won't really get this for the upcoming consoles. A massive game changer for framerate potential that could really see a huge uptick in visuals over the next few years. Definitely good for the longer view, to PS6 or whatever that these techniques will be widespread and built in and constantly improve in their implementation.
This is exactly what consoles should be striving for instead of just trying to get to native 4K. How awesome would it be if we could use DLSS as an option to increase frame rates?
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
This is exactly what consoles should be striving for instead of just trying to get to native 4K. How awesome would it be if we could use DLSS as an option to increase frame rates?

Developers that are not already targeting 60 on console would simply use all that saved performance from a DLSS like solution to push even more visual eye candy into their 33ms frame budget and stay at 30fps lol. Ubisoft I'm looking at you!
 

Iron Eddie

Banned
Nov 25, 2019
9,812
Developers that are not already targeting 60 on console would simply use all that saved performance from a DLSS like solution to push even more visual eye candy into their 33ms frame budget and stay at 30fps lol. Ubisoft I'm looking at you!
You can play those games on the PC with higher frame rates, that should make it easier to at least give the option on consoles to scale back the graphics and have performance mode?
 

Pwnz

Member
Oct 28, 2017
14,279
Places
The idea that upscaling gives you a better picture than a native image is just crazy to me. Really it's sounding like true 4k may end up being the exception in the next-gen rather than the standard.

Aye, 4K native takes a lot of horsepower. If you can put more compute in other effects at 1440p and upscale it'll look a lot better than 4k native with low settings.
 

TheRealTalker

Member
Oct 25, 2017
21,480
Yes, this was the video I was anticipating after some comments about it in the last video.

Going to watch it right now.
 

zombiejames

Member
Oct 25, 2017
11,930

DrowsyJungle

Member
Oct 25, 2017
912
Had to google this one and this is the first thing that came up.

vUPDs8C.jpg


LMAO.
Dude is a hardcore AMD shill. Had to quit watching his stuff.
 

Truant

Member
Oct 28, 2017
6,759
I've posted this before, but I'm worried that the lack of HW-based upscaling tech will widen the gap between PC and console graphics and performance much further for the next generation than ever before. TAA was in many ways a revolution for people playing on TVs for the current generation. DLSS 2.0 and beyond will be even more important going forward, it seems. Having Halo Infinite run natively at 4k seems like such a huge waste.
 

denx

Prophet of Truth
Member
Oct 27, 2017
6,321
I feel like Microsoft and Sony backed the wrong horse going with AMD this gen. Hopefully we will see some version of DLSS on the Switch 2.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
I wonder what technique Epic used for the UE5 demo? If I remember the DF write-up, it defied all of their pixel counting.

edit:

www.eurogamer.net

Inside Unreal Engine 5: how Epic delivers its generational leap

Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the …
it helps there was little thin objects in the demo. that might have gave it away immediately as things like grass and hair can't take advantage of Nanite yet
 

Fall Damage

Member
Oct 31, 2017
2,058
DLSS is exiting tech. Just thinking a scrub like me will be able to take advantage of the newest demanding titles on a 4k monitor without having to buy a 3080 or better is really something. It really needs to make it's way to VR.

Egineer: "Oooh, look Kojima San, we need to add better vectors for these particles in order for the DLSS black magic stuff to work well with them. Let me just start..."

Kojima: "No! .... These trails plus ultrawide will get us even closer to our true cinematic vision on PC!"

:p

lol
 

Deleted member 420

User requested account closure
Banned
Oct 25, 2017
7,056
Any word if Horizon will support this? I'm kinda fearing it won't due to the AMD promos. Also fearing that Sony may want to let their consoles be the exclusive systems to support upscaling for their first party games, but not sure if I'm just being a nut on that one. The engine supports it though so I am hopeful.
 

Gitaroo

Member
Nov 3, 2017
8,000
Any word if Horizon will support this? I'm kinda fearing it won't due to the AMD promos. Also fearing that Sony may want to let their consoles be the exclusive systems to support upscaling for their first party games, but not sure if I'm just being a nut on that one. The engine supports it though so I am hopeful.
I don't think it does but I think it support dynamic res in one of the preview video.
 

Falus

Banned
Oct 27, 2017
7,656
Nah, DLSS takes a lower input resolution and creates a higher resolution output (4K DLSS quality Mode = 1440p native; hence the performance boost), what you're referring to would be akin to super sampling (which would decrease performance). I get it, it's a little confusing: DLSS produces a better-than-native image quality in certain aspects (hair, image stability etc) but has some shortcomings as seen in the video. For gaming on a PC with a higher end display, the choice is simple because you get a massive performance boost and only have to deal with the minor negatives of DLSS. When you're hard capped at a - comparatively low - 1080p60, I guess it boils down to a matter of preference.

(somebody more knowledgable please feel free to correct me)
(also I'm confused why the tech is called deep learning super sampling, given that it takes a lower resolution input?)
Got it so I guess I shouldn't use it on GeForce now
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,931
Berlin, 'SCHLAND
I wonder what technique Epic used for the UE5 demo? If I remember the DF write-up, it defied all of their pixel counting.

edit:

www.eurogamer.net

Inside Unreal Engine 5: how Epic delivers its generational leap

Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the …
The reason why this defies Pixel counting is because it has no straight edges, it does not look like 4k native or something
 

azfaru

Member
Dec 1, 2017
2,273
I feel so left out not getting an Nvidia card when I got my PC. I was choosing between 5700XT and a 2070 Super. Arghhh
 

brain_stew

Member
Oct 30, 2017
4,731
That was brilliant, and when you consider checkerboard was pretty much best in class upscaling previously, the fact that DLSS can deliver better results at half resolution is truly astounding. I viewed the video at 4K on my OLED which is where I play all my games (PC and console) and the image quality for DLSS performance was fantastic at a normal viewing distance, I would be more than happy to settle for that image quality. For this reason, I won't be considering an AMD card until they have Tensor cores and I wouldn't be comfortable recommending them to anyone unless they are at a very steap discount.

It's the image stability and antialiasing that gets me. I could see reconstructing detail from a 1080p image with motion vectors but to then have better antialiasing and temporal stability than TAA at twice the resolution is really incredible. Reconstructing single pixel level detail at 4k that doesn't exist even when rendered at higher resolution is super impressive.

The performance of the upcoming RTX 3070 at 4K DLSS performance is going to be incredible for the money. We just need to hope that developers adopt it as they're leaving a ridiculous amount of performance on the table without it. If we get the improvement to RT performance we're expecting from the RTX 3000 series then something like a RTX 3060 becomes viable for next generation games with Ray traced global illumination at 4K/60 by using DLSS performance.
 
Last edited:

AegonSnake

Banned
Oct 25, 2017
9,566
I feel like Microsoft and Sony backed the wrong horse going with AMD this gen. Hopefully we will see some version of DLSS on the Switch 2.
nah. Dlss isn't some magic tech. These rtx graphics cards have extra hardware in them to process this stuff and that extra hardware is space on the silicon die. Expensive and would pose a challenge cooling in a small form factor. Switch 2 will likely never have this since the chips need to be small and any extra die space would likely go towards making the gpu more powerful.

Besides, 4kcb is almost as good and the best implementations from Sony dont even use the hardware stuff cerny put in there to utilize it. PCs will always have the best tech but it comes at a price.

The ps5 and xsx are roughly as powerful as a 2070 super and 2080 respectively. These are $500-600 cards in a $400-500 next gen console with Zen 2 cpus and fancy ssds. You won't get that if they had gone with nvidia. Their GPUs are too big and too expensive.
 

Deusmico

Banned
Oct 27, 2017
1,254
Yeah consoles cost as much as a gfx card and they have to include cpu,hard drives, game controller etc. Price is a thing console makers consider
 

Schlomo

Member
Oct 25, 2017
1,133
I wonder if 8k TVs will actually make sense as displays for the next Nvidia generation thanks to DLSS.
 

Dan Thunder

Member
Nov 2, 2017
14,049
I know some may have responded to this, but folks in the next gen speculation threads have been saying this for years...

Knowledgeable folks in general over the years have.
Yeah, I'd always believed that it'd be something done for the 'big' games as they'll be pushing the hardware the most but it sounds like we're not far away from the point where it's just not worth the effort for any devs to try and push for native.
 

Genio88

Banned
Jun 4, 2018
964
It's so cool to see a company like Nvidia still trying to improve and set new standards despite being in a clear dominant position compared to AMD, with Ray Tracing in 2018 and DLSS 2.0 AI software now they are really doing awesome things.
I don't get why they're criticized for their prices, their technology and R&D department are among the best in this field, i think their prices are in line with what they offer
 

medyej

Member
Oct 26, 2017
6,437
It's so cool to see a company like Nvidia still trying to improve and set new standards despite being in a clear dominant position compared to AMD, with Ray Tracing in 2018 and DLSS 2.0 AI software now they are really doing awesome things.
I don't get why they're criticized for their prices, their technology and R&D department are among the best in this field, i think their prices are in line with what they offer
Agreed, I've been on Nvidia for so long now not just because of their hardware but the technologies they bring to stay ahead. In the past decade they brought us 120hz+ screens, G-Sync, Shadowplay and DLSS.
 
Nov 2, 2017
2,275
nah. Dlss isn't some magic tech. These rtx graphics cards have extra hardware in them to process this stuff and that extra hardware is space on the silicon die. Expensive and would pose a challenge cooling in a small form factor. Switch 2 will likely never have this since the chips need to be small and any extra die space would likely go towards making the gpu more powerful.

Besides, 4kcb is almost as good and the best implementations from Sony dont even use the hardware stuff cerny put in there to utilize it. PCs will always have the best tech but it comes at a price.

The ps5 and xsx are roughly as powerful as a 2070 super and 2080 respectively. These are $500-600 cards in a $400-500 next gen console with Zen 2 cpus and fancy ssds. You won't get that if they had gone with nvidia. Their GPUs are too big and too expensive.
Die space for tensor cores is only 10-15%. I'd say that's worth the space. You can gain much more performance with that 10-15% in Tensor cores than with extra FP32 power.

I wouldn't say CB is almost as good as DLSS. Temporal supersampling/injection, like Insomniac uses, is better than CB anyway so I doubt many if any devs are going to be using CB anyway. Most will just take the easy way and go native.
 

MIMF

Member
Nov 23, 2017
146
I find DLSS 2.0 a superior reconstruction tech despite the dedicated hardware required to make it feasible.

But also true that the stability comparison is not fair as for DLSS it is analyzed from a 60fps+ render output compared to the 30fps of the PS4 Pro version of the game, what has an obvious impact in any kind of analysis dependant of previous frames.

At 60fps for sure the checkerboarded method would yield better results in this regard.
 

Carn

Member
Oct 27, 2017
11,918
The Netherlands
Hopefully we will see some version of DLSS on the Switch 2.

A modified Xavier would be a nice Tegra-chip for that; but I wonder what the current price is on those. If you want an Jetson Xavier NX it will cost you around 399; I can imagine Nintendo can get a much better deal, but its still quite expensive at the moment.
 

Alexandros

Member
Oct 26, 2017
17,811
So with DLSS being a clearly incredible feature the onus is on Nvidia to push it as hard as possible. It needs to ensure that it becomes a standard feature for most if not all big releases.
 

Deleted member 10234

User requested account closure
Banned
Oct 27, 2017
2,922
Nice video, the trails on some objects are quite a significant drawback to DLSS in this game.

Has Digital Foundry (or anyone else for that matter) ever looked at how DLSS works at 30 fps instead of 60+? It would be interesting to see if this causes much more obvious artifacts.
 

Corralx

Member
Aug 23, 2018
1,176
London, UK
Nice video, the trails on some objects are quite a significant drawback to DLSS in this game.

Has Digital Foundry (or anyone else for that matter) ever looked at how DLSS works at 30 fps instead of 60+? It would be interesting to see if this causes much more obvious artifacts.

It does.
The less temporally correlated frames in the history are (lower framerate means more time has passed between each frame) the harder it is to produce a "visually correct" frame. It wouldn't necessarily cause artifacts like ghosting tho, it might just increase aliasing, ie. give you a lower perceived reconstructed resolution.
Same as dropping base resolution. When resolution lowers each pixel is less likely to have the correct information is needed to reconstruct the signal appropriately because the values change with a higher frequency.
 

Mr Swine

The Fallen
Oct 26, 2017
6,040
Sweden
A modified Xavier would be a nice Tegra-chip for that; but I wonder what the current price is on those. If you want an Jetson Xavier NX it will cost you around 399; I can imagine Nintendo can get a much better deal, but its still quite expensive at the moment.

Why go with Xavier when Nvidia can create a customized Ampere Tegra that can be put in their future Nvidia Shield lineup?
 

Carn

Member
Oct 27, 2017
11,918
The Netherlands
Why go with Xavier when Nvidia can create a customized Ampere Tegra that can be put in their future Nvidia Shield lineup?

Sure, but why not use/modify an existing Tegra-SoC like Xavier? It has Tensor cores on board.
Tegra Orin is next in line, that uses Ampere-based GPU iirc, but I don't think its available anytime soon for mass-market.
 

eso76

Prophet of Truth
Member
Dec 8, 2017
8,119
I really like how dlss takes care of pixel crawling and shimmering in the video.
A consistent image across frames is the most important aspect of IQ for me.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Sure, but why not use/modify an existing Tegra-SoC like Xavier? It has Tensor cores on board.
Tegra Orin is next in line, that uses Ampere-based GPU iirc, but I don't think its available anytime soon for mass-market.
From what I hear, Xavier's tensor cores aren't tuned for stuff like DLSS. And then there's the cpu. Orin is probably using the same Ampere variant as the A100, which is a no go for any kind of gaming (I guess you could use it for gaming, but Nvidia made two variants for a reason)
 

Carn

Member
Oct 27, 2017
11,918
The Netherlands
From what I hear, Xavier's tensor cores aren't tuned for stuff like DLSS. And then there's the cpu. Orin is probably using the same Ampere variant as the A100, which is a no go for any kind of gaming (I guess you could use it for gaming, but Nvidia made two variants for a reason)

Makes sense, thanks for the clarification. Nvidia's product lines can get convoluted at times, heh.
 

Bearly_There

Member
Mar 16, 2020
30
Game graphics have entered a new and very dynamic stage of computer science problem-solving. The first problem is increasing resolution on the cheap, computationally-speaking, because 4K isn't worth it in terms of computational resources, it just doesn't look that much better than 1440p or perhaps even 1080p to be worth calculating it in the traditional manner. So things like temporal upscaljng, CBR, DLSS are all about coming up with cleverer, quicker ways to extrapolate a larger image based on the image you have already calculated at great computational expense. It's a great compsci challenge and the field seems to be evolving very quickly. Guerrilla woke people up with their temporal upscaling for the 60fps multiplayer in Killzone when the PS4 released.

DLSS is interesting in that it essentially takes advantage of offline computation. In the same way that pre-baked lighting gives you far better lighting than you could do in real-time, DLSS takes advantage of having the tensor cores "trained" by running the game over and over at high res on a powerful PC. The tensor cores can extrapolate a high res image because that training teaches them what to expect.

I wonder how much, like pre-baked lighting, you lose flexibility on DLSS. Pre-baked lighting can't light objects that weren't in the original image (moving NPCs for example), and I wonder if DLSS' visual flaws are due to situations that the tensor cores haven't been trained on. Perhaps rare and inherently unpredictable visual situations (rain, dynamic weather, dynamic and destructible environments) give rise to DLSS hiccuping. What are the limits on how much they can be trained?

Another big area of comp sci innovation is obviously ray-tracing, denoising, and so on.

Ultimately the more dynamic and flexible the compsci solution, the better for devs, because it frees them and makes things cheaper. Pre-baked lighting is expensive and time-consuming in terms of the production process, and I'd imagine that training machine-learning cores might be as well. Device-specific solutions, like Nvidia-only solutions, are less likely to be embraced widely if they add to production time and cost.

But there's great progress on this being made every day, and the new consoles have a good amount of computational power to go around, so there is plenty of scope for smart programmers to come up with good solutions that don't rely on tensor cores specifically.
 

LCGeek

Member
Oct 28, 2017
5,857
I always said it...4k is a gimmick. Dlss just proves it.

it's not a gimmick

we always benefit from pixel density go back to 480p or less than go back to it and it's clear as day what the benefit is.

I'd rather do 4k with reconstruction then kill performance though.

DLSS just makes the IQ worth it and you keep performance.

Dlss just makes the process of resolution easier since we get massive fps gains with it. I'm still gonna use it to have awesome downsampling effect in time as well.
 

PlayerOne

Member
Apr 16, 2018
1,710
I wonder how it would compare to checkerboard at 60fps. I remember DF guys being really impressed with DMC V implementation at 60fps, i think they even mistook it for native, maybe it would be a lot closer to dlss.
 

Techno

The Fallen
Oct 27, 2017
6,412
Agreed, I've been on Nvidia for so long now not just because of their hardware but the technologies they bring to stay ahead. In the past decade they brought us 120hz+ screens, G-Sync, Shadowplay and DLSS.

Yup and also Ansel, it isn't a game changer like some of the other stuff but it's a cool bonus to have.

Photomode for a game like Tekken 7 only exists because of Ansel. I also saw some game filters Nvidia was showing off the other day which you can apply to your games in real time during the actual gameplay (Witcher 3) which is really cool as well.