• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Deleted member 2834

User requested account closure
Banned
Oct 25, 2017
7,620
Guess there will be ps5 pro with tensor cores....
This but unironically. I think there's a good chance DLSS will turn into an absolute game changer. I'm confident I'm better off spending my PS5 money for an Nvidia 3000 series this holiday season and just wait for a PS5 Pro with some DLSS equivalent down the line. That better happen.
 

Dinjoralo

Member
Oct 25, 2017
9,137
I kinda wish checkerboarding was available in the PC version. My non-RTX having self would take that over AMD's CAS that doesn't look too hot.
 

DrDeckard

Banned
Oct 25, 2017
8,109
UK
I just went on twitter...and saw some responses to Alex and John about these videos and zooming in...Man....I am so glad I do not have to deal with that stuff, and twitter is absolutely horrible.
 

Sia

Attempted to circumvent ban with alt account
Banned
Jun 9, 2020
825
Canada
I wonder if Cerny is working on Checker boarding 2.0 which would improve checker boarding or have they completely tabled it and are aiming for 4k native.
 

Gitaroo

Member
Nov 3, 2017
7,985
This but unironically. I think there's a good chance DLSS will turn into an absolute game changer. I'm confident I'm better off spending my PS5 money for an Nvidia 3000 series this holiday season and just wait for a PS5 Pro with some DLSS equivalent down the line. That better happen.
Depends on your current pc setup, if you are on mid end 1000 series then sure. Next couple of years will be a lot of cross gen games period and dlss is still not widely adopted as amazing as it is. Wait for 4000 series with a much bigger overhead than ps5 and xbsx and more dlss support games is more logical and hopefully AMD also have a true high end competitors by then. Remember DLSS need to be supported by games, not a standard feature you expect out of most pc games.
 

FoolsMilky

Member
Sep 16, 2018
485
Great stuff as always from Digital Foundry.

I think there's still a lot of demystifying to be done as it seems people don't quite understand the rendering pipeline from that "2.5ms" graph that keeps being posted.
Here's the slide before and the two slides after it from the GTC 2020 talk from Nvidia.
ktyBywh.png

7VooYS3.png

JiXsgUe.png

NYzYx8B.png

The current maximum added time from the second lowest card (The 2060 Super with the original 2060 likely being the "worst" card) is approximately 2.5ms. That time is getting added to the cost of rendering a lower resolution frame. The difference between rendering 540p and 1080p, 1080p and 1440p, and 1440p and 4K is demonstrably massive. It's the whole reason rendering in Native 4K is avoided, it's incredibly costly.

So that massive 2.5ms bar representing DLSS cost is not really massive at all.
Again for reference, this is how much a frame normally can take.
30 fps = 33.33ms / frame
60 fps = 16.67ms / frame
100 fps = 10ms / frame
144 fps = 6.94ms / frame
240 fps = 4.17ms / frame
360 fps = 2.77ms / frame

tl;dr: The second slide is confusing people and making DLSS look "costly" , when in reality it costs very little time with respect to normal rendering. If it was costly, I mean, it literally wouldn't improve performance. DLSS will only get better, even further reducing that 2.5ms (It's already down to 1.5ms on a 2080ti as can be seen on that graph).

Also someone had the rendering pipeline picture which broke down what a "frame" is made out of in normal rendering versus DLSS, but I can't find it. Would be helpful to repost.
 

Ploid 6.0

Member
Oct 25, 2017
12,440
Wow, the framerates for 1080p to 4k on control. I remember why I didn't touch the first gen of RTX and especially 4k monitors, RT still has a way to go, even with DLSS.
 

Xun

Member
Oct 25, 2017
4,316
London
DLSS looks fantastic.

Hopefully we see an implementation of it in the Switch 2, but we shall see.
 

wafflebrain

Member
Oct 27, 2017
10,198
So what game has some of the biggest perf improvements to show off DLSS 2.0? Thankfully PC Gamepass has a handful that use it (FFXV, Deliver Us To The Moon, Youngblood). Tempted to snag Control on EGS, with the discount plus a $10 coupon its only $20 there right now. A little iffy on that due to hearing how demanding the RT is, only have a 2060S for RT. Would love to get Death Stranding but don't really want to double dip at the current price.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
So what game has some of the biggest perf improvements to show off DLSS 2.0? Thankfully PC Gamepass has a handful that use it (FFXV, Deliver Us To The Moon, Youngblood). Tempted to snag Control on EGS, with the discount plus a $10 coupon its only $20 there right now. A little iffy on that due to hearing how demanding the RT is, only have a 2060S for RT. Would love to get Death Stranding but don't really want to double dip at the current price.
FF15 uses 1.0, so Control is the best usage. Also coming to steam in a couple of months
 

Gitaroo

Member
Nov 3, 2017
7,985
I wonder if Cerny is working on Checker boarding 2.0 which would improve checker boarding or have they completely tabled it and are aiming for 4k native.
I think ps5 will need the ID buffer hardware to emulate ps4 pro unless they got a work around. But higher frame rate should help with all the artifacts created by checkerboard rendering similar to TAA having less ghosting artifacts when frame rate is higher because of all the frame blending. So say same games like GoW running with the same checkerboard 4k but in 60fps on ps5, artifacts will be less noticeable. If all the ps5 first party games are targeting a native 4k 30fps, they can have a checkerboard 1800p mode at 60fps.
 

blitzblake

Banned
Jan 4, 2018
3,171
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...

never buying amd again, at this point you're a complete sucker if you do.
 

TeenageFBI

One Winged Slayer
Member
Oct 25, 2017
10,226
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...
DLSS 2.0 is computed on the Tensor Cores that are only found on RTX cards. Even without patents or whatever, it simply couldn't run on current AMD hardware.

Not to mention Nvidia's research into AI reconstruction. They spent a *lot* of time and money on RTSS.
 

CreepingFear

Banned
Oct 27, 2017
16,766
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...

never buying amd again, at this point you're a complete sucker if you do.
Pretty much every technology that AMD matches, Nvidia has a better version. G-sync is superior to Free Sync, etc. It's unfortunate. I want AMD to compete.
 

Nzyme32

Member
Oct 28, 2017
5,245
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...

never buying amd again, at this point you're a complete sucker if you do.

This is pretty ridiculous.
"Exclusivity" has nothing to do with it. Nvidia decided to go run with this machine learning / AI effort for years, invested in it beyond gaming products - then decided to implement tensor cores into the new Geforce lineup to support ray tracing and efforts like DLSS, when other companies did not take those risks. People critisised the company for it at the time, particularly with increased pricing vs performance and minimal ray tracing support off the bat.
And they sure screwed up a bunch along the way in terms of pricing etc, but they walked that back with the "super cards", and eventually support for Ray Tracing has improved and been way more accepted since launch of the RTX cards, and DLSS has been iterated on to remove so many of the issues it had in 1.0 to this point.

AMD have there own efforts going on I'm sure, and as usual they'll be informed by what they see as working for other companies, but it was their decision not to take this path in favour of their own ideas, that had less associated risks. Both companies have their disadvantages and advantages. Clearly though, the risks Nvidia has taken have been paying off.

As Alex says in the video, this may well be the defacto way of up-scaling at least for PC in the near future. Both companies will continue to compete on their own ideas to compete for users, just as MS and Sony will dream up alternatives for their consoles. On the hardware side for the consoles, they are set in stone now and XSX is the only one with something similar. But there will be updated consoles that may have it. Maybe they have some other solution. Who knows.

"Never buying" from a company again because of something like this, where they made a long term bet, is pointless. I'll keep watching them compete and jump between which ever suits my interests and use cases
 

Spark

Member
Dec 6, 2017
2,538
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...

never buying amd again, at this point you're a complete sucker if you do.
People should applaud NVIDIA for pushing this tech into consumer products, and pressure AMD to compete. So far AMD just seems content with securing console contracts and playing catch-up in the GPU space.
 

Gitaroo

Member
Nov 3, 2017
7,985
People should applaud NVIDIA for pushing this tech into consumer products, and pressure AMD to compete. So far AMD just seems content with securing console contracts and playing catch-up in the GPU space.
It has been that way for a very long time, I remember they were cutting edge in the ATi days.... 9700pro...
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...

never buying amd again, at this point you're a complete sucker if you do.
IMO, DLSS is probably one of the most important leaps in game rendering. 4K with the performance of 1080p?

at least there will be an equivalent from Microsoft. at some point. and Facebook is working on AI upscaling too
 
Oct 25, 2017
9,872
I finally know why hair in so many PS4 games looks grainy. I thought that they had some hair engine and it was meant to look that way or something.
 

Lump

One Winged Slayer
Member
Oct 25, 2017
15,960
The idea that upscaling gives you a better picture than a native image is just crazy to me. Really it's sounding like true 4k may end up being the exception in the next-gen rather than the standard.

The secret sauce is that the AI upscaling techniques use target 16k baselines with their algorithm before scaling down to a much
It has been that way for a very long time, I remember they were cutting edge in the ATi days.... 9700pro...

Still the GOAT video

 

seldead

Member
Oct 28, 2017
453
Game graphics have entered a new and very dynamic stage of computer science problem-solving. The first problem is increasing resolution on the cheap, computationally-speaking, because 4K isn't worth it in terms of computational resources, it just doesn't look that much better than 1440p or perhaps even 1080p to be worth calculating it in the traditional manner. So things like temporal upscaljng, CBR, DLSS are all about coming up with cleverer, quicker ways to extrapolate a larger image based on the image you have already calculated at great computational expense. It's a great compsci challenge and the field seems to be evolving very quickly. Guerrilla woke people up with their temporal upscaling for the 60fps multiplayer in Killzone when the PS4 released.

DLSS is interesting in that it essentially takes advantage of offline computation. In the same way that pre-baked lighting gives you far better lighting than you could do in real-time, DLSS takes advantage of having the tensor cores "trained" by running the game over and over at high res on a powerful PC. The tensor cores can extrapolate a high res image because that training teaches them what to expect.

I wonder how much, like pre-baked lighting, you lose flexibility on DLSS. Pre-baked lighting can't light objects that weren't in the original image (moving NPCs for example), and I wonder if DLSS' visual flaws are due to situations that the tensor cores haven't been trained on. Perhaps rare and inherently unpredictable visual situations (rain, dynamic weather, dynamic and destructible environments) give rise to DLSS hiccuping. What are the limits on how much they can be trained?

Another big area of comp sci innovation is obviously ray-tracing, denoising, and so on.

Ultimately the more dynamic and flexible the compsci solution, the better for devs, because it frees them and makes things cheaper. Pre-baked lighting is expensive and time-consuming in terms of the production process, and I'd imagine that training machine-learning cores might be as well. Device-specific solutions, like Nvidia-only solutions, are less likely to be embraced widely if they add to production time and cost.

But there's great progress on this being made every day, and the new consoles have a good amount of computational power to go around, so there is plenty of scope for smart programmers to come up with good solutions that don't rely on tensor cores specifically.

This is a slightly misspecified appraisal of DLSS. Supervised machine learning aims to fit a parameterised function on a set of training data with the goal of then generalising the learnt function to be performant on unseen data. The generalisation is achievable due to the probabilistic nature of the learnt function. It isn't hard coding discrete rules for every specific instance (which would be impossible to do for unseen instances). Instead ML techniques attempt to learn a general functional form for the underlying data process and then perform inferences from it for unseen data. As a result, the ability for ML to be successful at tasks on unseen information far exceeds completely offline techniques such a pre-baked lighting (which in fact does define a rule for every seen instance in offline rendering and therefore can't generalise).

This ability for ML to be performant on unseen data is further improved with modern deep learning techniques such as transfer learning, meta learning, and domain adaptation. DLSS 2.0 takes particular advantage of transfer learning in that it doesn't need to be trained on the specific games that implement it. Instead, the model is trained on a large amount of data for an upstream task that Nvidia has hypothesised will generalise to all games that use the DLSS. The quality of the learnt parameters for the upstream task are then evaluated on the primary task of decoding the high resolution images of the video games.

So, while not perfect yet, with the maturation of techniques and increase in available data there is no reason why DLSS or a competitor couldn't improve to the point where no perceivable flexibility is lost.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...

never buying amd again, at this point you're a complete sucker if you do.

Exclusive bullshit? What?
How is it NV's fault that AMD doesn't have anything comparable? (for the love of god people don't bring up RIS/fidelityFX)
Should they send some engineers AMD's way to help them on their ML implementation?
 

Alexandros

Member
Oct 26, 2017
17,800
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit.

I think that you are looking at this the wrong way. Nvidia developing DLSS and adding hardware to its graphics cards to accelerate it is the sort of technological innovation that pushes the industry forward. It isn't something that is available to both companies and Nvidia is keeping it exclusive through deals and moneyhats, it's technology that it developed and AMD doesn't yet have. If and when it does and developers don't support it because of deals with Nvidia, then it would be a case of exclusive bullshit.
 

blitzblake

Banned
Jan 4, 2018
3,171
Exclusive bullshit? What?
How is it NV's fault that AMD doesn't have anything comparable? (for the love of god people don't bring up RIS/fidelityFX)
Should they send some engineers AMD's way to help them on their ML implementation?
I think that you are looking at this the wrong way. Nvidia developing DLSS and adding hardware to its graphics cards to accelerate it is the sort of technological innovation that pushes the industry forward. It isn't something that is available to both companies and Nvidia is keeping it exclusive through deals and moneyhats, it's technology that it developed and AMD doesn't yet have. If and when it does and developers don't support it because of deals with Nvidia, then it would be a case of exclusive bullshit.
I'm just salty that I have a amd card. G sync, ray tracing and now dlss. I don't think I'll be buying amd again.
 

Alexandros

Member
Oct 26, 2017
17,800
I'm just salty that I have a amd card. G sync, ray tracing and now dlss. I don't think I'll be buying amd again.

Never say never. I'm sure many people thought that AMD wouldn't be able to keep up with Intel, yet here we are. Also, DLSS would have to gain mass adoption in order to become a serious competitive advantage.
 
Oct 27, 2017
6,348
Weird, I recently started the game on PC and thought the trails behind black particles were on purpose. What a strange side effect. But it works with the look of the game at least.
 
Jan 21, 2019
2,902
I'm not the only one that wants to see what DLSS AA does to a native 4K image, right? Instead of upscaling, I want to see it used specifically for AA.
 

dstarMDA

Member
Dec 22, 2017
4,289
Honestly the compromise is obviously absolutely worth it for now but I'm starting to get tired of reading "4k for 1080p rendering cost" on each and every one of this thread. There are still very obvious artifacts and a native 4k image is definitely better, even if most people don't notice or don't care.

That said, the temporal nature of many rendering techniques / post-process effects in modern games introduce many artifacts of some kind most of the time anyway, so we're still one or two generations away from having pristine IQ I guess, and in the meantime reconstruction techniques are the way to go.
 

nelsonroyale

Member
Oct 28, 2017
12,124
Tried this out on PC...the framerate and visual refinement makes quite a big difference with this game. It looked great on PS4 Pro, but there was some annoying aliasing. Now that is taken care of, it is pretty obvious this game kind of highlights the value of console optimisation, or maybe it is how excellent the Decima engine is. I am playing this alongside Gears 4 on PC at ultra 4k, and DS looks significantly more impressive.
 

Nooblet

Member
Oct 25, 2017
13,622
Tried this out on PC...the framerate and visual refinement makes quite a big difference with this game. It looked great on PS4 Pro, but there was some annoying aliasing. Now that is taken care of, it is pretty obvious this game kind of highlights the value of console optimisation, or maybe it is how excellent the Decima engine is. I am playing this alongside Gears 4 on PC at ultra 4k, and DS looks significantly more impressive.
Well Gears is also a superbly optimised game for both console and PC.
And while DS may look more impressive than Gears 4, Gears 5 is several orders of magnitude more impressive than Gears 5 and can stand toe to toe with Death Stranding and other games.
 

Cyberclops

Member
Mar 15, 2019
1,439
I could see a scenario where Microsoft/AMD could implement a similar technique on the Series X. They've already touted the machine learning capabilities of it. If the Series X can run a game at 1080p at around 85 fps, it can use 5 ms per frame to upscale to 4k running at 60fps.

If you're CPU-bound at 60 fps, does running DLSS lower your overall frame rate?
 

MrKlaw

Member
Oct 25, 2017
33,038
I could see a scenario where Microsoft/AMD could implement a similar technique on the Series X. They've already touted the machine learning capabilities of it. If the Series X can run a game at 1080p at around 85 fps, it can use 5 ms per frame to upscale to 4k running at 60fps.

If you're CPU-bound at 60 fps, does running DLSS lower your overall frame rate?

It would mostly run on GPU so it shouldn't. But 5ms out of 16 is a huge amount. I guess technically if you can render 1080p/60 in 11ms as you say, then it would be possible. But realistically it seems a huge amount of frame time budget for reconstruction. I think it is more likely developers will come up with alternative solutoins that fit more in the 1.5-2.5ms area. Either regular GPU compute based or leveraging RPM if it speeds things up (limited/more optimised versions)

What is the ms cost of the 'performance' DLSS rather than quality? Maybe that'd be more in the ballpark for XSX?
 

nelsonroyale

Member
Oct 28, 2017
12,124
Well Gears is also a superbly optimised game for both console and PC.
And while DS may look more impressive than Gears 4, Gears 5 is several orders of magnitude more impressive than Gears 5 and can stand toe to toe with Death Stranding and other games.

Looking forward to trying out Gears 5 after I complete 4. I find the quality and complexity of geometry in Gears 4 pretty disappointing to be honest, although as you say performance is very good. I am also playing it alongside Quantum Break on PC at 4k, etc, and that looks damn good. Definitely more impressive than Gears 4. That game was obviously held back massively on Xbone by IQ.
 

Ganado

▲ Legend ▲
Member
Oct 25, 2017
2,176
Did you ever notice how ao, ssr, and hair never render right?
They should Render just as little flicker spots more or less without TAA.
Oh yeah, sure. I just don't know if I would say that the game breaks without TAA. While not optimal, I personally rather go for full sharpness with aliasing on hair, rather than some kind of blur/ghosting. I just tested with TAA on and off and while I agree that the picture is overall better, e.g. camera movement doesn't feel as snappy with a mouse. Also, as a side note, when I use TAA, my characters constume flickers like crazy.
 

dsk1210

Member
Oct 25, 2017
2,389
Edinburgh UK
Oh yeah, sure. I just don't know if I would say that the game breaks without TAA. While not optimal, I personally rather go for full sharpness with aliasing on hair, rather than some kind of blur/ghosting. I just tested with TAA on and off and while I agree that the picture is overall better, e.g. camera movement doesn't feel as snappy with a mouse. Also, as a side note, when I use TAA, my characters constume flickers like crazy.
I don't know how you can play it without the TAA, it's a shimmering mess if turn it off.
 

Pargon

Member
Oct 27, 2017
11,992
Is DLSS on amd cards? I've got a feeling it's not and I hate that people give PC a pass for this exclusive bullshit. If you bought the wrong card you just miss out on an amazing piece of tech...
never buying amd again, at this point you're a complete sucker if you do.
It's not available on AMD GPUs; but it's not like NVIDIA has bought up exclusivity and prevented AMD from using it.
It's a technology that they have developed themselves, for their own specific hardware that AMD does not have an equivalent for even if it was not restricted to NVIDIA GPUs.
And I've been telling people here ever since the RTX cards launched, that no-one should buy an AMD GPU when they lack equivalents to the RTX or Tensor cores.

DirectML is probably going to be the nearest competitor, since it's developed by Microsoft and should be vendor-agnostic (but still accelerated by the Tensor cores).
While RDNA2 may not have an equivalent to the Tensor cores, with a discrete hardware block just for AI acceleration, that doesn't mean they won't be able to run anything equivalent. They just don't have the speed to run NVIDIA's exact DLSS 2.0 implementation.

Look at how post-process AA has evolved from MLAA to FXAA, SMAA, TAA, checkerboard rendering.
We have seen dramatic improvements across a generation, running on the same hardware.
There's no reason to think that it won't be possible to have something rivaling or surpassing DLSS just because the hardware is weaker.
In the short term, sure, there's certainly nothing else like it; but I hope that it does get more competitive.
 

mhayze

Member
Nov 18, 2017
555
Beyond DLSS, Inference (calculating results based on a learned model) is a very useful workload across the board - something that tensor cores accelerate on newer Nvidia GPUs, but it's not exclusive to Nvidia or proprietary (outside of the particular implementation of DLSS). Lots of phone SOCs have inference accelration cores too, and in data centers there are new startups making very large inference and learning-focused ASICs (not traditional GPUs).
Nvidia has been working on and investing in this space heavily for a while now, for a variety of use cases including self-driving car computation acceleration, for example, and has other ML workload-based solutions like using the GPU to cleanup microphone audio (isolating the speaker from background noise) that is pretty cool, and other projects like their random face generator that show where things could go.
Anyways, I don't think DLSS is something that no one else can do - and the current DirectML video based upscaler is a related but not that ideal technique - I think other temporal/inference reconstruction techniques could do well on other GPUs. Having hardware to accelerate inference would definitely help, and it remains to be seen what new AMD GPUs can do here, but there are hints that the XSX and PS5 have optimization for faster low precision work like inference, and hopefully PC cards have that too.
Besides HW, pre-packaged libraries could help too so hoepfully MS and AMD are working on that too.
 

Cow Mengde

Member
Oct 26, 2017
12,697
What people here don't realize is that Nvidia is a decade ahead of AMD in developing AI. Even if AMD develops their own solution to DLSS, they would still be playing catch up with Nvidia.
 

Alastor3

Attempted to circumvent ban with alt account
Banned
Oct 28, 2017
8,297
Sorry to bump and old thread but, do we know how well DS did on Steam?
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
Death Stranding was updated with DLSS 2.1 Ultra performance mode. Is this for 1440p->8K right?