• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Plax

Member
Nov 23, 2019
2,820
I dont mean to come off as being rude but did you not do any research before dropping that kind of cash?

I definitely did my research and managed to order a 6800 XT for a fair bit less than a 3080 would cost me. I was aware of DLSS of course, but this thread is full of positive impressions that make me want AMD to release their version.

I hope you didn't buy the AIBs models that are actually more expensive then the 3080.

No. My order is for a stock card at MSRP. No idea if the order will be fulfilled with all of the stock issues. But fingers crossed.
 

Bonfires Down

Member
Nov 2, 2017
2,816
I'd like to say no but I might have to say yes. AMD is just going to have to lower their prices. The extra VRAM on AMD cards is very nice but DLSSd games also use less VRAM than native.
 

marcbret87

Member
Apr 20, 2018
1,367
i'm going agaisnt the grain here and say that DLSS doesn't matter unless you mostly play AAA games.

For me i rarely touch AAA games (mostly total war) so i don't think DLSS matters at all. Unless DLSS starts to be vailable in almsot every small game like project wigman and euro truck simulator it will never be an important feature.

Don't misunderstand me, it is a fucking impressive piece of tech. however liek msot of Nvidia proprietary features it seems to only get added to big titles.

If you don't play big AAA games the you likely don't need a super high-end gpu to begin with. Which is fine, or course, but the key thing is that DLSS is supported in the games that really need it
 

p3n

Member
Oct 28, 2017
650
Radeon cards lack the hardware components to benefit from a DLSS-like approach. Calculations could be run through the CUs and the resulting image quality improvements would be identical but at a loss of performance instead of a gain. The algorithms/cascades used in DLSS are nothing new or groundbreaking - just having dedicated hardware on a GPU to run them at a net performance gain is. There will be no DLSS-equivalent on PS5/XSX or R6000 cards. AMD's solution is already available: a filtered upscaler with a sharpness slider and it is called CAS (formerly RIS).
 

Spoit

Member
Oct 28, 2017
3,989
And your point is? Should I be happy with the few selected games in that quote?

I really like DLSS and would love to have it on more games. Look around this thread. A lot of people agree the support is limited and they want DLSS on more games.
Hopefully adoption will be much wider in the future now that it's part of the main UE4 branch, instead of being something nvidia needed to work with developers on a per game basis
 

Prefty

Banned
Jun 4, 2019
887
It's extremely important...and that is why the industry needs an open source solution, free for everyone to use, It will come, just give it time.

(it will happen, just like Gsync/freesync)
 
Oct 27, 2017
3,894
ATL
DLSS is incredible tech. Even if AMD produces a solution that's similar, there's still the issue of the fact that DLSS is hardware accelerated via tensor cores.

I am curious if the image reconstruction quality of DLSS will ever improve to the point where it can produce similar results to DLSS Quality at a 4th of the output resolution or lower? It's hard to think where the tech could go other than getting better quality from even lower source resolutions so that greater performance can be eked out of existing hardware without visual quality loss.
 

King_Moc

Member
Oct 25, 2017
4,126
Yep, we went from ray tracing being a pipe dream, to me playing cyberpunk with rtx reflections and lighting, at 4k, 40 - 70fps. All because of DLSS.

2.5 years ago you'd have been laughed at for suggesting this would happen.
 

pg2g

Member
Dec 18, 2018
4,811
And perhaps thats the way forward. Haven't been so far though, it's always the same cycle, new games and features when the new consoles hit and a year or two in the PC cards are strong enough to brute force everything for the next couple of years. Of course performance can always be used elsewhere but will it, feels like a solution to a made up problem.

Real time rendering is all about doing as little as you can to achieve the desired result. That is why techniques like occlusion colluing and variable rate shading exist. DLSS is just another one of those techniques. Advancements come from innovation on both the hardware and software front.
 

Deleted member 49611

Nov 14, 2018
5,052
i was tempted to get a 6800XT but until AMD can offer something to rival DLSS then i'm not touching any of their cards. i don't care about the raytracing performance. DLSS helps either way whether you're running RTX or not. every game that supports it i enable it.
 

bes.gen

Member
Nov 24, 2017
3,353
repeating whole thread, yeah its a game changer.
you basically set your resolution pretty low and get a high resolution image quality (think like 1080p giving 4k image) and get all the free performance boost that entails.
its like holy grail of graphics settings.
 

Spoit

Member
Oct 28, 2017
3,989
Can't train DLSS 1.0 on a modded game, it's going to be trained on vanilla. 2.0 is better though mods probably will not have the extra time or perhaps possibility to code for DLSS 2.0. Truth is we don't know but it's mostly for the biiggest releases so far. Someone talked about G-Sync to Freesync, in my mind until DLSS goes platform agnostic it's not worth thinking about except in a very limited sense.
What are you even talking about? Who's even using 1.0 any more? And when have modders ever dug deep enough into the code to be able to program their own TAA solution?
 

jakershaker

Member
Oct 28, 2017
202
Real time rendering is all about doing as little as you can to achieve the desired result. That is why techniques like occlusion colluing and variable rate shading exist. DLSS is just another one of those techniques. Advancements come from innovation on both the hardware and software front.

Could be. Something tells me that there will be cards who can run Cyberpunk with raytracing at 4k without using DLSS. People will then choose image quality over extra frames.

What are you even talking about? Who's even using 1.0 any more? And when have modders ever dug deep enough into the code to be able to program their own TAA solution?

Who knows. From what I've read and seen it seems like DLSS is somethiing new and high budget games will use. And tbh I didn't say people were using 1.0 anymore I used it as an example, 2.0 is better bit still not easily adopted into a mod afaik.

For some weird reason people are being hyped up over this. Go on do your thing. My recommendation is still just get the card you can, most of them are queued up until forever here.
 

dimb

Member
Oct 25, 2017
737
Never said it was 4k exclusive. Just not worth it at all for a laptop gamers given the resolution we play because the returns are poor. As for potentially being a better AA, being generous, It is a game by game case scenario applied to a pool of games which support it that - as of now - is insignificant quantity wise.

Although I guess it's worth it if you want a to spend (waste imo) money in a laptop to play at 4k?
What is "worth it"? You can still render are resolutions below the native of the monitor/screen and upscale with DLSS for a performance bump. What is the "waste" of money even supposed to refer to? Nvidia is so pervasive across the laptop sector that you would have to go extremely far out of your way to avoid them.
 

Galava

â–˛ Legend â–˛
Member
Oct 27, 2017
5,080
DLSS is just too good to pass on. Lowers VRAM usage, increases framerate and makes RT playable. All while looking better than TAA or similar.
 

Ada

Member
Nov 28, 2017
3,736
Finished Control last week with all the RTX goodies on and it was amazing. DLSS made it possible.
 

Alvis

Saw the truth behind the copied door
Member
Oct 25, 2017
11,231
Spain
i dont think you understand how DLSS works cuz it is not to be used for 1080p or 720p displays.
it only really starts to shine when using it in 1440/4k/8k and actively makes shit worse when going from 720 to 1080p.
Not true. I used it on Control on my 1080p display before I upgraded to 1440p. It allowed me to play it with RT enabled at very high framerates on my 3070 (90-100 FPS). It looked almost as good as a native 1080p image.

On Death Stranding, DLSS Quality at 1080p straight up looked better than native 1080p.

As for how good it would work to reconstruct from a lower resolution to 720p in handheld mode, no idea, but I bet it's doable too.
 
Oct 27, 2017
5,618
Spain
To be fair, consoles have been using image reconstruction techniques for years that produce very good looking results with huge performance savings and good image quality, without the need for AI. Not just checkerboarding, but also stuff like Insomniac's temporal reconstruction method that games like Spiderman/Miles Morales/Demons' Souls use. That method produces almost indistinguishable results between native 4K and reconstructing from 1440p.
 

Shiz Padoo

Member
Oct 13, 2018
6,117
I've seen great things on Digital Foundry about DLSS and it sure is tempting, but I'm not ready to upgrade just yet. So, while not exactly the same thing, I'll use this FidelityFX upscaling and sharpening thing where possible and where necessary until I do. For example, my 1660 Ti runs Monster Hunter World at pretty much 1080/60 with highest settings. 1440p is around 40fps. Enable FidelityFX and 1440p is 60fps and looks very nice indeed. Literally cannot argue with that.

So for me, the choice is difficult as my GFX card is still fairly capable. But for anyone struggling to reach 30fps on lowest settings today, which for me personally is upgrade territory, DLSS is a no-brainer.
 

Pwnz

Member
Oct 28, 2017
14,279
Places
If the next switch has DLSS built-in, it's going to be a game changer

Yup. Next gen consoles could do 8K DLSS in 6 years. 8K native would be nearly triple that timeline, assuming we haven't made a huge advancement to replace silicon with something capable of massively higher frequencies.
 

Jroc

Banned
Jun 9, 2018
6,145
As a 3080 owner I've got to call DLSS out a bit. The performance benefit is great, but it still looks noticeably worse than a native image. Cyberpunk at 1440p looks a lot more detailed than 1440p Quality DLSS to me. The quality also varies when the game is in motion, so the still image comparisons you see online don't tell the whole story.

It's the best way to game if you want to game at sub-native res, but at the end of the day it still looks like sub-native res. I like having the option though since a blurry game with RT can often look better than a clearer game with no RT.
 

Freshmaker

Member
Oct 28, 2017
3,928
As a 3080 owner I've got to call DLSS out a bit. The performance benefit is great, but it still looks noticeably worse than a native image. Cyberpunk at 1440p looks a lot more detailed than 1440p Quality DLSS to me. The quality also varies when the game is in motion, so the still image comparisons you see online don't tell the whole story.
Turning off film grain and chromatic aberration (Why are these "features" ever turned on?) helped DLSS resolve that game quite a bit closer to native for me. Lets me hit 60fps at 1440p with all the RTX stuff cranked that way at least.
 
Last edited:

the botanist

Member
Jun 18, 2018
19
Cyberpunk at 1440p looks a lot more detailed than 1440p Quality DLSS to me. The quality also varies when the game is in motion, so the still image comparisons you see online don't tell the whole story.

You could also look at it from another perspective: Compare native 1440p with 4k DLSS where the native res is 1440p (thats the quality setting, right?). If you downsample the 4k DLSS image to 1440p and then compare it to the native 1440 the image quality should be drastically better. In this comparison the gain is in image quality and not performance.
 

mztik

Member
Oct 25, 2017
3,274
Tokyo, Japan
You asked me what my point was and I gave it to you.

Except that I asked what was the point of you quoting my original post with that stupid quote about "there comes someone complaining bububububut about 0.0003% games supporting DLSS". Instead of answering my bewilderment of such insulting reply, you go on there explaining why there isn't that much DLSS support. I didn't ask, but thanks, I'm already aware of the situation.

My original post to the OP states that the technology is amazing, but there aren't enough games available supporting it. Providing a link to a list of supported games. Not really a complaint, more like a wish for more.

If you're going to try to turn this into snarky ad hominem

And yet, this is exactly what you did with my original post.

ROWaiHj.png
 

karnage10

Member
Oct 27, 2017
5,505
Portugal
If you don't play big AAA games the you likely don't need a super high-end gpu to begin with. Which is fine, or course, but the key thing is that DLSS is supported in the games that really need it
obviously i don't need a super high-end GPU but if i wanted to max games out at 4k i'd need a relatively good GPU. If more games had DLSS than i could use a weaker GPU.
For example i have a 1070 and i can't play the following games in max setting @ 1440p:
  • Total war (new ones)
  • Battlefleet gothic armada 2
  • call to arms
  • transport fever 2
  • planet zoo
I imagine that if I could play games @4k then the list would probably be higher. Personally i'd love to have DLSS in the games i play. I'd upgrade my GPU quickly if that happened.
DLSS feels like a sci-fi unrealsitic tech and it is a shame it is just being used on games that are AAA ; which i'll be blunt are the games that, IMO, need it the less because usually the difference between high and max is something that i have a hard time seeing.

Looking at total war between medium, high and max unit detail is the difference between a unit that is murky to a FPS quality model in battle. I mean this is the quality of the warhammer 2 models:
tom-parker-tomp-def-malekith-lp-05.jpg

Can you imagine if the GPU could handle texturing 10000 models of this quality?
 

mephixto

Member
Oct 25, 2017
306
As a 3080 owner I've got to call DLSS out a bit. The performance benefit is great, but it still looks noticeably worse than a native image. Cyberpunk at 1440p looks a lot more detailed than 1440p Quality DLSS to me. The quality also varies when the game is in motion, so the still image comparisons you see online don't tell the whole story.

It's the best way to game if you want to game at sub-native res, but at the end of the day it still looks like sub-native res. I like having the option though since a blurry game with RT can often look better than a clearer game with no RT.

I noticed a llitle blurry too but try using the Image Sharpening on the Nvidia Control Panel, worked for me.

NVIDIA Support

 
Oct 25, 2017
6,086
IMO unless you know you only care for rasterization and will never ever touch raytracing (less and less likely as even the new consoles support it to an extent), AMD is a non-player this gen. Outside of the 6900XT, AMD's offerings aren't insanely better than the competition for the price, and once you turn on DXR they pale in comparison to even the 3060 Ti, which has no AMD competition in its price tier. This is before even mentioning DLSS, which pretty much ups your card's performance by a price tier or more depending on the implementation. I'm sure the 6000 refresh or 7000 series of Radeon will catch up in RT performance, but with no DLSS analogue out now, there's no real reason to go for the current GPUs and you should just go NVIDIA or wait until AMD releases one. Yes, they're working on something, but it isn't ready now, and gamers of all people should know never to purchase a product on the promise of future updates and only spend the money now if they're happy with what they'd get now.
 

Puggles

Sometimes, it's not a fart
Member
Nov 3, 2017
2,871
Yes. It's already neccasarry
As a 3080 owner I've got to call DLSS out a bit. The performance benefit is great, but it still looks noticeably worse than a native image. Cyberpunk at 1440p looks a lot more detailed than 1440p Quality DLSS to me. The quality also varies when the game is in motion, so the still image comparisons you see online don't tell the whole story.

It's the best way to game if you want to game at sub-native res, but at the end of the day it still looks like sub-native res. I like having the option though since a blurry game with RT can often look better than a clearer game with no RT.

Have you tried DLSS in Control? I think it looks better than native so it really depends on the game. Nvidia sharpening filter helps a lot in some games too.
 

Cleve

Member
Oct 27, 2017
1,022
It feels like a must for me. It's not one right this second, but I like to make my video cards last at least two years and I have no doubt that the adoption rate for DLSS2+ will be huge as it's now in the base build of unreal and only likely to grow from there.
 

Daeoc

Member
Oct 27, 2017
184
MA
Make sure you also consider OpenGL performance, AMD's OpenGL drivers are terrible, especially in emulation performance.

After seven years of being AMD only, I switched to NVIDIA(3070) and it feels nice to not go "well I guess this won't perform well" whenever OpenGL pops up or having to completely ignore many features(NVIDIA tech like HairWorks, DLSS, NVDEC, assembly shaders on Yuzu).

Although DLSS is one of the big reasons that should make you want to switch to NVIDIA, there are other reasons to consider it.
 
OP
OP
Grifter

Grifter

Member
Oct 26, 2017
2,573
Make sure you also consider OpenGL performance, AMD's OpenGL drivers are terrible, especially in emulation performance.

After seven years of being AMD only, I switched to NVIDIA(3070) and it feels nice to not go "well I guess this won't perform well" whenever OpenGL pops up or having to completely ignore many features(NVIDIA tech like HairWorks, DLSS, NVDEC, assembly shaders on Yuzu).

Although DLSS is one of the big reasons that should make you want to switch to NVIDIA, there are other reasons to consider it.
I'm having trouble recalling the last time I ran OpenGL. Is that common for current emulators?

Now I'm curious about HairWorks.
 
Oct 27, 2017
3,894
ATL
Using DLSS to super sample at higher performance is really good. If you have great 1440p performance on a 1440p native monitor, running at game at 4K quality DLSS will provide a pristine image with minor performance loss.
 

Daeoc

Member
Oct 27, 2017
184
MA
I'm having trouble recalling the last time I ran OpenGL. Is that common for current emulators?
Yes, OpenGL is the best API for most Nintendo emulation. There is the option of Vulkan in some of the newer ones, but it is still not as stable as OpenGL since Vulkan was implemented later on.

What I've noticed when it comes to emulation is that you either have the option of DirectX or OpenGL, or only have OpenGL until they add Vulkan.

Outside of emulation, AMD GPUs are likely to underperform in native OpenGL PC games when compared to their NVIDIA counterpart. For example, Wolfenstein The New Order is built on the idTech 5 engine which uses OpenGL. The newer idTech 6 and 7 seen in Doom and Doom Eternal do use Vulkan though.

OpenGL is more of a concern when it comes to older games and emulation.
 
Last edited:

the botanist

Member
Jun 18, 2018
19
Yes, OpenGL is the best API for most Nintendo emulation. There is the option of Vulkan in some of the newer ones, but it is still not as stable as OpenGL since Vulkan was implemented later on.

Don't wanna derail the thread into some graphics API war nonsense but I disagree with this statement.

Dolphin works like a charm with basically every graphics API including Vulkan, DirectX 11 and 12 as OpenGL. Then theres CEMU where the vulkan backend has gotten quite cable with features like async shader compilation. As for switch emulation I dont have too much experience but seeing yuzu having a vulkan backend I guess it should also work well?

Anyways, the point is, vulkan is supported in most major emulators working quite good even today. Seeing most emulator folks, and also a significant part of the gaming industry, moving in this direction shows that vulkan is the cross platform graphics API of the future.

AMDs bad OpenGL drivers on windows are odd though. I know they simply don't care but they should be ashamed with third party linux drivers being just so much better.
 

Deleted member 17289

Account closed at user request
Banned
Oct 27, 2017
3,163
DLSS is good if tuned properly by devs, CP 2077 DLSS looks blurry, you have separately add sharpness for it to look nice. It is definitely the defining feature and selling point of Nvidia GPU's.
 

Dreamwriter

Member
Oct 27, 2017
7,461
DLSS allowed me to get a solid 60fps in Cyberpunk with almost all effects (including Ray Traced everything but shadows) on Ultra at 1440p. Without DLSS, the framerate was far, far worse and I would have had to sacrifice more effects.
 

Daeoc

Member
Oct 27, 2017
184
MA
Don't wanna derail the thread into some graphics API war nonsense but I disagree with this statement.

Dolphin works like a charm with basically every graphics API including Vulkan, DirectX 11 and 12 as OpenGL. Then theres CEMU where the vulkan backend has gotten quite cable with features like async shader compilation. As for switch emulation I dont have too much experience but seeing yuzu having a vulkan backend I guess it should also work well?

Anyways, the point is, vulkan is supported in most major emulators working quite good even today. Seeing most emulator folks, and also a significant part of the gaming industry, moving in this direction shows that vulkan is the cross platform graphics API of the future.

AMDs bad OpenGL drivers on windows are odd though. I know they simply don't care but they should be ashamed with third party linux drivers being just so much better.
I did say most, as Dolphin is one of the best emulators ever made. CEMU has come a long way with Vulkan, but it took awhile for it to get there with the only viable API for a long time being OpenGL. Switch emulation is still pretty early and so far OpenGL is what you want to be playing in. When you go to the DS and 3DS, there is only really OpenGL. Go back further and although it doesn't really matter since they are so old, many are superior when you use OpenGL.

I do agree that Vulkan is the way to go and should be the future for emulators but so far Vulkan has not been used as the base API. I've experienced this and having to wait longer than a year for a Vulkan API to release and then have to wait again for it to actually be playable is not a fun time.

AMD's decision to abandon OpenGL in its poor state is the biggest reason why I recently switched over to NVIDIA.

(I did get a Ryzen 5800x though, their CPUs are really where they shine now)
 

the botanist

Member
Jun 18, 2018
19
I see where you're coming from. But then the only platform where bad opengl performance still matters is switch emulation. Yes, DS and 3DS emulators may only support opengl, but I'd argure it doesnt really matter due to the comparably low hardware requirements.

And if (switch) emulation on opengl/amd really matters to you, theres still the option to run either yuzu or ryujinx on linux with mesa drivers.
 

TheMadTitan

Member
Oct 27, 2017
27,246
To be fair, consoles have been using image reconstruction techniques for years that produce very good looking results with huge performance savings and good image quality, without the need for AI. Not just checkerboarding, but also stuff like Insomniac's temporal reconstruction method that games like Spiderman/Miles Morales/Demons' Souls use. That method produces almost indistinguishable results between native 4K and reconstructing from 1440p.

Another reason why I'm not too concerned, even though my next card will probably be an RTX card. There's plenty of ways developers have in order to upscale/faux render that look good, so the need for DLSS isn't necessarily there. If Insomniac can get that good of performance out of a PS5, Microsoft can do the same thing on Windows and AMD can do the same on Windows and Linux.

I'm having trouble recalling the last time I ran OpenGL. Is that common for current emulators?

Now I'm curious about HairWorks.
It was mainly an issue for RPCS3 and Cemu these days, but both emulators support Vulkan now.

Unless you're trying to get 70+ fps in BoTW, the AMD/Nvidia conversation in regards to emulators and OpenGL isn't really necessary. Maybe for Yuzu and Ryujinx, but I think they support Vulkan too.

edit: Yup, Yuzu supports Vulkan. Looks like Ryujinx only supports OpenGL.
 

ShinUltramanJ

Member
Oct 27, 2017
12,950
Correct me if I'm wrong but Nvidia and AMD have to throw money at these publishers to get them supporting their specific features right? Developers don't just decide to throw their weight behind AMD or Nvidia - there's deals being struck, right?

While AMD doesn't have a DLSS answer at the moment, I wonder how things will go once they do? Will there be a battle to get publishers supporting one over the other? Will Microsoft and Sony being in AMD's corner have any affect on future DLSS support?

just wondering.
 

Spoit

Member
Oct 28, 2017
3,989
Correct me if I'm wrong but Nvidia and AMD have to throw money at these publishers to get them supporting their specific features right? Developers don't just decide to throw their weight behind AMD or Nvidia - there's deals being struck, right?

While AMD doesn't have a DLSS answer at the moment, I wonder how things will go once they do? Will there be a battle to get publishers supporting one over the other? Will Microsoft and Sony being in AMD's corner have any affect on future DLSS support?

just wondering.
If AMD's solution is actually good, I imagine it'd be used widely on both next gen consoles, and thus would be pretty widely available
 

the botanist

Member
Jun 18, 2018
19
Yeah, thats why they are aiming for a generic cross platform solution. As long as it works everywhere and helps a game perform / look significantly better with minimal additional development effort, publishers and developers will be all ears.

Does anyone know how DLSS (or any other proprietary game tech) licensing works? Do studios implement it just because it makes their game better or is Nvidia pulling some strings in the background? Maybe something in between?
 

TheMadTitan

Member
Oct 27, 2017
27,246
Yeah, thats why they are aiming for a generic cross platform solution. As long as it works everywhere and helps a game perform / look significantly better with minimal additional development effort, publishers and developers will be all ears.

Does anyone know how DLSS (or any other proprietary game tech) licensing works? Do studios implement it just because it makes their game better or is Nvidia pulling some strings in the background? Maybe something in between?
Of the ones that have it, I'm sure it's Nvidia kicking some money over to them as a marketing deal. We won't know about games that adopt it because it's useful for awhile now. And by that time, Microsoft and AMD will likely have finalized their stuff that developers will also leverage.
 

LCGeek

Member
Oct 28, 2017
5,857
To be fair, consoles have been using image reconstruction techniques for years that produce very good looking results with huge performance savings and good image quality, without the need for AI. Not just checkerboarding, but also stuff like Insomniac's temporal reconstruction method that games like Spiderman/Miles Morales/Demons' Souls use. That method produces almost indistinguishable results between native 4K and reconstructing from 1440p.

It's not as good and it doesn't give you an insane fps performance which matters in a RT situation.

I have no doubts if performance gains were good miles morales could do 1440p with rt and and upscaling at high fps, it can't. This is where DLSS really helps a developer.
 

inner-G

Banned
Oct 27, 2017
14,473
PNW
Does anyone know how DLSS (or any other proprietary game tech) licensing works? Do studios implement it just because it makes their game better or is Nvidia pulling some strings in the background? Maybe something in between?
I heard it was recently added to Unreal Engine 4 so hopefully more games will use it soon.

I know I am more likely to pay more $ for a newer game with RTX/DLSS instead of waiting for a super sale or bundle.
 

Jhey Cyphre

Member
Oct 25, 2017
3,089
I only wish serious effort was put into getting DLSS into some older games. It truly is amazing. It's utility is amazing as well... you can leverage it for whatever you prioritize.
 

Haint

Banned
Oct 14, 2018
1,361
If DLSS runs on dedicated hardware and no longer needs per title training, why is there no universal driver level toggle that works with everything (games, videos/media, desktop) a la the Shield TV? Why are there no DLSS hooks that can tie into PC media players and perform ridiculous upscales on videos, a la all the Nvidia specific decoders/encoders/hardware accelerators?
 
Last edited: