• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Duxxy3

Member
Oct 27, 2017
21,929
USA
There's quite a few reasons for me to avoid AMD cards right now. Not having DLSS is definitely near the top of the list. Ray tracing performance is significantly behind Nvidia. Pricing really isn't that good. And then I have to remember how many issues AMD has had with their drivers in the past. It seems OK now, but it's a pretty long history. Nvidia is the safer bet, and they're not gouging people with the 3000 series like they did with the 2000 series. That said, if AMD dropped the price considerably I'd have to consider it.
 

OldDirtyGamer

Member
Apr 14, 2019
2,485
CyberPunk is the first time i really used DLSS and yeah, without it id have to play the game at 30fps or low settings ( maybe ) Now i can play at default high at 60fps with DLSS. I have a 2060 btw, so yeah....im with DLSS
 

TheZynster

Member
Oct 26, 2017
13,291
its literally allowed me to hold onto my 2070 for a lot longer without using ray tracing and keep my frames and resolution at native for myself.

It's fucking black magic.

Thank god for it, since its literally impossible to find a new GFX card atm and the games i wanted to play support it.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,650
how many issues AMD has had with their drivers in the past.
I don't know how valid this is, I've had a ton of issues with Nvidia drivers, most recently a known bug since April that affects performance on Steam VR and some crashes on Gears of War 5.

I've owned a 1060 -- 2080 -- 3080 so I've only ever had Nvidia on the GPU side.
 

Bonezz

Prophet of Truth
Member
Oct 25, 2017
597
Pennsylvania
I would say yes. DLSS is the future and if AMD doesn't have anything to compete with it there's no reality where I ever consider purchasing their product.
 

Truant

Member
Oct 28, 2017
6,774
Let me just put it this way: The PS5 Pro is gonna have hardware support for AI supersampling.
 

Duxxy3

Member
Oct 27, 2017
21,929
USA
I don't know how valid this is, I've had a ton of issues with Nvidia drivers, most recently a known bug since April that affects performance on Steam VR and some crashes on Gears of War 5.

I've owned a 1060 -- 2080 -- 3080 so I've only ever had Nvidia on the GPU side.

Reviewers wouldn't recommend any Navi series card from September of last year until roughly March of this year. That's not something that should be ignored.
 

BreakAtmo

Member
Nov 12, 2017
12,962
Australia
It's supported by a small list of games (still more than RTX though) and AMD is working on their solution, so... just wait and see. It's not like you'de be able to buy a GPU at a reasonable price right now anyway.

People keep saying this, but that "small list" continues to grow exponentially, and we've gone from it only being in a couple of big-ish games and some Indies in March to being in Fortnite, the new CoD and Cyberpunk in December. They've now updated it to work with VR and dynamic resolution, and it's been added into Unreal Engine as something devs can just enable automatically. It's plainly obvious that DLSS adoption is going to keep growing, and if someone is buying a card now to use for the next 3 or 4 years, they need to take that into account.

Also AMD is very very unlikely to be able to match DLSS 2.0 - it uses tensor cores that the RX 6000 series simply doesn't have. Nvidia tried DLSS on shader cores - it was called DLSS 1.9, and it couldn't touch 2.0 on any level. People call DLSS 2.0 black magic, but it's really just a smart use of machine learning hardware that is normally going unused in games. AMD matching it on shader cores would be actual black magic.
 

bill crystals

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
1,079
It feels like once they get some kind of DLSS implementation working on the consoles it'll be a total revolution. The masses of console gamers are simply not going to be able to tell the difference, and devs will be able to get away with really pushing the DLSS in some instances, therefore freeing up resources for all kinds of other crazy shit like ray tracing. I think it's going to be HUGE when it hits the mainstream.
 
Apr 4, 2018
4,554
Vancouver, BC
If you want to play Cyberpunk with Ray Tracing, then Nvidia is the only option right now. Even the 3070 gets pulverized by it.

As someone who just had both a 3070 and an RX 6800 and tested both quite a bit though, I will say that in my experience the 6800 will still run much better on tye huge majority of your games, but won't be as good for ray traced titles. DLSS is an amazing feature, and it will be a big benefit in the small handful of titles that use it.

On the other hand, I don't think DLSS will be a huge win for Nvidia in the long term. It sounds like both AMD and Microsoft are heavily working on thier own DLSS type solution, so one year from now, I expect AMD to be catching up on that end, and possibly exceed Nvidia in compatibility due to the open-source nature of AMD's Siper Resolution feature.
 

dadoes

Member
Feb 15, 2018
462
all the people that say dlss is not important must already own an amd GPU and they are trying to justify their purchase.

LOL

Cognitive dissonance at its best.
 

medyej

Member
Oct 26, 2017
6,492
DLSS is amazing and absolutely the killer GPU feature of the moment, just as G-Sync was 5 years ago.
 

Freshmaker

Member
Oct 28, 2017
3,946
all the people that say dlss is not important must already own an amd GPU and they are trying to justify their purchase.

LOL

Cognitive dissonance at its best.
Own a 3080 and two 2080's. It can be nice, but it also introduces its own set of issues, and actual support is still super limited.
 

Pipyakas

Member
Jul 20, 2018
549
Yes. The entire software stack for Nvidia GPUs is more enticing than Radeon's, if you dont use Hackintosh or Linux. DLSS is just one of the gaming parts, NVENC, RTX Broadcast etc... is all dark magic that Radeon GPUs cannot match
 

leng jai

Member
Nov 2, 2017
15,131
As usual once it comes to consoles it's going to be hyped up way more and be the best thing since sliced bread.

Own a 3080 and two 2080's. It can be nice, but it also introduces its own set of issues, and actual support is still super limited.

Of course there are issues, it's obviously not magic that just gives you a massive performance boost for free. From what I've seen the massive gain in performance is worth it for the IQ hit, and in some cases it does look better than native in games that have atrocious AA. Support hopefully gets much better going forward but every game that uses RT seems to have it which are the ones that need it the most.

I see no reason to downplay it at all and I certainly wouldn't be spending a ton of cash on something that doesn't support it. DLSS 2.0 has certainly boosted the usability of RT on my 2070 Super, without I wouldn't be able to get decent performance in Cyberpunk or Control at all.
 

BreakAtmo

Member
Nov 12, 2017
12,962
Australia
It feels like once they get some kind of DLSS implementation working on the consoles it'll be a total revolution. The masses of console gamers are simply not going to be able to tell the difference, and devs will be able to get away with really pushing the DLSS in some instances, therefore freeing up resources for all kinds of other crazy shit like ray tracing. I think it's going to be HUGE when it hits the mainstream.

Actual DLSS in the Switch 2 would be especially insane. Right now the Switch in its handheld form has similar power to a 360 and usually targets a standard resolution of 720p to match its screen. Imagine a Switch 2 that went to a 1080p screen and raw graphics power similar to a PS4, but also had DLSS and didn't need to increase its resolution target (720p or DLSS Quality Mode). Combined with a much better CPU than the PS4 Jaguar (not difficult), it could visually match the PS4 on the average game while doubling the framerate. Then you consider the benefits of going down to 540p (Performance Mode) or even 360p (Ultra Performance Mode, though this may be pushing it) and suddenly next-gen games would be doable with less performance loss if the CPU, SSD and RAM can hold up.

I could see it happening if the Switch 2 chip was like a 2060 but ARM/2 gens newer/massively shrunk from 12nm to 3nm/had its RT cores replaced with CPU cores.
 

dadoes

Member
Feb 15, 2018
462
dlss means that you can get a rtx 3060 and get the equivalent performance of an amd 6800xt for a fraction the cost, if the game you run supports dlss.

Yes, currently not many games support dlss, but going forward I expect most games that are not amd sponsored to support dlss as nvidia makes it easier and easier to incorporate dlss into games.
 

PapaGoob

Member
Oct 27, 2017
190
North Carolina
Based off the responses in this thread I'll go ahead and raise my hand and say I'm the "fool" who sold his 3080 when I managed to snag a 6800XT.

I've always been an Intel/Nvidia build and wanted to do a complete AMD build this time around since I also snagged a 5800x.

My answer is it depends on what games you play and how important DXR is, and to a lesser extent how sensitive you are to artifact's.

I'm pretty sensitive to artifacts and DLSS in motion looks like crap to me. I much prefer native resolution. The performance boost is nice though, but...

I only think it's worth getting the performance boost if you have DXR enabled or are using a low tier RTX GPU and need the extra boost to have playable frame rates.

I personally don't care for DXR yet. It looks great, but after the first 30 minutes of appreciating a games graphics I start to not care as I get engrossed in the gameplay instead.

Once GPU tech gets good enough to use DXR without a performance hit is when I'll jump on the wagon.

The other reason is just what games you play. I only play a handful of triple A titles a year and those are the majority of games that will have DXR and DLSS. I mainly play WoW and FIFA, with the occasional triple A title thrown in, so my gaming habits don't really require DLSS.

That's my two cents.
 

dadoes

Member
Feb 15, 2018
462
Based off the responses in this thread I'll go ahead and raise my hand and say I'm the "fool" who sold his 3080 when I managed to snag a 6800XT.

I've always been an Intel/Nvidia build and wanted to do a complete AMD build this time around since I also snagged a 5800x.

My answer is it depends on what games you play and how important DXR is, and to a lesser extent how sensitive you are to artifact's.

I'm pretty sensitive to artifacts and DLSS in motion looks like crap to me. I much prefer native resolution. The performance boost is nice though, but...

I only think it's worth getting the performance boost if you have DXR enabled or are using a low tier RTX GPU and need the extra boost to have playable frame rates.

I personally don't care for DXR yet. It looks great, but after the first 30 minutes of appreciating a games graphics I start to not care as I get engrossed in the gameplay instead.

Once GPU tech gets good enough to use DXR without a performance hit is when I'll jump on the wagon.

The other reason is just what games you play. I only play a handful of triple A titles a year and those are the majority of games that will have DXR and DLSS. I mainly play WoW and FIFA, with the occasional triple A title thrown in, so my gaming habits don't really require DLSS.

That's my two cents.

I don't get it. It's not like you are forced to use dlss. If you already have a 3080, you can continue to use it without dlss.

And if Nvidia comes out with dlss 3.0 and improves it, you can enable it.

With the 6800xt you basically are stuck unless amd comes out with their own implementation.
 
Last edited:

Mass One

Member
Oct 28, 2017
1,227
I don't get it. It's not like you are forced to use dlss. If you already have a 3080, you can continue to use it without dlss.

And if Nvidia comes out with dlss 3.0 and improves it, you can enable it.

With the 6800xt you basically are stuck unless amd comes out with their own implementation.
Whatever happened to RIS? I feel like I saw one hardware unboxed video and then it was removed from the public.
 

scabobbs

Member
Oct 28, 2017
2,109
Honestly I can't believe AMD priced as high as they did when they don't have DLSS or the baseline RT performance nvidia offers. I cannot recommend AMD to anyone interested in RT even a little bit. It's really not a competition at this point.
 
Oct 28, 2017
4,970
Honestly I can't believe AMD priced as high as they did when they don't have DLSS or the baseline RT performance nvidia offers. I cannot recommend AMD to anyone interested in RT even a little bit. It's really not a competition at this point.

That's the real problem with the high end AMD cards right now. They're not undercutting the competition by enough and they're nowhere to be found. So there's basically no reason to NOT get the RT3080 when its comparable in performance and has much better features and RT performance.
 

Skyebaron

Banned
Oct 28, 2017
4,416
If AMD wouldve come with a price tag less than $150 vs nvidia on their 6000 cards I wouldve thought about going red. Right now DLSS is the real deal but its still not in many games. Dont care much for ray tracing because of its enormous framerate cost.

I just hope DLSS is a driver level setting someday.
 

Plax

Member
Nov 23, 2019
2,826
Just bought a 6800XT and then see this thread. Oh boy...

Hopefully AMD bring their version out soon.
 

leburn98

Member
Nov 1, 2017
1,637
At the end of the day it all comes down to support and the games you play. For me personally, by the time DLSS is widely available in the majority of games I play, I will likely be in the market for a new GPU anyways.
 

OmegaDL50

One Winged Slayer
Member
Oct 25, 2017
9,743
Philadelphia, PA
Reading comprehension is tough.

My reading comprehension is perfectly fine, thank very much.

You asked me what my point was and I gave it to you. If you're going to try to turn this into snarky ad hominem you'll only end up being a complete waste of your time.

DLSS is Nvidia proprietary tech, which means it will only appear in games with Nvidia sponsorship, so the further saturation of similar technology being utilized in more games is a matter of time, nothing more. Eventually there might be a platform agnostic version, although that's entirely debatable if also integrated into existing games that currently only support DLSS.

Also there is the thing that at least with DLSS 2.1 and beyond in theory any game that supports TAA aliasing should also be DLSS capable minding it's enabled on a developer side. Although it remains to be seen if even Nvidia would be willing to allow non-Nvidia sponsored titles to allow the feature without some caveat attached.
 
Last edited:

Babadook

self-requsted ban
Banned
Nov 11, 2017
192
I'm hoping ML upscaling arrives on RDNA 2/3 soon. We know it's going to be available on some AMD products, but it's still under wraps.
Let me just put it this way: The PS5 Pro is gonna have hardware support for AI supersampling.
Most likely both ps5 and XSX will use it well before then.
 
OP
OP
Grifter

Grifter

Member
Oct 26, 2017
2,596
The nvidia control panel is way better than the crap AMD has. For that reason alone I would go with nvidia.
In what ways? Last I looked (years, admittedly), it hadn't evolved in design and feel from its Windows '95 styling.

AMD now has a (relatively) usable panel that integrates an in-game UI for on-the-fly tweaks, they've added nifty options like anti-lag and image sharpening, and opened those options up to years-old cards.
 

OmegaDL50

One Winged Slayer
Member
Oct 25, 2017
9,743
Philadelphia, PA
I'm hoping ML upscaling arrives on RDNA 2/3 soon. We know it's going to be available on some AMD products, but it's still under wraps.

Any solution that allows AI supersampling that has the benefit of nearly better than native upscaling and associated performance benefits as a result is only a win for everyone involved.

A pure software solution that works for everyone would be nice, but I think for the time being, it might require dedicated hardware on the GPU to subside the any potential negative for performance. Although a one-size-fits-all solution would most certainly be welcome.
 

Babadook

self-requsted ban
Banned
Nov 11, 2017
192
Any solution that allows AI supersampling that has the benefit of nearly better than native upscaling and associated performance benefits as a result is only a win for everyone involved.

A pure software solution that works for everyone would be nice, but I think for the time being, it might require dedicated hardware on the GPU to subside the any potential negative for performance. Although a one-size-fits-all solution would most certainly be welcome.
I think all current AMD hardware has the ability to compute 4 bit ints at very high capacity. That's likely to be pretty competitive for ML.
 

Vex

Member
Oct 25, 2017
22,213
If cyberpunk is any indication of the quality of DLSS going forward, it is severely underwhelming imo. It seems to degrade the quality of your image too much for it to be worth it. Id rather just disable RTX features, have a sharper image, and run the game at higher framerates to be honest. I hear Death Stranding did it better but that game did not have any ray-tracing features. Control did it well, but that environment was not open and it was a corridor shooter.

Not a very promising outlook.

It's nice to have, BUT i've wound up changing my settings to have DLSS off in Cyberpunk because the quality loss is more noticeable than advertised IMO.

Even on "quality" settings.
This.
 

dadoes

Member
Feb 15, 2018
462
If cyberpunk is any indication of the quality of DLSS going forward, it is severely underwhelming imo. It seems to degrade the quality of your image too much for it to be worth it. Id rather just disable RTX features, have a sharper image, and run the game at higher framerates to be honest. I hear Death Stranding did it better but that game did not have any ray-tracing features. Control did it well, but that environment was not open and it was a corridor shooter.

Not a very promising outlook.


This.

I personally think cyberpunk dlss along with nvidia sharpening looks great on my lg 65" c9. At least with a nvidia rtx card, you get a choice of enabling or disabling it. There is no such option on an amd card.

And its not like nvidia isn't going to improve upon it. dlss 2.0 is already a massive improvement over dlss 1.0. No reason to think they won't improve the image quality with 3.0.

Given a choice between a 3080 and a 6800xt, its a no brainer. Even if you don't like dlss, I'd rather have the choice of enabling or disabling it as I see fit.
 

MechaJackie

Banned
Oct 25, 2017
2,032
Brazil
It's the defining thing for me in wanting to buy an RTX series card, AMD cards are not that interesting to me if they won't offer something similar. I can see DLSS giving a much longer life for whatever card I end up buying, simply making it have overwhelmingly better price x performance over time for me, and the titles that already support it are nearly all of them games that I'm very interested in playing, hopefully I will be upgrading my machine by mid next year to get around to playing Control and Death Stranding.
 

mordecaii83

Avenger
Oct 28, 2017
6,878
Control and DS, nope, at least for me they look like blurry messes at 1080p
Is your monitor native 1080p? Digital Foundry and others have shown DLSS is actually sharper than native in Control and DS, so I don't know why you would try to claim they're a "blurry mess". DLSS has other issues at times like some artifacts, but (especially in those two games) blurriness is not one of them.
 

dadoes

Member
Feb 15, 2018
462
My reading comprehension is perfectly fine, thank very much.

You asked me what my point was and I gave it to you. If you're going to try to turn this into snarky ad hominem you'll only end up being a complete waste of your time.

DLSS is Nvidia proprietary tech, which means it will only appear in games with Nvidia sponsorship, so the further saturation of similar technology being utilized in more games is a matter of time, nothing more. Eventually there might be a platform agnostic version, although that's entirely debatable if also integrated into existing games that currently only support DLSS.

Also there is the thing that at least with DLSS 2.1 and beyond in theory any game that supports TAA aliasing should also be DLSS capable minding it's enabled on a developer side. Although it remains to be seen if even Nvidia would be willing to allow non-Nvidia sponsored titles to allow the feature without some caveat attached.

why wouldn't nvidia allow non sponsored titles to use dlss? as it is, its already getting integrated into unreal engine. Its in nvidia's best interest to have dlss supported in as many games as possible as its a differentiator that will allow them to sell more rtx cards.

The only games where I don't think will have dlss are amd sponsored games and even then, I think that would be just a limited time and after that time is up, dlss will get added if nvidia makes it easy to integrate into games.
 
Last edited:

Geinrendour

Member
Jun 3, 2018
362
Overall it is a neat technology. It will offer better performance, but because of its novelty factor people overreact to it way too much. Besides, the quality of it works in a game to game basis. I think Death Stranding has the best results but quality wise people are not convinced in Cyberpunk, for example.

In the end it all depends on adoption and betting on the future is flipping a coin. For now, only games I think it is a major thing are Control and Cyberpunk (besides the quality, it is the only thing that allow this game to be played) but this only if you are going to play at 4k really. In sum, only nVidia sponsored/marketed games have it so although it is cool, much remains to be seen. Also, it is only really really important if you are going to play with ray-tracing in those high-res scenarios.

I game on PC on a laptop, so DLSS doesn't do it for me even if it was applied to a wide range of games which is by far not the case.
 

OmegaDL50

One Winged Slayer
Member
Oct 25, 2017
9,743
Philadelphia, PA
why wouldn't nvidia allow non sponsored titles to use dlss? as it is, its already getting integrated into unreal engine. Its in nvidia best interest to have dlss supported in as many games as possible as its a differentiator that will allow them to sell more rtx cards.

That's a fair point, but I'm just reminded of the divide between GSync and Freesync. Nvidia got benefit for every GSync display sold, but there was no guarantee that a GSync capable display also allowed Freesync, basically Nvidia trying to lock the adaptive vsync market down.

Still the difference in this case in order to take advantage of DLSS you'd need to be already in the Nvidia ecosystem owning an RTX 2000 or 3000 series card. Still I just want the option to support what AMD's solution might end up being as well as Nvidia as well.

Nothing is more healthy for the consumer than competitive market that continues to drive newer and better technology.
 

dadoes

Member
Feb 15, 2018
462
That's a fair point, but I'm just reminded of the divide between GSync and Freesync. Nvidia got benefit for every GSync display sold, but there was no guarantee that a GSync capable display also allowed Freesync, basically Nvidia trying to lock the adaptive vsync market down.

Still the difference in this case in order to take advantage of DLSS you'd need to be already in the Nvidia ecosystem owning an RTX 2000 or 3000 series card. Still I just want the option to support what AMD's solution might end up being as well as Nvidia as well.

Nothing is more healthy for the consumer than competitive market that continues to drive newer and better technology.

I'm all for an open source dlss solution, but i have no problem with a nvidia dlss proprietary solution at the moment. Nvidia is not a charity, they are a business whose goal is to make money.

Nvidia spent millions of dollars in r&d to come up w dlss, they should be able to profit from it. In the end, their goal is to get you to buy their products, and if they can do it by providing a better product than their competitors and not via some underhanded scheme, then more power to them.

I agree, competition is great. If amd comes up with a better solution, then nvidia will have to respond to that. The consumer wins in the end.
 
Last edited:

jakershaker

Member
Oct 28, 2017
203
What does DLSS have to do with modding?

Can't train DLSS 1.0 on a modded game, it's going to be trained on vanilla. 2.0 is better though mods probably will not have the extra time or perhaps possibility to code for DLSS 2.0. Truth is we don't know but it's mostly for the biiggest releases so far. Someone talked about G-Sync to Freesync, in my mind until DLSS goes platform agnostic it's not worth thinking about except in a very limited sense.

Because you can combine smart solutions with better hardware and get an even better result. We're spending roughly 4x the GPU performance solely on going from 1080p to 4k, if you don't have to do that thats performance that can be used elsewhere.

And perhaps thats the way forward. Haven't been so far though, it's always the same cycle, new games and features when the new consoles hit and a year or two in the PC cards are strong enough to brute force everything for the next couple of years. Of course performance can always be used elsewhere but will it, feels like a solution to a made up problem.
 

dimb

Member
Oct 25, 2017
737
In the end it all depends on adoption and betting on the future is flipping a coin. For now, only games I think it is a major thing are Control and Cyberpunk (besides the quality, it is the only thing that allow this game to be played) but this only if you are going to play at 4k really. In sum, only nVidia sponsored/marketed games have it so although it is cool, much remains to be seen. Also, it is only really really important if you are going to play with ray-tracing in those high-res scenarios.

I game on PC on a laptop, so DLSS doesn't do it for me even if it was applied to a wide range of games which is by far not the case.
DLSS is not exclusive to 4K and can be used to reduce the performance overhead at a variety of different resolutions targeting all sorts of framerate performance, and there are potentially situations where it would prove to be a better anti-aliasing solution than the alternatives.
 

OmegaDL50

One Winged Slayer
Member
Oct 25, 2017
9,743
Philadelphia, PA
Can't train DLSS 1.0 on a modded game, it's going to be trained on vanilla. 2.0 is better though mods probably will not have the extra time or perhaps possibility to code for DLSS 2.0. Truth is we don't know but it's mostly for the biiggest releases so far. Someone talked about G-Sync to Freesync, in my mind until DLSS goes platform agnostic it's not worth thinking about except in a very limited sense.



And perhaps thats the way forward. Haven't been so far though, it's always the same cycle, new games and features when the new consoles hit and a year or two in the PC cards are strong enough to brute force everything for the next couple of years. Of course performance can always be used elsewhere but will it, feels like a solution to a made up problem.

I mean the RTX 3080 was basically advertised as a 4K Card. Being able to hit 60FPS on most titles at that resolution, However I personally don't see the point for 4K at this point in time, which is why I've been targeting 1440P. Even on Cyberpunk 2077 which is the most demanding game I own and the visual / performance expectation for future PS5 / Series X 3rd party ports to PC, I use Ultra Settings but turn down Volumetric Res and Clouds down to Low (According to Digital Foundry) the visual impact between low and high Volumetrics is negligible but the performance impact is difference of 10FPS between low and high.

I don't think it's coming up with a solution to a made up problem. There is clearly net benefits using DLSS, even without raytracing enabled. I mean if you can't get above 30FPS at 4K native without RTX, but with DLSS Quality at 1440P getting roughly 4K upscale but at 60FPS+ performance uplift with a minimal loss is visual fidelity with RTX enabled, is a net benefit.

Of course newer GPUs will be able to brute force everything. That has always been in the case, but I find DLSS as means to extend the life your existing equipment even further before quality exceeds performance and needing to replace your graphics hardware just stay competitive with the latest games, but until then I'll probably sticking with my RTX 3080 until at least the RTX 5000 series comes out, which the going rate for a new GPUs seems to be every 2 years. If I can get 4 years out of my RTX 3080, before my next PC build in 2025 give or take, I think that is more that sufficient.
 

Geinrendour

Member
Jun 3, 2018
362
DLSS is not exclusive to 4K and can be used to reduce the performance overhead at a variety of different resolutions targeting all sorts of framerate performance, and there are potentially situations where it would prove to be a better anti-aliasing solution than the alternatives.

Never said it was 4k exclusive. Just not worth it at all for a laptop gamers given the resolution we play because the returns are poor. As for potentially being a better AA, being generous, It is a game by game case scenario applied to a pool of games which support it that - as of now - is insignificant quantity wise.

Although I guess it's worth it if you want a to spend (waste imo) money in a laptop to play at 4k?
 

Thretau

Member
Oct 31, 2017
43
It's very nice but the supported games list is just way too small. 99% of the time I don't benefit from it