• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Aztechnology

Community Resettler
Avenger
Oct 25, 2017
14,145
When is AMD having their show/announcements. This all quiet on the front approach makes me a bit worried.
 

Tovarisc

Member
Oct 25, 2017
24,440
FIN
I think people have bit too high expectations from DLSS 2.0, for now its adaptation rate is very low. For it to be significant player for NV adaptation rate has to ramp up a lot.

When is AMD having their show/announcements. This all quiet on the front approach makes me a bit worried.

Has there even been rumors about AMD event for RDNA 2? No way they have one before 3000 series starts shipping, at least I think time window is too small now or they have to start lead up by end of this week.
 

Xyber

Member
Oct 27, 2017
1,298
I think people have bit too high expectations from DLSS 2.0, for now its adaptation rate is very low. For it to be significant player for NV adaptation rate has to ramp up a lot.

You'd think every big studio would love to add it, its basically free performance with very few downsides.

Since 2.0 launched, the interest to add it to games seems to have increased and I hope that trend continues. It is the most game changing feature we've gotten in a long time in the GPU space.
 

TC McQueen

Member
Oct 27, 2017
2,592
Has there even been rumors about AMD event for RDNA 2? No way they have one before 3000 series starts shipping, at least I think time window is too small now or they have to start lead up by end of this week.
They might be doing a combined event for RDNA 2 and Zen 3 (which is coming out in October), so they might announce their event anytime now.
You'd think every big studio would love to add it, its basically free performance with very few downsides.
I've heard Nvidia is hard to work with and many devs don't want to lock themselves into a proprietary solution.

That said, I heard that Nvidia was looking to turn DLSS into a driver level function that used TAA pipelines, so you might get your wish down the line.
 

MrKlaw

Member
Oct 25, 2017
33,073
Without this, RDNA2 is dead in the water, as far as PC adoption goes.


There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)

DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
 

TSM

Member
Oct 27, 2017
5,823
There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)

DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..

Did you see the Steam survey thread? AMD is pretty much a non factor on PC. If having comparable performance was good enough there would be a much larger presense in the top 20.
 

eonden

Member
Oct 25, 2017
17,091
There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)

DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
If you also see those analytics, you would see that cheaper cards predominate... and even then Nvidia wins. In the end, if people are shown 2 similar perfoming cards, with a small difference in price, one is AMD and another Nvidia, people will seemingly choose Nvidia because of brand knowledge and trust (which AMD has barely started to recover).

There is also the problem of the higher end cards. That is where extra functionalities such as well introduced ray tracing performance (or DLSS) matter even more, as you are asking people to have a seemingly bit worse of what they would want to buy the card for (extra shinies in raytracing or better performance at "4k"). DLSS only needs to be present in a small subset of the high profile games of the year to create the sensation that you are losing from not getting an Nvidia card (as Nvidia will have a big advantage on the games where you will want to use those tricks).

Did you see the Steam survey thread? AMD is pretty much a non factor on PC. If having comparable performance was good enough there would be a much larger presense in the top 20.
Yeah, AMD has started to catch up on driver and experience (with the AMD competitor to Gefore Panel) but the new extra shinies Nvidia added are gonna be even harder to compete with. Nvidia just has too much of a brand power in GPUs that AMD needs to get a concise advantage for years, similar to what happened on CPUs... but that is waaay harder against Nvidia (and even less as GPUs start to move into a field AMD has no experience and Nvidia dominates even more, such as GPU ML).
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)

DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.

Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.

If AMD's new GPUs can't compete with DLSS, they will be dead in the water because the Nvidia GPUs, if similarly priced, offer a huge advantage on the increasing amount of notable titles that support it.

This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.
 

Darkstorne

Member
Oct 26, 2017
6,828
England
DLSS 2.0 still looks like ass compared to hard downsampling though :(
Well... yeah? I don't understand your point of comparison. Downsampling to 4k would look absolutely amazing, but what sort of hardware can do that reliably at 60fps? =P

Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.
I completely agree. It's not so much that the feature set is or isn't desirable. It's that the price and price/performance ratio was a big turn off to many of us. The vast majority of PC gamers run 1080p displays, for which Pascal was still absolutely crushing. Going into next gen, and with landmark titles like Cyberpunk, and a fantastic price/performance ratio on new GPUs, I think a lot more people are about to fall in love with DLSS.
 
Last edited:

Oticon

Member
Oct 30, 2017
1,446
DLSS is the new GPU-accelerated PhysX. Until they can implement it in a way that every game can use it without doing game-by-game basis, I don't care. I don't make purchasing decisions on features that are vendor locked. If AMD can be competitive and have a good price/performance, they get my money.
 

disparate

Banned
Oct 25, 2017
7,904
If you want DLSS to look better downsample to a DSR resolution to your native while also using DLSS.

As an example. 1080p screen. 4K DSR. 4K DLSS -> 1080p downsample.
I'm not entirely sure if I need it in the picture at all. Granted, I'd also rather spend $3900 for an R5, versus AI upscale, then downscale back down an EOS 6D photo. Maybe I've spent too much time pixel peeping AI upscaled photography to compensate for the loss of vertical resolution from using Dual ISO to appreciate or even like the results.

I guess in the above scenario, the value is: "my GPU is old as shit, DLSS will let me run games better", but I'm not spending $1000+ for a new GPU to AI upscale my picture.
 
Oct 27, 2017
4,928
Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.

Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.

If AMD's new GPUs can't compete with DLSS, they will be dead in the water because the Nvidia GPUs, if similarly priced, offer a huge advantage on the increasing amount of notable titles that support it.

This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.
The cheapest 3000 series card is still $500, that's like twice what most PC gamers seem comfortable with spending historically on a GPU and now there's a global recession on top of that. Yes, the performance upgrade may be worth it from a $/fps standpoint, but it's like comparing a Corvette to a Miata. They appeal to customers with completely different budgets.

If you took the 50 most popular PC games of 2020, I'm guessing almost all of them could run at High/1080P/60fps on a $200 GPU. I think the big thing holding back developers from making more demanding games is that we're still firmly in the PS4/X1 generation and not enough people are upgrading their displays to go above 1080/60. Maybe in a couple years that will all change.
 

Buggy Loop

Member
Oct 27, 2017
1,232
Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.

Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.

If AMD's new GPUs can't compete with DLSS, they will be dead in the water, because the Nvidia GPUs, if similarly priced offer a huge advantage on the increasing amount of notable titles that support it.

This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.

I can't imagine buying a new card this year for Cyberpunk and not care about DLSS. Why would I not want this witchcraft to run high FPS with RT?

AMD is always late to the party feature wise and it seems, going by the rumours, that their RT and resolution scaling are again lagging behind.
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
The cheapest 3000 series card is still $500, that's like twice what most PC gamers seem comfortable with spending historically on a GPU and now there's a global recession on top of that. Yes, the performance upgrade may be worth it from a $/fps standpoint, but it's like comparing a Corvette to a Miata. They appeal to customers with completely different budgets.

If you took the 50 most popular PC games of 2020, I'm guessing almost all of them could run at High/1080P/60fps on a $200 GPU. I think the big thing holding back developers from making more demanding games is that we're still firmly in the PS4/X1 generation and not enough people are upgrading their displays to go above 1080/60. Maybe in a couple years that will all change.
So therefore AMD's high end Big Navi without DLSS has a chance against Nvidia's entire range of GPUs that supports DLSS?
 

Pargon

Member
Oct 27, 2017
12,030
DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
If they don't have anything comparable to DLSS they need more than "competitive rasterization performance" to compete - they need much higher rasterization performance to be competitive because NVIDIA can produce comparable/superior image quality while rendering half the resolution.


Source

Look at how much worse the aliasing (flickering) is in Death Stranding when rendering Native 4K + TAA compared to 4K DLSS Quality (1440p).
On top of that, it runs worse.

DLSS 2.0 still looks like ass compared to hard downsampling though :(
Of course downsampling from 2880p to 1440p should produce better image quality than DLSS.
But DLSS exists because high-end games are too demanding to be rendered at native resolution on 4K displays - or even 1440p ultrawides - let alone be downsampled from even higher resolutions like 5K.

Downsampling is not a cure-all for aliasing either - especially if your target is only 1440p.
If you look at an older game with really bad aliasing, like Alien: Isolation, injecting TAA at 1440p does a much better job for aliasing than downsampling from 2880p to 1440p.
The best results are achieved with a combination of downsampling and the improved anti-aliasing.

In a similar vein, you would surely have better results by combining DLSS with downsampling, rather than downsampling without it.
Instead of rendering native 5K with TAA and downsampling to 1440p, it would surely be better to use DLSS at 5K (1920p) and downsample that to 1440p.
You'd have better temporal stability, and better performance.
 

PLASTICA-MAN

Member
Oct 26, 2017
23,638
Will AMD hold a conference this year or won't the new GPUs reelase this year? The status of stuff is mysterious right now. The only confirmed RDNA2 GPUs that will rlease this year are from next-gen consoles so far.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
Will AMD hold a conference this year or won't the new GPUs reelase this year? The status of stuff is mysterious right now. The only confirmed RDNA2 GPUs that will rlease this year are from next-gen consoles so far.
They literally confirmed their Big Navi GPUs are coming before the SoC Navi-powering the next-gen home consoles.

At this stage, i'm expecting an October event from AMD where they announce their next CPU and GPU lineups.
 

Aztechnology

Community Resettler
Avenger
Oct 25, 2017
14,145
AMD not over hyping a new GPU is probably for the best
Not hyping. But releasing some info or attractive pricing info. They're going to have to compete on the dollar level. The pure performance alone from Nvidia is frightening, but DLSS performance is a game changer. Especially with 2.0. before it was still niche. But the titles coming and then their new machine learning is like beating a dead horse.
 

Isee

Avenger
Oct 25, 2017
6,235
I see rasterization performance being close enough, 10% up or down aren't that much of a big deal. Especially if AMD is going to offer more memory in the corresponding tier.
The deciding factor will be pure Ray Tracing performance and not DLSS though. Should AMD lack behind in this category in significant ways AMD is in big trouble. There will still be the usual "it doesn't matter, not worth it" crowd, but I think Ray Tracing is going to be the GPU battlefield of tomorrow.

DLSS will still be a rare sight, but it will make every Nvidia sponsored game an automatic win for team green. Both form a visual and performance pov. In contrast AMD sponsored games will have this aura of disappointment for many people, because they exclude the possibility of something that is perceived to be good.
That has potential to influence how people view Nvidia and AMD as brands.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
If they don't have anything comparable to DLSS they need more than "competitive rasterization performance" to compete - they need much higher rasterization performance to be competitive because NVIDIA can produce comparable/superior image quality while rendering half the resolution.


Source

Look at how much worse the aliasing (flickering) is in Death Stranding when rendering Native 4K + TAA compared to 4K DLSS Quality (1440p).
On top of that, it runs worse.


Of course downsampling from 2880p to 1440p should produce better image quality than DLSS.
But DLSS exists because high-end games are too demanding to be rendered at native resolution on 4K displays - or even 1440p ultrawides - let alone be downsampled from even higher resolutions like 5K.

Downsampling is not a cure-all for aliasing either - especially if your target is only 1440p.
If you look at an older game with really bad aliasing, like Alien: Isolation, injecting TAA at 1440p does a much better job for aliasing than downsampling from 2880p to 1440p.
The best results are achieved with a combination of downsampling and the improved anti-aliasing.

In a similar vein, you would surely have better results by combining DLSS with downsampling, rather than downsampling without it.
Instead of rendering native 5K with TAA and downsampling to 1440p, it would surely be better to use DLSS at 5K (1920p) and downsample that to 1440p.
You'd have better temporal stability, and better performance.

Or just go ham with the 1440p-to-8K DLSS mode then downsample. Unless that's exclusive to the 3090, I can't remember.
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
No I think they're both going to struggle to get most of the market off of the Pascal/Polaris based cards.
I'm talking about new graphics cards that will be sold by AMD and Nvidia. I am saying that AMD's offerings will not stand a chance against Nvidia's because of DLSS, and are basically DOA if they don't have it and you're coming back with "but Polaris and Pascal!", but those are not currently in production and are not being sold. I've repeated this and you are not understanding what I'm saying for some reason.

Too much Brawndo, perhaps?
 

Laiza

Member
Oct 25, 2017
2,171
DLSS 2.0 still looks like ass compared to hard downsampling though :(
It's nice to want things.

In this world where we don't have literal petaflops of computing power in our machines, however, playing smart is infinitely preferable to brute force any day of the week. I'll absolutely take DLSS over any other option when computing power is as limited as it is.
 

dgrdsv

Member
Oct 25, 2017
11,886
The deciding factor will be pure Ray Tracing performance and not DLSS though. Should AMD lack behind in this category in significant ways AMD is in big trouble. There will still be the usual "it doesn't matter, not worth it" crowd, but I think Ray Tracing is going to be the GPU battlefield of tomorrow.
Keyword - tomorrow. Ampere and RDNA2 won't be running too many RT titles, a large number of them will come later.

DLSS will still be a rare sight
I dunno about that, it's orders of magnitude easier to integrate than RT and results in huge performance gains on everything from 2060 to 3090. It'll be rather hard to say no to this going forward, especially if you consider that Turing cards will pretty much need it to run RT at 30+ fps in a year or so from now.
 

z0m3le

Member
Oct 25, 2017
5,418
When Cyberpunk 2077 launches and DLSS is used, people are going to look at DLSS very differently. Call of Duty also has DLSS, so it's going to be a big feature going forward. Also FF15 had DLSS, it wouldn't surprise me at all if FF7R has it next spring, as well as FF16 whenever that comes out.

It's not going to stop there, all engines need to make it work is TAA support, which is the industry standard, it plugs in there and replaces TAA. It's going to be really bad for AMD, not because they didn't bring something to compete with RTX, but they didn't see DLSS 2.0+ happening, at least not so soon, 2.0 launched in March this year, it was a blindside punch from Nvidia, not to mention Nvidia doubling shader core counts by turning their INT cores into standard Int/flop ALUs, leakers expected each SM from Nvidia to have 64 cuda cores, instead it has 128, and yes performance per flop is lower because the rest of the SM didn't double, but performance does look about 50% higher as a result here, as in the RTX 3070 would have had about 2/3rds the performance it currently has if Nvidia didn't double the cuda core count, but because they did, the 20TFLOPs GPU outperforms the 14TFLOPs 2080 TI and consumes only 220 watts, this is again something AMD probably wasn't aware of, and to top it off, insiders expected the price of these cards to be $100 more on both the 3070 and 3080...

AMD has to beat the RTX 3070 at $499, and people have to not care about the free performance boosts from DLSS in COD and Cyberpunk this year as well as prior DLSS games, and future ones, not to mention Ray Tracing is probably a lot better on Nvidia's cards than AMDs. To further put AMD's position into perspective, even if they beat the RTX 3070 at $499, Nvidia has a RTX 3070 TI waiting to drop into the fold with 16GB RAM and probably 15%-20% faster than this card for $599.

Next year isn't going to look much better either, more DLSS adoption (especially on the off chance Nintendo Switch Pro offers DLSS), will make RTX 3060 and even RTX 3050 competitive at $349 and $249 price points, if Nvidia wants to, they can keep lowering performance and supporting RTX features too, I would expect 3060 is ~15TFLOPs, 3050 is ~12TFLOPs, they can realistically go down to 6TFLOPs Ampere and support every RTX feature at about the same performance as their 2060 max q, which can run control at 1080p on ultra with raytracing on while maintaining mid to high 60fps via DLSS. This means a RTX 3040 and even a RTX 3030 could exist IMO. Below this and there isn't that much of a point, entry level laptop maybe, but nothing too exciting here.
 

MrKlaw

Member
Oct 25, 2017
33,073
with the 'doubling' of Cuda cores - is that something that may provide additional benefits with driver and/or game engine updates? eg right now they may be balanced for FP/INT and might be adaptable to taking advantage of more FP?
 

Nooblet

Member
Oct 25, 2017
13,637
I keep wondering about this a lot. People speak as if everyone plays the same two dozen games.
The one thing is that the games where you really do need performance, are the ones most likely to implement DLSS.
Since these are the games pushing the tech, it's also likely that they will implement DLSS.

For games that don't push the tech, well you are getting a good framerate anyway. So it's the notion of DLSS helping you out when you really need it, that adds to the perceived advantage.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
The one thing is that the games where you really do need performance, are the ones most likely to implement DLSS.
Since these are the games pushing the tech, it's also likely that they will implement DLSS.

For games that don't push the tech, well you are getting a good framerate anyway. So it's the notion of DLSS helping you out when you really need it, that adds to the perceived advantage.

I actually do hope to see more simple games adding it. Like, I know Fortnite is probably going to market it as a way to keep performance when using ray-tracing, but I expect it will also be a major boon to players using 2060 Max-Q laptops with 144-240Hz displays. This will continue to apply if Nvidia release 3050, 3030 and 3010 GPUs and laptops have Max-Q versions. We could very well see some surprisingly low-power RTX GPUs over the next year that could punch well above their weight with DLSS.
 

Isee

Avenger
Oct 25, 2017
6,235
Keyword - tomorrow. Ampere and RDNA2 won't be running too many RT titles, a large number of them will come later.

Sure but most people seem to sit longer than a year on GPUs. This years RT performance will be significant for many people, for a long time. Not taking it into account would be a mistake.


Especially if you consider that Turing cards will pretty much need it to run RT at 30+ fps in a year or so from now.

Considering a 520€ RTX 3070 is supposed to be equivalent to a RTX 2080Ti, a RTX 3060 could be equivalent to a 2070S/2080 and a RTX 3050 equivalent to a RTX 2060S/2070 and factoring in that most people buy 200-300€ cards I think it is unlikely that Turing will look that bad in a year, or the vast majority of the Ampere audience would have a bad experience too.
Unless we are solely talking about ultra RT settings, but than anything but a 3080, 3090 will look bad anyway.
We are still a couple GPU generations away from fully ray traced games, after all. The Marvels at Night Demo is running at 1440p, with DLSS on a 3090 at 30 fps. While looking awesome, I think it is an indicator where the tech is currently and how far it needs to go.

Games will be a mix & match with adjustable options for quiet some time.

In the end it comes down to how RT is going to be implemented in the future and that will depend on how strong the lowest, common denominator is: PS5, or maybe XSX.
Though they, of course, have the thirty fps advantage.

We'll see soon enough. But I think RT will see faster and wider adoption than DLSS. Just because it will be available on every tier of GPUs, from both GPU brands and on both console brands.
While DLSS has a very small user base currently, one that will expand but I'm not sure devs already see value in implementing it. I imagine they have their hands full with a lot of new tech and other challenges.

May sound like I don't want DLSS to succeed. Quiet the opposite, I just don't think it has development priority.
 

dgrdsv

Member
Oct 25, 2017
11,886
Sure but most people seem to sit longer than a year on GPUs. This years RT performance will be significant for many people, for a long time. Not taking it into account would be a mistake.
These people are usually expecting their GPUs to start struggling in a couple of years from them being launched. So for them it'll be a norm no matter the delta between AMD and NV respective h/w.

I think it is unlikely that Turing will look that bad in a year, or the vast majority of the Ampere audience would have a bad experience too
Ampere will be significantly faster than Turing in RT. What is some +50% on average will be +100% in RT workloads. Which means that Turing will start to struggle with 2021 RT titles in resolutions above 1080p.

In the end it comes down to how RT is going to be implemented in the future and that will depend on how strong the lowest, common denominator is: PS5, or maybe XSX.
Consoles will provide a baseline but a) it will be fairly low so even Turing will likely deal with it with ease and b) not all next gen games will have RT, I expect many to opt for no RT in favor of higher resolutions and/or framerates.

But I think RT will see faster and wider adoption than DLSS.
Nothing points to this so far. DLSS 2.0+ is too good of a tech to just pass on it for whatever reason.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
Nothing points to this so far. DLSS 2.0+ is too good of a tech to just pass on it for whatever reason.

People seem convinced that the adoption rate of DLSS 2.0 right now can be extrapolated to make assumptions about the next few years despite it only being introduced (as a beta!) in March. We're already seeing CoD and Fortnite using it, and isn't it also being built right into Unreal? I wouldnt be surprised if a year from now the majority of major PC games (so over 50%) will use it.
 

Isee

Avenger
Oct 25, 2017
6,235
These people are usually expecting their GPUs to start struggling in a couple of years from them being launched. So for them it'll be a norm no matter the delta between AMD and NV respective h/w.

This norm doesn't mean that they would leave behind potential performance.
Being on a tighter budget doesn't mean they are stupid or unreasonable.
Quiet the opposite, they are more likely to get the best bang for their money

Consoles will provide a baseline but a) it will be fairly low so even Turing will likely deal with it with ease and b) not all next gen games will have RT

That's what I said. Not sure what we are arguing here about. You said Turing will be obsolete in a year from now and I brought consoles into play, because I think Turing has a good chance to stay quiet competitive for longer than a year.

Ampere will be significantly faster than Turing in RT. What is some +50% on average will be +100% in RT workloads. Which means that Turing will start to struggle with 2021 RT titles in resolutions above 1080p.

100% more RT performance doesn't necessarily mean 100% more fps in RT games though.
In the end, we need to see benchmarks. Time will tell soon enough. No sense in speculation.

You also can't have it both ways. Either consoles are setting a baseline and Turing won't struggle with that or it will be outdated sooner than later. There is of course no doubt that ampere is better tier for tier. One must be stupid to assume anything else. But I think you are not taking the complete ampere line up into account. A 3060 for 350€ will do fine enough on reasonable RT settings, and so will most of the Turing line up.

hing points to this so far. DLSS 2.0+ is too good of a tech to just pass on it

I mean sure. We are both in deep speculative territory here. RT is a good tech and if it being available on consoles and every GPU tier in the future doesn't conceive you that it will have broad adoption asap. Than no idea why DLSS being a just a good tech, without a very broad user base makes you think it will have an even faster adoption rate.
Other than you wishing for dlss to be everywhere. Which is a reasonable and understandable wish. But hardly an argument.
 

Iron Eddie

Banned
Nov 25, 2019
9,812
Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.

Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.

If AMD's new GPUs can't compete with DLSS, they will be dead in the water because the Nvidia GPUs, if similarly priced, offer a huge advantage on the increasing amount of notable titles that support it.

This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.
Raytracing is what I and likely others looking to future proof themselves are focusing on. Some used to promote the 2070 Super over the 5700XT (and probably still do) because it could do raytracing. Well at the time games ran like shit until DLSS 2.0 came out but I still didn't think the card didn't do raytracing any justice. The new 3x series to me is the game changer I was looking for. AMD has their hands full now but $699 is still quite a premium price to me (not that interested in the 3070 but rather the 3080).
 

tokkun

Member
Oct 27, 2017
5,413
People seem convinced that the adoption rate of DLSS 2.0 right now can be extrapolated to make assumptions about the next few years despite it only being introduced (as a beta!) in March. We're already seeing CoD and Fortnite using it, and isn't it also being built right into Unreal? I wouldnt be surprised if a year from now the majority of major PC games (so over 50%) will use it.

If your definition of "major" is billion-dollar franchises like Call of Duty and Fortnite, then sure. Not only can they afford to throw whatever engineering resources they want at implementing new features, they probably had Nvidia engineers breaking down their doors offering to do it for them. I wouldn't even be surprised if Nvidia paid them to do it, given how valuable those games are as a marketing tool.

Personally I will be more interested in seeing what adoption looks like for games that are not high profile enough to get direct support from Nvidia.
 

eonden

Member
Oct 25, 2017
17,091
If your definition of "major" is billion-dollar franchises like Call of Duty and Fortnite, then sure. Not only can they afford to throw whatever engineering resources they want at implementing new features, they probably had Nvidia engineers breaking down their doors offering to do it for them. I wouldn't even be surprised if Nvidia paid them to do it, given how valuable those games are as a marketing tool.

Personally I will be more interested in seeing what adoption looks like for games that are not high profile enough to get direct support from Nvidia.
The thing is that for DLSS to be a good technical advantage, it only needs to be in a majority of those games, as they are the ones taht push graphics to the max and were DLSS will have an even greater impact (as running them at normal resolution might be a problem) in the customer. It trickling down to less intensive games (as it is currently happening), its just something that makes the technical advantage from good to great.
 

tokkun

Member
Oct 27, 2017
5,413
The thing is that for DLSS to be a good technical advantage, it only needs to be in a majority of those games, as they are the ones taht push graphics to the max and were DLSS will have an even greater impact (as running them at normal resolution might be a problem) in the customer. It trickling down to less intensive games (as it is currently happening), its just something that makes the technical advantage from good to great.

There are graphically intensive games that aren't also mega-budget AAA productions. VR is a good example, where the high resolution and framerate requirements mean you need powerful hardware, but the audience is smaller so the games are lower budget. It was intriguing to see Nvidia announce DLSS support for VR, but I am curious about whether we see much adoption unless Valve or Facebook are able to build it in at a system level like they do with reprojection.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
If your definition of "major" is billion-dollar franchises like Call of Duty and Fortnite, then sure. Not only can they afford to throw whatever engineering resources they want at implementing new features, they probably had Nvidia engineers breaking down their doors offering to do it for them. I wouldn't even be surprised if Nvidia paid them to do it, given how valuable those games are as a marketing tool.

Personally I will be more interested in seeing what adoption looks like for games that are not high profile enough to get direct support from Nvidia.

I think you're greatly overestimating the difficulty of adding DLSS to a game. As has been said before, most of these games use TAA, and if you already have that then adding DLSS is apparently very simple - there's no more needing to train the AI for your specific game or anything.