Don't fixate over the word efficiency...That just means they over-specced the power supply.
It has nothing to do with inefficiency.
For all we know, that could be where its peak efficiency is at.
Don't fixate over the word efficiency...That just means they over-specced the power supply.
It has nothing to do with inefficiency.
For all we know, that could be where its peak efficiency is at.
Without this, RDNA2 is dead in the water, as far as PC adoption goes.
When is AMD having their show/announcements. This all quiet on the front approach makes me a bit worried.
I keep wondering about this a lot. People speak as if everyone plays the same two dozen games.I think people have bit too high expectations from DLSS 2.0, for now its adaptation rate is very low. For it to be significant player for NV adaptation rate has to ramp up a lot.
I think people have bit too high expectations from DLSS 2.0, for now its adaptation rate is very low. For it to be significant player for NV adaptation rate has to ramp up a lot.
They might be doing a combined event for RDNA 2 and Zen 3 (which is coming out in October), so they might announce their event anytime now.Has there even been rumors about AMD event for RDNA 2? No way they have one before 3000 series starts shipping, at least I think time window is too small now or they have to start lead up by end of this week.
I've heard Nvidia is hard to work with and many devs don't want to lock themselves into a proprietary solution.You'd think every big studio would love to add it, its basically free performance with very few downsides.
Without this, RDNA2 is dead in the water, as far as PC adoption goes.
There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)
DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
DLSS 2.0 still looks like ass compared to hard downsampling though :(
If you also see those analytics, you would see that cheaper cards predominate... and even then Nvidia wins. In the end, if people are shown 2 similar perfoming cards, with a small difference in price, one is AMD and another Nvidia, people will seemingly choose Nvidia because of brand knowledge and trust (which AMD has barely started to recover).There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)
DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
Yeah, AMD has started to catch up on driver and experience (with the AMD competitor to Gefore Panel) but the new extra shinies Nvidia added are gonna be even harder to compete with. Nvidia just has too much of a brand power in GPUs that AMD needs to get a concise advantage for years, similar to what happened on CPUs... but that is waaay harder against Nvidia (and even less as GPUs start to move into a field AMD has no experience and Nvidia dominates even more, such as GPU ML).Did you see the Steam survey thread? AMD is pretty much a non factor on PC. If having comparable performance was good enough there would be a much larger presense in the top 20.
Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.There are literally a small handful of games that use DLSS right now. And they need an RTX card which almost nobody owns (compared to the entire gaming PC market)
DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
Well... yeah? I don't understand your point of comparison. Downsampling to 4k would look absolutely amazing, but what sort of hardware can do that reliably at 60fps? =PDLSS 2.0 still looks like ass compared to hard downsampling though :(
I completely agree. It's not so much that the feature set is or isn't desirable. It's that the price and price/performance ratio was a big turn off to many of us. The vast majority of PC gamers run 1080p displays, for which Pascal was still absolutely crushing. Going into next gen, and with landmark titles like Cyberpunk, and a fantastic price/performance ratio on new GPUs, I think a lot more people are about to fall in love with DLSS.Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.
I'm downsampling from 5k to 3440*1440
If you want DLSS to look better downsample to a DSR resolution to your native while also using DLSS.
I'm not entirely sure if I need it in the picture at all. Granted, I'd also rather spend $3900 for an R5, versus AI upscale, then downscale back down an EOS 6D photo. Maybe I've spent too much time pixel peeping AI upscaled photography to compensate for the loss of vertical resolution from using Dual ISO to appreciate or even like the results.If you want DLSS to look better downsample to a DSR resolution to your native while also using DLSS.
As an example. 1080p screen. 4K DSR. 4K DLSS -> 1080p downsample.
The cheapest 3000 series card is still $500, that's like twice what most PC gamers seem comfortable with spending historically on a GPU and now there's a global recession on top of that. Yes, the performance upgrade may be worth it from a $/fps standpoint, but it's like comparing a Corvette to a Miata. They appeal to customers with completely different budgets.Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.
Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.
If AMD's new GPUs can't compete with DLSS, they will be dead in the water because the Nvidia GPUs, if similarly priced, offer a huge advantage on the increasing amount of notable titles that support it.
This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.
Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.
Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.
If AMD's new GPUs can't compete with DLSS, they will be dead in the water, because the Nvidia GPUs, if similarly priced offer a huge advantage on the increasing amount of notable titles that support it.
This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.
So therefore AMD's high end Big Navi without DLSS has a chance against Nvidia's entire range of GPUs that supports DLSS?The cheapest 3000 series card is still $500, that's like twice what most PC gamers seem comfortable with spending historically on a GPU and now there's a global recession on top of that. Yes, the performance upgrade may be worth it from a $/fps standpoint, but it's like comparing a Corvette to a Miata. They appeal to customers with completely different budgets.
If you took the 50 most popular PC games of 2020, I'm guessing almost all of them could run at High/1080P/60fps on a $200 GPU. I think the big thing holding back developers from making more demanding games is that we're still firmly in the PS4/X1 generation and not enough people are upgrading their displays to go above 1080/60. Maybe in a couple years that will all change.
AMD not over hyping a new GPU is probably for the bestWhen is AMD having their show/announcements. This all quiet on the front approach makes me a bit worried.
No I think they're both going to struggle to get most of the market off of the Pascal/Polaris based cards.So therefore AMD's high end Big Navi without DLSS has a chance against Nvidia's entire range of GPUs that supports DLSS?
If they don't have anything comparable to DLSS they need more than "competitive rasterization performance" to compete - they need much higher rasterization performance to be competitive because NVIDIA can produce comparable/superior image quality while rendering half the resolution.DLSS is amazing but has tiny adoption. If RDNA has no DLSS but competitive rasterisation performance and a good price, it'll do just fine. And this place will still shit on it because it doesn't have DLSS..
Of course downsampling from 2880p to 1440p should produce better image quality than DLSS.DLSS 2.0 still looks like ass compared to hard downsampling though :(
They literally confirmed their Big Navi GPUs are coming before the SoC Navi-powering the next-gen home consoles.Will AMD hold a conference this year or won't the new GPUs reelase this year? The status of stuff is mysterious right now. The only confirmed RDNA2 GPUs that will rlease this year are from next-gen consoles so far.
They literally confirmed their Big Navi GPUs are coming before the SoC Navi-powering the next-gen home consoles.
At this stage, i'm expecting an October event from AMD where they announce their next CPU and GPU lineups.
Not hyping. But releasing some info or attractive pricing info. They're going to have to compete on the dollar level. The pure performance alone from Nvidia is frightening, but DLSS performance is a game changer. Especially with 2.0. before it was still niche. But the titles coming and then their new machine learning is like beating a dead horse.
it seems perfectly reasonable.Yeah everyone is saying that AMD wants to announce their Zen 3 CPUs and RDNA2 GPUs together.
If they don't have anything comparable to DLSS they need more than "competitive rasterization performance" to compete - they need much higher rasterization performance to be competitive because NVIDIA can produce comparable/superior image quality while rendering half the resolution.
Source
Look at how much worse the aliasing (flickering) is in Death Stranding when rendering Native 4K + TAA compared to 4K DLSS Quality (1440p).
On top of that, it runs worse.
Of course downsampling from 2880p to 1440p should produce better image quality than DLSS.
But DLSS exists because high-end games are too demanding to be rendered at native resolution on 4K displays - or even 1440p ultrawides - let alone be downsampled from even higher resolutions like 5K.
Downsampling is not a cure-all for aliasing either - especially if your target is only 1440p.
If you look at an older game with really bad aliasing, like Alien: Isolation, injecting TAA at 1440p does a much better job for aliasing than downsampling from 2880p to 1440p.
The best results are achieved with a combination of downsampling and the improved anti-aliasing.
In a similar vein, you would surely have better results by combining DLSS with downsampling, rather than downsampling without it.
Instead of rendering native 5K with TAA and downsampling to 1440p, it would surely be better to use DLSS at 5K (1920p) and downsample that to 1440p.
You'd have better temporal stability, and better performance.
I'm talking about new graphics cards that will be sold by AMD and Nvidia. I am saying that AMD's offerings will not stand a chance against Nvidia's because of DLSS, and are basically DOA if they don't have it and you're coming back with "but Polaris and Pascal!", but those are not currently in production and are not being sold. I've repeated this and you are not understanding what I'm saying for some reason.No I think they're both going to struggle to get most of the market off of the Pascal/Polaris based cards.
It's nice to want things.DLSS 2.0 still looks like ass compared to hard downsampling though :(
Keyword - tomorrow. Ampere and RDNA2 won't be running too many RT titles, a large number of them will come later.The deciding factor will be pure Ray Tracing performance and not DLSS though. Should AMD lack behind in this category in significant ways AMD is in big trouble. There will still be the usual "it doesn't matter, not worth it" crowd, but I think Ray Tracing is going to be the GPU battlefield of tomorrow.
I dunno about that, it's orders of magnitude easier to integrate than RT and results in huge performance gains on everything from 2060 to 3090. It'll be rather hard to say no to this going forward, especially if you consider that Turing cards will pretty much need it to run RT at 30+ fps in a year or so from now.
The one thing is that the games where you really do need performance, are the ones most likely to implement DLSS.I keep wondering about this a lot. People speak as if everyone plays the same two dozen games.
The one thing is that the games where you really do need performance, are the ones most likely to implement DLSS.
Since these are the games pushing the tech, it's also likely that they will implement DLSS.
For games that don't push the tech, well you are getting a good framerate anyway. So it's the notion of DLSS helping you out when you really need it, that adds to the perceived advantage.
Maybe they can revisit x86 Tegra? Make CPUs with iGPUs and Tensor cores? Integrated graphics that can upscale 540p to 1080p would be an interesting proposition.Below this and there isn't that much of a point, entry level laptop maybe, but nothing too exciting here.
Keyword - tomorrow. Ampere and RDNA2 won't be running too many RT titles, a large number of them will come later.
Especially if you consider that Turing cards will pretty much need it to run RT at 30+ fps in a year or so from now.
These people are usually expecting their GPUs to start struggling in a couple of years from them being launched. So for them it'll be a norm no matter the delta between AMD and NV respective h/w.Sure but most people seem to sit longer than a year on GPUs. This years RT performance will be significant for many people, for a long time. Not taking it into account would be a mistake.
Ampere will be significantly faster than Turing in RT. What is some +50% on average will be +100% in RT workloads. Which means that Turing will start to struggle with 2021 RT titles in resolutions above 1080p.I think it is unlikely that Turing will look that bad in a year, or the vast majority of the Ampere audience would have a bad experience too
Consoles will provide a baseline but a) it will be fairly low so even Turing will likely deal with it with ease and b) not all next gen games will have RT, I expect many to opt for no RT in favor of higher resolutions and/or framerates.In the end it comes down to how RT is going to be implemented in the future and that will depend on how strong the lowest, common denominator is: PS5, or maybe XSX.
Nothing points to this so far. DLSS 2.0+ is too good of a tech to just pass on it for whatever reason.
Nothing points to this so far. DLSS 2.0+ is too good of a tech to just pass on it for whatever reason.
These people are usually expecting their GPUs to start struggling in a couple of years from them being launched. So for them it'll be a norm no matter the delta between AMD and NV respective h/w.
Consoles will provide a baseline but a) it will be fairly low so even Turing will likely deal with it with ease and b) not all next gen games will have RT
Ampere will be significantly faster than Turing in RT. What is some +50% on average will be +100% in RT workloads. Which means that Turing will start to struggle with 2021 RT titles in resolutions above 1080p.
hing points to this so far. DLSS 2.0+ is too good of a tech to just pass on it
Raytracing is what I and likely others looking to future proof themselves are focusing on. Some used to promote the 2070 Super over the 5700XT (and probably still do) because it could do raytracing. Well at the time games ran like shit until DLSS 2.0 came out but I still didn't think the card didn't do raytracing any justice. The new 3x series to me is the game changer I was looking for. AMD has their hands full now but $699 is still quite a premium price to me (not that interested in the 3070 but rather the 3080).Small handful of games that is soon to include Cyberpunk 2077, along with most other marquee PC titles moving forward.
Almost nobody owns RTX cards because the bang-to-buck ratio wasn't great for Pascal owners. This new gen changes that. The 3000 series is going to sell like hotcakes, and it offers DLSS, while the competitor does not. Any PC gamer looking to buy a new GPU will care about DLSS once they understand what it is.
If AMD's new GPUs can't compete with DLSS, they will be dead in the water because the Nvidia GPUs, if similarly priced, offer a huge advantage on the increasing amount of notable titles that support it.
This is not about what the entire Steam user base owns in terms of GPUs, it is about which new GPUs will sell.
People seem convinced that the adoption rate of DLSS 2.0 right now can be extrapolated to make assumptions about the next few years despite it only being introduced (as a beta!) in March. We're already seeing CoD and Fortnite using it, and isn't it also being built right into Unreal? I wouldnt be surprised if a year from now the majority of major PC games (so over 50%) will use it.
The thing is that for DLSS to be a good technical advantage, it only needs to be in a majority of those games, as they are the ones taht push graphics to the max and were DLSS will have an even greater impact (as running them at normal resolution might be a problem) in the customer. It trickling down to less intensive games (as it is currently happening), its just something that makes the technical advantage from good to great.If your definition of "major" is billion-dollar franchises like Call of Duty and Fortnite, then sure. Not only can they afford to throw whatever engineering resources they want at implementing new features, they probably had Nvidia engineers breaking down their doors offering to do it for them. I wouldn't even be surprised if Nvidia paid them to do it, given how valuable those games are as a marketing tool.
Personally I will be more interested in seeing what adoption looks like for games that are not high profile enough to get direct support from Nvidia.
The thing is that for DLSS to be a good technical advantage, it only needs to be in a majority of those games, as they are the ones taht push graphics to the max and were DLSS will have an even greater impact (as running them at normal resolution might be a problem) in the customer. It trickling down to less intensive games (as it is currently happening), its just something that makes the technical advantage from good to great.
If your definition of "major" is billion-dollar franchises like Call of Duty and Fortnite, then sure. Not only can they afford to throw whatever engineering resources they want at implementing new features, they probably had Nvidia engineers breaking down their doors offering to do it for them. I wouldn't even be surprised if Nvidia paid them to do it, given how valuable those games are as a marketing tool.
Personally I will be more interested in seeing what adoption looks like for games that are not high profile enough to get direct support from Nvidia.