YesCan AMD offer me DLSS-type technology to let me play games in 8K but with the performance of 1080p?
YesCan AMD offer me DLSS-type technology to let me play games in 8K but with the performance of 1080p?
People seem to have the idea stuck in their head that every game needs to be maxed at 4k and run over 60fps to be considered a 4k card. Even today that's not even possible with the 3080. Linus showed that Crysis remastered runs like crap with everything cranked up at 4k.Well they're not going to make many non PC exclusive titles that will not run on consoles. Most of the most demanding AAA games will probably run at 1440p 30 FPS on the next gen consoles by mid to late gen, and if so, these GPU's should be able to run them at 4k 30 or 4k 60 with DLSS, and 1440p 60+, etc, at the very least.
Now yeah, if you want to run them NOT at console settings, but on Ul;tra instead, you'd probably need a GPU that comes out towrd the middle of the generation, but that's always been the case.
A 700 series card still runs Red Dead Redemption 2 at PS4 settings, but you will not be running it at ultra until you get to the 1000 series.
The games won't be GPU bound for doubling the frame rate but they might be CPU bound as I don't think PCs will have CPUs with double the single-thread performance of PS5/SXGPU's should be able to run them at 4k 30 or 4k 60 with DLSS, and 1440p 60+, etc, at the very least.
Navi 22 is 40 CUs, a Navi 10 replacement.
Navi 23 is presumably 20 CUs.
People seem to have the idea stuck in their head that every game needs to be maxed at 4k and run over 60fps to be considered a 4k card. Even today that's not even possible with the 3080. Linus showed that Crysis remastered runs like crap with everything cranked up at 4k.
I'll wait to see what they have to offer, but Nvidia always had better GPUs and this is coming from someone who had mostly AMD gpus.
I'd be shocked if 3080s VRam becomes an actual issue. Performance ist where AMD needs to compete
Yeah I don't care about Ultra settings anymore, stopped caring about them years ago. I usually do High and sometimes Medium since I value framerate over graphics. Heck for me it's Framerate>Resolution>Graphics.And that's an excellent illustration of why too. Often times "Ultra" is just, we are throwing everything at this with minimal or no optimization, good luck to you!.
With the Crysis remaster it's literally turning off the LOD system entirely. Something utterly unoptimized by it's very nature. You literally cannot discern all of the extra detail from the best LOD model from even a few meters away form the camera.
A lot of Ultra features only offer minimal improvements for a LOT of performance trade off. High is usually where it's at. Of course there are exceptions, but this is usually what happens.
You are probably right, but this was always my impression, probably because Nvidia products were always more expensive.This is objectively not true. There's but several generations where AMD/ATI had better raw performance, better power efficiency, or both. They haven't been on top of the performance scales since the 290X but that doesn't change the facts.
Edit: Why would "Navi 22" have less VRAM than "Navi 21"? Shouldn't that be the other way around?
Intel does the same thing with its CPUs despite them only being better in (some) games, and only a few percentages at that, while losing soundly in all other categories.You are probably right, but this was always my impression, probably because Nvidia products were always more expensive.
499 isn't the top card so i wonder what they price that one at 600 maybe?Even with the knowledge that AMD has planned to undercut Ampere via pricing (whatever the performance ends up being, all evidence shows they will be more power efficient GPUs), $499 seems too absurdly good to be true if this rumor reflects actual performance.
Pretty sure the 6900XT is the top-end card, aka. Navi 21.499 isn't the top card so i wonder what they price that one at 600 maybe?
I am skeptical of that being the price, in any case, especially if they're using HBM2. That's pretty expensive compared to GDDR6, last I recall.AMD's new Radeon RX 6900 XT should be powered by Navi 21 GPU, 16GB of GDDR6 memory on a 256-bit memory bus, could cost $499.
The games won't be GPU bound for doubling the frame rate but they might be CPU bound as I don't think PCs will have CPUs with double the single-thread performance of PS5/SX
The last rumor about this said 499 is the cut down navi 21 idk how tweaktown came to the conclusion it was the 6900xt, it seems to not be the casePretty sure the 6900XT is the top-end card, aka. Navi 21.
I am skeptical of that being the price, in any case, especially if they're using HBM2. That's pretty expensive compared to GDDR6, last I recall.
Maybe? It depends on how the drivers work out this time and if supply issues remain for Ampere come October/November. The 5700XT is arguably the best AMD launch since the 400/500 RX series, and even it took a few months to work out the driver issues (I haven't had any issues with it but there's plenty of stories of people suffering black screens or having to switch from HDMI to DisplayPort and vice-versa as one or the other was having compatibility). To say their GPU division has an image problem is an understatement, but the same could have been said for their CPUs during the Dozer era. Personally, i'm going for whichever one gives me the best bang-for-buck at the $1,000 CAD price point (Don't care for raytracing and i'm assuming AMD will have their own DLSS equivalent). Assuming this performance is reflective of Navi 21, the main issue I can see for consumers is that the quality partner cards (Sapphire, PowerColor) might not be out until 2021.I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.
It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?
Do you have a link for the latter i.e. cutdown vs. the full Navi 21?The last rumor about this said 499 is the cut down navi 21 idk how tweaktown came to the conclusion it was the 6900xt, it seems to not be the case
I will look for it but rogame the person the article is based on even says he's not sure which variant those vram makeups are forMaybe? It depends on how the drivers work out this time and if supply issues remain for Ampere come October/November. The 5700XT is arguably the best AMD launch since the 400/500 RX series, and even it took a few months to work out the driver issues (I haven't had any issues with it but there's plenty of stories of people suffering black screens or having to switch from HDMI to DisplayPort and vice-versa as one or the other was having compatibility). To say their GPU division has an image problem is an understatement, but the same could have been said for their CPUs during the Dozer era. Personally, i'm going for whichever one gives me the best bang-for-buck at the $1,000 CAD price point (Don't care for raytracing and i'm assuming AMD will have their own DLSS equivalent). Assuming this performance is reflective of Navi 21, the main issue I can see for consumers is that the quality partner cards (Sapphire, PowerColor) might not be out until 2021.
Do you have a link for the latter i.e. cutdown vs. the full Navi 21?
I know DLSS is a huge initial advantage but I'm not confident counting AMD out yet. Especially considering how rushed the Ampere launch was. Waiting isn't going to cost me anything. There will be other machine learning temporal AA solutions in the future. Nvidia isn't the only one with a huge stake in that venture. AMD's cards tend to age pretty well, with the 9 year old 7970 still capable of running Doom Eternal at 1080p/60fps. Strong Linux / Vulkan performance is a nice plus. And calling the match before the fight even started is super boring and makes for boring threads.
But if AMD can't beat the 3080 at a lower price then that's probably game over.
AMD is going to have to pull a Ryzen with RDNA2. Be competitive and a decent amount cheaper on this gen. If they are only competitive and similar in price, no one will switch. Well not many.I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.
It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?
Can you specify?Doesn't seem entirely correct according to the other rumours I've heard.
Edit: Why would "Navi 22" have less VRAM than "Navi 21"? Shouldn't that be the other way around?
Depends on how high its clocks will be really.Wouldn't that be a bit too low to compete against the RTX 3070?
No. RDNA architecture doesn't have such option.Is there any chance RDNA2 will be similar to Ampere in regards to the "2x" shader cores count thing?
I've now got confirmation for both : > Navi21 16GB VRAM > Navi22 12GB VRAM I have no idea if these are full die or cut down SKUs
Is there any chance RDNA2 will be similar to Ampere in regards to the "2x" shader cores count thing? Still don't know why Ampere is like that, but I don't recall seeing anything like that showing up in rumors leading up to the reveal.
Which it is lagging behind on based on these specs even, GDDR6 has a lower clock speed and the memory bus is much narrower. As has been explained many times, it'll be a while before 10GB of VRAM capacity becomes an issue, much less so if you're rendering less than 4k.
This would be a legit worry if DLSS was available by default in all games, by the time this becomes the norm I'm sure AMD will have a similar option.man, as excited as I am for an actual competitor to Nvidia, I can't shake that feeling of "no DLSS, no buy" currently :/
Please be good!
I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.
It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?
We've seen AMD cards with vram advantages get smoked by Nvidia cards in actual performance in the past. I'm gonna need to see a lot more...
So it will take time. It's better to go nvidea.See my post earlier.
Makes no sense for AMD to compete against Microsoft for a directML model for upscaling.
Yeah if true it's a little bad.
They are rumored to have some super mondo 128MB on GPU cache breakthrough tech to mitigate that, but still.
IF true...
2070s-2080 performance for less than 300€ and I might consider going AMD.