• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

DreamRunner

Banned
Sep 14, 2020
934
I'll wait to see what they have to offer, but Nvidia always had better GPUs and this is coming from someone who had mostly AMD gpus.
 

asmith906

Member
Oct 27, 2017
27,465
Well they're not going to make many non PC exclusive titles that will not run on consoles. Most of the most demanding AAA games will probably run at 1440p 30 FPS on the next gen consoles by mid to late gen, and if so, these GPU's should be able to run them at 4k 30 or 4k 60 with DLSS, and 1440p 60+, etc, at the very least.

Now yeah, if you want to run them NOT at console settings, but on Ul;tra instead, you'd probably need a GPU that comes out towrd the middle of the generation, but that's always been the case.

A 700 series card still runs Red Dead Redemption 2 at PS4 settings, but you will not be running it at ultra until you get to the 1000 series.
People seem to have the idea stuck in their head that every game needs to be maxed at 4k and run over 60fps to be considered a 4k card. Even today that's not even possible with the 3080. Linus showed that Crysis remastered runs like crap with everything cranked up at 4k.
 

myzhi

Member
Oct 27, 2017
1,650
EVERYONE SHOULD WAIT FOR AMD BENCHMARKS BEFORE BUYING A 3080 / 3090.


So, I can have a chance at a 3080 or 3090
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
People seem to have the idea stuck in their head that every game needs to be maxed at 4k and run over 60fps to be considered a 4k card. Even today that's not even possible with the 3080. Linus showed that Crysis remastered runs like crap with everything cranked up at 4k.

And that's an excellent illustration of why too. Often times "Ultra" is just, we are throwing everything at this with minimal or no optimization, good luck to you!.

With the Crysis remaster it's literally turning off the LOD system entirely. Something utterly unoptimized by it's very nature. You literally cannot discern all of the extra detail from the best LOD model from even a few meters away form the camera.

A lot of Ultra features only offer minimal improvements for a LOT of performance trade off. High is usually where it's at. Of course there are exceptions, but this is usually what happens.
 

Tallshortman

Member
Oct 29, 2017
1,649
I'll wait to see what they have to offer, but Nvidia always had better GPUs and this is coming from someone who had mostly AMD gpus.

This is objectively not true. There's but several generations where AMD/ATI had better raw performance, better power efficiency, or both. They haven't been on top of the performance scales since the 290X but that doesn't change the facts.
 

Zoon

Member
Oct 25, 2017
1,397
2070s-2080 performance for less than 300€ and I might consider going AMD.
 

CrunchyFrog

Member
Oct 28, 2017
2,462
I'd be shocked if 3080s VRam becomes an actual issue. Performance ist where AMD needs to compete

Which it is lagging behind on based on these specs even, GDDR6 has a lower clock speed and the memory bus is much narrower. As has been explained many times, it'll be a while before 10GB of VRAM capacity becomes an issue, much less so if you're rendering less than 4k.
 

Timu

Member
Oct 25, 2017
15,635
And that's an excellent illustration of why too. Often times "Ultra" is just, we are throwing everything at this with minimal or no optimization, good luck to you!.

With the Crysis remaster it's literally turning off the LOD system entirely. Something utterly unoptimized by it's very nature. You literally cannot discern all of the extra detail from the best LOD model from even a few meters away form the camera.

A lot of Ultra features only offer minimal improvements for a LOT of performance trade off. High is usually where it's at. Of course there are exceptions, but this is usually what happens.
Yeah I don't care about Ultra settings anymore, stopped caring about them years ago. I usually do High and sometimes Medium since I value framerate over graphics. Heck for me it's Framerate>Resolution>Graphics.
 

SapientWolf

Member
Nov 6, 2017
6,565
I know DLSS is a huge initial advantage but I'm not confident counting AMD out yet. Especially considering how rushed the Ampere launch was. Waiting isn't going to cost me anything. There will be other machine learning temporal AA solutions in the future. Nvidia isn't the only one with a huge stake in that venture. AMD's cards tend to age pretty well, with the 9 year old 7970 still capable of running Doom Eternal at 1080p/60fps. Strong Linux / Vulkan performance is a nice plus. And calling the match before the fight even started is super boring and makes for boring threads.

But if AMD can't beat the 3080 at a lower price then that's probably game over.
 

DreamRunner

Banned
Sep 14, 2020
934
This is objectively not true. There's but several generations where AMD/ATI had better raw performance, better power efficiency, or both. They haven't been on top of the performance scales since the 290X but that doesn't change the facts.
You are probably right, but this was always my impression, probably because Nvidia products were always more expensive.
 

Herne

Member
Dec 10, 2017
5,326
Edit: Why would "Navi 22" have less VRAM than "Navi 21"? Shouldn't that be the other way around?

AMD are giving the chips codenames based on which came first. The most powerful chip was developed first so it is Navi 21. Any less powerful variants came later so you have Navi 22 and 23.
 

Siresly

Prophet of Regret
Member
Oct 27, 2017
6,588
If it's likely to be $499 I expect it to compete with cards at that price point, which will be the 3070.
They will apparently have the same memory and memory bus, so I believe myself even more now.
 

Raydonn

One Winged Slayer
Member
Oct 25, 2017
919
Doesn't seem entirely correct according to the other rumours I've heard.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
Even with the knowledge that AMD has planned to undercut Ampere via pricing (whatever the performance ends up being, all evidence shows they will be more power efficient GPUs), $499 seems too absurdly good to be true if this rumor reflects actual performance.
You are probably right, but this was always my impression, probably because Nvidia products were always more expensive.
Intel does the same thing with its CPUs despite them only being better in (some) games, and only a few percentages at that, while losing soundly in all other categories.
 
Last edited:

MrAlderson

Member
Apr 19, 2018
646
Even with the knowledge that AMD has planned to undercut Ampere via pricing (whatever the performance ends up being, all evidence shows they will be more power efficient GPUs), $499 seems too absurdly good to be true if this rumor reflects actual performance.
499 isn't the top card so i wonder what they price that one at 600 maybe?
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
499 isn't the top card so i wonder what they price that one at 600 maybe?
Pretty sure the 6900XT is the top-end card, aka. Navi 21.
AMD's new Radeon RX 6900 XT should be powered by Navi 21 GPU, 16GB of GDDR6 memory on a 256-bit memory bus, could cost $499.
I am skeptical of that being the price, in any case, especially if they're using HBM2. That's pretty expensive compared to GDDR6, last I recall.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
The games won't be GPU bound for doubling the frame rate but they might be CPU bound as I don't think PCs will have CPUs with double the single-thread performance of PS5/SX

True, so long as you are pushing grpahics settings and resolution you should comfortbaly beat out a console. As for CPu performence, I would not say never, over the next 7 years.
 

Dr. Zoidberg

Member
Oct 25, 2017
5,267
Decapod 10
I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.

It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?
 

MrAlderson

Member
Apr 19, 2018
646
Pretty sure the 6900XT is the top-end card, aka. Navi 21.

I am skeptical of that being the price, in any case, especially if they're using HBM2. That's pretty expensive compared to GDDR6, last I recall.
The last rumor about this said 499 is the cut down navi 21 idk how tweaktown came to the conclusion it was the 6900xt, it seems to not be the case
 

BeI

Member
Dec 9, 2017
5,996
Is there any chance RDNA2 will be similar to Ampere in regards to the "2x" shader cores count thing? Still don't know why Ampere is like that, but I don't recall seeing anything like that showing up in rumors leading up to the reveal.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.

It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?
Maybe? It depends on how the drivers work out this time and if supply issues remain for Ampere come October/November. The 5700XT is arguably the best AMD launch since the 400/500 RX series, and even it took a few months to work out the driver issues (I haven't had any issues with it but there's plenty of stories of people suffering black screens or having to switch from HDMI to DisplayPort and vice-versa as one or the other was having compatibility). To say their GPU division has an image problem is an understatement, but the same could have been said for their CPUs during the Dozer era. Personally, i'm going for whichever one gives me the best bang-for-buck at the $1,000 CAD price point (Don't care for raytracing and i'm assuming AMD will have their own DLSS equivalent). Assuming this performance is reflective of Navi 21, the main issue I can see for consumers is that the quality partner cards (Sapphire, PowerColor) might not be out until 2021.
The last rumor about this said 499 is the cut down navi 21 idk how tweaktown came to the conclusion it was the 6900xt, it seems to not be the case
Do you have a link for the latter i.e. cutdown vs. the full Navi 21?
 

MrAlderson

Member
Apr 19, 2018
646
Maybe? It depends on how the drivers work out this time and if supply issues remain for Ampere come October/November. The 5700XT is arguably the best AMD launch since the 400/500 RX series, and even it took a few months to work out the driver issues (I haven't had any issues with it but there's plenty of stories of people suffering black screens or having to switch from HDMI to DisplayPort and vice-versa as one or the other was having compatibility). To say their GPU division has an image problem is an understatement, but the same could have been said for their CPUs during the Dozer era. Personally, i'm going for whichever one gives me the best bang-for-buck at the $1,000 CAD price point (Don't care for raytracing and i'm assuming AMD will have their own DLSS equivalent). Assuming this performance is reflective of Navi 21, the main issue I can see for consumers is that the quality partner cards (Sapphire, PowerColor) might not be out until 2021.

Do you have a link for the latter i.e. cutdown vs. the full Navi 21?
I will look for it but rogame the person the article is based on even says he's not sure which variant those vram makeups are for
 

Buggy Loop

Member
Oct 27, 2017
1,232
I know DLSS is a huge initial advantage but I'm not confident counting AMD out yet. Especially considering how rushed the Ampere launch was. Waiting isn't going to cost me anything. There will be other machine learning temporal AA solutions in the future. Nvidia isn't the only one with a huge stake in that venture. AMD's cards tend to age pretty well, with the 9 year old 7970 still capable of running Doom Eternal at 1080p/60fps. Strong Linux / Vulkan performance is a nice plus. And calling the match before the fight even started is super boring and makes for boring threads.

But if AMD can't beat the 3080 at a lower price then that's probably game over.

I think they only have a chance to compete at pure rasterization performances. The thing peoples don't quite grasp yet with RDNA 2 architecture is that rasterization, RT and ML all compete for the operations on the CUs. Nvidia's solution is to have RT & ML dedicated.

The thing also with DirectML is that it provides an API for implementing and running neural network models. It won't provide the models.

DirectML gives a way to stack conv2d layers with ReLU layers to form a net and that's it. You have to find a model that achieves good upsampling and they will have to provide the weights that allows the AI to learn. You'll get as good of an upscaler as the model allows, and it can easily go from DLSS 1.0 (horrible) to unlikely DLSS 2.0 (god-tier). But it's a recipe not many peoples in this industry are familiar with and Nvidia as been at the top of the AI game with an head start of 13 years now with almost all deep learning being done on CUDA cores in the industry/universities and it still took them like 2 years to find this recipe. They have been in close collaboration with Microsoft since 2017 on directML and most of the DX12u ML's logic are straight out of Nvidia papers, directML is RTX codified for an API.

Also since directML is an API regulated by Microsoft, it makes more sense that Microsoft will provide a base model for developers to ML their games for upscaling than AMD providing a solution, because the advantage of the API is to remove the need to write vendor-specific code paths. DirectML will be hardware agnostic, in fact even a CPU can run directML, any DX12 GPU in fact, i can't even recall the GPU series' name since it's so far back in support. RDNA2 of course has accelerated ML over these older GPUs.

So yea..Microsoft will probably be the one proposing a model, but when? It might take a long time. I'm pretty sure AMD will have a checkerboard + AI sharpness (ala FidelityFX) upscaling in the meantime, but i seriously doubt they'll invest a lot of R&D and time on a model that will ... be vendor-specific and before Microsoft gives an universal model? What would be the point?
 
Last edited:

MrBob

Member
Oct 25, 2017
6,670
I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.

It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?
AMD is going to have to pull a Ryzen with RDNA2. Be competitive and a decent amount cheaper on this gen. If they are only competitive and similar in price, no one will switch. Well not many.

AMD really need to go big to try and gain market share. Nvidia already have watch dogs legion and cyber punk locked up with rtx features.
 
Nov 8, 2017
957
We've seen AMD cards with vram advantages get smoked by Nvidia cards in actual performance in the past. I'm gonna need to see a lot more...
 

Duxxy3

Member
Oct 27, 2017
21,804
USA
I guess the upside of the 30xx series selling out in .1 seconds is that I'll probably get to see the RDNA 2 cards before I'm able to buy anything new.
 

Gay Bowser

Member
Oct 30, 2017
17,731
They're actually making a 6900? I thought they would stop at 6800 this gen, just for...reasons.

Edit: Why would "Navi 22" have less VRAM than "Navi 21"? Shouldn't that be the other way around?

The numbers are just codenames. Generally the first on a new tens digit is the full version and later numbers are cut-down versions.

Navi 14 is smaller than Navi 10, etc.
 

shark97

Banned
Nov 7, 2017
5,327
Tweaktown seems to be underplaying these as the source tweet says


I've now got confirmation for both : > Navi21 16GB VRAM > Navi22 12GB VRAM I have no idea if these are full die or cut down SKUs

Also I though Navi 21 was expected to be at least 384 bit (and even that was cutting close), wouldn't 256 be a disappointment?

But yeah 499 for ~3080 at least sounds nice if it pans out. Still over a month to learn the truth :(
 

Atolm

Member
Oct 25, 2017
5,831
For 499$, if it doesn't have RTX or some sort of DLSS it should have a big advantage in traditional performance over the 3070. I'd buy it if that were the case.
 

Buggy Loop

Member
Oct 27, 2017
1,232
Is there any chance RDNA2 will be similar to Ampere in regards to the "2x" shader cores count thing? Still don't know why Ampere is like that, but I don't recall seeing anything like that showing up in rumors leading up to the reveal.

Console TFLOPs count does not indicate this. No reason to believe that RDNA 2 from consoles to PC will drastically differ in architecture.

In fact, it seems even AIB for the 3080 did not know about teh 2x shader cores as some of them had half the core count on their leaked specs before the Nvidia conference. It was a wild card.
 

shark97

Banned
Nov 7, 2017
5,327
Which it is lagging behind on based on these specs even, GDDR6 has a lower clock speed and the memory bus is much narrower. As has been explained many times, it'll be a while before 10GB of VRAM capacity becomes an issue, much less so if you're rendering less than 4k.


I'm not too worried about 10Gb but in general I cant agree. It sneaks up on you. Like a 6GB card right now IMO would be problematic.

One of the best purchasing decisions I ever made was to buy a HD 4890 with 1GB RAM rather than the 512 GB everybody online told me was fine at the time (by the time you need more the card will be too weak anyway, was the common argument). The 1GB FB made the card age so much better. Hell I tried to dig it out not too long ago, I had no GPU (sold mine to miners for a nice price) and needed something to play Destiny 2 PC, it would have still been great for that! turned out I forgot I had actually thrown it away :(

My mindset is never buy a card anywhere close to RAM limited. Again though, subjectively, 10GB feels ok. But then again there is next gen consoles...

If I was spending $700 on a card I'd probably want more than 10GB, but that's me. Some people buy cards a lot more often.
 

Deleted member 17289

Account closed at user request
Banned
Oct 27, 2017
3,163
man, as excited as I am for an actual competitor to Nvidia, I can't shake that feeling of "no DLSS, no buy" currently :/
Please be good!
This would be a legit worry if DLSS was available by default in all games, by the time this becomes the norm I'm sure AMD will have a similar option.
 

TSM

Member
Oct 27, 2017
5,830
I see a lot of people on the Internet (not necessarily in this thread) cheer on AMD and hope they bring out incredible new products, and then immediately follow that up with "so it will force Nvidia to lower prices. I would never buy an AMD card personally" or some variation of that thought. Oftentimes when pushed they will cite bad drivers as the reason why.

It seems to me that AMD has a severe image problem that they need to overcome with regards to their software. Even if they were somehow able to deliver a compelling card this time, it appears to me that a lot of people would not give an actual purchase a second glance. Do you think that is mostly hot air and people would buy an AMD GPU if it were to somehow best Nvidia on price/performance?

The thing is that Nvidia isn't just offering a video card. They are offering services specific to their hardware like gsync, DLSS, Broadcast, Reflex, Ansel, etc as well as game ready drivers day one on big releases. They also partner with many big games to add Nvidia specific features like RT and DLSS. If you look at the latest Steam survey it's very clear that AMD offering comparable hardware at a comparable price isn't working for them. AMD needs to either step up their services or offer significantly better performance at a much better price point then Nvidia to make any head way. As long as Nvidia is seen as the premium brand most consumers won't see a reason to buy a comparable AMD card over it's Nvidia counter part.
 
Last edited:

shark97

Banned
Nov 7, 2017
5,327
We've seen AMD cards with vram advantages get smoked by Nvidia cards in actual performance in the past. I'm gonna need to see a lot more...


Maybe at the top end. In general AMD is never behind in price/performance (usually they are a little better at it, because people prefer Nvidia brand and need an incentive to buy AMD). See EG, 5700XT currently. Well, in pre 3000 series landscape...
 

Revan Ren

Member
Jan 11, 2018
270
I'm hoping the rumors that the 6900XT is somewhere between the 3080 and the 3090 are true. Also hoping I can actually buy one at retail instead of eBay.
 

SharpX68K

Member
Nov 10, 2017
10,522
Chicagoland
256-bit bus for Navi 21 ?????????

giphy.gif


:/
 

SolarPowered

Member
Oct 28, 2017
2,215
The VRAM amounts are enticing, but those memory bus sizes do seem a tad on the small size. Hopefully AMD really puts the pedal to the medal on RDNA2. They really need to be competing with the RTX 3080 and not just the RTX 3070.