• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

dgrdsv

Member
Oct 25, 2017
11,879
Oct 25, 2017
1,170
Wakayama
That 6800XT looks sexy. Thinking of going AMD this gen, if for no other reason to give them the support to keep competing.

Assuming they have stock of course.
 

Alyssa

Member
Oct 27, 2017
291
Belgium
I don't understand their pricing, so they have a way cheaper RTX 3090 alternative, but a more expensive RTX 3070 alternative ? Or am I missing something :/
 

closure

Alt account
Banned
Jul 21, 2020
15
I don't understand their pricing, so they have a way cheaper RTX 3090 alternative, but a more expensive RTX 3070 alternative ? Or am I missing something :/
mXV6H8X.png


this is from MLID's latest video, his rumours have been pretty accurate about the rest of this launch but still just rumours.

3070 kind of has its own lane, but I'd still rather get the surrounding AMD cards over it, 8GB vram just seems like a joke in a $499 card, and only suitable if you're sticking with 1080p
 

Alyssa

Member
Oct 27, 2017
291
Belgium
mXV6H8X.png


this is from MLID's latest video, his rumours have been pretty accurate about the rest of this launch but still just rumours.

3070 kind of has its own lane, but I'd still rather get the surrounding AMD cards over it, 8GB vram just seems like a joke in a $499 card, and only suitable if you're sticking with 1080p

Well, one is 16 GB DDR6 and the other one is 8 GB DDR6X. I'm not sure if it's fair to compare the two. I though that after watching the presentation, I would make up my mind weither to buy a RTX or AMD in the future, but all these numbers only confuse me even more.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,639
I don't understand their pricing, so they have a way cheaper RTX 3090 alternative, but a more expensive RTX 3070 alternative ? Or am I missing something :/
6800 seems to be better than a 3070 though by a considerable margin, it also has VRAM that will won't make you worry in any game for the next 7 years either
 

closure

Alt account
Banned
Jul 21, 2020
15
Well, one is 16 GB DDR6 and the other one is 8 GB DDR6X. I'm not sure if it's fair to compare the two. I though that after watching the presentation, I would make up my mind weither to buy a RTX or AMD in the future, but all these numbers only confuse me even more.
nah the 3070 is just using standard GDDR6, but yeah I'm in the same position where the more info released the more indecisive I become lol. there are rumours of a 10gb 3070ti which might be my ideal card...
 

shadow2810

Member
Oct 25, 2017
3,244
If you have money to spend on a 6800 or 3070 chance is you will probably do your next upgrade in 4-5 years anyway so vram should not be issue.
 

Nooblet

Member
Oct 25, 2017
13,632
Well, one is 16 GB DDR6 and the other one is 8 GB DDR6X. I'm not sure if it's fair to compare the two. I though that after watching the presentation, I would make up my mind weither to buy a RTX or AMD in the future, but all these numbers only confuse me even more.
3070 is GDDR6 actually.
 

Alyssa

Member
Oct 27, 2017
291
Belgium

dgrdsv

Member
Oct 25, 2017
11,879
If you have money to spend on a 6800 or 3070 chance is you will probably do your next upgrade in 4-5 years anyway so vram should not be issue.
If you don't plan to upgrade the GPU again until the end of 2022 (meaning 40 and 7000 series) then 6800/XT 16GB would be an obvious better choice.
 
Nov 2, 2017
2,275
I just think VRAM won't matter at all for the lifespan of these cards. None of these cards are going to last very long by PC standards. It's very different from a 290x or 970. These cards will be the 7950/7970 - 670/680 of this generation. If you realy care so much about having to drop down to high/medium textures then you're going to want to upgrade anyway in 2-3 years.

Right now is probably the worst time to buy a GPU for futureproofing as these cards are "barely" faster than next gen consoles. The 7970 was actually faster relative to the current gen console than the 6900XT is to next gen consoles. These cards are okay to bridge the crossgen period but once next gen starts in full swing then these cards just won't be enough unless you're willing to drop settings or play at 1080p/1440p. In such a scenario VRAM doesn't matter as much anyway.
Buying a $500+ card right now with the intent of futureproofing is just a terrible idea imo and you're better off with a next gen console/$300-400 card to last until we actually get some cards that are signifcantly faster than next gen consoles.

Now if you have two equal cards with the same price and one has more VRAM then the one with more VRAM is the obvious choice. I personally think raytracing performance is more important than VRAM as you can actually benefit from that day 1 and it's not something that might matter in a couple of years.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
I think 8GB is too small for a $500 card, so I personally would automatically rule out Ampere's 3070.
 

Jroc

Banned
Jun 9, 2018
6,145
mXV6H8X.png


this is from MLID's latest video, his rumours have been pretty accurate about the rest of this launch but still just rumours.

3070 kind of has its own lane, but I'd still rather get the surrounding AMD cards over it, 8GB vram just seems like a joke in a $499 card, and only suitable if you're sticking with 1080p

Honestly that chart makes the 6700XT look like the true 3070 competitor. $100 cheaper, 13% weaker with 50% more VRAM. If overclocking can close the gap at all then it seems like a great choice. The 3070 would be hands down THE card to get if it had 12GB of VRAM imo.

I hope at least one of these cards ends up overclocking well similar to the 5700 and the Vega 56. It's incredibly boring when both companies have things perfectly segmented so that performances scales almostly linearly with price.

If the 6800XT turns into a 3070 with RT on then I'm really curious how the $330 6700 will perform. Would it be worse than the 6GB 2060? Maybe it won't even have RT.
 

dgrdsv

Member
Oct 25, 2017
11,879
I just think VRAM won't matter at all for the lifespan of these cards.
Again, depends on how you define "lifespan".
If it's until the next generation of GPUs (early to mid 2022) then yeah, 8GBs won't be any kind of an issue.
If it's beyond that though, through 2022 and 23 to 24 then there will most certainly be games and modes where 8GBs will be a limiting factor - not a breaking one as you can always lower texture quality and such but still.
 
Nov 2, 2017
2,275
Again, depends on how you define "lifespan".
If it's until the next generation of GPUs (early to mid 2022) then yeah, 8GBs won't be any kind of an issue.
If it's beyond that though, through 2022 and 23 to 24 then there will most certainly be games and modes where 8GBs will be a limiting factor - not a breaking one as you can always lower texture quality and such but still.
That's why I somewhat clarrified what I meant with "lifespan". I meant it in the sense of running 60fps at console resolution. None of these cards are fast enough for that, even at console settings. You're already going to have to run at lower resoluions and/or settings, which will save VRAM.

I think for the people who want to run games at ultra settings with 60fps+ performance none of these cards are going to have a very long "lifespan". They're good cards for current gen games but they're going to struggle hard with next gen, so if you really care that much about dropping your settings, then you're not going to be statisfied and you're going to be upgrading the first chance you get.
 

dgrdsv

Member
Oct 25, 2017
11,879
Just curious: is there is a specific reason to expect next gen GPUs to land in early/mid 2022, or is this extrapolation from the normal cadence?
Extrapolation taking into account what we know of upcoming process advancement and the fact that the competition in GPU space is heating up again meaning that launches from all IHVs will likely happen a bit faster in the upcoming several years and can become a bit messier, with new chips filling some parts of already existing lineups instead of being a whole lineup overhaul (think closer to G92 vs G80 type of changes).
 

cjn83

Banned
Jul 25, 2018
284
Extrapolation taking into account what we know of upcoming process advancement and the fact that the competition in GPU space is heating up again meaning that launches from all IHVs will likely happen a bit faster in the upcoming several years and can become a bit messier, with new chips filling some parts of already existing lineups instead of being a whole lineup overhaul (think closer to G92 vs G80 type of changes).

Well thought through as always. Thanks, dgrdsv.
 

Graven

Member
Oct 30, 2018
4,102
Nvidia is the more accessible path nowadays it seems. I assume the 3070 will be the overall vanilla for next gen PC players.
 

asmith906

Member
Oct 27, 2017
27,392
mXV6H8X.png


this is from MLID's latest video, his rumours have been pretty accurate about the rest of this launch but still just rumours.

3070 kind of has its own lane, but I'd still rather get the surrounding AMD cards over it, 8GB vram just seems like a joke in a $499 card, and only suitable if you're sticking with 1080p
Do we have any idea when something like the 6700 might come out.
 

Gusy

Member
Oct 27, 2017
1,070
I'm interested in whatever gets me more performance in Flight Sim 2020 and VR games. Seems to me that Zen3 is showing better performance for single core bottle-necked games right?

I also wonder if the 6900xt could have similar raytracing performance in 1440P than a 3090 at 4K. If that's the case.. a 5800 cpu with a 6900 GPU would be my ideal mix for 1440P gaming and VR.
 

Firefly

Member
Jul 10, 2018
8,633
That's why I somewhat clarrified what I meant with "lifespan". I meant it in the sense of running 60fps at console resolution. None of these cards are fast enough for that, even at console settings. You're already going to have to run at lower resoluions and/or settings, which will save VRAM.

I think for the people who want to run games at ultra settings with 60fps+ performance none of these cards are going to have a very long "lifespan". They're good cards for current gen games but they're going to struggle hard with next gen, so if you really care that much about dropping your settings, then you're not going to be statisfied and you're going to be upgrading the first chance you get.
Do we know the equivalent AMD GPU in the Series X and PS5?
 

gimbles123

Member
Oct 27, 2017
296
I just think VRAM won't matter at all for the lifespan of these cards. None of these cards are going to last very long by PC standards. It's very different from a 290x or 970. These cards will be the 7950/7970 - 670/680 of this generation. If you realy care so much about having to drop down to high/medium textures then you're going to want to upgrade anyway in 2-3 years.

Right now is probably the worst time to buy a GPU for futureproofing as these cards are "barely" faster than next gen consoles. The 7970 was actually faster relative to the current gen console than the 6900XT is to next gen consoles. These cards are okay to bridge the crossgen period but once next gen starts in full swing then these cards just won't be enough unless you're willing to drop settings or play at 1080p/1440p. In such a scenario VRAM doesn't matter as much anyway.
Buying a $500+ card right now with the intent of futureproofing is just a terrible idea imo and you're better off with a next gen console/$300-400 card to last until we actually get some cards that are signifcantly faster than next gen consoles.

Now if you have two equal cards with the same price and one has more VRAM then the one with more VRAM is the obvious choice. I personally think raytracing performance is more important than VRAM as you can actually benefit from that day 1 and it's not something that might matter in a couple of years.

The gpus you mentioned released drastically apart relative to when the current consoles released. Life span is going to vary dependent on a consumer's priorities, and buying tech like this concurrently with the next-gen release foregoes generational benefits. Futureproofing is antithetical to anyone considering a new gpu in 2 years or less. Your post is confusing and misinformative for all the folks here not well versed in the technology.

Do we know the equivalent AMD GPU in the Series X and PS5?

We do not in the sense that the consumer gpu equivalent has not been revealed. However, based on what info AMD has released and reputable leak sources there will be 6600(XT)/6700(XT) gpus with smaller NAVI dies that are effectively what're in the XSX/PS5.
 

Firefly

Member
Jul 10, 2018
8,633
We do not in the sense that the consumer gpu equivalent has not been revealed. However, based on what info AMD has released and reputable leak sources there will be 6600(XT)/6700(XT) gpus with smaller NAVI dies that are effectively what're in the XSX/PS5.
I figured when the poster said these cards are "barely" faster, that the consoles would be 6800 or close to it. This is a huge difference.
 
Nov 2, 2017
2,275
The gpus you mentioned released drastically apart relative to when the current consoles released. Life span is going to vary dependent on a consumer's priorities, and buying tech like this concurrently with the next-gen release foregoes generational benefits. Futureproofing is antithetical to anyone considering a new gpu in 2 years or less. Your post is confusing and misinformative for all the folks here not well versed in the technology.
It doesn't really matter when they released, just how much faster they are compared to consoles. A 7970 was in fact faster than a 6900xt relative to consoles. You could use the GPUs that released at the end of 2013 and it's more or less the same story. Those were actually a lot faster than what we have now though relative to consoles so I didn't use them.

My post is there to warn people not to spend too much expecting their cards to last very long by PC standards. Sure, they'll be fine if they play at 1080p/1440p or are fine with 30fps. Most people don't buy $500+ cards for that though. I think I'm going to hear the word "unoptimized" a lot in the next two years.

I figured when the poster said these cards are "barely" faster, that the consoles would be 6800 or close to it. This is a huge difference.
The barely was in the context of consoles vs PCs. I think a 30-40% difference is pretty close in this context. That doesn't get you very much. Most people don't play on PC for 30fps. In theoretical Tflops there is a 33% difference between the XSX & 6800. The difference in performance might be different because of bandwidth differences or the way the 6800 boost works.
 

gimbles123

Member
Oct 27, 2017
296
It doesn't really matter when they released, just how much faster they are compared to consoles. A 7970 was in fact faster than a 6900xt relative to consoles. You could use the GPUs that released at the end of 2013 and it's more or less the same story. Those were actually a lot faster than what we have now though relative to consoles so I didn't use them.

My post is there to warn people not to spend too much expecting their cards to last very long by PC standards. Sure, they'll be fine if they play at 1080p/1440p or are fine with 30fps. Most people don't buy $500+ cards for that though. I think I'm going to hear the word "unoptimized" a lot in the next two years.

It does matter if you're trying to compare them 1:1. Where are you sourcing your data, we have very few benchmarks for XSX/PS5 performance, and only the AMD slides for 6900xt? You have no solid comparison and are conflating speculation with facts.

PC standards? If someone is expecting to be on top for the next 5 years they know nothing about the progression of microchip technology and should educate themselves because spending several hundred dollars or more on a PC component without a clue of what you're getting out of it is reckless. Not sure what optimization has to do with hardware, but in the previous years when "unoptimized" was spammed people didn't know what they were talking about then either.
 
Nov 2, 2017
2,275
It does matter if you're trying to compare them 1:1. Where are you sourcing your data, we have very few benchmarks for XSX/PS5 performance, and only the AMD slides for 6900xt? You have no solid comparison and are conflating speculation with facts.
We have DF saying the XSX is on par with a 2080 in Gears 5. Based on what we know about RDNA2 we can also make a pretty accurate estimate. The XSX has 25% less Tflops than the 6800 on the same architecture. Like I said that doesn't give you an exact performance difference between them but it gives you a fairly confident estimate of what to expect.

Not sure what optimization has to do with hardware, but in the previous years when "unoptimized" was spammed people didn't know what they were talking about then either.
That's kind of my point yeah. These cards are going to struggle next gen and people will throw out the term "unoptimized" when their 3080/6800xt isn't able to hit 60fps at 1440p at console settings without really knowing what they're talking about.
 

Tacitus

Member
Oct 25, 2017
4,038
That's kind of my point yeah. These cards are going to struggle next gen and people will throw out the term "unoptimized" when their 3080/6800xt isn't able to hit 60fps at 1440p at console settings without really knowing what they're talking about.

With the consoles targeting native 4k, I think some complaining might be warranted.
 
Nov 2, 2017
2,275
With the consoles targeting native 4k, I think some complaining might be warranted.
AAA games aren't really going to target native 4K. It's going to be mostly dynamic or some form of upscaling like CB. It seems even cross gen games aren't hitting native 4K. Someone pixel counted the WD XSX footage and found 1440p as the resolution so it's probably a dynamic res that drops down to 1440p. We already know how it performs on a 3080 and if you take into account that a 2080 tier console runs this at 1440p 30fps in locations it isn't really surprising that the 3080 struggles.

You can argue it doesn't look good enough for how it performs but it's not unoptimized relative to consoles. You can't expect a 3080 to hit 60fps even at 1440p in a game where the XSX runs it 1440p30fps.
 

Firefly

Member
Jul 10, 2018
8,633
We have DF saying the XSX is on par with a 2080 in Gears 5. Based on what we know about RDNA2 we can also make a pretty accurate estimate. The XSX has 25% less Tflops than the 6800 on the same architecture. Like I said that doesn't give you an exact performance difference between them but it gives you a fairly confident estimate of what to expect.
Gears 5 turned out to be dynamic, ranging from 1080p to 4K on Series X. 2080 can do much better than that.
 

Jroc

Banned
Jun 9, 2018
6,145
Heard only good things about their cards. Currently running an XFX card and its running hot, too hot for my liking.

My Sapphire RX480 Nitro+ has been hot and loud since day one with some intense coil noise. Maybe I just got a bad model, but I'll definitely think twice before spending a lot of money on a "high-end" partner card in the future.
 

Tacitus

Member
Oct 25, 2017
4,038
AAA games aren't really going to target native 4K. It's going to be mostly dynamic or some form of upscaling like CB. It seems even cross gen games aren't hitting native 4K.

Huh. Ain't that some shit. I don't intend to buy the new consoles so I didn't know their claims were already broken.