• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

GravaGravity

Member
Oct 27, 2017
4,223
What was the comparisons between the 980 and the 980ti when they launched? I've been rocking my 980ti for about 5 years and I'm kind of tempted by the 3080ti, if could get some similar longevity out of it.
 

Havel

Member
Oct 25, 2017
490
What was the comparisons between the 980 and the 980ti when they launched? I've been rocking my 980ti for about 5 years and I'm kind of tempted by the 3080ti, if could get some similar longevity out of it.
Think the Ti was 30% faster the the 980. i upgraded from a Gigabyte G1 Gaming 980 Ti (which was roughly equal to a 1070 Ti, so heavily overclocked compared to the base 980 Ti) to a 3090 and I couldn't be happier. I imagine the 3080 Ti will be basically the same as a 3090 just with slightly less VRAM.
 

scabobbs

Member
Oct 28, 2017
2,103
What was the comparisons between the 980 and the 980ti when they launched? I've been rocking my 980ti for about 5 years and I'm kind of tempted by the 3080ti, if could get some similar longevity out of it.
The 3080 is the pick this time around. The 3090 is about 10% faster than the 3080, so the 3080Ti will be almost exactly the same.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,201
Dark Space
Nvidia pretty much fucked up the product stack this time. The 3090 and 3080 Ti are pointless if they are minimal raster improvements, and just dangle higher VRAM for the fish to bite.

Should've just dropped a 3080 20GB, then waited until you could release a higher tier SKU that would offer at least a 20% performance boost along with whatever VRAM brings the boys and girls to the yard.

Having three cards within 15% of each other, but price a variance of $800, is just... amateurish. This is why I still kinda don't believe these 3080 Ti rumors. Again, how can you release a better value 3090, with 3090 orders still pending in the channel? This makes no sense from a business or PR perspective. You knew all along that AMD was going to compete and undercut your SKU, so why make the blunder of pricing at $1,499 in the first place?

Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?

Take the L.
 

dgrdsv

Member
Oct 25, 2017
11,846
Should've just dropped a 3080 20GB
That would be a lot more "fucked up" than what they plan currently.
3080 20GB wouldn't be able to land at $700 and would likely cost around $900 while providing about zero performance advantage over a 10GB model (has anyone seen even one benchmark where 6800/XT gets ahead of 3080 due to its VRAM size?)
3080Ti with 12 or 20 GBs (my money is on 12 btw but we'll see) makes more sense as it will be faster than 3080 because it will have more SMs enabled. This at least somewhat justify the price hike over 3080 10GB for those who are willing to pay for it.

Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?
The only way to not let AMD to use the gaps in your lineup would be to launch after AMD. They've chosen to go first because they were confident enough in what they've had on hand. And the results so far show that they were right.
 

Roytheone

Member
Oct 25, 2017
5,140
The non-ti 3060 is also rumored for January right? They will have quite a few cards out around January then, and probably non will have stock :(
 

Azai

Member
Jun 10, 2020
3,958
The 3080 is the pick this time around. The 3090 is about 10% faster than the 3080, so the 3080Ti will be almost exactly the same.

I dont think the 3080Ti will be the same in performance. With the rumored alsmosz 2000 CUDA Core increase it should see quite a juml. I agree that placing it between the 3090 and the 3080 is pretty weird but either the selling point is 20GB VRAM or more performance plus more VRAM. So if they really only give it 12GB with no noticable performance increase I would think its pointless. Paying probably 300€ more just for 2 more GB of VRAM?! Especially since AMD offers more for a better price.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Nvidia pretty much fucked up the product stack this time. The 3090 and 3080 Ti are pointless if they are minimal raster improvements, and just dangle higher VRAM for the fish to bite.

Should've just dropped a 3080 20GB, then waited until you could release a higher tier SKU that would offer at least a 20% performance boost along with whatever VRAM brings the boys and girls to the yard.

Having three cards within 15% of each other, but price a variance of $800, is just... amateurish. This is why I still kinda don't believe these 3080 Ti rumors. Again, how can you release a better value 3090, with 3090 orders still pending in the channel? This makes no sense from a business or PR perspective. You knew all along that AMD was going to compete and undercut your SKU, so why make the blunder of pricing at $1,499 in the first place?

Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?

Take the L.
They're gonna flood the market like they did with Turing. The 16 series didn't have much variance once the Supers were out. The 20 series had shit like the 2060 Super and the 2070 on the shelf at the same time despite being functionally the same
 

Azai

Member
Jun 10, 2020
3,958
They're gonna flood the market like they did with Turing. The 16 series didn't have much variance once the Supers were out. The 20 series had shit like the 2060 Super and the 2070 on the shelf at the same time despite being functionally the same

Didnt the 2060 only had 6GB VRAM?
But yea although the 2070 had slightly better performance it was basically just a cheaper 2070.
But they shot themselves in the foot by pricing the 20series way too high. O remember the reports of Nvidia losing millions because noone bought them.
Now they priced them cheaper or at least for a better value but noone can actually get them because there are none.
Really curious about the sales numbers.
Just because they are sold out everywhere doesnt mean they sell better than the 20 series.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Didnt the 2060 only had 6GB VRAM?
But yea although the 2070 had slightly better performance it was basically just a cheaper 2070.
But they shot themselves in the foot by pricing the 20series way too high. O remember the reports of Nvidia losing millions because noone bought them.
Now they priced them cheaper or at least for a better value but noone can actually get them because there are none.
Really curious about the sales numbers.
Just because they are sold out everywhere doesnt mean they sell better than the 20 series.
The 2060 had 6GB, the 2060 Super had 8GB.

We don't have numbers but their gaming hardware posted an increase in revenue thanks to Ampere's launch. So they must have sold quite a bit of them
 

dgrdsv

Member
Oct 25, 2017
11,846
Really curious about the sales numbers.
Just because they are sold out everywhere doesnt mean they sell better than the 20 series.
They sell way better at launch that 20 series did.

There's also this:

post-cgi.png


3Q20 hasn't seen many Ampere shipments as it ended with 30th of Sep. But still no apparent change in shipment numbers here which means that Ampere must have been shipping in the same numbers as Turing had in the same niches previously.
 

Readler

Member
Oct 6, 2018
1,972
That would be a lot more "fucked up" than what they plan currently.
3080 20GB wouldn't be able to land at $700 and would likely cost around $900 while providing about zero performance advantage over a 10GB model (has anyone seen even one benchmark where 6800/XT gets ahead of 3080 due to its VRAM size?)
3080Ti with 12 or 20 GBs (my money is on 12 btw but we'll see) makes more sense as it will be faster than 3080 because it will have more SMs enabled. This at least somewhat justify the price hike over 3080 10GB for those who are willing to pay for it.
ComputerBase attributes the worse frametimes in Ghost Recon at 4k to precisely that; the 3070 becomes completely unplayable (<30 FPS).
www.computerbase.de

AMD Radeon RX 6800 und RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing

AMD Radeon RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing / Testsystem und Testmethodik

Still can't believe people are defending this.
 
Nov 2, 2017
2,275
And 6800 which costs +$80 to 3070 gets to 34 fps thanks to its 16GB of VRAM then? Wow what a difference.
In this case you should be looking at the 0.2% minimums. The 3070 falls much further behind in these. On average the 6800 is 9% faster while it's 26% faster in minimums in Ghost Recon. The same happens to a lesser extent in the 3080 vs 6800xt comparison.

It's likely that this is related to VRAM as it doesn't happen at 1080p.
 

dgrdsv

Member
Oct 25, 2017
11,846
In this case you should be looking at the 0.2% minimums. The 3070 falls much further behind in these. On average the 6800 is 9% faster while it's 26% faster in minimums in Ghost Recon.
Both are at 20-30 fps 0.2% minimums there which is unplayable IMO and is in fact a highlight of the fact that 16GBs won't help N21 much since it doesn't have enough processing power to make full use of them anyway.

The same happens to a lesser extent in the 3080 vs 6800xt comparison.
It's hard to say if that happens due to VRAM differences or renderer differences. This applies to both comparisons there, 6800 vs 3070 and XT vs 3080
Note that AMD gets relatively higher 0.2% results in this benchmark even on cards which has the same VRAM capacities - like V64 vs 1080 or 5700XT vs 2070S. Thus it looks more like the renderer itself is doing something on NV cards which leads to lower minimal fps and this isn't necessarily due to VRAM differences.
Another interesting example there is RVII which goes from being on par with 1080Ti in averages to being behind it in 0.2%. That's a 16GB card vs a 11GB one so clearly this isn't a VRAM deficit.
People are too quick to assign blame to VRAM whenever they see such results. But the truth is always a bit more complex and is usually a combination of many things, drivers included.
 

Readler

Member
Oct 6, 2018
1,972
And 6800 which costs +$80 to 3070 gets to 34 fps thanks to its 16GB of VRAM then? Wow what a difference.
I must not have seen you talking about the price.
And I was talking about the frametimes. Nice goalposts though.

Both are at 20-30 fps 0.2% minimums there which is unplayable IMO and is in fact a highlight of the fact that 16GBs won't help N21 much since it doesn't have enough processing power to make full use of them anyway.


It's hard to say if that happens due to VRAM differences or renderer differences. This applies to both comparisons there, 6800 vs 3070 and XT vs 3080
Note that AMD gets relatively higher 0.2% results in this benchmark even on cards which has the same VRAM capacities - like V64 vs 1080 or 5700XT vs 2070S. Thus it looks more like the renderer itself is doing something on NV cards which leads to lower minimal fps and this isn't necessarily due to VRAM differences.
Another interesting example there is RVII which goes from being on par with 1080Ti in averages to being behind it in 0.2%. That's a 16GB card vs a 11GB one so clearly this isn't a VRAM deficit.
People are too quick to assign blame to VRAM whenever they see such results. But the truth is always a bit more complex and is usually a combination of many things, drivers included.
None of the RDNA cards are between 20-30 FPS. And the debate is not about making full use of 16GBs, rather that 10GBs appear to not be sustainable for the next 4ish years.

Quite frankly I trust CB's judgement a lot more than yours.
And what do you even mean "the renderer does something different on NV cards" - you are talking about like three completely different architectures there .
AMD got higher frametimes in the cases where it also had higher average FPS, so I don't konw if that explanation holds up. The only exception to that is the RVII, which is definitely the odd one out.
Fact of the matter is that in no other case, the relative differences are as large as they are here.
 

Kalik

Banned
Nov 1, 2017
4,523
Nvidia screwed up their launch lineup...releasing all these new/better cards so soon after the original cards screams desperation...the 3080 Ti sounds better then a 3090...plus the eventual Super variants possibly using TSMC sounds like it'll be the definitive versions of the Ampere cards...
 

mephixto

Member
Oct 25, 2017
306
This websites create their own rumors just debunk them a week later.

I don't see a 3080 Ti before June/July 2021
 

scabobbs

Member
Oct 28, 2017
2,103
I dont think the 3080Ti will be the same in performance. With the rumored alsmosz 2000 CUDA Core increase it should see quite a juml. I agree that placing it between the 3090 and the 3080 is pretty weird but either the selling point is 20GB VRAM or more performance plus more VRAM. So if they really only give it 12GB with no noticable performance increase I would think its pointless. Paying probably 300€ more just for 2 more GB of VRAM?! Especially since AMD offers more for a better price.
what i was saying is if the 3090 is 10% faster than the 3080... by simple math it would fall somewhere around 10% faster as well. The 3090 and 3080Ti have the same cuda core count. At best you're looking at a 10% increase in performance for $300 MSRP
 

Azai

Member
Jun 10, 2020
3,958
what i was saying is if the 3090 is 10% faster than the 3080... by simple math it would fall somewhere around 10% faster as well. The 3090 and 3080Ti have the same cuda core count. At best you're looking at a 10% increase in performance for $300 MSRP

Yea. Either way the 3080Ti looks like it will be in a tough spot to put in by Nvidia.
 

Brandson

Member
Oct 26, 2017
2,219
I'd be fine with any of a 3080Ti, 3080, or 3070 as long as I can buy it at MSRP. I don't know that launching new models should be the focus though. Get way more cards into retail first.
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
I have no idea how or why nvidia is releasing another model. Nvidia can't make enough 3070/3080/3090, same goes for AIB's, now adding a 4th sku, and worst of all its an upper tier model vs a 3060, at least that would have made some sense lol. Wtf is even going on with this launch lol.
 

scitek

Member
Oct 27, 2017
10,054
Yea. Either way the 3080Ti looks like it will be in a tough spot to put in by Nvidia.

If I can find a 3080Ti at MSRP, and it's still hard to find a 3080, I'll take the Ti lol.

I edit video for a living, though, so the extra VRAM would be useful to me for more than just gaming.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
I have no idea how or why nvidia is releasing another model. Nvidia can't make enough 3070/3080/3090, same goes for AIB's, now adding a 4th sku, and worst of all its an upper tier model vs a 3060, at least that would have made some sense lol. Wtf is even going on with this launch lol.
the 3060 Ti is coming next week

the 3080 Ti, 3060, 3050 Ti are coming in January. and the 3050 is coming later than that.

and then there's rumors of some kind of refresh next year, probably in H2
 

dgrdsv

Member
Oct 25, 2017
11,846
I must not have seen you talking about the price.
And I was talking about the frametimes. Nice goalposts though.
You can't assess the performance without looking at its cost. So not sure what you were talking about but the price is always relevant.

None of the RDNA cards are between 20-30 FPS.
Screenshot-2020-11-25-204657.png


Shall we argue how much of a difference there is between 30 and 30.8?

And the debate is not about making full use of 16GBs, rather that 10GBs appear to not be sustainable for the next 4ish years.
You weren't talking about 10GBs either, you were talking about 3070 while I was talking about 3080 precisely.
I had let this one slide but since we're discussing the difference between 30 and 30.8 fps now maybe we should get back to this?
There are no apparent VRAM related issues observed on 3080 in your link.

Quite frankly I trust CB's judgement a lot more than yours.
And what do you even mean "the renderer does something different on NV cards" - you are talking about like three completely different architectures there .
Exactly and all of them are showing signs of worse frametimes stability on NV h/w, no matter how much VRAM these cards have. What does that tell you?
There's a 3090 there even and it too loses more performance in 0.2% than 6800XT. Shall we attribute this one to VRAM as well?
Judgement and all are nice but I'm just looking at the data provided by them, nothing more.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
It seems pointless given the supply issues with the existing cards.
probably banking on "more skus = more people talking about nvidia at any given time". of course when supply is good, they'll have a lot of options to go around at many different price points, gambling on AMD not having as many options
 

asmith906

Member
Oct 27, 2017
27,358
Nvidia pretty much fucked up the product stack this time. The 3090 and 3080 Ti are pointless if they are minimal raster improvements, and just dangle higher VRAM for the fish to bite.

Should've just dropped a 3080 20GB, then waited until you could release a higher tier SKU that would offer at least a 20% performance boost along with whatever VRAM brings the boys and girls to the yard.

Having three cards within 15% of each other, but price a variance of $800, is just... amateurish. This is why I still kinda don't believe these 3080 Ti rumors. Again, how can you release a better value 3090, with 3090 orders still pending in the channel? This makes no sense from a business or PR perspective. You knew all along that AMD was going to compete and undercut your SKU, so why make the blunder of pricing at $1,499 in the first place?

Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?

Take the L.
It's the early adopter tax. They sold every 3090 they could produce for over 1500 before AMD could get their cards out. Now that AMD has a card that can roughly compete with performance they can release the 3080 ti for a similar price and get people who might have considered switching. It's just a way to squeeze as much money as possible from their consumers.
 

SuperBoss

Member
Oct 25, 2017
6,520
It better fucking not lmao.

Couldn't find a 3080 for my life and just managed to get a 3070 with the intention of upgrading to 3080ti. January is too soon, especially when you still can't get a 3080!
 

-Tetsuo-

Unlimited Capacity
Member
Oct 26, 2017
12,560
ComputerBase attributes the worse frametimes in Ghost Recon at 4k to precisely that; the 3070 becomes completely unplayable (<30 FPS).
www.computerbase.de

AMD Radeon RX 6800 und RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing

AMD Radeon RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing / Testsystem und Testmethodik

Still can't believe people are defending this.

But the 3080 has a better avg framerate than the 680XT while having basically the same .2% low?
 

Carbon

Deploying the stealth Cruise Missile
Member
Oct 27, 2017
10,846
It better fucking not lmao.

Couldn't find a 3080 for my life and just managed to get a 3070 with the intention of upgrading to 3080ti. January is too soon, especially when you still can't get a 3080!
Don't worry, no one will be able to get 3080ti's for months either. Even if the 3080 wasn't officially a paper launch, it's clear they're having severe production issues with some parts that will prevent them from filling the distribution channels with cards.

And the 3070 is a pretty great card unless you're pushing 4k.
 

Readler

Member
Oct 6, 2018
1,972
You can't assess the performance without looking at its cost. So not sure what you were talking about but the price is always relevant.
I was talking about how Ampere is already struggling at 4k due to its VRAM, as that is literally what you asked for in your original post. I was merely offering proof how the oft-repeated argument of how the VRAM is enough (which is arguable with the 3080 with its wider bus and faster memory, but definitely not the fact with the 3070) is already not necessarily true, I didn't even mention the 6800 lol
If the 3070 is struggling right now, that's not a sign of good things to come. Nobody knows the future, but I thought it was obvious that I wasn't advocating for buying the 6800 only to gain 4 frames in that one specific game.

Screenshot-2020-11-25-204657.png


Shall we argue how much of a difference there is between 30 and 30.8?
What? You said 20 to 30 yourself, saying that anything in that range is unplayable, which I agree with.

You weren't talking about 10GBs either, you were talking about 3070 while I was talking about 3080 precisely.
I had let this one slide but since we're discussing the difference between 30 and 30.8 fps now maybe we should get back to this?
There are no apparent VRAM related issues observed on 3080 in your link.
This is what I said:
"ComputerBase attributes the worse frametimes in Ghost Recon at 4k to precisely that; the 3070 becomes completely unplayable (<30 FPS)."
Those are two separate statements. The first one specifically addressing your point about there not being a game where RDNA2 gets ahead of the 3080 due to VRAM limitations, the second one being further showing that the 3070 with its 8 Gigs is already outdated to a certain degree.
I was talking, and I'm sorry if that wasn't clear, about the VRAM situation in general for Ampere.

Exactly and all of them are showing signs of worse frametimes stability on NV h/w, no matter how much VRAM these cards have. What does that tell you?
There's a 3090 there even and it too loses more performance in 0.2% than 6800XT. Shall we attribute this one to VRAM as well?
Judgement and all are nice but I'm just looking at the data provided by them, nothing more.
Alright, I actually didn't take a look at the 3090 but here's what I'm seeing:
3090 vs. 6800XT +25%/+20% at avg/0.2% - meaning a relative loss of 25%
3080 vs. 6800XT +12%/+5% at avg/0.2% - meaning a relative loss ~60%
6800 vs. 3070 +9%/+26% at avg/0.2% - meaning a relative gain of ~190%

For the sake of the argument, let's assume that you're right about the worse stability on NV hardware and let's use that to explain the drop in performance for the 3090 (which obvs isn't VRAM bound) - aren't those results still odd, especially since the whole point of percentiles is to ignore single outliers?

But the 3080 has a better avg framerate than the 680XT while having basically the same .2% low?
Yes, but whereas the relative performance stays the same between the cards, it drops for Ampere (see above).
 

dgrdsv

Member
Oct 25, 2017
11,846
For the sake of the argument, let's assume that you're right about the worse stability on NV hardware and let's use that to explain the drop in performance for the 3090 (which obvs isn't VRAM bound) - aren't those results still odd, especially since the whole point of percentiles is to ignore single outliers?
Not really. Percentile will show you the amount of time a card spend at an interval of fps ignoring the lowest spikes.
The fact that all NV GPUs there suffer from worse lower fps - not only in relation to averages but sometimes actually worse in absolute numbers too - than AMD GPUs hints at there being an issue with either the renderer or the driver.
Basically, I would be cautious of saying that this is a result of VRAM size differences here. The actual data they've provided doesn't back this up - at least clearly.

I was talking, and I'm sorry if that wasn't clear, about the VRAM situation in general for Ampere.
VRAM situation in general for Ampere is really simple: if you care about a card's longevity then 6800/XT is the obviously better choice as they are what was 7900 series back at this console gen launch and there's a good amount of confidence that they will age great. VRAM is just one part of why this is true.

30 series is a bit of an uncharted waters here though as generally speaking, sure, they are highly likely to run into their VRAM size limitations during the same period of the new console generation. But it's hard to say how this alone will affect their longevity - will it make them actually unusable? Or will it be a case where you will be able to offset this lack of VRAM with better overall performance of RT and shading (they do have more shading power - i.e. flops)? How will DirectStorage play into this once it will come to PC? Will next gen games actually use more than 8-10 GBs coming straight from consoles with their 16 GBs total or will this be limited to "PC exclusive hires texture packs" where you will have to zoom into screenshots to see the differences? Will DLSS help here and how often as it lowers overall VRAM usage when applied? Etc.
If NV would just go with twice the VRAM sizes for 30 series at the same prices then it would be a slam dunk really. Why would you even buy a 6800XT over 3080 in this case? It's also rather unlikely that a 30 series with double the VRAM would be able to launch at the same prices so this should also be clear: you wouldn't be able to get 3080 performance at $700 right now if it would come with 20GBs. Would that additional cost and the resulting hit to perf/price worth it for the majority of buyers? Doubtful.
So it remains to be seen how NV's gamble with VRAM size will play out. Saying that it's just not enough isn't true at the moment.
 

Readler

Member
Oct 6, 2018
1,972
Not really. Percentile will show you the amount of time a card spend at an interval of fps ignoring the lowest spikes.
The fact that all NV GPUs there suffer from worse lower fps - not only in relation to averages but sometimes actually worse in absolute numbers too - than AMD GPUs hints at there being an issue with either the renderer or the driver.
Basically, I would be cautious of saying that this is a result of VRAM size differences here. The actual data they've provided doesn't back this up - at least clearly.
Agree to disagree then. I think the behaviour of the 3090 vs that of the 3080/70 shows (or at least implies) otherwise.
Obviously we need a larger sample size to back these claims up, so it'll be interesting to see things in a year or so from now.

VRAM situation in general for Ampere is really simple: if you care about a card's longevity then 6800/XT is the obviously better choice as they are what was 7900 series back at this console gen launch and there's a good amount of confidence that they will age great. VRAM is just one part of why this is true.
Basically all I've been saying. Ampere is the best option for right now, but things might change two years down the line. And I know some people always get the latest and greatest, but I keep my cards for ~4 years at least so longevity definitely is a concern.

30 series is a bit of an uncharted waters here though as generally speaking, sure, they are highly likely to run into their VRAM size limitations during the same period of the new console generation. But it's hard to say how this alone will affect their longevity - will it make them actually unusable? Or will it be a case where you will be able to offset this lack of VRAM with better overall performance of RT and shading (they do have more shading power - i.e. flops)? How will DirectStorage play into this once it will come to PC? Will next gen games actually use more than 8-10 GBs coming straight from consoles with their 16 GBs total or will this be limited to "PC exclusive hires texture packs" where you will have to zoom into screenshots to see the differences? Will DLSS help here and how often as it lowers overall VRAM usage when applied? Etc.
If NV would just go with twice the VRAM sizes for 30 series at the same prices then it would be a slam dunk really. Why would you even buy a 6800XT over 3080 in this case? It's also rather unlikely that a 30 series with double the VRAM would be able to launch at the same prices so this should also be clear: you wouldn't be able to get 3080 performance at $700 right now if it would come with 20GBs. Would that additional cost and the resulting hit to perf/price worth it for the majority of buyers? Doubtful.
So it remains to be seen how NV's gamble with VRAM size will play out. Saying that it's just not enough isn't true at the moment.
All great questions.
I did a little write up in one of the other threads (could we have like one central GPU OT? Please?) about how different and intriguing the respective approaches are this time around and why it is so difficult to just recommend one card over the other (...a 20GB 3080 at the same price would have been the definite choice though, agreed).

And no, my statement isn't true at the moment but I was always talking (or at least meaning to talk) about the future in my claims, with Ghost Recon possibly being a sign of things to come.

That is a good point, but would you really call that benchmark showcasing the 6800XT getting ahead of the 3080 because of VRAM, if at all?
Nope, it's not about getting ahead (so in that regard I may have worded myself poorly/incorrectly), but rather about a diminishing lead caused by VRAM usage.
Poor analogy time: buying a Ferrari and only using in on highways is a great idea, but if you intend to use to also use it on a dirt road it's dumb, even if it's still gonna be faster than only going by foot.
 

dgrdsv

Member
Oct 25, 2017
11,846
And no, my statement isn't true at the moment but I was always talking (or at least meaning to talk) about the future in my claims, with Ghost Recon possibly being a sign of things to come.
We've had such "signs" for years now, with games requiring more than 8GBs with high rez texture packs and such. Did they make a 1080 or 5700 unusable? Nope. Will this turn out to be different now because of a switch to a new console generation with double the RAM capacity? Nobody knows for sure. There are more changes at play now than just the VRAM size.
Personally, I think that unless you're aiming at 4K native with 3070 - which is a tall order for the card regardless of its VRAM size - both 3070 and 3080 will be fine at least until the next gen of NV GPUs will be released. "Fine" as in there won't be more than a handful of games where they will suffer from lack of VRAM - and all of these games will probably be AMD sponsored to boot. IMO this isn't any different to how there will be a handful of NV sponsored games over the same period where 6800 series will plummet due to heavy use of ray tracing for example.
Also while 3070's 8GBs do feel a bit lower than the card should have had, 10GBs on 3080 will likely be okay for a much longer time, to a point where it will be hard to find a game which will have issues running on the thing solely due to its VRAM size until 2025 or so.
But again, nobody knows for sure, we are all just guessing.
 

p3n

Member
Oct 28, 2017
650
if you care about a card's longevity then 6800/XT is the obviously better choice

If you care about a card's longevity you don't buy a card that can run old games fast and craps itself once RT is enabled. The 6000 series is a very odd product aimed to run old games at higher resolutions with higher framerates but zero viability in future games.
 

dgrdsv

Member
Oct 25, 2017
11,846
If you care about a card's longevity you don't buy a card that can run old games fast and craps itself once RT is enabled. The 6000 series is a very odd product aimed to run old games at higher resolutions with higher framerates but zero viability in future games.
Nobody expects RT to become a requirement to run a game in the next couple of years. If will remain a feature which you will be able to disable to get better performance - this seems to be the case even on consoles now. And even if some games will require RT to be always on they will likely use it lightly, to a point where both consoles and RDNA2 PC GPUs will be able to deal with it.
It's kinda similar to how you will always be able to lower settings or drop the resolution to free up VRAM usage on 3070.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,201
Dark Space
That would be a lot more "fucked up" than what they plan currently.
3080 20GB wouldn't be able to land at $700 and would likely cost around $900 while providing about zero performance advantage over a 10GB model (has anyone seen even one benchmark where 6800/XT gets ahead of 3080 due to its VRAM size?)
3080Ti with 12 or 20 GBs (my money is on 12 btw but we'll see) makes more sense as it will be faster than 3080 because it will have more SMs enabled. This at least somewhat justify the price hike over 3080 10GB for those who are willing to pay for it.
I find it difficult to believe that people are seriously looking forward to a 3080 Ti that's ~10% faster and $300 more expensive.

Turing is called a failure by many and Nvidia didn't even pull that kinda shit in that lineup.

This is madness.