What was the comparisons between the 980 and the 980ti when they launched? I've been rocking my 980ti for about 5 years and I'm kind of tempted by the 3080ti, if could get some similar longevity out of it.
Think the Ti was 30% faster the the 980. i upgraded from a Gigabyte G1 Gaming 980 Ti (which was roughly equal to a 1070 Ti, so heavily overclocked compared to the base 980 Ti) to a 3090 and I couldn't be happier. I imagine the 3080 Ti will be basically the same as a 3090 just with slightly less VRAM.What was the comparisons between the 980 and the 980ti when they launched? I've been rocking my 980ti for about 5 years and I'm kind of tempted by the 3080ti, if could get some similar longevity out of it.
The 3080 is the pick this time around. The 3090 is about 10% faster than the 3080, so the 3080Ti will be almost exactly the same.What was the comparisons between the 980 and the 980ti when they launched? I've been rocking my 980ti for about 5 years and I'm kind of tempted by the 3080ti, if could get some similar longevity out of it.
That would be a lot more "fucked up" than what they plan currently.
The only way to not let AMD to use the gaps in your lineup would be to launch after AMD. They've chosen to go first because they were confident enough in what they've had on hand. And the results so far show that they were right.Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?
The 3080 is the pick this time around. The 3090 is about 10% faster than the 3080, so the 3080Ti will be almost exactly the same.
They're gonna flood the market like they did with Turing. The 16 series didn't have much variance once the Supers were out. The 20 series had shit like the 2060 Super and the 2070 on the shelf at the same time despite being functionally the sameNvidia pretty much fucked up the product stack this time. The 3090 and 3080 Ti are pointless if they are minimal raster improvements, and just dangle higher VRAM for the fish to bite.
Should've just dropped a 3080 20GB, then waited until you could release a higher tier SKU that would offer at least a 20% performance boost along with whatever VRAM brings the boys and girls to the yard.
Having three cards within 15% of each other, but price a variance of $800, is just... amateurish. This is why I still kinda don't believe these 3080 Ti rumors. Again, how can you release a better value 3090, with 3090 orders still pending in the channel? This makes no sense from a business or PR perspective. You knew all along that AMD was going to compete and undercut your SKU, so why make the blunder of pricing at $1,499 in the first place?
Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?
Take the L.
They're gonna flood the market like they did with Turing. The 16 series didn't have much variance once the Supers were out. The 20 series had shit like the 2060 Super and the 2070 on the shelf at the same time despite being functionally the same
The 2060 had 6GB, the 2060 Super had 8GB.Didnt the 2060 only had 6GB VRAM?
But yea although the 2070 had slightly better performance it was basically just a cheaper 2070.
But they shot themselves in the foot by pricing the 20series way too high. O remember the reports of Nvidia losing millions because noone bought them.
Now they priced them cheaper or at least for a better value but noone can actually get them because there are none.
Really curious about the sales numbers.
Just because they are sold out everywhere doesnt mean they sell better than the 20 series.
They sell way better at launch that 20 series did.Really curious about the sales numbers.
Just because they are sold out everywhere doesnt mean they sell better than the 20 series.
ComputerBase attributes the worse frametimes in Ghost Recon at 4k to precisely that; the 3070 becomes completely unplayable (<30 FPS).That would be a lot more "fucked up" than what they plan currently.
3080 20GB wouldn't be able to land at $700 and would likely cost around $900 while providing about zero performance advantage over a 10GB model (has anyone seen even one benchmark where 6800/XT gets ahead of 3080 due to its VRAM size?)
3080Ti with 12 or 20 GBs (my money is on 12 btw but we'll see) makes more sense as it will be faster than 3080 because it will have more SMs enabled. This at least somewhat justify the price hike over 3080 10GB for those who are willing to pay for it.
And 6800 which costs +$80 to 3070 gets to 34 fps thanks to its 16GB of VRAM then? Wow what a difference.ComputerBase attributes the worse frametimes in Ghost Recon at 4k to precisely that; the 3070 becomes completely unplayable (<30 FPS).
In this case you should be looking at the 0.2% minimums. The 3070 falls much further behind in these. On average the 6800 is 9% faster while it's 26% faster in minimums in Ghost Recon. The same happens to a lesser extent in the 3080 vs 6800xt comparison.And 6800 which costs +$80 to 3070 gets to 34 fps thanks to its 16GB of VRAM then? Wow what a difference.
Both are at 20-30 fps 0.2% minimums there which is unplayable IMO and is in fact a highlight of the fact that 16GBs won't help N21 much since it doesn't have enough processing power to make full use of them anyway.In this case you should be looking at the 0.2% minimums. The 3070 falls much further behind in these. On average the 6800 is 9% faster while it's 26% faster in minimums in Ghost Recon.
It's hard to say if that happens due to VRAM differences or renderer differences. This applies to both comparisons there, 6800 vs 3070 and XT vs 3080The same happens to a lesser extent in the 3080 vs 6800xt comparison.
I must not have seen you talking about the price.And 6800 which costs +$80 to 3070 gets to 34 fps thanks to its 16GB of VRAM then? Wow what a difference.
None of the RDNA cards are between 20-30 FPS. And the debate is not about making full use of 16GBs, rather that 10GBs appear to not be sustainable for the next 4ish years.Both are at 20-30 fps 0.2% minimums there which is unplayable IMO and is in fact a highlight of the fact that 16GBs won't help N21 much since it doesn't have enough processing power to make full use of them anyway.
It's hard to say if that happens due to VRAM differences or renderer differences. This applies to both comparisons there, 6800 vs 3070 and XT vs 3080
Note that AMD gets relatively higher 0.2% results in this benchmark even on cards which has the same VRAM capacities - like V64 vs 1080 or 5700XT vs 2070S. Thus it looks more like the renderer itself is doing something on NV cards which leads to lower minimal fps and this isn't necessarily due to VRAM differences.
Another interesting example there is RVII which goes from being on par with 1080Ti in averages to being behind it in 0.2%. That's a 16GB card vs a 11GB one so clearly this isn't a VRAM deficit.
People are too quick to assign blame to VRAM whenever they see such results. But the truth is always a bit more complex and is usually a combination of many things, drivers included.
what i was saying is if the 3090 is 10% faster than the 3080... by simple math it would fall somewhere around 10% faster as well. The 3090 and 3080Ti have the same cuda core count. At best you're looking at a 10% increase in performance for $300 MSRPI dont think the 3080Ti will be the same in performance. With the rumored alsmosz 2000 CUDA Core increase it should see quite a juml. I agree that placing it between the 3090 and the 3080 is pretty weird but either the selling point is 20GB VRAM or more performance plus more VRAM. So if they really only give it 12GB with no noticable performance increase I would think its pointless. Paying probably 300€ more just for 2 more GB of VRAM?! Especially since AMD offers more for a better price.
what i was saying is if the 3090 is 10% faster than the 3080... by simple math it would fall somewhere around 10% faster as well. The 3090 and 3080Ti have the same cuda core count. At best you're looking at a 10% increase in performance for $300 MSRP
Yea. Either way the 3080Ti looks like it will be in a tough spot to put in by Nvidia.
the 3060 Ti is coming next weekI have no idea how or why nvidia is releasing another model. Nvidia can't make enough 3070/3080/3090, same goes for AIB's, now adding a 4th sku, and worst of all its an upper tier model vs a 3060, at least that would have made some sense lol. Wtf is even going on with this launch lol.
You can't assess the performance without looking at its cost. So not sure what you were talking about but the price is always relevant.I must not have seen you talking about the price.
And I was talking about the frametimes. Nice goalposts though.
You weren't talking about 10GBs either, you were talking about 3070 while I was talking about 3080 precisely.And the debate is not about making full use of 16GBs, rather that 10GBs appear to not be sustainable for the next 4ish years.
Exactly and all of them are showing signs of worse frametimes stability on NV h/w, no matter how much VRAM these cards have. What does that tell you?Quite frankly I trust CB's judgement a lot more than yours.
And what do you even mean "the renderer does something different on NV cards" - you are talking about like three completely different architectures there .
the 3060 Ti is coming next week
the 3080 Ti, 3060, 3050 Ti are coming in January. and the 3050 is coming later than that.
and then there's rumors of some kind of refresh next year, probably in H2
Does this mean I can finally get my hands on a 3080? I kind of stopped looking a couple weeks ago.
probably banking on "more skus = more people talking about nvidia at any given time". of course when supply is good, they'll have a lot of options to go around at many different price points, gambling on AMD not having as many optionsIt seems pointless given the supply issues with the existing cards.
It's the early adopter tax. They sold every 3090 they could produce for over 1500 before AMD could get their cards out. Now that AMD has a card that can roughly compete with performance they can release the 3080 ti for a similar price and get people who might have considered switching. It's just a way to squeeze as much money as possible from their consumers.Nvidia pretty much fucked up the product stack this time. The 3090 and 3080 Ti are pointless if they are minimal raster improvements, and just dangle higher VRAM for the fish to bite.
Should've just dropped a 3080 20GB, then waited until you could release a higher tier SKU that would offer at least a 20% performance boost along with whatever VRAM brings the boys and girls to the yard.
Having three cards within 15% of each other, but price a variance of $800, is just... amateurish. This is why I still kinda don't believe these 3080 Ti rumors. Again, how can you release a better value 3090, with 3090 orders still pending in the channel? This makes no sense from a business or PR perspective. You knew all along that AMD was going to compete and undercut your SKU, so why make the blunder of pricing at $1,499 in the first place?
Respond to AMD? You didn't forecast this from the beginning? Did someone fuck up?
Take the L.
Got it, so glad I did not end up selling my 2070 Super. Hope it gets me through CyperPunk at 1440P/60 on mixed settings.
ComputerBase attributes the worse frametimes in Ghost Recon at 4k to precisely that; the 3070 becomes completely unplayable (<30 FPS).
AMD Radeon RX 6800 und RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing
AMD Radeon RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing / Testsystem und Testmethodikwww.computerbase.de
Still can't believe people are defending this.
Got it, so glad I did not end up selling my 2070 Super. Hope it gets me through CyperPunk at 1440P/60 on mixed settings.
Yep, pretty much, it's a shame really as I wanted to get a new gpu from them but who knows when that would happen now.
Don't worry, no one will be able to get 3080ti's for months either. Even if the 3080 wasn't officially a paper launch, it's clear they're having severe production issues with some parts that will prevent them from filling the distribution channels with cards.It better fucking not lmao.
Couldn't find a 3080 for my life and just managed to get a 3070 with the intention of upgrading to 3080ti. January is too soon, especially when you still can't get a 3080!
I was talking about how Ampere is already struggling at 4k due to its VRAM, as that is literally what you asked for in your original post. I was merely offering proof how the oft-repeated argument of how the VRAM is enough (which is arguable with the 3080 with its wider bus and faster memory, but definitely not the fact with the 3070) is already not necessarily true, I didn't even mention the 6800 lolYou can't assess the performance without looking at its cost. So not sure what you were talking about but the price is always relevant.
What? You said 20 to 30 yourself, saying that anything in that range is unplayable, which I agree with.
Shall we argue how much of a difference there is between 30 and 30.8?
This is what I said:You weren't talking about 10GBs either, you were talking about 3070 while I was talking about 3080 precisely.
I had let this one slide but since we're discussing the difference between 30 and 30.8 fps now maybe we should get back to this?
There are no apparent VRAM related issues observed on 3080 in your link.
Alright, I actually didn't take a look at the 3090 but here's what I'm seeing:Exactly and all of them are showing signs of worse frametimes stability on NV h/w, no matter how much VRAM these cards have. What does that tell you?
There's a 3090 there even and it too loses more performance in 0.2% than 6800XT. Shall we attribute this one to VRAM as well?
Judgement and all are nice but I'm just looking at the data provided by them, nothing more.
Yes, but whereas the relative performance stays the same between the cards, it drops for Ampere (see above).But the 3080 has a better avg framerate than the 680XT while having basically the same .2% low?
Not really. Percentile will show you the amount of time a card spend at an interval of fps ignoring the lowest spikes.For the sake of the argument, let's assume that you're right about the worse stability on NV hardware and let's use that to explain the drop in performance for the 3090 (which obvs isn't VRAM bound) - aren't those results still odd, especially since the whole point of percentiles is to ignore single outliers?
VRAM situation in general for Ampere is really simple: if you care about a card's longevity then 6800/XT is the obviously better choice as they are what was 7900 series back at this console gen launch and there's a good amount of confidence that they will age great. VRAM is just one part of why this is true.I was talking, and I'm sorry if that wasn't clear, about the VRAM situation in general for Ampere.
Yes, but whereas the relative performance stays the same between the cards, it drops for Ampere (see above).
Agree to disagree then. I think the behaviour of the 3090 vs that of the 3080/70 shows (or at least implies) otherwise.Not really. Percentile will show you the amount of time a card spend at an interval of fps ignoring the lowest spikes.
The fact that all NV GPUs there suffer from worse lower fps - not only in relation to averages but sometimes actually worse in absolute numbers too - than AMD GPUs hints at there being an issue with either the renderer or the driver.
Basically, I would be cautious of saying that this is a result of VRAM size differences here. The actual data they've provided doesn't back this up - at least clearly.
Basically all I've been saying. Ampere is the best option for right now, but things might change two years down the line. And I know some people always get the latest and greatest, but I keep my cards for ~4 years at least so longevity definitely is a concern.VRAM situation in general for Ampere is really simple: if you care about a card's longevity then 6800/XT is the obviously better choice as they are what was 7900 series back at this console gen launch and there's a good amount of confidence that they will age great. VRAM is just one part of why this is true.
All great questions.30 series is a bit of an uncharted waters here though as generally speaking, sure, they are highly likely to run into their VRAM size limitations during the same period of the new console generation. But it's hard to say how this alone will affect their longevity - will it make them actually unusable? Or will it be a case where you will be able to offset this lack of VRAM with better overall performance of RT and shading (they do have more shading power - i.e. flops)? How will DirectStorage play into this once it will come to PC? Will next gen games actually use more than 8-10 GBs coming straight from consoles with their 16 GBs total or will this be limited to "PC exclusive hires texture packs" where you will have to zoom into screenshots to see the differences? Will DLSS help here and how often as it lowers overall VRAM usage when applied? Etc.
If NV would just go with twice the VRAM sizes for 30 series at the same prices then it would be a slam dunk really. Why would you even buy a 6800XT over 3080 in this case? It's also rather unlikely that a 30 series with double the VRAM would be able to launch at the same prices so this should also be clear: you wouldn't be able to get 3080 performance at $700 right now if it would come with 20GBs. Would that additional cost and the resulting hit to perf/price worth it for the majority of buyers? Doubtful.
So it remains to be seen how NV's gamble with VRAM size will play out. Saying that it's just not enough isn't true at the moment.
Nope, it's not about getting ahead (so in that regard I may have worded myself poorly/incorrectly), but rather about a diminishing lead caused by VRAM usage.That is a good point, but would you really call that benchmark showcasing the 6800XT getting ahead of the 3080 because of VRAM, if at all?
We've had such "signs" for years now, with games requiring more than 8GBs with high rez texture packs and such. Did they make a 1080 or 5700 unusable? Nope. Will this turn out to be different now because of a switch to a new console generation with double the RAM capacity? Nobody knows for sure. There are more changes at play now than just the VRAM size.And no, my statement isn't true at the moment but I was always talking (or at least meaning to talk) about the future in my claims, with Ghost Recon possibly being a sign of things to come.
if you care about a card's longevity then 6800/XT is the obviously better choice
Nobody expects RT to become a requirement to run a game in the next couple of years. If will remain a feature which you will be able to disable to get better performance - this seems to be the case even on consoles now. And even if some games will require RT to be always on they will likely use it lightly, to a point where both consoles and RDNA2 PC GPUs will be able to deal with it.If you care about a card's longevity you don't buy a card that can run old games fast and craps itself once RT is enabled. The 6000 series is a very odd product aimed to run old games at higher resolutions with higher framerates but zero viability in future games.
I find it difficult to believe that people are seriously looking forward to a 3080 Ti that's ~10% faster and $300 more expensive.That would be a lot more "fucked up" than what they plan currently.
3080 20GB wouldn't be able to land at $700 and would likely cost around $900 while providing about zero performance advantage over a 10GB model (has anyone seen even one benchmark where 6800/XT gets ahead of 3080 due to its VRAM size?)
3080Ti with 12 or 20 GBs (my money is on 12 btw but we'll see) makes more sense as it will be faster than 3080 because it will have more SMs enabled. This at least somewhat justify the price hike over 3080 10GB for those who are willing to pay for it.
Sure. Which is why I don't really understand why people are waiting for this card.I find it difficult to believe that people are seriously looking forward to a 3080 Ti that's ~10% faster and $300 more expensive.