• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

dgrdsv

Member
Oct 25, 2017
11,846
because the clockspeeds on thermal constrained consoles like the PS5 are insane for RDNA 2
Are they? RDNA1 in the form of Navi 10 hits 2.1 GHz pretty consistently.
I mean, sure, there must be _some_ perf/watt gain or it would be nearly impossible to have what is claimed for PS5/XSX in a realistic power budget.
But don't let the slides fool you - anything any vendor says about their products is not a proof of anything. We have to wait till actual h/w will be available to get that.
 

eonden

Member
Oct 25, 2017
17,078
I mean, a 50% increase in performance per watt from AMD would put them close to the current position for Nvidia... before Nvidia moves to a smaller node (which should be more power efficient). Yes, Nvidia is destroying AMD in power efficiency with a much bigger node.

www.tomshardware.com

Graphics Card Power Consumption and Efficiency Tested

We tested over fifty graphics cards to see which give the best performance per watt.

In the overall standings, where less power is better, it's pretty easy to see how far AMD fell behind Nvidia prior to the Navi generation GPUs. The various Vega and Polaris AMD cards use significantly more power than their Nvidia counterparts. Even now, with 7nm Navi GPUs going up against 12nm GPUs, AMD is only roughly equal to Nvidia. How things will change with upcoming Nvidia Ampere and AMD Big Navi launches is something we're definitely looking forward to seeing, what with AMD's claims of 50% improvements in performance per watt with RDNA 2.

There is also the doubts I have for AMD to have a better ray tracing solution than gen 1 rtx given the gigantic difference in GPU machine learning Nvidia has over AMD.
 

Luigi87

One Winged Slayer
Member
Oct 25, 2017
5,102
Since 3080 and up will be out of my price range, I will be curious to see the 3070. Plan to go from a 1070 to it.
 

Komo

Info Analyst
Verified
Jan 3, 2019
7,110
30% gains for Turing prices? So much for the hype.

Get the fuck out of here Nvidia.
I mean this probably runs RT much faster, and has a massive amount of improvements because the 3D Mark scores doesn't mean everything. Plus this is probably the 3080 stock card, and will only be improved by aftermarket clocks.

I'd wait till it actually comes out to base nvidia over hyping their cards.
 

Nine_Ball

Member
Jun 12, 2020
116
So, lets assume the best case scenario (that 50% power efficiency jump), now AMD is finally on par... with a 2 years old Nvidia architecture, without counting the tensor and raytracing cores of the later. Congrats?


Let's assume hypothetically that 50% performance per watt increase figure was true for RDNA2, it would give Big Navi (with 80 compute units and 2ghz base clock) around 30/40 % more performance than the 2080ti.

Obviously we are not sure about anything with Big Navi, even the 80 compute figure is rumored.

Anyway like I said before, I'm pretty sure Nvidia as something great in the works with Ampere and I think they'll keep their power crown with this new generation of cards.
But if AMD can have more competitive cards, NVIDIA will have to align their pricing to AMD's and that's what we want as consumers.
 

Midgarian

Alt Account
Banned
Apr 16, 2020
2,619
Midgar
Me with my 3 month old 2080 Ti:

source.gif
 

Sqrt

Member
Oct 26, 2017
5,880
Let's assume hypothetically that 50% performance per watt increase figure was true for RDNA2, it would give Big Navi (with 80 compute units and 2ghz base clock) around 30/40 % more performance than the 2080ti.

And now Shrink the 2080ti to 7nm, which the least we can expect from Ampere.

Obviously we are not sure about anything with Big Navi, even the 80 compute figure is rumored.

Anyway like I said before, I'm pretty sure Nvidia as something great in the works with Ampere and I think they'll keep their power crown with this new generation of cards.
But if AMD can have more competitive cards, NVIDIA will have to align their pricing to AMD's and that's what we want as consumers.

Having AMD to compete again with Nvidia would only be good for us consumers, but that is far from given, sadly.
 

Isee

Avenger
Oct 25, 2017
6,235
Me with a 1.5 years old 2080Ti, waiting for a 3080Ti.

giphy.gif


But in all honesty. You are worrying too much. 2080Ti won't suddenly become irrelevant from a performance perspective
 

Midgarian

Alt Account
Banned
Apr 16, 2020
2,619
Midgar
If you don't need top of the line all the time then you can ride until Hopper (4k series).
It was my first ever Gaming PC build, so I think I'll be content.

My intention behind going for the Ti was just to absolutely future proof it for a good 5 or so years so I don't have to worry about settings for as long as possible and can just pump everything on max to a 1440p 144hz monitor.
 

Nine_Ball

Member
Jun 12, 2020
116
And now Shrink the 2080ti to 7nm, which the least we can expect from Ampere.

I'm not talking about AMD to diminish what NVIDIA has achieved until now. They are clearly in a position of leadership when it comes to GPU's.


Having AMD to compete again with Nvidia would only be good for us consumers, but that is far from given, sadly

I'm pretty sure with what I've seen from next-gen consoles that RDNA2 is the real deal. Not necessarily the next leader in the GPU world but at least bringing competition back to the high-end market.
 

Spehornoob

Member
Nov 15, 2017
8,938
Bloody hell I only just bought a 2070 Super :( Oh well, time to start saving again.
Me too!

I don't intend to play at a higher than 1080p resolution for a while. I really don't aspire to 4k right now. For anyone who knows more about hardware than I do, ahould a 2070 Super keep me playing at 1080p/60FPS for a good couple years at least?
 

Spoit

Member
Oct 28, 2017
3,976
Let's assume hypothetically that 50% performance per watt increase figure was true for RDNA2, it would give Big Navi (with 80 compute units and 2ghz base clock) around 30/40 % more performance than the 2080ti.
How does that work out? The 5700xt is only about as strong as a 2070 super, which is like a good 20-30% weaker than a 2080ti. I don't see how a 50% boost on top of that gets to 30-40% better than one? Unless you're expecting it to draw significantly more power?
 

jett

Community Resettler
Member
Oct 25, 2017
44,653
Do some of you really think AMD is going to even attempt to compete in the high end market again? They've left that to Nvidia for so long I don't see the point. They'll continue to offer cheaper alternatives for mid and low-end and that's frankly fine to me.
 

Phinor

Member
Oct 27, 2017
1,236
I'd bet a dollar on this being the Ti/3090 model if the result is real and the recent rumours are somewhat accurate. Going to 7nm by itself does nothing for performance. What it does is allows you to put more stuff into that same amount of space (or alternatively go for decreased power consumption). The rumour is that the 3090 has around 5248 cuda cores and was benchmarked at ~1900MHz, 2080 Ti FE has 4352 cores and benchmarks typically at around 1800MHz. ~20% more cores, a bit more MHz, small IPC improvements, and pre-release drivers and around 30-35% seems very much in the ballpark. It's not like Turing architecture is suddenly poor and they can magically squeeze 30% extra just by architecture improvements at this point. Maybe AMD could do that because they were well behind in terms of high-end performance.

Everyone is expecting big improvements with ray tracing so they also need to have room for that on the board/die but the die size of 2080 Ti was already massive, I doubt they are going for another die of that size. Definitely not bigger than that. Also while the node change helps with power consumption, there's now so much new stuff being added that whatever breathing room they gained by going 7nm is suddenly all gone. It's not like they can go for increased power consumption vs 2080 Ti either, that thing was already on the limit of what's reasonable for consumer cards.

So add some extra MHz on core/memory (which isn't nearly as easy as it used to be with node changes, these things are so complex), small driver improvements and that 30-35% is suddenly in the 40-45% range which to me seems pretty good for 2020 all things considered. I am however expecting much bigger improvements with ray tracing.
 

Nzyme32

Member
Oct 28, 2017
5,245
2070S has been really good so far, at the moment I'm playing at 1080p so it just eats up most games easily at max settings. I'm getting RDR2 soon to see how it performs.

But yeah I hope 3080ti is a big jump.

Yeah with 2070S, it's pretty overkill for 1080p, but is well suited for downsampling!

I've been playing 1440p with a variable refresh monitor, generally targeting 100 - 144hz range, which I absolutely love. So I'm looking to bolster that a bit as we start to transition new console gen bringing more capable games across both standard and VR games. Then hopefully won't have a need to change up for 3 or 4 years
 

Nine_Ball

Member
Jun 12, 2020
116
How does that work out? The 5700xt is only about as strong as a 2070 super, which is like a good 20-30% weaker than a 2080ti. I don't see how a 50% boost on top of that gets to 30-40% better than one? Unless you're expecting it to draw significantly more power?


The 5700xt has 40 compute units. On the contrary many rumors state that Big Navi should have 80 compute units. I also use 2Ghz as base clock because I think AMD can hit this target with sufficient cooling.
Again, AMD could have problems attaining those frequencies with the final retail unit and "Big Navi" could also be shipped with "only" 64 compute units if the rumors aren't true.
So i'm just presenting a very best case scenario here.

And the 30/40% improvement in performance over the 2080ti here is only an assumption based on those numbers.
 

UF_C

Member
Oct 25, 2017
3,347
I just don't see how this is the 3080. Unless nvidia continues to make these cards on huge dies (like they did with the 2000 series which also made it more expensive), I don't see the 3080ti being more than 30% improvement.

30% increase is pretty typical for nvidia. Only a few times have they gone beyond that (1080ti being the most recent...that was 60%)

yes, we have a die shrink but if nvidia is going back to its typical 550mm2 size, the shrink will be negligible. Adding more transistors and shaders does not scale linearly, especially if you aren't willing or can't increase clock speeds.

The real gains from this card will be improved raytracing. I think we will see a huge upgrade in that area and that is what nvidia will be pushing come reveal.

while I hope this is the 3080, my guess is this is the 3080ti or 3090 or whatever you want to call it.
 

Rpgmonkey

Member
Oct 25, 2017
1,348
A 2080Ti, give or take, at the 3070 level with further increases from there seems decent.

I'm feeling like we're at a point in the overall gaming cycle where a 2080Ti is still going to be really good for pretty much everything released between now and Hopper's launch in 2022 or so, where one of the high end cards there will probably be enough to comfortably glide through the rest of next-gen, so I'm going to hold off personally.

Really curious to see what kind of performance increase they can get with Ray Tracing though.
 

Flash

Member
Oct 27, 2017
377
You're being weirdly hostile about this. We'll all find out when the cards appear but the consensus seems to be this is indeed the 3080. But it really doesn't matter...
I guess sarcasm isn't allowed anymore :(

Also I would be the first one jumping up and down if it's a 3080 that outperforms a 2080Ti by that much. I am getting the 3080Ti or the 3090Ti/Super regardless of what card is in the benchmarks. Just seems people's expectations are out of wack lol when you expect the 3070 to outperform the 2080Ti and cost half as much it's unrealistic and there is no harm in calling it out for it's ridiculousness.
 

Bosch

Banned
May 15, 2019
3,680
I'd bet a dollar on this being the Ti/3090 model if the result is real and the recent rumours are somewhat accurate. Going to 7nm by itself does nothing for performance. What it does is allows you to put more stuff into that same amount of space (or alternatively go for decreased power consumption). The rumour is that the 3090 has around 5248 cuda cores and was benchmarked at ~1900MHz, 2080 Ti FE has 4352 cores and benchmarks typically at around 1800MHz. ~20% more cores, a bit more MHz, small IPC improvements, and pre-release drivers and around 30-35% seems very much in the ballpark. It's not like Turing architecture is suddenly poor and they can magically squeeze 30% extra just by architecture improvements at this point. Maybe AMD could do that because they were well behind in terms of high-end performance.

Everyone is expecting big improvements with ray tracing so they also need to have room for that on the board/die but the die size of 2080 Ti was already massive, I doubt they are going for another die of that size. Definitely not bigger than that. Also while the node change helps with power consumption, there's now so much new stuff being added that whatever breathing room they gained by going 7nm is suddenly all gone. It's not like they can go for increased power consumption vs 2080 Ti either, that thing was already on the limit of what's reasonable for consumer cards.

So add some extra MHz on core/memory (which isn't nearly as easy as it used to be with node changes, these things are so complex), small driver improvements and that 30-35% is suddenly in the 40-45% range which to me seems pretty good for 2020 all things considered. I am however expecting much bigger improvements with ray tracing.
Lol 3090 for Just 30% increase on a 2 years old gpu?

Turing hurted bad Nvidia users.

We are looking to a real jump here. It is a 3080.
 

PHOENIXZERO

Member
Oct 29, 2017
12,065
Some of you really need to get off the Turing hang ups, this situation in 2020 is going to be significantly different from 2018.

I just don't see how this is the 3080. Unless nvidia continues to make these cards on huge dies (like they did with the 2000 series which also made it more expensive), I don't see the 3080ti being more than 30% improvement.

30% increase is pretty typical for nvidia. Only a few times have they gone beyond that (1080ti being the most recent...that was 60%)

yes, we have a die shrink but if nvidia is going back to its typical 550mm2 size, the shrink will be negligible. Adding more transistors and shaders does not scale linearly, especially if you aren't willing or can't increase clock speeds.

The real gains from this card will be improved raytracing. I think we will see a huge upgrade in that area and that is what nvidia will be pushing come reveal.

while I hope this is the 3080, my guess is this is the 3080ti or 3090 or whatever you want to call it.
The 980 Ti was generally around 50% faster than the 780 Ti too if not more in some instances wasn't it? The 970 in a lot of instances was pretty much neck and neck with it. These cards are supposedly going to be capable of going well over 2GHz. Engineering samples are clocked lower and obviously aren't using finished drivers.

How does that work out? The 5700xt is only about as strong as a 2070 super, which is like a good 20-30% weaker than a 2080ti. I don't see how a 50% boost on top of that gets to 30-40% better than one? Unless you're expecting it to draw significantly more power?
The 5700XT is a ~$400 card with 40CUs, its successor will probably be another ~$400 card, why wouldn't a 50% performance per watt increase not reach 2080Ti levels at probably lower power useage if the 5700XT was ~30% weaker?
I guess sarcasm isn't allowed anymore :(

Also I would be the first one jumping up and down if it's a 3080 that outperforms a 2080Ti by that much. I am getting the 3080Ti or the 3090Ti/Super regardless of what card is in the benchmarks. Just seems people's expectations are out of wack lol when you expect the 3070 to outperform the 2080Ti and cost half as much it's unrealistic and there is no harm in calling it out for it's ridiculousness.
The consensus is within ~5% of a 2080Ti which is roughly in line to what it has been prior to the 20-series but with the benefit of significantly improved ray tracing which is the area where it'd probably beat it, a 970 had an MSRP of less than half (and power draw) of the 780Ti and the 1070 was about $100 over being half the price of a 980Ti so why is it unrealistic for a 3070 to be around 5% of a 2080Ti but while more capable with ray tracing for also, less than half given there's going to actually be some competition this time around unlike 2018?
 

UF_C

Member
Oct 25, 2017
3,347
The 980 Ti was generally around 50% faster than the 780 Ti too if not more in some instances wasn't it? The 970 in a lot of instances was pretty much neck and neck with it. These cards are supposedly going to be capable of going well over 2GHz. Engineering samples are clocked lower and obviously aren't using finished drivers.
But wasn't the 980ti the first real introduction to maxwell?

I think nvidia has pretty much maxed out the maxwell-esque transistor. Unless ampere has improved its transistor I just don't see the similarity in the 900 gen vs 3000.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
But wasn't the 980ti the first real introduction to maxwell?

I think nvidia has pretty much maxed out the maxwell-esque transistor. Unless ampere has improved its transistor I just don't see the similarity in the 900 gen vs 3000.

No.
750-750 Ti were 1st gen Maxwell
900 series + Titan X were 2nd gen Maxwell

We're finally getting a node jump, and even if Ampere is just an optimized Turing like Pascal was to Maxwell, why shouldn't we expect a healthy jump from last gen?

I mean even the 2080 Ti was 30+% faster than the 1080 Ti and that was basically on the same process while introducing RT+Tensor cores.

You're expecting the same kind of gains with a node jump?
 

PHOENIXZERO

Member
Oct 29, 2017
12,065
But wasn't the 980ti the first real introduction to maxwell?

I think nvidia has pretty much maxed out the maxwell-esque transistor. Unless ampere has improved its transistor I just don't see the similarity in the 900 gen vs 3000.
IIRC the 980 Ti came out about nine months after the 980. I don't know what similarities in Maxwell's architecture there are in Turing. Pascal was the one that was effectively a die shrink of Maxwell. The relatively minor jump from Pascal to Turing is attributed to real estate being taken up by RT and Tensor cores on a relatively minor die shrink, 12nm was an enhanced version of 16nm. Going to 7nm should be much larger.
 

scitek

Member
Oct 27, 2017
10,054
How much are these cards going to cost ffs?

I was watching the latest news rundown from GamersNexus earlier and they mentioned the cooling in the leaked pics of the Founders edition 3000 series cards apparently costs Nvidia around $150 each. For reference, he mentioned the more traditional cooling you'll usually find on more high-end cards (Strix, FTW, etc.) costs manufacturers around $50.

 

j^aws

Member
Oct 31, 2017
1,569
UK
$150 cooler sounds way overpriced. Might as well get a waterblock and go watercooling. Maybe it's time to sell these high-end GPUs as plain PCBs, so you can choose a third-party cooler/ waterblock of your choice?
 

Canklestank

Member
Oct 26, 2017
762
I guess sarcasm isn't allowed anymore :(

Also I would be the first one jumping up and down if it's a 3080 that outperforms a 2080Ti by that much. I am getting the 3080Ti or the 3090Ti/Super regardless of what card is in the benchmarks. Just seems people's expectations are out of wack lol when you expect the 3070 to outperform the 2080Ti and cost half as much it's unrealistic and there is no harm in calling it out for it's ridiculousness.

I mean I own a 1070 that outperformed the 980Ti before it. Technology advances and becomes cheaper. The depreciation is the cost of having the absolute top end card.
 

LavaBadger

Member
Nov 14, 2017
4,986
How much are these cards going to cost ffs?

I was watching the latest news rundown from GamersNexus earlier and they mentioned the cooling in the leaked pics of the Founders edition 3000 series cards apparently costs Nvidia around $150 each. For reference, he mentioned the more traditional cooling you'll usually find on more high-end cards (Strix, FTW, etc.) costs manufacturers around $50.



I know people have high hopes about price (And a lot of other people in these threads are willing to drop any amount apparently), but I fear Nvidia is just going to stick with the 20XX series pricing as a new normal, or bump things down a hundred or so dollars here and there. I'm on a 1080, but that was a stretch to buy at the time, and I'm not buying a mid-tier card for the same price.

If Nvidia wants to convince me to buy a console instead this fall, all they have to do is keep things the way they have with their prices.
 

Sanctuary

Member
Oct 27, 2017
14,203
Me with a 1.5 years old 2080Ti, waiting for a 3080Ti.

giphy.gif

This is me with my soon to be three-and-a-half year old 1080 Ti. While a 30% jump from the 2080 Ti to the 3080 Ti (or 3090) if that's what this is would be garbage at their price bracket, it would actually be huge for me, and the rest who held off upgrading last gen. Even better if this benchmark is the 3080, because I might not even need to go beyond that to have a 4K/60 card (locked 60 mind you), which should also let me screw around with higher frame rates at 1440p too if I want.

Plus, when Turing initially released, ray tracing support was pretty damn sparse and way too taxing, and with the launch of the new consoles where developers will be using it more (even if to a lesser extent in their application per game at least on the consoles), it just means the RT cores will be more valuable than they previously were, and if the new cards have upped the efficiency in that regard too, then all the better. If this is just the 3080, and the next card or two above it has another 20% lead, I may just grab it when it comes out and be set for the next five years.

I'm not talking about AMD to diminish what NVIDIA has achieved until now. They are clearly in a position of leadership when it comes to GPU's.

Not holding my breath, since we see this same song and dance each time AMD is about to release their "super ultra powerful amazing new" flagship, but I do hope this is a return to form for them. I can remember two distinct periods where ATI (and then later just AMD) were very competitive with their offerings, which also forced Nvidia to remain in check as well as innovate further, and not just with their "value" cards either.

Do some of you really think AMD is going to even attempt to compete in the high end market again? They've left that to Nvidia for so long I don't see the point. They'll continue to offer cheaper alternatives for mid and low-end and that's frankly fine to me.

Not bothering to change the previous status quo is all well and good if that actually works for them and you personally don't want more than a $400 card, but it would be nice for them to actually be able to flex again. High end competition is good for everyone, because it also leads to better mid-range cards.
 
Last edited: