Status
Not open for further replies.

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
I think Nvidia would know the performance already and will launch according to their schedule, and if AMD do end up beating them, well Nvidia could just launch another card and call it a Super or whatever.

If any of this pans out (RDNA 2 50% faster than a 2080 ti, and in general RDNA2 in the range of 200% faster than RDNA 1), it might be tough for NV to match for once even with their full fat chips/super/ti versions.

We'll see I guess. Would actually be a great thing for the market if AMD actually could topple Nvidia performance wise in each segment for once. Last time it was even really close was 290x.
 

1-D_FE

Member
Oct 27, 2017
8,310
why do people worry about psu? if you get the new ti a 100$ new one is nothing compared to the gpu cost. also many have a too big psu, then theres undervolting. my 1080ti and 3700x undervolted just need 220W at max load, add the rest and its still below 350

The bigger issue is the heat. In the video they said it was using 300 watts at stock settings and 400 watts when overclocked (and overpowering the AC and still heating up the room). That's the real issue, IMO, with these mega cards.

It'll be interesting to see what AMD has on tap, because it's sounding like I'm out on these Nvidia cards. Way too hot for my liking. I'm upgrading solely for VR, so I'm looking for solid performance/watt gains in traditional rendering situations.

I also remembering somebody in here referencing the 4XX series Nvidia cards. That may not be a bad anaolog either. Those were expensive and hot cards. Then the GTX 460 came out, whooped the GTX 465, and did it at like half the wattage of the older cards. I could totally see these cards being replaced fairly quickly if they accidentally forced themselves into being stuck with Samsung's older nodes.
 

Jimrpg

Member
Oct 26, 2017
3,280
If any of this pans out (RDNA 2 50% faster than a 2080 ti, and in general RDNA2 in the range of 200% faster than RDNA 1), it might be tough for NV to match for once even with their full fat chips/super/ti versions.

We'll see I guess. Would actually be a great thing for the market if AMD actually could topple Nvidia performance wise in each segment for once. Last time it was even really close was 290x.

A lot of talk in the last Moore's Law Is Dead video about how Nvidia are willing to really push the TDP to like 300W and possibly 400W after removing the power limits just to get the performance needed.

Like you said it's great for the market in general, hope it leads to lower costs, I really think what was once a mid tier card like the GTX xx70 Series needs to be $300-400 again.
 

scabobbs

Member
Oct 28, 2017
2,110
If any of this pans out (RDNA 2 50% faster than a 2080 ti, and in general RDNA2 in the range of 200% faster than RDNA 1), it might be tough for NV to match for once even with their full fat chips/super/ti versions.

We'll see I guess. Would actually be a great thing for the market if AMD actually could topple Nvidia performance wise in each segment for once. Last time it was even really close was 290x.
Whens the last time Nvidia couldnt match AMD/ATi performance? People are setting themselves up for a huge disappointment lol
 

mordecaii83

Avenger
Oct 28, 2017
6,880
If any of this pans out (RDNA 2 50% faster than a 2080 ti, and in general RDNA2 in the range of 200% faster than RDNA 1), it might be tough for NV to match for once even with their full fat chips/super/ti versions.

We'll see I guess. Would actually be a great thing for the market if AMD actually could topple Nvidia performance wise in each segment for once. Last time it was even really close was 290x.
Unfortunately I'm locked in to Nvidia's ecosystem due to my monitor being G-Sync but not FreeSync, so whatever Nvidia comes out with I'll end up getting.
 

eonden

Member
Oct 25, 2017
17,161
Whens the last time Nvidia couldnt match AMD/ATi performance? People are setting themselves up for a huge disappointment lol
Just because Intel fucked up and let AMD catch up in CPU, they believe it can happen in the graphic cards market, except Nvidia hasnt failed at all.

When was the last time one of them launched with a notable node disadvantage?
What node disadvantage? Nvidia has been doing 10nm and handidly beating AMD 7nm in both performance AND performance per watt.
 

1-D_FE

Member
Oct 27, 2017
8,310
Isn't RDNA2 supposed to be on TSMC's 7nm+ (or at least an improved version of the 7nm process they used for RDNA1)?
Wouldn't there be a pretty sizeable difference in transistor densities if NV went with Samsung's 8nm?

The one link was claiming that Samsung's 8nm is more akin to TSMC's 10nm process. So there's definitely a sizeable difference in production nodes. And explains why these Nvidia cards look like they're going to consume massive wattage (when we were led to believe power size was going to be reduced. Remember the hilarity when people actually believed Ampere would be 50% more powerful AND consume 50% less wattage. You couldn't talk sense to some of these people because people love fairy tales and get angry if you try and expose the truth).
 

dgrdsv

Member
Oct 25, 2017
12,105
- it runs in serial so the CUs are stalled while it is running. This is ok because it's really fast at doing what it does - but it's still a few ms per frame out of your budget
This isn't an issue obviously otherwise you wouldn't get any speed ups with DLSS.

- the tensor cores also take up silicon area so for any given sized GPU you could argue you'd get more CUs for the same size chip if you didn't have tensor cores.
Which would push your power consumption up (more active SIMDs = more power) leading to chip clocking lower which would likely even out the additional SIMDs and net you what's more or less the same performance but without the new ML h/w - and without DLSS.

does it increase latency is rendering frames? I dont have DLSS card but hows the performance when it makes up for the textures?
Any additional part of rendering a frame increase latency as latency is the inverse of performance. So if you add DLSS on top of your regular frame rendering then you'll get slower performance and higher latency. Since DLSS is running on top of rendering in lower-than-native resolution though the cost of DLSS itself is well below the advantage you get by rendering in lower resolution - so the latency is obviously lower than without DLSS, as it would be with anything which significantly increases the performance really.

I think Nvidia would know the performance already and will launch according to their schedule, and if AMD do end up beating them, well Nvidia could just launch another card and call it a Super or whatever.
Launch a new card in place of an older one in a month from launching that older one? Not a good business practice.

Isn't RDNA2 supposed to be on TSMC's 7nm+ (or at least an improved version of the 7nm process they used for RDNA1)?
Wouldn't there be a pretty sizeable difference in transistor densities if NV went with Samsung's 8nm?
RDNA2 is N7 "enhanced" which is basically N7 with tweaks it got since it launched in 2018. N7+ is EUV and so far it doesn't look like any GPU or CPU vendor is going to use that over N7.
Densities is a product of design as much as production process so it's hard to say what will happen on two different production lines with two vastly different designs.
What does it matter anyway? Even if AMD will have a density advantage I doubt that N7 wafer pricing will allow them to take advantage of that by building a gaming chip as huge as GA100 for example - they'd need to sell it for some thousands of dollars. And with processes being different - with Samsung's 8nm being potentially cheaper per transistor than TSMC's N7 - even the size difference won't matter much - cost per transistor is what matters for margins as well as comparative design complexities at the same performance levels.

That being said I do think that AMD has a chance of competing with top end Ampere cards, for the first time since, I dunno, Tahiti vs GK104? Hawaii was way too power inefficient against GM204 and Fury hasn't really managed to beat GM200. And top end Pascals and Turings were just completely out of reach.
 
Last edited:

Gohlad

Avenger
Oct 28, 2017
1,077
What really will make the difference in the end (and I mean long(er) term) is the strategy AMD and Nvidia are executing regarding graphics cards.

We saw that AMD is going for a split approach with gaming cards (RDNA) and compute cards (CDNA), this could lead to their gaming cards not having unnecessary features for compute work loads that games don't use, thus leaving more room to focus on gaming related features in the cards.

Compared to Nvidia having their cards be used for both compute and gaming related workloads. This has led to advantages for Nvidia like being able to use ML features for gaming, but also ultimately made their cards massive in terms of size and power consumption (and the fact that you may have features on it that you will never use, if you only use your card for gaming).

In the end we will see which approach will prove to be more successful.
 

StrangeADT

Prophet of Truth
Member
Oct 25, 2017
2,093
Unfortunately I'm locked in to Nvidia's ecosystem due to my monitor being G-Sync but not FreeSync, so whatever Nvidia comes out with I'll end up getting.
This is my situation as well. I'm still rooting hard for AMD because the better they do the less I'm gonna have to pay. My GTX970 is *very* long in the tooth especially for an ultrawide monitor. I did not realize how poorly my 970 would perform before I bought this monitor. I don't think I would have bothered if I did. These days I tend to bump the resolution on my monitor down to 2560x1440 , sacrificing the ultra wide aspect ratio, just to get acceptable FPS. I have a 120hz ultrawide and I just can't drive it. Definitely poor usage of money on my part.
 

mordecaii83

Avenger
Oct 28, 2017
6,880
This is my situation as well. I'm still rooting hard for AMD because the better they do the less I'm gonna have to pay. My GTX970 is *very* long in the tooth especially for an ultrawide monitor. I did not realize how poorly my 970 would perform before I bought this monitor. I don't think I would have bothered if I did. These days I tend to bump the resolution on my monitor down to 2560x1440 , sacrificing the ultra wide aspect ratio, just to get acceptable FPS. I have a 120hz ultrawide and I just can't drive it. Definitely poor usage of money on my part.
That's exactly why I went with 2560x1080p on my ultrawide, I want to keep it at 120+ as much as possible while running ultra settings.
 

Bosch

Banned
May 15, 2019
3,680
Whens the last time Nvidia couldnt match AMD/ATi performance? People are setting themselves up for a huge disappointment lol
I don't think so... AMD have a node advantage and are using a refined 7nm node 7nm+

It is really possible they catch Nvidia for real at least in raw performance.

5700XT beat a 2700 SUPER in many games...

But when you think about DLSS and Ray Tracing we need to wait what AMD gonna show us.
 

Readler

Member
Oct 6, 2018
1,975
I don't think so... AMD have a node advantage and are using a refined 7nm node 7nm+

It is really possible they catch Nvidia for real at least in raw performance.

5700XT beat a 2700 SUPER in many games...

But when you think about DLSS and Ray Tracing we need to wait what AMD gonna show us.
I'd be very surprised if AMD has an answer to DLSS by the time the cards after RDNA2 release.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
I don't believe for a second that a 225% increase will happen but I really hope it's true. That would be amazing for competition.

5700xt X2 plus a little more due to node echancement and arch enhancement thus putting it quite a ways ahead of a 2080ti doesn't seem thaaat crazy of a thing for me to believe but we'll see. I'd expect more realistically 30-40% better than a 2080ti which I would think would compare favorably with a 3080 ti, maybe getting beat by 5-10% for less money.

Even if they manage that it would be great for the market, if they actually manage to beat Nvidias halo card it would be amazing for the market.
 
Oct 25, 2017
2,960
RDNA 1 stopped at the ~$450 price range. 'Big Navi' is supposed to be a XX80 Ti killer. I would hope it would be 225% as fast.
 
Last edited:
Oct 25, 2017
2,960
As an aside, the author of that Tweaktown article citing MLID did a podcast with MLID last week.



The meat of the convo is how tech journalism has changed over the past 10 years - they do get into the rise of both tech based clickbait and newer generation of 'tech leak' channels (of which MLID is a part of, he's only been at it for a year).
 

Teiresias

Member
Oct 27, 2017
8,282
Yeah, between my G-Sync monitor and my LG C9 at this point only supporting Nvidia for "gsync/VRR" I'm stuck with Nvidia, but would love to see AMD claw back some market and mindshare if only to affect the pricing.
 

dragn

Attempted to circumvent ban with alt-account
Banned
Oct 26, 2017
881
can only hope for amd to get foveated rendering for pimax vr working, only then i can switch back :/
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
RDNA 1 stopped at the ~$450 price range. 'Big Navi' is supposed to be a XX80 Ti killer. I would hope it would be 225% as fast.

Right, 2x the bones for 2x the perf plus arch improvement and node echancement. It's not that crazy to believe. Will be interesting to see if that beats Nvidia 3080 ti or not, or at least how close it is say for $800 range vs $1200 expected for 3080 ti and if it forces Nvidia to bring 3080ti under $1k.

Will also be interested to see what a cut down big Navi 6800xt would looks like price/perf vs 3070/3080.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Yeah, between my G-Sync monitor and my LG C9 at this point only supporting Nvidia for "gsync/VRR" I'm stuck with Nvidia, but would love to see AMD claw back some market and mindshare if only to affect the pricing.

I like that my PC set up is now one Gsync 27 inch 1440p and one freesync HDR 27 inch both 1440p.

Allows me to not really have a preference either way. Still kind of prefer AMD as the only HDR monitor works with AMD for HDR as it only works over HDMI and Nvidia cannot do gsync and HDR simultaneously over HDMI on older monitors which sucks. The HDR monitor is also IPS which I vastly prefer over the TN gsync monitor. It's an Asus predator so it's a good TN so not the end of the world, also does 165hz vs 144hz, so I'm ok with that being the primary monitor as well.
 

Tovarisc

Member
Oct 25, 2017
24,537
FIN
If AMD has "3080Ti Killer" there is no way they undercut 3080Ti pricing in any big way, they will want cash in on that.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
If AMD has "3080Ti Killer" there is no way they undercut 3080Ti pricing in any big way, they will want cash in on that.

Yeah if it actually beats the 3080ti by a meaningful percentage, sure, but they also will likely have an inferior dlss (or not at all) and likely inferior RT, so they may need to price a bit less even if the do match it or beat it. Also just battling uphill for market share they may undercut Nvidia.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
If they don`t delivery a dlss solution and a good RT solution $999 is doable

I think if RT solution is much worse than Nvidia's they have to go lower than $999 even otherwise you lose the value prop. Unless 3080ti is $1500msrp or something. You can actually find 2080tis for $999 occasionally (the actual MSRP). Possible 3080 ti is the same or like $1200 MSRP.
 

Fatmanp

Member
Oct 27, 2017
4,441
A 225% perf improvement on the same node doesn't sound right unless as they are doing x2 models.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
A 225% perf improvement on the same node doesn't sound right unless as they are doing x2 models.

This is taking into account that they are doubling the CUs so specifically only talking about big Navi. IPC improvement is expected around 10%. On top of that higher clocks.

So specifically for the big Navi chip expected roughly 2x better than 5700xt specifically plus a bit more. A 6700xt then probably looking at a more typical 25% boost over the direct previous model plus added RT features etc. So 6700xt will probably line up against 3060, then likely a cut down big Navi to line up against 3070 and 3080, and then the fu fat chip 6900xt to line up against 3080 Ti.

These rumors point to first big Navi chip at 72CUs, so AMD with increased yields may have an 80cu 6950xt of sorts a bit further out as well.

Would actually be kind of cool to see a horse race like that where Nvidia drops 3080, AMD drops 6900xt and beats it, then Nvidia drops 3080 ti and matches/beats, then AMD hits again with 6950, then maybe Nvidia 3090. Like the old 7950 days lol.
 
Last edited:

eonden

Member
Oct 25, 2017
17,161
I see how this thread has gone from "50% increase in performance for Nvidia is impossible" to "225% increase in performance for AMD is totally possible!" in the span of a few pages lmao.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
I see how this thread has gone from "50% increase in performance for Nvidia is impossible" to "225% increase in performance for AMD is totally possible!" in the span of a few pages lmao.

The AMD number is mainly due to almost doubling the CUs, so it makes sense.

It's kind of a unique situation of AMDs current top card being a small die chip. Same article posted says 40-50% faster than 2080 ti which lines up when you factor in IPC improvement and faster clocks
 
Last edited:

eonden

Member
Oct 25, 2017
17,161
The AMD number is mainly due to almost doubling the CUs, so it makes sense.
And Nvidia is having a big decrease in node size while still having a better overall architecture than AMD. And having a fuck ton more knowledge on ML in graphic cards than AMD.
Nvidia has all the cards in hand to trounce AMD again.

Uhh *Looks at Ryzen and what it forced Intel to do*
Intel had problems for years before AMD even managed to get close to them (and again, that was mainly for getting stuck in a node change). Nvidia has been going from strength to strength.

but someone made a youtube video.
I swear to god I want AMD to be good, but some of the AMD fans really are in the "overhype AMD product" loop cycle all the time. And it makes them really annoying.
 

Vex

Member
Oct 25, 2017
22,213
Omg everytime i see this thread get bumped i freak out and think something new happened. I dont even know why I am getting hype like this. I never get new gpus at launch anyways. The hell is wrong with me?
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
And Nvidia is having a big decrease in node size while still having a better overall architecture than AMD. And having a fuck ton more knowledge on ML in graphic cards than AMD.
Nvidia has all the cards in hand to trounce AMD again.


Intel had problems for years before AMD even managed to get close to them (and again, that was mainly for getting stuck in a node change). Nvidia has been going from strength to strength.


I swear to god I want AMD to be good, but some of the AMD fans really are in the "overhype AMD product" loop cycle all the time. And it makes them really annoying.

Sure, it's absolutely an uphill battle for AMD. Just saying if this leak turns out to be accurate it's at least encouraging as we've seen supposed 3080 (maybe even the ti chip), beating a 2080ti by 30%. So even if it's just the 3080 and not the ti, still seems like a possibility of AMD at least matching or being very close to Nvidia just in raw performance. But yeah likely disadvantaged in RT and DLSS equivalent if there even is one. Still, imo it's good for the market if AMD is able to complete with a Nvidia halo chip especially if they manage to do it with less wattage, and the leaks on Nvidia power required seems to suggest they might have a hint that what AMD is cooking is pretty powerful. Competing at the top end and then cutting down to compete with the rest of Nvidia's offering is good as well.

Maybe it doesn't happen and AMD doesn't catch really anything.

IMHO even 5700 xt being quite close to 2070s/2080 is about as well as AMD has done for quite some time, so yeah just hoping they are able to compete well across the segment.

I don't have any horses (in fact I own Nvidia shares so maybe I do lol) in this other than competition is good.
 

Nothing

Member
Oct 30, 2017
2,095
I swear to god I want AMD to be good, but some of the AMD fans really are in the "overhype AMD product" loop cycle all the time. And it makes them really annoying.
I know what you mean. Especially when it comes to CPUs. A lot of people speak like Intel isn't even an option, whereas they are still better performers for gaming-only setups. We are also about to become less GPU-bound with the new graphics cards.

I want AMD GPUs to be good too. Competition is great for consumers. But I will still likely buy Nvidia bc the driver support is better, meaning you will have fewer crashes, plus I have a G-sync (module) monitor.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
I know what you mean. Especially when it comes to CPUs. A lot of people speak like Intel isn't even an option, whereas they are still better performers for gaming-only setups. We are also about to become less GPU-bound with the new graphics cards.

I want AMD GPUs to be good too. Competition is great for consumers. But I will still likely buy Nvidia bc the driver support is better, meaning you will have fewer crashes, plus I have a G-sync (module) monitor.

Yeah there's something to be said that a $300 launch 8700k years ago still is as good as AMDs chips purely in gaming.

The hype is just an "I want to believe" lol. Hopefully eventually it actually pans out. You do have to go back to like Tahiti for when AMD was actually competitive in the enthusiast segment and it just kinda sucks.
 

asmith906

Member
Oct 27, 2017
27,631
I think if RT solution is much worse than Nvidia's they have to go lower than $999 even otherwise you lose the value prop. Unless 3080ti is $1500msrp or something. You can actually find 2080tis for $999 occasionally (the actual MSRP). Possible 3080 ti is the same or like $1200 MSRP.
I don't believe I've ever seen a 2080ti for $999 outside of some crappy blower card. $1200 seems to be the real price of a 2080ti.
 

dgrdsv

Member
Oct 25, 2017
12,105
We saw that AMD is going for a split approach with gaming cards (RDNA) and compute cards (CDNA), this could lead to their gaming cards not having unnecessary features for compute work loads that games don't use, thus leaving more room to focus on gaming related features in the cards.
CDNA is just a GCN evolution right now and it will likely be moved to RDNA execution core in CDNA2 already. It makes little sense to have two vastly different generic multiprocessor designs in your products, from s/w support perspective, the ability to reuse gaming products for pro markets and the need to actually fund R&D for them both for them to stay relevant. Compute features which aren't needed for gaming can be removed from gaming versions of the same architecture.

Compared to Nvidia having their cards be used for both compute and gaming related workloads.
GA100 isn't even equipped with video outputs anymore. NV has split off top HPC card from the gaming lineup completely - both GP100 and GV100 were capable of being videocards, GA100 isn't anymore. So it's actually NV who is the first here, Arcturus will follow - but on an old GCN base.

This has led to advantages for Nvidia like being able to use ML features for gaming, but also ultimately made their cards massive in terms of size and power consumption
Not sure where you're getting this. Turing size is very close to that of Navi 1x without any ML features and it has lower power consumption while being one node behind.

(and the fact that you may have features on it that you will never use, if you only use your card for gaming)
All PC h/w have such features. Why would it matter to the end user?
 

GameAddict411

Member
Oct 26, 2017
8,613
If any of this pans out (RDNA 2 50% faster than a 2080 ti, and in general RDNA2 in the range of 200% faster than RDNA 1), it might be tough for NV to match for once even with their full fat chips/super/ti versions.

We'll see I guess. Would actually be a great thing for the market if AMD actually could topple Nvidia performance wise in each segment for once. Last time it was even really close was 290x.
I find this post funny because I seriously doubt that AMD will match let alone make Nvidia worry. They have been so far ahead the past 8 years or so that I doubt AMD will catch up in such short amount of time.
 
Last edited:

GameAddict411

Member
Oct 26, 2017
8,613
Sure, it's absolutely an uphill battle for AMD. Just saying if this leak turns out to be accurate it's at least encouraging as we've seen supposed 3080 (maybe even the ti chip), beating a 2080ti by 30%. So even if it's just the 3080 and not the ti, still seems like a possibility of AMD at least matching or being very close to Nvidia just in raw performance. But yeah likely disadvantaged in RT and DLSS equivalent if there even is one. Still, imo it's good for the market if AMD is able to complete with a Nvidia halo chip especially if they manage to do it with less wattage, and the leaks on Nvidia power required seems to suggest they might have a hint that what AMD is cooking is pretty powerful. Competing at the top end and then cutting down to compete with the rest of Nvidia's offering is good as well.

Maybe it doesn't happen and AMD doesn't catch really anything.

IMHO even 5700 xt being quite close to 2070s/2080 is about as well as AMD has done for quite some time, so yeah just hoping they are able to compete well across the segment.

I don't have any horses (in fact I own Nvidia shares so maybe I do lol) in this other than competition is good.
The 2070s is on much larger node, has much more integrated hardware inside such as the the tensor cores and the RT cores, and still manages a lower TDP then the 5700 xt at 7nm. I can't even imagine the improvements going to 7nm will be like for Nvidia.
 
Nov 2, 2017
2,275
This is taking into account that they are doubling the CUs so specifically only talking about big Navi. IPC improvement is expected around 10%. On top of that higher clocks.

So specifically for the big Navi chip expected roughly 2x better than 5700xt specifically plus a bit more. A 6700xt then probably looking at a more typical 25% boost over the direct previous model plus added RT features etc. So 6700xt will probably line up against 3060, then likely a cut down big Navi to line up against 3070 and 3080, and then the fu fat chip 6900xt to line up against 3080 Ti.

These rumors point to first big Navi chip at 72CUs, so AMD with increased yields may have an 80cu 6950xt of sorts a bit further out as well.

Would actually be kind of cool to see a horse race like that where Nvidia drops 3080, AMD drops 6900xt and beats it, then Nvidia drops 3080 ti and matches/beats, then AMD hits again with 6950, then maybe Nvidia 3090. Like the old 7950 days lol.
200% faster is tripling performance. There's 0% chance of that happening on the same node. That would be double the 2080Ti and not 40-50% faster like the site you linked is claiming. I'll just assume the rumour guy and the people writing the article fail at percentages. That does not bode well for the validity of the rumour if they can't even do basic math.
 

1-D_FE

Member
Oct 27, 2017
8,310
The 2070s is on much larger node, has much more integrated hardware inside such as the the tensor cores and the RT cores, and still manages a lower TDP then the 5700 xt at 7nm. I can't even imagine the improvements going to 7nm will be like for Nvidia.

You do realize it's extremely likely Ampere isn't going to be on this node, right? It's seemingly going to be an Samsung's significantly inferior node. All because Nvidia played a game of chicken with TSMC and lost (and which nobody should be happy with, because those savings wouldn't have been passed onto us, the consumer, anyways).
 

GameAddict411

Member
Oct 26, 2017
8,613
You do realize it's extremely likely Ampere isn't going to be on this node, right? It's seemingly going to be an Samsung's significantly inferior node. All because Nvidia played a game of chicken with TSMC and lost (and which nobody should be happy with, because those savings wouldn't have been passed onto us, the consumer, anyways).
It still a big improvement over Turing 12nm.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
200% faster is tripling performance. There's 0% chance of that happening on the same node. That would be double the 2080Ti and not 40-50% faster like the site you linked is claiming. I'll just assume the rumour guy and the people writing the article fail at percentages. That does not bode well for the validity of the rumour if they can't even do basic math.

I'm not actually sure that they fucked it up. 225% the performance of isn't the same thing as a 225% faster. If they said it the wrong way with the percentages it's still in the same article they mean 2x better, which is accurate to what they are talking about.

Edit: So basically tweaktown fucked it up with their headline but their source did not

major leap forward for them 195% to 225% of the current available cards. But their internal estimates are still projecting the performance per dollar cards up to the upper end of the range".

So that means 95-125% faster which does make sense if you are nearly doubling CUs.
 
Last edited:

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
The 2070s is on much larger node, has much more integrated hardware inside such as the the tensor cores and the RT cores, and still manages a lower TDP then the 5700 xt at 7nm. I can't even imagine the improvements going to 7nm will be like for Nvidia.

They apparently aren't going to 7nm though but rather 8nm Samsung which is really just an improved 10nm.
 
Status
Not open for further replies.