• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
The 2070s is on much larger node, has much more integrated hardware inside such as the the tensor cores and the RT cores, and still manages a lower TDP then the 5700 xt at 7nm. I can't even imagine the improvements going to 7nm will be like for Nvidia.

I've had both a 2070s and a 5700xt and they consume roughly the same tdp. Either way seems NV is pushing up wattage reliable sources so I dunno.
 

RedSwirl

Member
Oct 25, 2017
10,102
Might not be the right place to ask, but some discussion here confused me about bottlenecks:
If I have a CPU that would be too weak for a new upgraded GPU, creating a bottleneck, do I just lose out on potential performance that upgrade could have caused, or does it negatively affect the base performance of the seperate components? Like, make the CPU perform worse than it would with a weaker GPU?

Not sure I put that in way that can be understood.
Bottleneck means there is a component that caps the performance. If you change all components but the one that bottlenecks, you should get more or less similar performances. So yeah, if you are CPU-bottlenecked and upgrade your gfc you only "lose out" on potential performance from the new GFX, you dont make your CPU perform worse.
The first situation. The GPU is still just as capable and you could even max out its full potential if you keep increasing resolution or settings, but it has to wait for the CPU if what you want is high refresh rates. If the CPU can't do 120fps in a certain game, then it is limiting the maximum fps you can reach regardless of the GPU.
My experience with Arma 3 illustrates this pretty well. That game is highly dependent on single-core CPU clock speeds so in many situations it'll always run like shit, but upgrading to a better GPU allowed me to push some effects like total draw distance and shadow draw distance further than before without the performance getting any worse. Having something like a CPU bottleneck may cap your performance, but it doesn't necessarily mean pushing the GPU harder will immediately drop your fps further.
If the new GPUs have high power draw at stock speed, they might have even higher if they have any substantial overclocking capability. I would expect that anyone with a 650-750W PSU is perfectly fine but those on 500-600W might at least experience PSU fan coming into play more often. Most people don't want to buy a new PSU if they can help it, it's an extra cost. I recently got an SFX size PSU and went with a 750W model just in case even though for my 3700X + 2080 Ti the 600W Platinum model would have been enough. The good thing is the new PSU reduced my 2080 Ti coil whine so it bothers me less.
When I went from a 760 to a 1070 I kept the same 550W PSU -- which was originally from around 2011, because I found the 1070 actually drew less wattage than the 760. Even today, as this mid-tier PSU comes up on 10 years old it's still perfectly fine.

If I can build a new rig with a 3070 though I'll move up to 650W or something.
 

ugoo18

Member
Oct 27, 2017
150
No one can really predict what's going to happen to the gaming industry in 5 years. I bought my machine an i5-4690k, with a GTX 970 in 2015, and its fine for 1080p, but with my 1440p itd be chugging. I replaced the 970 with a 1070 when that came out. Even if I got the i7 - it'd be better, but there's a huge gap between the i7-4790k and say the latest CPUs. At a guess I'd say you'd be fine if your specs are better than the consoles in general. But with PCs, PC devs develop with the market in mind. So if you buy a mid tier card, or your high end card becomes mid tier - you'll get that performance. That's why for me, I'm probably going with a 3080 over the 3070, just that the little bit better gives a bit of breathing room in terms of performance.

Another thing - the PS5 is looking like its going to be around $499 - that is serious value - given to build an equivalent PC it'd be at least $1200-$1500.

Oh i plan to get a PS5 however for me i've done all my multiplatform gaming on PC simply because of guaranteed (more or less) backwards and forward compatibility in addition to the price proposition generally being better on PC storefronts imo. My consoles (Switch and PS4 Pro then PS5) have been purely for their exclusives with the occasional multiplat on Switch for the portability also however that's always been as a double dip. Currently i've been going the gaming laptop path (on my second one now that i got mid PS4 gen as my initial one had motherboard problems) and for this upgrade i felt that building a PC outright was a better idea and gives me a greater performance ceiling vs price compared to going a 3rd gaming laptop. In terms of choice i would likely bite the financial bullet for say the 3080 over the 3070 because as i see it, a bit of extra financial strain now could potentially save me a whole lot of pain in the future from that additional performance carrying me just a bit longer down the line.

The first thing i will grab beforehand though will be the Ryzen 3900x and then slowly build up the components i want till reveal day. Well the components that couldn't necessarily be completely thrown into flux by said gpu reveals.

And to australians: yes, $1000+, if the 3070 fits under that number I'll be surprised

Well 1000+ is less daunting than 2000+ i suppose, I'll still be over the moon if the 3080 is closer to 1k than 2k lol.
 

Jimrpg

Member
Oct 26, 2017
3,280
Launch a new card in place of an older one in a month from launching that older one? Not a good business practice.

Yes true, but I was thinking that it hasn't stopped Nvidia releasing a 1080 and then a Titan and then a 1080 Ti in the past. There were plenty of people who went from 1080 to Titan, or a 1080 to 1080 Ti. I don't think they would release new cards within a month, but 6 months I could see that happening. Especially now that they're happy to release a variety of cards in the market with their Ti/Super or 3rd party OC branding.

Oh i plan to get a PS5 however for me i've done all my multiplatform gaming on PC simply because of guaranteed (more or less) backwards and forward compatibility in addition to the price proposition generally being better on PC storefronts imo. My consoles (Switch and PS4 Pro then PS5) have been purely for their exclusives with the occasional multiplat on Switch for the portability also however that's always been as a double dip. Currently i've been going the gaming laptop path (on my second one now that i got mid PS4 gen as my initial one had motherboard problems) and for this upgrade i felt that building a PC outright was a better idea and gives me a greater performance ceiling vs price compared to going a 3rd gaming laptop. In terms of choice i would likely bite the financial bullet for say the 3080 over the 3070 because as i see it, a bit of extra financial strain now could potentially save me a whole lot of pain in the future from that additional performance carrying me just a bit longer down the line.

The first thing i will grab beforehand though will be the Ryzen 3900x and then slowly build up the components i want till reveal day. Well the components that couldn't necessarily be completely thrown into flux by said gpu reveals.

Well 1000+ is less daunting than 2000+ i suppose, I'll still be over the moon if the 3080 is closer to 1k than 2k lol.

To your original point about whether a PC could last 5 years - I think there's a fair bit of uncertainty in the market right now in a lot of areas. I need to buy a new PC too, so I guess I'm going to just close my eyes and pick the best parts I can, but imo its a really bad time to buy right now from an objective point of view. I think I'm almost prepared for the fact that I might have to build another PC say 3 years down the track, one with perhaps SSD improvement, better 4k, a CPU with more than 16 cores etc etc. The PC i get at the end of the year (maybe a Ryzen 4900x) and a RTX 3080 and a 1TB SSD with read/writes of 3.5gb-5gb/s will be great, but PCs will have jumps every year while the console just took their big jump.
 

low-G

Member
Oct 25, 2017
8,144
Zen's design made it cheaper to manufacture compared to Intel. In the gpu space it's not the same thing. The 7nm node is dense but not necessarily much cheaper per transistor, so Navi 1 was barely able to undercut nvidia while lacking several features of Turing.

It is cheaper per transistor or they wouldn't do it (this is a fundamental 'law' of the business) - it wouldn't make business sense. Simply by shrinking the node you get more chips per wafer, which directly reduces cost proportional to the shrink, provided your yields are decent. Even if yields start so-so, they'll improve to where they'll make greater profit. Otherwise you just stick with whatever node you're on.

Navi 1 wasn't able to undercut Nvidia because their GPU design isn't as advanced.
 
Nov 8, 2017
13,245
It is cheaper per transistor or they wouldn't do it (this is a fundamental 'law' of the business) - it wouldn't make business sense. Simply by shrinking the node you get more chips per wafer, which directly reduces cost proportional to the shrink, provided your yields are decent. Even if yields start so-so, they'll improve to where they'll make greater profit. Otherwise you just stick with whatever node you're on.

Navi 1 wasn't able to undercut Nvidia because their GPU design isn't as advanced.

The reduction in cost per transistor has been slowing. 7nm cost per transistor was estimated at roughly 70% of what it was on "10nm", but this also comes with added thermal considerations, and the cost per transistor on a larger chip would have been even worse than that because this is an averaged value that doesn't scale linearly with arbitrarily sized dies. AMD had public slides showing that the cost per mm^2 (which is a correlate of cost per transistor, although they obviously aren't identical values) on 7nm was not very favourable.

The economic downsides of using 7nm for large dies were a major factor in why the Radeon 7 (331mm^2) was unable to meaningfully undercut the RTX 2080 (545mm^2). It's a much smaller die, so usually that would imply big savings, but 7nm was just a much more expensive process, even at that scale. Later, the 251mm^2 Navi was only able to undercut Nvidia by a modest margin, even without dedicated RT and AI acceleration on the wafer.

This is not just technology of AMD versus Nvidia, this is also the reality of using modern processes to produce large dies. It used to be that every shrink was denser, cheaper, faster, and more power efficient. They're still getting denser, but the other three factors are in constant contention with each other.
 

felixdat

Alt account
Banned
Jul 2, 2020
145
I am wondering if my 2080ti can run games like the new watch dogs, cyberpunk at ultra settings 60-70fps@ 1440p with some sort of RT.
 

kami_sama

Member
Oct 26, 2017
7,045
Make your bets, people, make your bets!
I think we might get some info on the cards in September and a paper launch later in the month.
Only for the 3070 and 3080 however. 3060 and 3080ti (or equivalent) will be later in the year, maybe in December.
12-pin power connector is real, but like the strange cooler we saw earlier in the year, it is a prototype and maybe they won't put it in the cards.
3070 is within striking distance of the 2080ti, if a little lower in raster-only workloads, but better in raytracing. I hope it ends up being $500, but I think it will end being higher than that.
With them being made in Samsung's 8nm means they're gonna be hot.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
I'm listening to rumours Ampere is a power hog (3080 Ti 300W and 400W overclocked) and 'only' offers 40% more performance over 2080 Ti. Yet the rumours of AMD's big Navi are 50% perf over 2080 Ti.

Could it be happening? Has Nvidia dropped the ball going with Samsung 8nm?
 

kami_sama

Member
Oct 26, 2017
7,045
I'm listening to rumours Ampere is a power hog (3080 Ti 300W and 400W overclocked) and 'only' offers 40% more performance over 2080 Ti. Yet the rumours of AMD's big Navi are 50% perf over 2080 Ti.

Could it be happening? Has Nvidia dropped the ball going with Samsung 8nm?
Probably. If raytracing and dlss weren't a thing this generation would go to AMD 100%. (well there's also the issue with the drivers).
I have a 750w psu, but even I don't want to think about the power consumption of the new cards.
They played too hard with TSMC and they lost. I sure hope they go with them for the cards after these ones. Or Samsung gets a good node.
 

NightmareT

Banned
Aug 21, 2018
116
Probably. If raytracing and dlss weren't a thing this generation would go to AMD 100%. (well there's also the issue with the drivers).
I have a 750w psu, but even I don't want to think about the power consumption of the new cards.
They played too hard with TSMC and they lost. I sure hope they go with them for the cards after these ones. Or Samsung get's a good node.
I think almost the same... But

How many "aa/aaa games have dlss"

2?? 3???

The ray tracing always off for the fps..

But I don't know. If amd does not launch something good I will go with a 3070/3080
 

seroun

Member
Oct 25, 2018
4,490
I'm listening to rumours Ampere is a power hog (3080 Ti 300W and 400W overclocked) and 'only' offers 40% more performance over 2080 Ti. Yet the rumours of AMD's big Navi are 50% perf over 2080 Ti.

Could it be happening? Has Nvidia dropped the ball going with Samsung 8nm?

As far as I know the rumours of big Navi are of it being on the level of 2080 Ti, not 50% more. I still think AMD needs a bit more time and there's also the issue of the drivers.

And at least for people who render through GPU, AMD is still not an option.
 

Serious Sam

Banned
Oct 27, 2017
4,354
I'm listening to rumours Ampere is a power hog (3080 Ti 300W and 400W overclocked) and 'only' offers 40% more performance over 2080 Ti. Yet the rumours of AMD's big Navi are 50% perf over 2080 Ti.

Could it be happening? Has Nvidia dropped the ball going with Samsung 8nm?
My 860W Platinum PSU could use a workout, but I doubt these rumors are true.
 
Oct 25, 2017
2,950
12 Pin connector is targeted towards OEMs/System Integrators and reference cards. Has been in the works for years. No, you don't need a new psu.
 

dgrdsv

Member
Oct 25, 2017
12,024
It is cheaper per transistor or they wouldn't do it
What options did they have? Navi 10 consume 250W being a 300mm^2 chip on N7. It just wouldn't be able to compete with Turing at all on a less advanced process, going with N7 was the only viable option for RDNA1. And it certainly wasn't cheaper per transistor than the old 16nm node Turing uses.

Simply by shrinking the node you get more chips per wafer, which directly reduces cost proportional to the shrink, provided your yields are decent.
Too bad that wafer prices are completely different for different processes so you can't directly compare the price per mm for chips made on different processes.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
As far as I know the rumours of big Navi are of it being on the level of 2080 Ti, not 50% more. I still think AMD needs a bit more time and there's also the issue of the drivers.

And at least for people who render through GPU, AMD is still not an option.

There's being pessimistic and then there's this....lol! Seriously though, I'm about 95% sure top Big Navi card will be faster than a 2080 Ti, it would be pretty disastrous if not. The next gen consoles performance level tells us what ballpark.

Probably. If raytracing and dlss weren't a thing this generation would go to AMD 100%. (well there's also the issue with the drivers).
I have a 750w psu, but even I don't want to think about the power consumption of the new cards.
They played too hard with TSMC and they lost. I sure hope they go with them for the cards after these ones. Or Samsung gets a good node.

Rumour is Nvidia TSMC 7nm 'refresh' of Samsung 8nm Ampere cards next year.
 

seroun

Member
Oct 25, 2018
4,490
There's being pessimistic and then there's this....lol! Seriously though, I'm about 95% sure top Big Navi card will be faster than a 2080 Ti, it would be pretty disastrous if not. The next gen consoles performance level tells us what ballpark.



Rumour is Nvidia TSMC 7nm 'refresh' of Samsung 8nm Ampere cards next year.

I'm just sad because I can't render with AMD :( And to be fair at this point I've read so many rumours that I don't know what to believe anymore. @.@

Still, I hope they do extremely well. Anything that makes NVIDIA falter is good.
 

Patitoloco

Banned
Oct 27, 2017
23,714
Rumour is Nvidia TSMC 7nm 'refresh' of Samsung 8nm Ampere cards next year.
Oh god, the wait continues

giphy.gif
 

kiguel182

Member
Oct 31, 2017
9,473
No new PSU is good news but the power consumption jump from my 1660 Super to whatever card I get is going to be big.
 

kiguel182

Member
Oct 31, 2017
9,473
My 620W PSU is barely going to get it done with the 3070 or the 3060.

EDIT: I was looking at PCPartpicker and my components use like 180W excluding GPU. Still, the ceiling for this PSU is going to be this gen it seems. And no OC.
 

tuxfool

Member
Oct 25, 2017
5,858
My 620W PSU is barely going to get it done with the 3070 or the 3060.

EDIT: I was looking at PCPartpicker and my components use like 180W excluding GPU. Still, the ceiling for this PSU is going to be this gen it seems. And no OC.
I mean you're worried about your PSU, but it's probably the cheapest discrete part of your build (maybe with the exception of your case).
 

Bosch

Banned
May 15, 2019
3,680
As far as I know the rumours of big Navi are of it being on the level of 2080 Ti, not 50% more. I still think AMD needs a bit more time and there's also the issue of the drivers.

And at least for people who render through GPU, AMD is still not an option.
5700 xt already is on the level of a 2070 super.

So you believe they are launching a new card in the end of the year that only matches 2080 ti? Do u think this make sense?

Xbox series x sits between a 2080 and 2080 super and you think their big chip is 15% faster than that only? Really?
 

Hong

Member
Oct 30, 2017
777
Out of interest I looked up what the largest capacity consumer PSU is that you can get. Seems like 2000W is the limit. But you can also get this 9000W badboy. Guess I'll go for that one, just be safe.

I have a RM550x PSU and I'm sticking to it. Nothing can go wrong, right?
 

Sqrt

Member
Oct 26, 2017
5,918
So, the 8nm thing is almost set in stone, then? Pity, but I hope it translates into better prices at least...
 

kiguel182

Member
Oct 31, 2017
9,473
Yup, I also don't really want to deal with taking it out and connecting everything back up.

I had to switch a cooler the other day and I'm good with banging my head against this PC for the next year or two.

Anyway, seems like 620W will be fine and the same connectors.
 
Status
Not open for further replies.