Status
Not open for further replies.

Veliladon

Member
Oct 27, 2017
5,578
OUCH!!! While my PSU is more than adequate lol, the estimated cost per month hurt my feelings!! $555 per year for this thing!
Exactly, people over estimate the power usage, and underestimate the cost. I also run 24/7, and I am rethinking that, its just not friendly for the environment.

It's also not realistic to draw that much power 24/7. My machine idles at <100W most of the time. It jumps to 400W while gaming maybe. Even at the ridiculous price of US$0.20/kW that's only $175/year idling.
 

Nikokuno

Unshakable Resolve
Member
Jul 22, 2019
767
I want a new GPU, I want to enjoy the summer at the same time I'd like to know what Nvidia and AMD had to offer.

Also all the rumours about Ampere consumption are a bit scary and also weird coming from Nvidia recent standard.
Looking at my 550 gold PSU, I hope I'll not have to buy a new PSU too.
 

nitewulf

Member
Nov 29, 2017
7,294
It's also not realistic to draw that much power 24/7. My machine idles at <100W most of the time. It jumps to 400W while gaming maybe. Even at the ridiculous price of US$0.20/kW that's only $175/year idling.
Yeah, absolutely, it's not running full blast all the time. IMO we should all still try to be more sustainable.
 

PotionBleue

Member
Nov 1, 2017
470
Is Ampere expected to bing improvements to hardware video encoding/NVENC?

I found it surprisingly good with my RTX card compared with x264 or Intel QuickSync.
 

asmith906

Member
Oct 27, 2017
27,706
The really important part is the pricing. If the pricing tiers are kept the same, then a 50%-odd performance increase over the 2080 for the 3080 is great (plus, it would give us 1080Ti owners a viable upgrade). If they decide to increase the prices again, then it might not be so good. Not that I think they're going to pull a Turing and shift the prices to the extent of the 3080 costing the same as a 2080Ti, but I wouldn't be surprised in the slightest if there was a $100-$200 hike.
I fully expect Nvidia to keep the same prices from the 20 series.

Nvidia MSRP prices are BS though so expect to pay at least $100 - $200 over whatever they announce
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
Does the rumored 300+ wattage/tdp mean these cards are gonna run stupid hot? My 180w GTX 1080 already increases my room temperature a few degrees when under heavy load so I'm really worried the 3000 series is gonna make my room unbearable :/
 

Telaso

Member
Oct 25, 2017
1,684
Does the rumored 300+ wattage/tdp mean these cards are gonna run stupid hot? My 180w GTX 1080 already increases my room temperature a few degrees when under heavy load so I'm really worried the 3000 series is gonna make my room unbearable :/

Should probably wait for the partner cards with extra cooling to come out then. Thats what i'll probably do.

Edit: reading comprehension is not my strong suit....
 
Last edited:
Oct 25, 2017
9,872
Another 750W Gold here, only because it was available. For 13 years I've had 600W-650W for my computers.
I bought my Seasonic 750W platinum just because it was the first PSU that I found that was available at MSRP and was highly rated. There were some 500W options and non-modular options but I didn't want to deal with the wires. The PSU shortage right now is very real. Also it came with a 10-year warranty so I figured that the extra cost would be made up if I use it in at least one more build in the future.
 

low-G

Member
Oct 25, 2017
8,144
Does the rumored 300+ wattage/tdp mean these cards are gonna run stupid hot? My 180w GTX 1080 already increases my room temperature a few degrees when under heavy load so I'm really worried the 3000 series is gonna make my room unbearable :/

I'm worried about fan noise if it's that high. My 2080 with a 3 slot HSF already can easily run 80C depending on my fan curve, and it's already noisy at that level.
 

TheNerdyOne

Member
Oct 28, 2017
521
Even in that situation it's not drawing 250w when playing a game.
absolutely correct, i was just correcting the statement that its not happening *at all*, and if you have an HEDT cpu like a threadripper gen 3 part, then yeah 250w will be the norm... during multithreaded productivity loads, gaming power draw on basically all of these cpus is like.... 50 - 90w its not even worth caring about in this context
 

Telaso

Member
Oct 25, 2017
1,684
The amount of heat generated will be the same no matter what kind of cooler is used. A better cooler just moves more heat from the card into the ambient environment.

That's not how that works. The card will still be using the same amount of power. A better cooler will only make the card run cooler

I read Aqua's post wrong. I took it as Aqua was worried about the computer getting too hot and not the room.

Derp.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
If $699 that's not bad. Pretty much the same core count as the 2080 Ti. So basically getting a much improved 2080 ti with better RT and faster memory (although losing 1gb), for quite a bit less money.

We'll see if they hit that price point. Hope the 3070 can hit $500 as well.
 

TheDutchSlayer

Did you find it? Cuez I didn't!
Member
Oct 26, 2017
7,092
The Hauge, The Netherlands
If $699 that's not bad. Pretty much the same core count as the 2080 Ti. So basically getting a much improved 2080 ti with better RT and faster memory (although losing 1gb), for quite a bit less money.

We'll see if they hit that price point. Hope the 3070 can hit $500 as well.
LOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOL in your dreams.
Mid range cards at launch have not been around 500 for a few gens now sadly :(
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
2070 was 599, I can't remember the super price though

That was for the FE

NVIDIA-GeForce-RTX-20-Pricing.jpg


2019-07-01-image.png
 

Eternia

Member
Oct 25, 2017
501
Some people's expectations are delusional. Even the 1080 was less than 30% faster than the 980 Ti on a node shrink. If the 3080 is slightly above 20% over the 2080 Ti without a price increase from last generation, it seems fine.
 

Ionic

Member
Oct 31, 2017
2,742
Some people's expectations are delusional. Even the 1080 was less than 30% faster than the 980 Ti on a node shrink. If the 3080 is slightly above 20% over the 2080 Ti without a price increase from last generation, it seems fine.

I hate that these insane graphics cards prices have been normalized. I don't think the price of the 2080 Ti was fine to begin with.
 

mordecaii83

Avenger
Oct 28, 2017
6,880
Some people's expectations are delusional. Even the 1080 was less than 30% faster than the 980 Ti on a node shrink. If the 3080 is slightly above 20% over the 2080 Ti without a price increase from last generation, it seems fine.
Yep, exactly. The real question is going to be the pricing, I'd prefer they lower the pricing across the board but if not they at least need to not raise the price. With Turing they had an excuse since they had monster size chips due to not having a node shrink.
 

Eternia

Member
Oct 25, 2017
501
I hate that these insane graphics cards prices have been normalized. I don't think the price of the 2080 Ti was fine to begin with.
That's fair. I was really more focused on the performance gains between generations. As long as it's close to the gains of Pascal, I'd mark that down as a success. It would be nice for pricing to come down especially with Nvidia expressing some disappointment with initial RTX sales.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
absolutely correct, i was just correcting the statement that its not happening *at all*, and if you have an HEDT cpu like a threadripper gen 3 part, then yeah 250w will be the norm... during multithreaded productivity loads, gaming power draw on basically all of these cpus is like.... 50 - 90w its not even worth caring about in this context

The problem fof me is not the PSU failing, or being on the edge, which would be unusual cases anyway.

It would be the heat and thus noise generated from a high power card as is rumoured paired with a very hot and power hungry CPU (9900k or 10900K). I am past dealing with a noisey system and I'm past dealing with my room heating up after a gaming session, I thought this was all in the past!

There is a certain joy from gaming on a cool and quiet system.
 

1-D_FE

Member
Oct 27, 2017
8,323
Would be funny a couple of years ago but with Navi 21 this may well be an option finally.
Which is also why people expecting the doubling of 2080 price or something are weird.

I definitely agree that doubling the price would be suicide. But beyond that, I have zero expectations on pricing. You can't tell me Jensen Huang, and his niece, Lisa Su, haven't had conversations about this. The GPU market is a niche and hardcore market. Price wars don't lead to growing the pie. It may result in market share moving a couple percentage points here or there, but ultimately it destroys the margins for both without bringing in many new customers. I expect Nvidia will charge the most it thinks its customers are willing to pay, and AMD will undercut that pricing slightly.
 

1-D_FE

Member
Oct 27, 2017
8,323
The problem fof me is not the PSU failing, or being on the edge, which would be unusual cases anyway.

It would be the heat and thus noise generated from a high power card as is rumoured paired with a very hot and power hungry CPU (9900k or 10900K). I am past dealing with a noisey system and I'm past dealing with my room heating up after a gaming session, I thought this was all in the past!

There is a certain joy from gaming on a cool and quiet system.

I guess it's what the death or Moore's law looks like. We're still getting good gains in performance, but we're going Back to the Future in terms of power draws (and their ability to heat a room up).

What I wouldn't give to know the new Xbox's power draws. Have they truly made headway here? Or are the new consoles going to be power monsters. Digital Foundry would never do it (since it would kill their relationship with MS), but it would have been amazing if they would have plugged their test machine into a Kill-a-Watt device. This would give us some indication as to whether AMD's new GPU architecture has improved any in this area.
 

Bosch

Banned
May 15, 2019
3,680
AMD it's always "wait for x..." where x has no real competition against Nvidia's top end.
It is not always like that , before series 900 AMD was a real competitor in this market. 5700 XT showed they can compete again.

It was the same thing about CPUS before Ryzen and see how it is now the market...

If they delivery a GPU 50% more powerfull than 2080 TI they are in the game, even 40%
 

Wowzors

The Wise Ones
Member
Oct 27, 2017
1,712
I'm not sure but I think the progression of cards today it's more of a don't upgrade every generation, maybe every other generation. I don't see the 20 series to 30 series being substantial, however, if you are in the 10 series or lower the 30 series should be a substantial upgrade.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360

Yeah but because Nvidia prices $100 above MSRP pretty much everyone else does too and most aftermarket cards are actually nicer than the founders edition as far as pcb and power delivery and some are 3 fans. Why would they not change the same price Nvidia is charging. 2070s launching at $500 even Nvidia's own model was effectively a price drop.

You couldn't find a $500 2070 for months after launch outside of maybe a blower model and even that I'm not sure on. Sometimes those aren't really even cheaper because sometimes people seek them out for sff cases. EVGA with their black series are pretty much the only vendor that stays close to the MSRP and those would sell out immediately for months.

The whole founders thing is skeezy in general. Just launch with one actual MSRP for your own actual product like everyone else does. Not hey MSRP is $499 but here is our $600 version board partners feel free to also charge $600! Like certainly board partners can charge more than MSRP but a reference card should be priced at the MSRP.
 

Deleted member 2834

User requested account closure
Banned
Oct 25, 2017
7,620
Why're you all flexing your 750W PSUs? Was some news revealed that would lead you to believe you'd need gigantic PSUs? I got a 650W and I was always told its oversized for single GPU rigs.
 

SharpX68K

Member
Nov 10, 2017
10,615
Chicagoland
Would make about zero sense to do this costs wise so nah.

N5 quotas are already bought up, NV has their share too.

Also about zero chance of any "MCM" or "chiplet" GPUs reaching consumers any time soon from any IHV. Hopper is likely a 100% data center / AI / HPC oriented design. No clear info on if it's even a separate architecture or just Ampere chip(let)s in a multichip configuration.

I'd give about 1% of the first happening with Navi 2x vs Ampere and the second is basically guaranteed.

Thanks for the reply and especially thanks for pointing out what is very unlikely.

I guess its going to take a few additional years before GPU MCM or GPU Chiplets will become useful for games, and made easy to code for, for developers.
Having 2 or 4 separately manufactured pieces of silicon on a card, where they act as one huge GPU to developers is not an easy thing, and it's not hear yet, and probably won't be in 2022. Maybe 2024 ? These would also need to using HBM type memory, so save on power consumption.[/QUOTE]
 

dgrdsv

Member
Oct 25, 2017
12,181
I definitely agree that doubling the price would be suicide. But beyond that, I have zero expectations on pricing. You can't tell me Jensen Huang, and his niece, Lisa Su, haven't had conversations about this. The GPU market is a niche and hardcore market. Price wars don't lead to growing the pie. It may result in market share moving a couple percentage points here or there, but ultimately it destroys the margins for both without bringing in many new customers. I expect Nvidia will charge the most it thinks its customers are willing to pay, and AMD will undercut that pricing slightly.
Oh, we definitely shouldn't expect any of the two companies to gift their new GPUs for some peanuts or such. But considering that NV's top end will launch to a competition coming in a couple of months after that expecting them to increase the prices is equally unrealistic. This upcoming gen will be a big fight between the two which means that anything above $1000 is likely out - if only because either would be forced to introduce a much cheaper card with essentially the same performance (think 1080Ti vs GTX Titan and such). With NV opting for a supposedly cheaper Samsung 8nm process they seem to be fully aware of what's coming and are trying to protect their margins this way which means that the prices won't go up and have a good chance of actually coming down from where they were with Turing and RDNA1.

AMD it's always "wait for x..." where x has no real competition against Nvidia's top end.
Navi 21 is top end though and it should launch this year so there's that.

I guess its going to take a few additional years before GPU MCM or GPU Chiplets will become useful for games, and made easy to code for, for developers.
It's... going to be more then some years I think. Chiplet design for GPUs will prompt for a total rebuild of how games are being programmed and the likely dumping of rasterization h/w in favor of something more natively parallel. Won't happen at the beginning of a new console gen which is for all intents and purposes very old school in its GPU approach. By the end of this gen though or at the start of the next one, when we'll be testing 3nm processes which are very likely to be the end of scaling on silicon - well, maybe. But that's 2025+ at best I think.
There will likely be some HPC / DL testing for chiplet designs though, we've already seen the research on that from several IHVs and it's very probable that we'll see some products designated for pure GPU compute markets much sooner than anything like this will come to gaming.
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
Why're you all flexing your 750W PSUs? Was some news revealed that would lead you to believe you'd need gigantic PSUs? I got a 650W and I was always told its oversized for single GPU rigs.
Depending on which card you go with in the Ampere lineup, your PSU might need to supply your GPU alone with between 300-350W, more if you overclock it.

This is the one area where I don't expect a major difference between Big Navi/RDNA 2 given how power hungry the 5000 RX series cards were, and I say that as someone who uses a 5700 XT in a mini build.
 

SharpX68K

Member
Nov 10, 2017
10,615
Chicagoland
It's... going to be more then some years I think. Chiplet design for GPUs will prompt for a total rebuild of how games are being programmed and the likely dumping of rasterization h/w in favor of something more natively parallel. Won't happen at the beginning of a new console gen which is for all intents and purposes very old school in its GPU approach. By the end of this gen though or at the start of the next one, when we'll be testing 3nm processes which are very likely to be the end of scaling on silicon - well, maybe. But that's 2025+ at best I think.
There will likely be some HPC / DL testing for chiplet designs though, we've already seen the research on that from several IHVs and it's very probable that we'll see some products designated for pure GPU compute markets much sooner than anything like this will come to gaming.

So maybe, just maybe, Chiplet design for GPU might be viable in time for PS6 and the equivalent Xbox assiming they are released Holiday 2027?

Or is even that too optimistic to think? (I know it's WAY to early to know).
 

camelkong

Member
Feb 26, 2018
68
ugh, am I gonna be good next gen with a 2070 super or did i blow my money too early?
edit: guess I didn't notice this was an older thread I already asked this on lol
 

Caz

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
13,055
Canada
ugh, am I gonna be good next gen with a 2070 super or did i blow my money too early?

1*t1JqKCNbd0sNcCMEB27veQ.png

You should be fine but I would have recommended waiting to see what NVIDIA and AMD have to offer in a few months time, if only to see how much better ray-tracing performance is on either company's cards.
 

LegendX48

Member
Oct 25, 2017
2,072
ugh, am I gonna be good next gen with a 2070 super or did i blow my money too early?
depends on the resolution you game at. I have a 2070 super as well and, at 1440p, certain titles from 2016 and 2018 are kind of tough. barely getting 60fps in ass creed odyssey and going through areas in Deus Ex Mankind Divided where the fps drops to high 40s/low50s is kind of wild for a $560 2018/9 gpu imo
 

camelkong

Member
Feb 26, 2018
68
You should be fine but I would have recommended waiting to see what NVIDIA and AMD have to offer in a few months time, if only to see how much better ray-tracing performance is on either company's cards.
I mean I got it like a year ago (and then the first one died and I had to get a replacement, naturally) so I do think I did ok with the info I had at the time but I guess I'm getting another one in like, a year and a half, which blows
 
Status
Not open for further replies.