Siresly

Prophet of Regret
Member
Oct 27, 2017
6,639
People say the 3080 is overkill for 1440p, that you won't be fully utilizing the GPU, and that seems fair and true.
But that won't continue being the case in perpetuity. I'm planning on still using this thing in 2025, where it might still be doing ok at 1440p.
What GPU makes sense depends on what upgrade cycle you subscribe to. I spend more but less often.
 

Samiya

Alt Account
Banned
Nov 30, 2019
4,811
really appreciated hardware nexus saying that if you're still good with a 1080ti for your gaming needs, just wait and avoid contributing to more wasteful consumption. It's good to have tech dudes who are levelheaded and not hyping up hyper-consumerism (such as Digital Foundtry for instance)
 

VZ_Blade

One Winged Slayer
Avenger
Oct 27, 2017
1,339
Is it going to be overkill if I game at 1440p and pair it with a Ryzen 2600? I plan to use Raytracing and DLSS whenever it's an option, so will my CPU not be bottlenecked as much?
 

iceblade

Member
Oct 25, 2017
4,295
Not much at all. Look at the Tom's Hardware - CPU review in the OP.

They tested it with a 4770K and got only a minor dip in performance. And it held up strong at 4K as well.

6700K should be fine, as well as any other Haswell or newer quad-cores.

Looks like the diminishing returns coorelates with monitor refresh rates, which in turns locks into your FPS limits, assuming most people want to lock them together to prevent screen tearing (Vsync or Gsync). So, figure 4K/60fps is a solid sweet target with your CPU.

Thanks for posting this. It's super helpful.
 

Fredrik

Member
Oct 27, 2017
9,003
I feel like AMD has an opening here. While the 3000 series are in the end monsters, the performance of 3080 is not near Nvidia's 2X 2080 claims and the power usage is very high. According to Techspot they may have been designed a bit more as server chips. We'll see.

I didn't think Big Navi could compete with 3090 but now I think it's easily possible (of course then we'll get 3090 Ti and the rat race continues)

I felt the same way for years vs Intel. I saw Intel was potentially "wasting" all that die space for discrete GPU users putting an IGP on every single CPU, for example.
Yeah now I'm thinking the low price on the 3080 and 3070 is a direct result of nvidia finding out the power of big navi.
I'll still go with nvidia but it would be great for everyone if AMD got close
DLSS is difficult to beat though.
 

strudelkuchen

Member
Oct 25, 2017
10,298
PSA to modest PSU owners.

RTX 3080 can draw more than 350W when stock and more than 400W when overclocked. While average usage is of course much lower, these spikes is what makes your system shut down or crash. Already saw some reviewers have their systems shut down due to insufficient power.

stock:
qEaWps7.png


overclocked:
91od0gV.png
You can lower the power limit to ~270W instead, for a modest ~4% performance loss.

www.computerbase.de

Nvidia GeForce RTX 3080 FE im Test: Effizienz, PCIe 4.0 vs. 3.0, FE-Kühler, Fortnite RTX und Async DLSS

GeForce RTX 3080 FE im Test: Effizienz, PCIe 4.0 vs. 3.0, FE-Kühler, Fortnite RTX und Async DLSS / RTX 3080 vs. RTX 2080 Ti bei 270 Watt

There is also this UV video going around which is similar:
 

Yerffej

Prophet of Regret
Member
Oct 25, 2017
24,030
People say the 3080 is overkill for 1440p, that you won't be fully utilizing the GPU, and that seems fair and true.
But that won't continue being the case in perpetuity. I'm planning on still using this thing in 2025, where it might still be doing ok at 1440p.
What GPU makes sense depends on what upgrade cycle you subscribe to. I spend more but less often.
Precisely my case and I'm thinking along the same lines. For the type of user I am, it'll scale well as we go deeper into the generation.
 

WackoWambo

Member
Jan 11, 2018
1,331
Not much at all. Look at the Tom's Hardware - CPU review in the OP.

They tested it with a 4770K and got only a minor dip in performance. And it held up strong at 4K as well.

6700K should be fine, as well as any other Haswell or newer quad-cores.

Looks like the diminishing returns coorelates with monitor refresh rates, which in turns locks into your FPS limits, assuming most people want to lock them together to prevent screen tearing (Vsync or Gsync). So, figure 4K/60fps is a solid sweet target with your CPU.
These CPUs with only 4 cores are going to be destroyed on next gen games, if you want an example try playing Microsoft Flight Simulator.

If you are building a new PC you need 6 cores minimum, or 8 cores ideally. You might skate by until 2022, but after that you'll hit unplayable territory.
 

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
Yeah, can't say that I'm impressed by the reviews and will probably just wait it out on my 2080 till AIB models and Navi 21.

The card's performance is fine and the FE cooler does a good enough job keeping it cool and quiet but it would be a lot more impressive in 220W instead of 320W.
I'm also wondering if 3x 8-pin would allow for a better clocking curve here since the FE is 100% power limited which result in some pretty wild boost clock changes.
So I'm fine with waiting a couple more months to see the full picture, thanks, NV.

Not really understanding your sort of 180 on these. You're too long in the tooth, as am I, to have bought into the Nvidia marketing. We've known the power draw is crazy high for months and the leaked reviews last week pegged the performance increase at 'just' 30% that is consistent across all reviews today. So why the disappointment?

Because overclocking headroom is low? The RT performance gains are small? The modest performance increase in 1440p?
 

PennyStonks

Banned
May 17, 2018
4,401
I am one of them. You shouldn't be disappointed in how others spend their money. We all live different and have different setups in mind. I'm glad Nvidia has a good versatile lineup that fits a lot of people's budget and desires.
I'm not disappointed in people spending money. I'm surprised people thought they'd be getting a big upgrade to a 2080ti for $700.
 
Oct 27, 2017
6,906
These CPUs with only 4 cores are going to be destroyed on next gen games, if you want an example try playing Microsoft Flight Simulator.

If you are building a new PC you need 6 cores minimum, or 8 cores ideally. You might skate by until 2022, but after that you'll hit unplayable territory.

Back then, when did quad-core CPUs start to become mandatory when people kept insisting on using their dual-core CPUs? Out of curiosity...

But yeah, people should really start looking into 8c/16t for their next build if they haven't already.
 

eonden

Member
Oct 25, 2017
17,152
On Linus tech tips. 1 FPS difference between a 3080 and 2080Ti on MS Flight Simulator. Ouch.
CPU-capped game...

These CPUs with only 4 cores are going to be destroyed on next gen games, if you want an example try playing Microsoft Flight Simulator.

If you are building a new PC you need 6 cores minimum, or 8 cores ideally. You might skate by until 2022, but after that you'll hit unplayable territory.
Other way around for MS FS, it really needs performance to be great in 1 core instead of being better at multi core.
 

exodus

Member
Oct 25, 2017
9,963
People say the 3080 is overkill for 1440p, that you won't be fully utilizing the GPU, and that seems fair and true.
But that won't continue being the case in perpetuity. I'm planning on still using this thing in 2025, where it might still be doing ok at 1440p.
What GPU makes sense depends on what upgrade cycle you subscribe to. I spend more but less often.

It's only overkill if you're just interested in 60Hz. If you're aiming for 1440p144, that's the same as going for 4K60.
 

WackoWambo

Member
Jan 11, 2018
1,331
Back then, when did quad-core CPUs start to become mandatory when people kept insisting on using their dual-core CPUs? Out of curiosity...

But yeah, people should really start looking into 8c/16t for their next build if they haven't already.
Transition started in late 2000's but by 2013 you were experiencing significant issues with dual core CPUs.

With consoles choosing 8 core as the baseline the adoption will be much faster. 8 CPU cores and SSD's becoming requirements is something PC gamers aren't expecting, just like the GPU's in the Xbox One and PS4 having more VRAM than expected. This made PC requirements for new games like Shadow of Mordor require the new GPUs.
 
Oct 27, 2017
1,743
USA
I run 1440p standard aspect ratio and am heavily considering an upgrade for the same reason (DSR and more stable 144hz gsync framerates for more demanding games), but I can't help but feel like I should maybe hold out for the inevitable 3080 TI.

Also question: with ultrawide gaming, do many games officially support it? In FPS games that have radars and HUD elements, are they not commonly positioned so far into the corners that you have to completely scan your eyes or turn your head to see your radar/health/etc.? Back when I played Destiny 2 on PC I had a friend I played with regularly who praised his ultrawide but said the only thing was it hampered his ability to quickly see the radar since the game didn't allow you to reposition it closer to the middle of the screen, and that's just not something I think I could put up with and has been holding me back on feeling confident about picking one up.
I think the benefits of Ultrawide outweigh the cons.

Sure, some games HUD elements can't be moved and are a pain to see (I'm sure the BF V minimap being off in the corner has gotten me killed before) but the extra wide view of the game world far outweighs that.

It's pretty standard for most new games these days to support 21:9 and if it doesn't, someone will mod it in.
 

Kingpin Rogers

HILF
Banned
Oct 27, 2017
7,459
If I can't get the founders edition tomorrow should I try and get one of the cheaper aftermarket cards? I'm reluctant because I've read mostly bad things about cheaper aftermarket cards like failure rate and noise/temp issues from a lot of them.
 

NaDannMaGoGo

Member
Oct 25, 2017
6,010
You can lower the power limit to ~270W instead, for a modest ~4% performance loss.

www.computerbase.de

Nvidia GeForce RTX 3080 FE im Test: Effizienz, PCIe 4.0 vs. 3.0, FE-Kühler, Fortnite RTX und Async DLSS

GeForce RTX 3080 FE im Test: Effizienz, PCIe 4.0 vs. 3.0, FE-Kühler, Fortnite RTX und Async DLSS / RTX 3080 vs. RTX 2080 Ti bei 270 Watt

There is also this UV video going around which is similar:


In case the settings used in the video run stable under stress tests then... yeah, it looks honestly like a terrible choice not to undervolt. Wonder if you could go down to 250W (20% less power draw) and still remain in single-digit performance loss. Just seems like a no-brainer to me for most situations. Whenever I'd do desire those few extra FPS over the disproportionately higher heat and noise, I could just change settings real quick for that. I mean, a swift profile swap with MSI Afterburner is a non-issue.
 

WackoWambo

Member
Jan 11, 2018
1,331
CPU-capped game...


Other way around for MS FS, it really needs performance to be great in 1 core instead of being better at multi core.
I switched from 5GHz 7600k to stock 3700x this month, I promise the core count from 4-6 helped more than anything. You can watch the digital foundry video on it.

Allegedly MSF cannot use more than 6 cores atm, which is why I said 6. I know my CPU has 8 cores lol.
 

Qudi

Member
Jul 26, 2018
5,350
This card is going to be a great 4k card right now and a fantastic 1440p card for long-term.
 

Gatti-man

Banned
Jan 31, 2018
2,359
Depends on which AIB reviews we are getting tomrrow. If it's low tier cards then that's not very interesting as they have lower power limit than FE card (350W vs 370W). What really interests me is STRIX OC and similar caliber cards that come with 3x8pin and ample power headroom enabled in BIOS.
Why are people so excited for these cards with OC room? To me I'm already concerned about heat output and wouldn't more OC pump out considerably more heat for like 5% performance?
 

dgrdsv

Member
Oct 25, 2017
12,091
Wonder how power consumption would be like had they gone the TSMC route
Keeping the specs the same (talking 3080 here) would you have any idea how much power they could have saved?
We can only guess and somewhat estimate from GA100 results - which are hard to come by.
Personally I don't think that the influence of 8N is that big - they may have lost some 10% or so of perf/watt here. If it would be more then they wouldn't have gone with 8N over N7.

AIB models are indeed launching tomorrow, as well, with just a few exceptions (FTW3, for example)
I don't expect good models till November or so.
I'd say that a good custom 3080 is a 3x8pin card which will have a solid and huge cooler but won't try to go too much higher than stock clocks by default.
Something like Palit Jetstream should be good this time. But they won't launch in the first wave of AIB models.

Not really understanding your sort of 180 on these. You're too long in the tooth, as am I, to have bought into the Nvidia marketing. We've known the power draw is crazy high for months and the leaked reviews last week pegged the performance increase at 'just' 30% that is consistent across all reviews today. So why the disappointment?
I'm not disappointed, I just don't see many reasons to rush and upgrade to 3080 from my 2080 just yet. Power is on the high side and I want to see how AIB models will handle this first. It's also a good idea to watch where Navi 21 will land in relation to GA102.

Because overclocking headroom is low? The RT performance gains are small? The modest performance increase in 1440p?
I mean, looking at the results which matter I see around +60+80% to 2080 and +40+50% to 2080Ti - the gains are solid and they will improve in next gen titles which will push math and RT even harder.
But yeah right now and without a 4K display it seems like an overkill for current gen games - so again, I don't see any reason to rush and get one. I'll wait.
 
Last edited:

StereoVSN

Member
Nov 1, 2017
13,620
Eastern US
I mean, looking at the results which matter I see around +60+80% to 2080 and +40+50% to 2080Ti - the gains are solid and they will improve in next gen titles which will push math and RT even harder.
But yeah right now and without a 4K display it seems like an overkill for current gen games - so again, I don't see any reason to rush and get one. I'll wait.
I am sitting with a 1080ti which is still a pretty good card, but lacks DLSS and RTX. I kind of want that for Cyberpunk, but my brain is telling me to wait for AMD and 3rd parties as well. Plus giving couple months for that game to get patched isn't a bad idea.
 

eonden

Member
Oct 25, 2017
17,152
I switched from 5GHz 7600k to stock 3700x this month, I promise the core count from 4-6 helped more than anything. You can watch the digital foundry video on it.

Allegedly MSF cannot use more than 6 cores atm, which is why I said 6. I know my CPU has 8 cores lol.
Oh that makes more sense XD
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
I don't expect good models till November or so.
I'd say that a good custom 3080 is a 3x8pin card which will have a solid and huge cooler but won't try to go too much higher than stock clocks by default.
Something like Palit Jetstream should be good this time. But they won't launch in the first wave of AIB models.

You don't expect the Strix, Trio X or FTW3 to be good?
Because outside of very high end models like the Matrix/Lightning/etc those are the best customs those brands will offer.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Surprised so many folks with 600w~ PSUs are even buying this card. Like wait a sec on big navi and the 3070 instead of having to order a new PSU as well.
 

Slack Attack

Member
Oct 28, 2017
819
Surprised so many folks with 600w~ PSUs are even buying this card. Like wait a sec on big navi and the 3070 instead of having to order a new PSU as well.

I think the problem many of us face is that if we wait on the 3070 benchmarks, we won't be able to get our hands on a 3080 if we decide the 3070 won't cut it. I'm personally aiming for a 1440p 144hz-type build.

Some of the components in my PC are ancient. I'm still rocking a GTX 970 paired with a 600w PSU. I'm absolutely going to buy either the 3070 or 3080 and I'll be upgrading my 1080p 60hz monitor and will likely need a new PSU no matter which new GPU I go with. Unfortunately, if I hold out for the 3070 reviews, I may not have a new card until January/February which would be super disappointing (though probably the smarter option).
 

gozu

Banned
Oct 27, 2017
10,442
America
Even worse of course. If you want to OC, you need a custom board that lets you throw tons of juice.

Thanks. With how easy it is to overclock GPUs these days, it's a no-brainer.

I was looking for a 3090FE card, but with this news, perhaps it is wise to pay a little more and get a card with more juice. Assuming THEY are overclockable.
 

PlayBee

One Winged Slayer
Member
Nov 8, 2017
5,613
So FE vs <$750 AIB, which to pick? I'm not interested in paying a lot more for something that performs about the same
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
How much gain can we expect from a 3x8 pin AIB vs the FE? 10% at least?

I think if you also win the silicon lottery maaaaybe 10%. Depends how much higher clocks actually scale with more power and if the cards are really pretty close to clock limits anyway with only 2x8pin. Guessing will be similar to last couple gens of pretty much capping out at 2000mhz or so and another 100mhz doesn't really give you more than a few percent fps gains.
So yeah ^ probably more like 5% oc vs OC on a 2x8pin card.

You can already extrapolate pretty well if you cut power limit to 270w you only lose 4%, so adding more wattage I'd expect best case you add 5% or so.
 

dgrdsv

Member
Oct 25, 2017
12,091
You don't expect the Strix, Trio X or FTW3 to be good?
Because outside of very high end models like the Matrix/Lightning/etc those are the best customs those brands will offer.
I don't know, we need to see the cards.
All of them (with the exception of MSI possibly, haven't looked too close) has had their share of issues previously - and this was back when they were working with 200-300W GPUs.
It'll be interesting to see if they'll cut any corners with a 320W+ GPU this time. Asus specifically likes to do weird shit with VRM cooling from time to time for example.
And as I've said I have a feeling that with 3080 a better AIB card shouldn't be too overclocked by default but should have 3 8-pins to provide enough power even for stock clocks.
The only burning question I have left right now is how well will their cooling solutions deal with 320W TDP. It's a bit concerning that we haven't seen too many 2 fans designs for 3080 custom cards so far - and 3 fans are typically a lot noisier than 2 fan configs.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
New 5700xt testing by hardware unboxed and a 3080 unboxing. 5700xt now seems to be on par/even a bit faster than a 2070s with driver updates.

Bodes well for big navi I think.

youtu.be

RTX 3080 Unboxing & RX 5700 XT Update vs. 2070 Super & 2060 Super

Samsung A30: https://smarthome.samsungsds.com/doorlock/product/view?prdId=42&gotoPage=1&searchWord=&searchPrdType=SD&searchCateId1=3&searchCateId2=0Join us o...
 

Darktalon

Member
Oct 27, 2017
3,291
Kansas

dgrdsv

Member
Oct 25, 2017
12,091
New 5700xt testing by hardware unboxed and a 3080 unboxing. 5700xt now seems to be on par/even a bit faster than a 2070s with driver updates.
Is it the exact same benchmarking suite as before? If not then I'm not sure how anyone can arrive to the conclusion that the result is due to driver updates.