• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Teiresias

Member
Oct 27, 2017
8,210
So I fired up my HTPC (where the 3080 would go) that is hooked up to a Cyberpower UPS to see what it was telling me wattage wise. I kept the AV receiver turned off, but the TV itself (LG C9) is still included in the wattage reading out of the UPS.

On my system:
SF600 PSU
6700k
16GB DDR3
GTX980

Running Heaven at 2560x1440 at highest settings for everything I saw jumps to 480W.
Running the FFXIV:Shadowbringers Benchmark I was seeing jumps 550W with one ping at 580W.

All of these have the TV included, and looking online that looks to maybe be, conservatively, about 150W to take off of those numbers to get to just the PC

So Worst Case 580W - 150W (TV) - 165W (GTX980 rated wattage) = 265W for the PC minus the GPU.

The 3080 is rated in the specs to 320W, so 265W + 320W = 585W

So seems I MIGHT have headroom with the SF600 PSU and the 6700k, with a potential AMD upgrade netting a few more watts off the top, but it seems like it may be on the hairy edge if the current specification isn't a worst-case number.
 

brain_stew

Member
Oct 30, 2017
4,727
I think that 3070 price is too high. Usually you see a decreased price/performance ratio when going to higher tier cards, but the opposite is true here. The 3080 is the better buy.

You're getting the performance of a current £1200 GPU for £470 with a better featureset, I don't think anyone can be disappointed really. I agree, it makes sense to have the 3070 as the price:performance king but Nvidia clearly want to try and upsell users to the 3080. They know the pricing is good enough that those that stick with the 3070 are going to be more than happy with the performance they're getting.

I'd love to step up to a 3080 but I just can't justify spending £650 on a graphics card especially when I can get over a 3x performance boost for around £200 less.
 

Aztechnology

Community Resettler
Avenger
Oct 25, 2017
14,131
Nvidia is quoting a 2x improvement vs a 2080 without RTX/DLSS, so it should be >2x improvement vs 1080 Ti.
If true, that is insanity. I expected this to just be for RTX because it's all I've seen via benchmarks. I want to see that 1440p performance comparison or maybe even 1080 depending on title (though that's likely to be CPU bottlenecked).

That would 100% warrant me upgrading from a 2080

What do we know about their cooling shroud and I'd expect AIB partners are probably going to be selling higher than MSRP due to limited supplies? I'm wondering if this time around Nvidia's cooler might actually be better than what AIB's are offering to start.
 

Xiaomi

Member
Oct 25, 2017
7,237
Which one is "better"? Should I wait for an EVGA card for example or jump in on a founders?

We won't know until release, but AIBs are usually factory overclocked and offer better cooling, at the cost of taking up more space (and usually carrying a higher price). The FE coolers this time look pretty good, though.
 

crienne

Member
Oct 25, 2017
5,166
Currently have an i5 8600K and a 1070. Looking at the 3070 as I've always been a mid-range builder, but maybe I'll splurge for the 3080. Any risk of a huge CPU bottleneck there?
 

Darktalon

Member
Oct 27, 2017
3,265
Kansas
I want to get rid of my RTX 2080 and replace it with a RTX 3080.

Would $350 to $400 be a fair asking price? I don't want to rip people off, but considering I paid $700 for it, I want to at least get 50% back, that's not being greedy right? What would you consider a fair price point?
Price it at what ebay is pricing or facebook marketplace is pricing, not what you think is fair. A lot of people will pay more than you think they should.
 

Arken

Member
Jan 14, 2018
370
Seattle
Thank you to the replies, I figured. Just trying to figure out which one to get. The 3090 just seems expensive and i can't seem to find benchmarks with that card, only the 3080
 

Cels

Member
Oct 26, 2017
6,772
They changed the tiers with the RTX cards. RTX '60 is the new replacement for GTX '70 - though the non-Super 2060 fell a bit short.
Generational changes have been ~35% on average for that tier:
  • 260 > 470: 56%
  • 470 > 570: 37%
  • 570 > 670: 29%
  • 670 > 770: 13%
  • 770 > 970: 43%
  • 970 > 1070: 47%
  • 1070 > 2060: 18% (32% for 2060 Super)

1070 to 2060 was definitely not worth it. someone who paid 380 for their 1070 at launch was making a bad decision if they decided to spend another 330 to buy a 2060 at launch...two and a half years later. terrible value proposition. that's why i disagree that the 2060 is supposed to be the upgrade path for the 1070 even if it's in that same price bracket. we'll see what the 3060 is like though.
 

Keywork

Member
Oct 25, 2017
3,125
On thing I was kind of confused on, will the new cards get even more of an advantage if I was to use a PCI Gen 4 NVME?
 

cakely

Member
Oct 27, 2017
13,149
Chicago
The 3080 at $700 sounds like something that I'll try to get by the end of the year.

I'm guessing availability will be dire?
 

Sqrt

Member
Oct 26, 2017
5,880
We won't know until release, but AIBs are usually factory overclocked and offer better cooling, at the price of taking up more space (and usually carrying a higher price). The FE coolers this time look pretty good, though.
Or the AIB cards will be even larger :O
 

Last_colossi

The Fallen
Oct 27, 2017
4,249
Australia
Relatively new PC owner here: I have a 2080 Super and that 3080 looks very tempting. My motherboard is an N7 Z390; would I just be able to upgrade to a 3080 with my current motherboard?

Yes definitely, PCIE 4.0 is backwards compatible with PCIE 3.0, But there may or may not be a small performance difference for the 3080 and 3090, we'll have to found out once reviews start coming in.
 

Combo

Banned
Jan 8, 2019
2,437
Remember how GPU prices went up in 2017 due to crypto mining? Any chance of that happening again since we are seeing another crypto boom coming?

In that case is it worth buying these sooner - before those possible price hikes and shortages?
 
Oct 30, 2017
8,967
giphy.gif


Anticipating the comparison videos.
 

AndyBNV

NVIDIA
Verified
Oct 28, 2017
68
Reiterating: we have a ton of extended information on GeForce.com. Many of the questions on these pages will be answered by them.


You can also check out the new GPU product pages for even more info the GeForce RTX 30 Series:



If you have a question that isn't answered by the articles above, we have a panel of NVIDIA's top minds on hand to answer questions asked in a Q&A.
 
Last edited:

UF_C

Member
Oct 25, 2017
3,346
75% to 100% performance gain over the 2080... holy shit!, I was honestly expecting a 35-40% improvement as usually when NVIDIA says "2x performance" they're telling a half truth because the game is running with RTX on and DLSS or something but god damn a 75-100% improvement in normal rasterisation at 4k is insane.
Anyone know the % difference between the 2080 and the 2080ti?
 

Max A.

Member
Oct 27, 2017
499
The 3080 just doesn't feel right for me, but the 3090 seems a bit too much. I really wish they had announced the inevitable in-between that we'll get next year.
 

Skittles

Member
Oct 25, 2017
8,256
3090 seems like an awful value for gaming so i'll be going with a 3080 instead. Bout to get an amazingly fat boost from my base 2060. Should be about 120% more performance wise. My index boutta be in heaven at 144hz
 

Lakeside

Member
Oct 25, 2017
9,212
Holy crap at the way 2080ti's have collapsed in price.

It's pretty significant but oh well. I have several PCs in the house so usually do hand-me-down upgrades. In this case it would have been better to sell everything not in immediate use.

No biggie though, most users wouldn't notice the differences. Honestly I'll probably only really notice at 4K.
 

Dekevo

Member
Oct 27, 2017
189
I made $500 building PC for my friends these past few weeks so I am ready for the 3080!
 

kami_sama

Member
Oct 26, 2017
6,998
Do we have more info about the memory? I remember micron releasing some data, but I want to know why if it's 2x clock per clock compared to GDDR6, nvidia didn't multiply it by two. It used instead the "traditional" number of 760GB/s.