Status
Not open for further replies.

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Here's one of the best 5700xt versus FE edition Nvidia cards. Is pretty much in line with what I said. Closer to the 2060S & gap between 2070S & 2080S is about the same as the gap between 5700XT vs 2070S.

www.techpowerup.com

Sapphire Radeon RX 5700 XT Pulse Review

The Sapphire Radeon RX 5700 XT Pulse is equipped with a factory overclock and features a much better thermal solution than the AMD reference design. The card not only runs a lot quieter as temperatures are better than on any other RX 5700 XT we've tested so far, and idle fan stop is included, too.

I don't really care to argue about this any longer but

youtu.be

Radeon RX 5700 XT vs. GeForce RTX 2070 Super, 2020 Update

Support us on Patreon https://www.patreon.com/posts/35946602Merch: https://represent.com/store/hardwareunboxedRadeon RX 5700 XT - https://amzn.to/2Js33xBGeFo...

5700xt is 8% faster than 2060s, and 8% slower than a 2070s, a $100+ more expensive card. This is on PCI 3.0. Its right in between. The sacrifice is a small perf delta and lack of RT/DLSS.

With pci 4.0 and a 2100 clock and memory OC it absolutely is within earshot of a $700 2080, but a good vanilla 2070 sample also is.

Imo 5700/2060s/2070/5700xt/2070s are all in the same cross shop area, and 2080/2080s are just poor value as all those other cards are much less money but really very close perf wise especially if you are willing to do a little tweaking.

Anyway it doesn't really matter as new stuff is almost here.
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Here's one of the best 5700xt versus FE edition Nvidia cards. Is pretty much in line with what I said. Closer to the 2060S & gap between 2070S & 2080S is about the same as the gap between 5700XT vs 2070S.

www.techpowerup.com

Sapphire Radeon RX 5700 XT Pulse Review

The Sapphire Radeon RX 5700 XT Pulse is equipped with a factory overclock and features a much better thermal solution than the AMD reference design. The card not only runs a lot quieter as temperatures are better than on any other RX 5700 XT we've tested so far, and idle fan stop is included, too.

Not a 2060S but I had both a 2070 vanilla (Asus 2070 Dual Evo) and 5700 XT (Gigabyte Gaming OC) during lockdown and plenty of time to compare them.
Both overclocked (2000/2000 for the 2070, 2100/1840 +UV for the 5700 XT) they're overall the same
Sent them both back but if I had kept one it would have been the 2070 for the DX12U perks.

Also had (briefly) a 2070S and a 2080 (for 4 months)
Both are clearly faster than the XT
 

Deleted member 35478

User-requested account closure
Banned
Dec 6, 2017
1,788
So is it true the 3080 will need to be fed 400w? I just built a new pc this spring with a 650w psu, would hate having to upgrade a new psu lol.
 

Telaso

Member
Oct 25, 2017
1,682
If I wanted an EVGA card, would it probably be best to sit on the EVGA website for one? Or would Newegg/Amazon/B&H be better?

Also, people are saying AIC and AIB, I feel like I'm missing something.

I'd also love to know this.

I can add that when I bought my 1080ti at launch a few years ago, the only way I could get one was watching the EVGA site for the small daily stock to show up. Never saw a chance on any of the other retailers to buy one.

I ran an auto refresh addon that would highlight when text changes from out of stock to add to cart. I also had nowinstock up in my browser at all times trying to find the right time.

Worst part of this too was that even if it got in your cart, if you didnt check out in time you had a chance to not be able to buy it. So the next day I had my card info saved and rushed through the order to get it.
 

piratepwnsninja

Lead Game Designer
Verified
Oct 25, 2017
3,811
I guess the question for me is, since I already have a 2080Ti, and I won't be doing more than 4k60 due to my monitor situation being static for a bit, is it going to be worth the upgrade to a 3090, especially if DLSS 3.0 still works with the 20xx series.
 

Darktalon

Member
Oct 27, 2017
3,291
Kansas
Well we can get this thread to 10k posts then.
At the rate this is going, I don't think we will be able.
I'm doing my part!
Sorry for the late response but I was at work.

Here is the link (at 4:13):


Wow, I'm surprised Tech Jesus said that without it sounding too sarcastic. I believe they may be deciding it in a range a day before but an hour before? Dunno.
Please give the RTX 3070 more than 8GB VRAM.

My RX480 from 2016 already had 8GB. I'm not sure how well that will age 3+ years from now once games are targeting the PS5 and Series X.
If Tensor Memory Compression is real, 8GB would be similar to 11GB
Do we know the length of the FE 3090 based on that leaked pic? (i know the width is 2.75 pcie slots).

I assume the 24GB of VRAM are there to increase bandwidth? More vram chips in parallel = higher speed.
Thats not how the memory bandwidth works, it's more memory controllers in parallel = higher speed. They are using 12 controllers with 24 1GB chips, 12 on each side of the PCB. 12GB and 24GB would have same bandwidth.
What time is the "presentation" tomorrow?
9am PST
It would be huge for AMD if their top RDNA 2 card can compete with the 3080. Especially if it's priced at $100 less.
They will be competitive in games that don't support DLSS, but for those that do, Nvidia will continue to wipe the floor. Cyberpunk 2077 supports dlss. I expect more and more big names to support it.
Imagine spending $800+ on a 3080 only for AMD Big Navi to be almost as good for half the price a couple months later. Really makes me hesitant on preordering at this time.
Hahah no. Half price? Cmon.
You can use Freesync on a G-Sync monitor, at least on the latest NVIDIA GPUs.
This isn't true. AMD can not use Gsync monitors. The newest of the newest gsync monitors support both Freesync and Gsync, which is where you are getting mixed up.
More than price I'm interested in power draw.
3080 and 3090 seem very power hungry.

I sold my 1080, and am running a 2GB 960 atm so would like to get a GPU but if I can wait another month or so for a comparable GPU that's not only cheaper but also draws less power then that'd be a good thing.
We've got the data leaked already, TGP of 320 and 350 already. And 375w is the max in spec power draw for these cards unless you get an AIB model with more than 2x 8 pins.
I'm sorry but I don't understand. Your post suggests that I will need to stay with nvidia regardless?
I'm saying that you should be able to enable Freesync or G-Sync on your monitor, regardless of which you go with; AMD's RDNA 2 GPUs can use G-Sync, NVIDIA's GPUs can use Freesync.
Wrong wrong wrong. Nvidia can use freesync, AMD can't use gsync. This will not ever change. The newest monitors can support both gsync and freesync, but a gsync only monitor will NEVER work with AMD.
reeeeaally starting to look towards the 3080 Ti in early '21 for best high-end performance value.
As is tradition, waiting will always get ya more value. But I need cybperunk right meow.
What do people think the odds of a 3090 Ti next year are? Late next year is when I'm planning my next build.
Zero.000%, 3090 is already 97% of the full chip. The full chip is Titan.
Well, that's not how the 20 series panned out. The 2080 Ti was a different chip to the 2080 after all. I know we didn't get a chip above the 2080 Ti in the 20 series, so I'm not expecting we'll see something above the 3090, but I still have my fingers crossed. I really don't want to put a GPU that's a year or more old in a new build, nor do I want to wait to 2022.

I'll cope either way though. Hah.
You are looking at it backwards. The consumer chips start their biggest at the Titan level which is ga102. 3090 is 97% of full ga102. Next card faster than 3090 will be a 4090 in 2 years.
I still say the wildcard is whether it's Samsung "7nm" or TSMC. Cause if it's just Samsung 8nm with a couple alterations to call it "7nm", then a refresh on TSMC could bring significant gains. If these cards actually are on TSMC, then Nvidia has to explain where all this power is going. Because it doesn't make a whole lot of sense at the moment.
7nm Samsung and tsmc are basically the same, except Samsung has had worse yields, meaning the parts are more expensive to make. But Samsung charges less than tsmc. If 3090 is using Samsung 7nm it just means the parts will be even rarer, but the performance would have been the same as tsmc.
If I have an older G-Sync monitor (Acer Predator from several years ago), and if I get an AMD video card, will that support FreeSync?
No it won't work, see above.
Isn't it basically confirmed by this point that it's samsung?
Not confirmed. We just know it's 7nm. But Samsung and tsmc 7nm are basically equal besides yields.
The marketing stuff that's leaked out lists 7nm. Rumors were supposed to be 8nm Samsung. Which is where people are speculating. I'm putting my money on it being some Samsung 8nm++ shenanigans, but we'll see.
See above.
1024.png


This is my motherboard, the MSI 470 Gaming Plus. Would I be able to fit a 3 slot card in my full size case with this board if I am using the M2 drive towards the bottom of the board? Just wondering if a 3090 would even work logistically if I didn't want to upgrade my motherboard. Any help is greatly appreciated.
M2 drives fit underneath gpu, doesn't matter.
I think I'm going to "refresh" my monitor before getting a new card in the spring. I was looking at the odyssey g7 from Samsung but I've read they have horrible flickering problems with gsync and freesync. Anyone have any other suggestions either 34" widescreen or 32" badassery?
Get 34 inch ultrawide 3440x1440p @144hz. Look at LG and Alienware.
So like, 3000 series comes out this year. How long would it take for 4000 to come out? 2-3 years? Like if it were to come out next year I would just go for a 3080 but 2 to 3 years and I would get a 3090.
2 years, hopper in 2022 barring any unforseen setbacks or delays.
A 20GB 3080 makes me wonder what a 3080 Ti would have.
12GB
Hope that lots of folks will buy the overpriced 20GB version of the 3080 so that i can snack one of the cheaper 10GB models and then just upgrade to a hopper card in 2022 with 16gb RAM at a resonable price.
16GB Hopper means either 256-bit or 512-bit bus width. Both extremely unlikely. Gonna be 12 or 24 for a while.
If there are 10GB and 20GB 3080 cards at launch, I am hoping the rumored $800 price point is for the 20 GB. If its for the 10GB, I'd imagine the 20GB variant will be $100 more at $900 at least. $900 for a freaking x80 card, wew.

And with the 3090 again, is its $1400 msrp, that means like the 2080Ti you won't be able to find one near that price anywhere near launch. You are more realistically talking $1500-1600. That price is INSANE to me. I can afford to pay it, but I don't think I can do it and exist in my own body afterwards lol.
GDDR6X is very very expensive. No chance that 10GB more costs less than $200 to the consumer.
Man, I hope they can at least give us a time frame as to when these 3080s will be available.
Same week as any of the AIB cards, which is likely a week after FE.
What's the best way to track when these go live, specifically the AIB cards?
Nowinstock.net
If I wanted an EVGA card, would it probably be best to sit on the EVGA website for one? Or would Newegg/Amazon/B&H be better?

Also, people are saying AIC and AIB, I feel like I'm missing something.
Evga website
Shopping around for a power supply to get for a Ryzen 4/GTX 3080 build, should I go 750w or 850?
If it's 15-20$ more just get the 850.
 

Nooblet

Member
Oct 25, 2017
13,717
We've got the data leaked already, TGP of 320 and 350 already. And 375w is the max in spec power draw for these cards unless you get an AIB model with more than 2x 8 pins.
I was talking about Big Navi there.
I already mentioned that 3080 and 3090 are power hungry, so I'm aware of the data leak about the TGP.
 

Darktalon

Member
Oct 27, 2017
3,291
Kansas
So is it true the 3080 will need to be fed 400w? I just built a new pc this spring with a 650w psu, would hate having to upgrade a new psu lol.
3080 is 320w TGP. And 375w is the spec limit for the power connectors in use. I highly highly doubt the rest of your computer uses 300w, unless you have an overclocked 10900k.
I guess the question for me is, since I already have a 2080Ti, and I won't be doing more than 4k60 due to my monitor situation being static for a bit, is it going to be worth the upgrade to a 3090, especially if DLSS 3.0 still works with the 20xx series.
Depends on how much you like raytracing, RT will be a lot faster in 3090 than 2080 ti. Also 2080ti is not really hitting 4k 60 in current titles as it is....
 

Darktalon

Member
Oct 27, 2017
3,291
Kansas
so those performance slides were fake right?

i'll make the OT for the actual thing btw
I think JaseC is making it.
I was talking about Big Navi there.
I already mentioned that 3080 and 3090 are power hungry, so I'm aware of the data leak about the TGP.
My bad, yeah not sure how rdna2 will stack up in efficiency. But since both rdna2 and ampere are on 7nm....
Interesting. I guess that ends the speculation on why the power draw is so high.
Power draw is high cuz the 2080ti was limited by the power they gave it. If it was unlocked power we could have gotten another 20% overclock out of it. Part of the reasontthey did that was so they could continue to scale up, so that's how we get to where we are today.
7nm and 350w TGP, they are going to be a fucking monster of a card. Also die size is smaller than 2080ti considerably, imagine what would have happened if they stayed that big. *drool*
 

Nooblet

Member
Oct 25, 2017
13,717
I think JaseC is making it.

My bad, yeah not sure how rdna2 will stack up in efficiency. But since both rdna2 and ampere are on 7nm....

Power draw is high cuz the 2080ti was limited by the power they gave it. If it was unlocked power we could have gotten another 20% overclock out of it. Part of the reasontthey did that was so they could continue to scale up, so that's how we get to where we are today.
7nm and 350w TGP, they are going to be a fucking monster of a card. Also die size is smaller than 2080ti considerably, imagine what would have happened if they stayed that big. *drool*
Well people are speculating that Ampere is not really 7nm but rather Samsung's 8nm which is what's causing the heavy power draw. If that's the case and RDNA2 is TSMC's 7nm, then RDNA2 may be quite a bit more efficient.
 
Oct 27, 2017
5,264
Going by the watt numbers, I think my 600 W bronze PSU should be putting out enough to handle a 3080. But the only recommendation we got said 750 min. Can you someone who knows what this shit means weigh in briefly so I can shut off that part of my brain?
 

Kayotix

Member
Oct 25, 2017
2,312
Glad i usually go 800 or so for my PSU with my builds. When i eventually get one of these cards i wont have to worry.
 

Vimto

Member
Oct 29, 2017
3,723
I wonder if pcie4.0 will help boost nvlink performance or not.

I think for dual 3090 setup you will need 1500W psu huh 😂
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
So many people with 650+W PSUs freaking out and already trying to buy bigger PSUs
Calm down
You already have the PSU and if it's a good one it should be fine
Ans what's the worst that could happen if it's not? You experience reboots/shutdowns?
Go buy a new one then
 

Lys Skygge

Shinra Employee
Member
Oct 25, 2017
3,751
Arizona
I have a gold 750W PSU, so I should be good to go ☺️

I remember when I first built my rig I initially had a 500W, but someone in the PC building thread recommended I go for the 750W. I'm glad I took their advice!
 

wwm0nkey

Member
Oct 25, 2017
15,676
So many people with 650+W PSUs freaking out and already trying to buy bigger PSUs
Calm down
You already have the PSU and if it's a good one it should be fine
Ans what's the worst that could happen if it's not? You experience reboots/shutdowns?
Go buy a new one then
Yeah 650W+ is fine, but people always freak out over this stuff.
 

Clay

Member
Oct 29, 2017
8,281
I'm curious to read impressions tomorrow. I need a new computer and am considering building a desktop instead of replacing my seven year old laptop. I'm planing on getting a PS5 and imagine I'll continue to prefer playing most games in the living room, but there's an increasing number of PC-only games I'd really like to check out, Disco Elysium and Fall Guys being two recent ones. I also used to be really into Civ, SimCity, and other management games, but most recent ones won't run on my laptop. VR on PC seems really exciting too.

It's going to be hard to justify spending hundreds of dollars on a graphics card alone, but I'm hoping the 3060 will be reasonable (sub $400 would be great) and play simpler games at 1440p for the generation. Presumably there's lower end cards coming out too, I think I'd probably be fine with a xx50 level card, I'm not sure if that will be announced tomorrow though.
 
Status
Not open for further replies.