Thats what you get with people who can't check their tone before they post.Wow, the last couple of pages went to shit, I see. That Sapphire card looks nice, though!
Thats what you get with people who can't check their tone before they post.Wow, the last couple of pages went to shit, I see. That Sapphire card looks nice, though!
The Pulse looks alright but that Nitro+ Special edition is...really ugly. I expect the cards to run cool but I didn't expect them to look so bad considering Sapphire's last few cards have looked very appealing.
No, we're not doing this again.
We're not having another crypto boom that drives up GPU prices.
I refuse.
Pretty fucking hateful to run this article lmao. bad enough that the fuckwits that try to flip sneakers were felipping 3080s and 3070s, now this
Me too. Just watching everything playing out. This whole graphics card stuff is a huge headache.I have settled myself for never getting a card this year and just waiting for next year.
I don't mean this as a gotcha, but here you're kinda arguing against yourself I feel.They went with a 256-bit bus which is slower and cheaper. The only options they had were 8GB or 16GB. 8GB isn't enough, so it's ruled out.
In my opinion, the 3070 is a bad buy for the price.I don't mean this as a gotcha, but here you're kinda arguing against yourself I feel.
Nobody is arguing against your post and nobody is disputing the facts (well...I guess some are, but you know lol) - the concern I and some others voice comes from the fact that using current games might not be the best indicator. In your own posts Watch Dogs is already exceeding 9 GBs of allocated memory at 4K.
You mentioned using a 2080 Ti until now, which, good for you, but my next card's ought to last me at least 4 years when I'm spending 500€+ on it and I don't see VRAM usage going down with actual next gen games. People also vastly overestimat DirectStorage imo, but that remains to be seen.
We saw the 770 2GB struggling to say relevant. We also saw that HBM didn't make a difference (so I dunno if the argument of G6 vs. G6X holds up). If a 500€ card like the 3070 is already outdated (according to yourself), is that not reason enough to call out Nvidia for being stingy af with their VRAM?
And keep in mind I'm still on the fence myself when it comes to 3080 vs 6800XT.
Oh the 3070 is shit in every possible way, there's literally no reason whatsoever to buy it (assuming the 6800 marketing isn't complete bollocks ofc).In my opinion, the 3070 is a bad buy for the price.
It's just a 2080ti (that was horrendously overpriced) with less VRAM and slightly less power draw for a crap price due to stock shortages. I'd either wait for a 3070ti with 16GB that competes with the 6800 or just jump up a step and go for a 3080 or 6800xt.
Agreed, i'm mixed on the 3080 because it's a fantastic card but then, so is the 6800xt (from what we know so far) with a lot more VRAM for a cheaper price that's actually got strong legs long term.Oh the 3070 is shit in every possible way, there's literally no reason whatsoever to buy it (assuming the 6800 marketing isn't complete bollocks ofc).
That still doesn't protect the 3080 from criticism though wrt its VRAM.
Agreed, i'm mixed on the 3080 because it's a fantastic card but then, so is the 6800xt (from what we know so far) with a lot more VRAM for a cheaper price that's actually got strong legs long term.
This reminds me of the 970 GTX with 4(3.5?) GB VRAM and AMD's R9 390 with 8 GB VRAM situation all over again lol.Even 6000 -series will start to fall off in few years as next-gen actually starts / gets going, rendering power just wont keep up. VRAM wont help you there.
Here and now that VRAM is great marketing tool and it's working for them so props for that to AMD. Now if performance is good in 3rd party benches then they have some good sellers in hands.
Even 6000 -series will start to fall off in few years as next-gen actually starts / gets going, rendering power just wont keep up. VRAM wont help you there.
Here and now that VRAM is great marketing tool and it's working for them so props for that to AMD. Now if performance is good in 3rd party benches then they have some good sellers in hands.
Do they really drum up the VRAM even that much in their marketing like some say?
16GB figure is plastered all over their 6800/6900 slide deck so I dunno.Do they really drum up the VRAM even that much in their marketing like some say?
G6X 2GB isn't available yet which is the main reason really as going with 1GB chips nets you a $1500 3090 with 24GBs.My theory is that Nvidia kept the Vram as tight as possible in order to keep the cost down.
The big difference is that those cards were significantly faster than the consoles, like 3x the PS4, so VRAM mattered. The current cards are not all that much faster and will struggle soon. Hell, they're already struggling in Ubisoft cross gen games and I can assure you most full next gen games aren't going to run better.This reminds me of the 970 GTX with 4(3.5?) GB VRAM and AMD's R9 390 with 8 GB VRAM situation all over again lol.
Just goes to show how wccftech should be avoided at all costsrdna 2 isnt even out and they are talking about rdna3 xD
stop giving moores law is dead clicks.
yes wccftech is trash too
"Talking" is a bit of a stretch. There's nothing new in what he has said.rdna 2 isnt even out and they are talking about rdna3 xD
stop giving moores law is dead clicks.
I put a 5600 XT into my recent PC build and now I'm considering a 4K television when the wife and I move into a new place soon (hopefully)
It's suddenly occurred to me that that 5600 XT wont cut it at 4k. Hopefully 1080p upscaled to 4k doesn't look too terrible.
It's likely they're going to be launched the same time as the last few GPU launches have been i.e. 9 A.M. EST.Have any retailers posted when they're making these available for purchase?
I thought I had one but Newegg voided my order due to it being out of stock at the time it was placed.