• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.
Feb 11, 2019
166
The Pulse looks alright but that Nitro+ Special edition is...really ugly. I expect the cards to run cool but I didn't expect them to look so bad considering Sapphire's last few cards have looked very appealing.

No, we're not doing this again.

We're not having another crypto boom that drives up GPU prices.

I refuse.
Pretty fucking hateful to run this article lmao. bad enough that the fuckwits that try to flip sneakers were felipping 3080s and 3070s, now this

Well, it looks like that was FUD anyway and the RX6K series isn't going to be that good at mining. Thank fucking God.
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
I subscribed to a discord notification bot for GPUs and landed a 3090 FE from Best Buy thanks to it. Figure it'll be better than the 6900XT because of raytracing performance and NVENC, also I won't believe AMD can make decent Windows drivers for their cards until they actually do it. 5700XT users reported tons of issues with their drivers. These kinds of bots are plentiful and if you do some googling it shouldn't be too hard to find one.
 
Last edited:

Readler

Member
Oct 6, 2018
1,974
we know that 8GB is not enough for 4k.
They went with a 256-bit bus which is slower and cheaper. The only options they had were 8GB or 16GB. 8GB isn't enough, so it's ruled out.
I don't mean this as a gotcha, but here you're kinda arguing against yourself I feel.

Nobody is arguing against your post and nobody is disputing the facts (well...I guess some are, but you know lol) - the concern I and some others voice comes from the fact that using current games might not be the best indicator. In your own posts Watch Dogs is already exceeding 9 GBs of allocated memory at 4K.

You mentioned using a 2080 Ti until now, which, good for you, but my next card's ought to last me at least 4 years when I'm spending 500€+ on it and I don't see VRAM usage going down with actual next gen games. People also vastly overestimat DirectStorage imo, but that remains to be seen.
We saw the 770 2GB struggling to say relevant. We also saw that HBM didn't make a difference (so I dunno if the argument of G6 vs. G6X holds up). If a 500€ card like the 3070 is already outdated (according to yourself), is that not reason enough to call out Nvidia for being stingy af with their VRAM?

And keep in mind I'm still on the fence myself when it comes to 3080 vs 6800XT.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,650
I don't mean this as a gotcha, but here you're kinda arguing against yourself I feel.

Nobody is arguing against your post and nobody is disputing the facts (well...I guess some are, but you know lol) - the concern I and some others voice comes from the fact that using current games might not be the best indicator. In your own posts Watch Dogs is already exceeding 9 GBs of allocated memory at 4K.

You mentioned using a 2080 Ti until now, which, good for you, but my next card's ought to last me at least 4 years when I'm spending 500€+ on it and I don't see VRAM usage going down with actual next gen games. People also vastly overestimat DirectStorage imo, but that remains to be seen.
We saw the 770 2GB struggling to say relevant. We also saw that HBM didn't make a difference (so I dunno if the argument of G6 vs. G6X holds up). If a 500€ card like the 3070 is already outdated (according to yourself), is that not reason enough to call out Nvidia for being stingy af with their VRAM?

And keep in mind I'm still on the fence myself when it comes to 3080 vs 6800XT.
In my opinion, the 3070 is a bad buy for the price.

It's just a 2080ti (that was horrendously overpriced) with less VRAM and slightly less power draw for a crap price due to stock shortages. I'd either wait for a 3070ti with 16GB that competes with the 6800 or just jump up a step and go for a 3080 or 6800xt.
 

Readler

Member
Oct 6, 2018
1,974
In my opinion, the 3070 is a bad buy for the price.

It's just a 2080ti (that was horrendously overpriced) with less VRAM and slightly less power draw for a crap price due to stock shortages. I'd either wait for a 3070ti with 16GB that competes with the 6800 or just jump up a step and go for a 3080 or 6800xt.
Oh the 3070 is shit in every possible way, there's literally no reason whatsoever to buy it (assuming the 6800 marketing isn't complete bollocks ofc).

That still doesn't protect the 3080 from criticism though wrt its VRAM.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,650
Oh the 3070 is shit in every possible way, there's literally no reason whatsoever to buy it (assuming the 6800 marketing isn't complete bollocks ofc).

That still doesn't protect the 3080 from criticism though wrt its VRAM.
Agreed, i'm mixed on the 3080 because it's a fantastic card but then, so is the 6800xt (from what we know so far) with a lot more VRAM for a cheaper price that's actually got strong legs long term.
 

Tovarisc

Member
Oct 25, 2017
24,507
FIN
Agreed, i'm mixed on the 3080 because it's a fantastic card but then, so is the 6800xt (from what we know so far) with a lot more VRAM for a cheaper price that's actually got strong legs long term.

Even 6000 -series will start to fall off in few years as next-gen actually starts / gets going, rendering power just wont keep up. VRAM wont help you there.

Here and now that VRAM is great marketing tool and it's working for them so props for that to AMD. Now if performance is good in 3rd party benches then they have some good sellers in hands.
 

cHinzo

Member
Oct 27, 2017
3,598
Even 6000 -series will start to fall off in few years as next-gen actually starts / gets going, rendering power just wont keep up. VRAM wont help you there.

Here and now that VRAM is great marketing tool and it's working for them so props for that to AMD. Now if performance is good in 3rd party benches then they have some good sellers in hands.
This reminds me of the 970 GTX with 4(3.5?) GB VRAM and AMD's R9 390 with 8 GB VRAM situation all over again lol.
 

Mr Swine

The Fallen
Oct 26, 2017
6,081
Sweden
Even 6000 -series will start to fall off in few years as next-gen actually starts / gets going, rendering power just wont keep up. VRAM wont help you there.

Here and now that VRAM is great marketing tool and it's working for them so props for that to AMD. Now if performance is good in 3rd party benches then they have some good sellers in hands.

The leap from PS4/Xbone to the PS5/XBSX is smaller than going from PS360 to PS4/Xbone. Also I think with not that big memory upgrade from 8GB to 16GB will hamper the PS5 and XBSX in the future despite both having incredibly fast SSD's

The 3000 series will be ok thanks to the DLSS but is limited VRAM will be tough sell. The AMD 6000 series have plenty of ram but lack DLSS to keep it up in performance compared to the 3000 series in the future
 

Serpens007

Well, Tosca isn't for everyone
Moderator
Oct 31, 2017
8,135
Chile
My theory is that Nvidia kept the Vram as tight as possible in order to keep the cost down. Gamer Nexus reports that partners are having a hard time staying in line with the MSRP so it seems that with everything going on in the world, it's hard to stay within acceptable pricing range. This is not to say that the Vram is not enough. Will it keep being enough? We don't know, and can't really know for sure. AMD always aims for higher Vram which usually gives a bit more longevity to their cards, but they'll all be fine.
 

dgrdsv

Member
Oct 25, 2017
12,028
Do they really drum up the VRAM even that much in their marketing like some say?
16GB figure is plastered all over their 6800/6900 slide deck so I dunno.

My theory is that Nvidia kept the Vram as tight as possible in order to keep the cost down.
G6X 2GB isn't available yet which is the main reason really as going with 1GB chips nets you a $1500 3090 with 24GBs.
3070 (and below) could've gotten 16GB G6 easily but it can't because it can't have more VRAM than 3080.
So the main reason here is them going with G6X memory for Ampere really.
It will resolve itself gradually and at some point they'll be able to do a refresh with double the VRAM.
But I'd say that this won't happen for another half a year from now.
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran

Chaosblade

Resettlement Advisor
Member
Oct 25, 2017
6,622
That presumed 3060 news is oof. Good variants are probably going to be at least $350 but it sounds like Nvidia wants them to be closer to $300 to compete with AMD's cards.
 
Nov 2, 2017
2,275
This reminds me of the 970 GTX with 4(3.5?) GB VRAM and AMD's R9 390 with 8 GB VRAM situation all over again lol.
The big difference is that those cards were significantly faster than the consoles, like 3x the PS4, so VRAM mattered. The current cards are not all that much faster and will struggle soon. Hell, they're already struggling in Ubisoft cross gen games and I can assure you most full next gen games aren't going to run better.

Everyone is so concerned about the VRAM when the real concern should be about how close these cards are to consoles. They won't age well in general if you want to match a console at 60fps.
 
AMD Partner Showcase Videos
OP
OP
Raydonn

Raydonn

One Winged Slayer
Member
Oct 25, 2017
919
www.youtube.com

Radeon™ RX 6000 Partner Showcase Ep. 1: DIRT 5 & Codemasters

In this series we look at new technologies our partner studios have implemented in several upcoming next-generation titles. Graphics and Performance are at t...
www.youtube.com

Radeon™ RX 6000 Partner Showcase Ep. 2: Godfall & Counterplay Games

In this series we look at new technologies our partner studios have implemented in several upcoming next-generation titles. Graphics and Performance are at t...
www.youtube.com

Radeon™ RX 6000 Partner Showcase Ep. 3: World of Warcraft®: Shadowlands & Blizzard Entertainment

In this series we look at new technologies our partner studios have implemented in several upcoming next-generation titles. Graphics and Performance are at t...
www.youtube.com

Radeon™ RX 6000 Partner Showcase Ep. 4: The Riftbreaker & EXOR Studios

In this series we look at new technologies our partner studios have implemented in several upcoming next-generation titles. Graphics and Performance are at t...

Might as well link these.
 

maximumzero

Member
Oct 25, 2017
23,007
New Orleans, LA
I put a 5600 XT into my recent PC build and now I'm considering a 4K television when the wife and I move into a new place soon (hopefully)

It's suddenly occurred to me that that 5600 XT wont cut it at 4k. Hopefully 1080p upscaled to 4k doesn't look too terrible.
 

Mr Swine

The Fallen
Oct 26, 2017
6,081
Sweden
I put a 5600 XT into my recent PC build and now I'm considering a 4K television when the wife and I move into a new place soon (hopefully)

It's suddenly occurred to me that that 5600 XT wont cut it at 4k. Hopefully 1080p upscaled to 4k doesn't look too terrible.

Depends on the TV you are getting, my Samsung Q80T upscales 1080P content very good.

Switch games look great on it but playing PC 1440p content looks fantastic if you add a bit of sharpness to it
 

syllogism

Member
Oct 25, 2017
88
I don't think any of the retailers over here had even a single card in stock and the cheapest 6800 XT 849e. TUF gaming OC is 959,90e. Quite a launch.
 
Status
Not open for further replies.