• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
RX6800_14_575px.jpg


A process node being "scheduled" for some time doesn't mean anything for the production time of a large die PC GPU.
Also note that they are not even comitting to the use of N5 for RDNA3 at this point meaning that it's likely more than a year away from now.


I don't need a new GPU now nor have I bought one since 2019.


8 and 10 won't be enough for the generation but this hardly matter when we talk about 2022. The generation will last till 2026-27.


Not going to happen due to 384 bit bus. It's either 11 or 12 GBs for 3080Ti which aren't at all that different to 10GBs of 3080 and won't be enough for the generation either.
Note that RX6800 won't be enough for the generation too but for different reasons. So it's kind of a pointless comparison.

To me this is entirely dependent on your target resolution and how much you need to max out RT.

Consoles are not going to come close to matching PC RT settings.

If you plan on sticking with a 1440p monitor and don't need to max out RT, absolutely a 6800xt or 3080 will last you 5+ years.

If you are targeting no compromises maxed out 4k 60 gaming meaning you never want to even turn down any settings (which is a silly, silly standard to have), then sure a 3080/6800xt will need to be upgraded likely around mid gen console refreshed time 3-4 years from now, but even then I don't see those refreshes meaningfully surpassing a 3080/6800xt.
 

Iron Eddie

Banned
Nov 25, 2019
9,812
I've narrowed my interest down to the 3080 (but only 10 GB of memory but it is GDDR6X) or the 6800XT but I am in no hurry to upgrade. Since I am in no hurry I think I may want to wait until next year to see how these cards evolve and raytracing results from AMD. I think maybe the next round of new cards this time next year will be worth waiting for?
 

LCGeek

Member
Oct 28, 2017
5,890
I've narrowed my interest down to the 3080 (but only 10 GB of memory but it is GDDR6X) or the 6800XT but I am in no hurry to upgrade. Since I am in no hurry I think I may want to wait until next year to see how these cards evolve and raytracing results from AMD. I think maybe the next round of new cards this time next year will be worth waiting for?

I wanted to wait out but on 970 it's painful. If you can wait 1-3 yeras for the mid gen pop do so.
 

Convasse

Member
Oct 26, 2017
3,832
Atlanta, GA, USA
I've narrowed my interest down to the 3080 (but only 10 GB of memory but it is GDDR6X) or the 6800XT but I am in no hurry to upgrade. Since I am in no hurry I think I may want to wait until next year to see how these cards evolve and raytracing results from AMD. I think maybe the next round of new cards this time next year will be worth waiting for?
What's the next round of expected cards? 30 series refresh, RDNA 3?
 

Iron Eddie

Banned
Nov 25, 2019
9,812
What's the next round of expected cards? 30 series refresh, RDNA 3?
I honestly don't know. We could keep waiting every year since they always get better. My next card I want it to last at least 4 years but it is unclear about AMD's raytracing efforts so far and a DLSS 2.0 equivalent. With Nvidia I'm not sure 10GB on the 3080 is enough in the long run for 4K and I don't want to spend $1,500 on a card.
 

dgrdsv

Member
Oct 25, 2017
12,024
If you plan on sticking with a 1440p monitor and don't need to max out RT, absolutely a 6800xt or 3080 will last you 5+ years.
This is mostly depending on how often you are willing to upgrade your GPU.
I've had a GTX770 4GB which AFAIK works fine even these days in my relative's PC. He doesn't play anything from consoles though.
So while any of these cards could last you 5+ years - hell, even my current 2080 back from 2018 will likely do that making it lasting 7+ years already - I'm not willing on finding that out since I will most likely upgrade to a 3080/6800XT and then to a 4080/7800XT and then to a 5080/8800XT in that time. Simply because it's more interesting this way that sitting on one card for 5+ years while the newer ones provide new features and performance gains each two years or so.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
This is mostly depending on how often you are willing to upgrade your GPU.
I've had a GTX770 4GB which AFAIK works fine even these days in my relative's PC. He doesn't play anything from consoles though.
So while any of these cards could last you 5+ years - hell, even my current 2080 back from 2018 will likely do that making it lasting 7+ years already - I'm not willing on finding that out since I will most likely upgrade to a 3080/6800XT and then to a 4080/7800XT and then to a 5080/8800XT in that time. Simply because it's more interesting this way that sitting on one card for 5+ years while the newer ones provide new features and performance gains each two years or so.

I frequently upgrade as well and I try to maximize resale as well. To me I'd rather lose $300-400 over 5 years while having new cards vs keeping one card for 5 years and having it be worth $200 or so. I'll spend $400-600 on a card, sell it for 50-100 less than that, and pick up a new card 2 years later for not much more than that resale, rinse and repeat. Vs spending $800 on a card and having it be worth $200 5 years later. That's why I also think it's smart to not buy in the enthusiast/high end segment as you will lose a lot of that resale 1 gen later vs like $100.

I wouldn't buy a 2080 ti right now for even $500. Because it makes a lot more sense to buy a 3070 brand new (which I did). 2 years from now a 2080ti will go for like $200 as a new 4060/6700 will wipe the floor with it for like $350.

Buying a halo product guarantees you are overpaying on the front end $/perf because of diminishing returns, and also that you will lose 50%+ of that cards value in 2 years time because the new cards in the mid-range will match it or beat it.

Waiting for 6950xt or 3080ti is also a losing proposition imo. They will still be $1k cards and give you like max 20% over what's been available for months. A year to 18mos later there will be a ~$500 card that beats them and then yep your resale is gone and you just lost $500. I can upgrade every gen in the mid-range and lose roughly that over 4-5 years vs 18mos-2 years.

Sure I don't have THE FASTEST card but I'm within 20-30% of it all the time, vs buying high end and being eclipsed by a $500 card within 2 years.
 
Last edited:

Deleted member 27751

User-requested account closure
Banned
Oct 30, 2017
3,997
I've currently got a 1070 and previously had the 4GB 770, I'm perfectly okay with having long waits between card upgrades. I don't see a point personally in chasing the tech dragon, the significance of RT is currently useless to me and realistically I'm only just now upgrading to 1440p with the 6800XT.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Just makes me more curious to see 3rd party benches with these cards and games like this.

Yeah afaik even if you hit vram limits usually that only results in a like a 5fps drop and brute force can keep a card hitting vram limits competitive anyway. We'll see.

I'd also say as much as people like to shit on AMD for not having dlss equivalent, it seems these cards are very fast in raster performance and overclock well (vs very limited OC headroom for ampere).

So if a 6800 is 20% faster than a 3070 at stock, and 6800xt is faster than a 3080 at stock, well, dlss is kind of negated imo. Especially if you are willing to deploy RIS 85% scale/Radeon boost which I cahellenge anyone in real time to notice the difference in real time vs pixel peeping screen grabs.

I think it's still the case most folks would prefer a significantly faster raster card vs relying on dlss in order to "beat" a card that doesn't have to do that.

And, amd will have a machine learning solution as well. More details to come.

Depending on reviews I will swap out the 3070 I grabbed for a 6800/6800xt.
 

Tovarisc

Member
Oct 25, 2017
24,504
FIN
Yeah afaik even if you hit vram limits usually that only results in a like a 5fps drop and brute force can keep a card hitting vram limits competitive anyway. We'll see.

I'd also say as much as people like to shit on AMD for not having dlss equivalent, it seems these cards are very fast in raster performance and overclock well (vs very limited OC headroom for ampere).

So if a 6800 is 20% faster than a 3070 at stock, and 6800xt is faster than a 3080 at stock, well, dlss is kind of negated imo. Especially if you are willing to deploy RIS 85% scale/Radeon boost which I cahellenge anyone in real time to notice the difference in real time vs pixel peeping screen grabs.

I think it's still the case most folks would prefer a significantly faster raster card vs relying on dlss in order to "beat" a card that doesn't have to do that.

And, amd will have a machine learning solution as well. More details to come.

Depending on reviews I will swap out the 3070 I grabbed for a 6800/6800xt.

We also could have case of game never capping out e.g. 10 GB of VRAM, but just allocates what ever it can get like games tend to do on PC.

In pure raster Ampere cards are also really fast cards and stuff like DLSS just lets them edge further, but yeah OC overhead is tapped out right from the factory pretty much.

Ampere 3rd party reviews Vs. RDNA 2 AMD marketing massaged charts don't put them that much apart as of now, so will be interesting to see where cards place on rankings when everything has 3rd party reviews.

There is a lot unknowns and murkiness still with a lot hype driving it, for good reasons mind you as AMD might be bringing the goods, but reviews are "soon"...
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
We also could have case of game never capping out e.g. 10 GB of VRAM, but just allocates what ever it can get like games tend to do on PC.

In pure raster Ampere cards are also really fast cards and stuff like DLSS just lets them edge further, but yeah OC overhead is tapped out right from the factory pretty much.

Ampere 3rd party reviews Vs. RDNA 2 AMD marketing massaged charts don't put them that much apart as of now, so will be interesting to see where cards place on rankings when everything has 3rd party reviews.

There is a lot unknowns and murkiness still with a lot hype driving it, for good reasons mind you as AMD might be bringing the goods, but reviews are "soon"...

Right, right now honestly the big perf/$ card looks to be the 6800 vanilla. Before I was kinda head scratching that card and it being priced at $579. Well with more stuff coming out it seems like it obliterates the 3070 and you get 8gb more vram. 20% more perf over the Nvidia $500 card for $79 plus more vram.

Like if it pans out and this thing is nearly a 3080 in raster (timespy leaks at like 17k), plus good OC headroom, that's a damn good deal.

RT looks to equivalent/better than a 3070 too.

We'll see though.
 

LCGeek

Member
Oct 28, 2017
5,890
Right, right now honestly the big perf/$ card looks to be the 6800 vanilla. Before I was kinda head scratching that card and it being priced at $579. Well with more stuff coming out it seems like it obliterates the 3070 and you get 8gb more vram. 20% more perf over the Nvidia $500 card for $79 plus more vram.

Like if it pans out and this thing is nearly a 3080 in raster (timespy leaks at like 17k), plus good OC headroom, that's a damn good deal.

RT looks to equivalent/better than a 3070 too.

We'll see though.

Buildzoid feels the same if it ocs.
 

Readler

Member
Oct 6, 2018
1,974
We also could have case of game never capping out e.g. 10 GB of VRAM, but just allocates what ever it can get like games tend to do on PC.

In pure raster Ampere cards are also really fast cards and stuff like DLSS just lets them edge further, but yeah OC overhead is tapped out right from the factory pretty much.

Ampere 3rd party reviews Vs. RDNA 2 AMD marketing massaged charts don't put them that much apart as of now, so will be interesting to see where cards place on rankings when everything has 3rd party reviews.

There is a lot unknowns and murkiness still with a lot hype driving it, for good reasons mind you as AMD might be bringing the goods, but reviews are "soon"...
FWIW I'd still opt for DLSS+Ampere rather than RDNA2+OC if given the choice. There is absolutely no way that RDNA2 will be faster than that.
The problem is that DLSS isn't widely supported and I don't see that changing either.
RT performance remains a mystery for now, but as long as it's not significantly worse than Turing, I don't see it as a problem personally.
 

GreyHand23

Member
Apr 10, 2018
413
Isn't this game coming out in a week? We don't really know much about it, wild.

The problem is the information we have about it is scattered all over the place. Even their official website barely has any information about the game on it. Obviously we should be getting a lot of info in the next 10 days, but it is a strange way to market a brand new IP from an unknown studio.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Buildzoid feels the same if it ocs.

This I assume?

youtu.be

Rambling about the AMD Radeon RX 6000 series announcement

My Patreon: https://www.patreon.com/buildzoidTeespring: https://teespring.com/stores/actually-hardcore-overclockingThe Twitch:https://www.twitch.tv/buildzoid...

Yeah it makes sense.

Diminishing returns on more CUs plus I bet the 6800 ocs better vs a fuller die.

If the timespy leaks are in fact a 6800 yeah thing is a beast and if it has good OC headroom to match/beat a stock 3080 or 6800xt.

But a lot of ifs.

3070 which I have is frankly disappointing as far as OC. It is damn near the full die and even feeding another 50w doesn't give you much.

I'm a tinkerer. I also have a feeling I'll be able to buy a 6800 vanilla pretty easily, either online or at MC.
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran
FWIW I'd still opt for DLSS+Ampere rather than RDNA2+OC if given the choice. There is absolutely no way that RDNA2 will be faster than that.
The problem is that DLSS isn't widely supported and I don't see that changing either.
RT performance remains a mystery for now, but as long as it's not significantly worse than Turing, I don't see it as a problem personally.


i have been saying this from day 1. DLSS is something that comes in 1 game out of 20 big games. That means it requires some efforts and optimizations which some developers don't want to do. Yes, it's a great tech but it's not common and hardly 10 games support that tech at this moment. I wouldn't be worried about AMD cards not having anything similar to dlss. I have seen DLSS running but the image quality doesn't look great to me something is weird about it. I will take 16gb vram over 8-10gb for 80 bucks.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
FWIW I'd still opt for DLSS+Ampere rather than RDNA2+OC if given the choice. There is absolutely no way that RDNA2 will be faster than that.
The problem is that DLSS isn't widely supported and I don't see that changing either.
RT performance remains a mystery for now, but as long as it's not significantly worse than Turing, I don't see it as a problem personally.

The second half of your post doesn't match the first half lol
 

dgrdsv

Member
Oct 25, 2017
12,024

Shadows here and in Dirt5.
I've heard a theory that RDNA2 won't do well in RT which will result in lots of divergent rays.
One light source shadows is an easier application of RT and the least visually noticeable - albeit still great in cases where shadow maps fail.
Perfect one bounce reflections should also fit well into this theory.
GI though and stuff like AO/contact shadows would lead to performance issues.
We'll see soon enough.
 
Aug 30, 2020
2,171
One light source shadows is an easier application of RT and the least visually noticeable - albeit still great in cases where shadow maps fail.

It's hard to tell if these shadows even have variable penumbra or if they don't and are just blurred a bit after the fact to simulate it.

It takes a lot more rays to resolve an area light sun than a point light sun. I'm not seeing anything in this footage that isn't just suggesting otherwise.
 

dgrdsv

Member
Oct 25, 2017
12,024
Screenshot-2020-11-03-022645.png


Nice sharpening filter they got there.

It's hard to tell if these shadows even have variable penumbra or if they don't and are just blurred a bit after the fact to simulate it.

It takes a lot more rays to resolve an area light sun than a point light sun. I'm not seeing anything in this footage that isn't just suggesting otherwise.
It's not even clear what shadows they are doing with RT. He's saying "character and foliage shadows in the distance" but it's a bit weird to put so hugely different things like character and foliage shadows into one bucket - unless they mean something like "dynamic objects far sun shadowing"? Ugh, we need more info.

Here the noise on the ground shadow suggests that its done with RT: https://youtu.be/ppLFctc0iMU?t=160
But the self shadowing on the shadow source on the left isn't noisy at all which suggests that it's not RT.
 

LCGeek

Member
Oct 28, 2017
5,890
This I assume?

youtu.be

Rambling about the AMD Radeon RX 6000 series announcement

My Patreon: https://www.patreon.com/buildzoidTeespring: https://teespring.com/stores/actually-hardcore-overclockingThe Twitch:https://www.twitch.tv/buildzoid...

Yeah it makes sense.

Diminishing returns on more CUs plus I bet the 6800 ocs better vs a fuller die.

If the timespy leaks are in fact a 6800 yeah thing is a beast and if it has good OC headroom to match/beat a stock 3080 or 6800xt.

But a lot of ifs.

3070 which I have is frankly disappointing as far as OC. It is damn near the full die and even feeding another 50w doesn't give you much.

I'm a tinkerer. I also have a feeling I'll be able to buy a 6800 vanilla pretty easily, either online or at MC.

I'm glad the word IF is being stressed, especially post ampere.

I just like other voices are signing on to something I've been hint/hoping since they talked about PS5 clock speeds.

6800 series performance from leaks are good. Be it some of the RT discussion happening or supposed clocks. If some samples are true we have 2.3 to 2.5GHz in the wild. For a value purchase the 6800 would be quite a steal if things hold up. Only a few more weeks before we know.
 

brain_stew

Member
Oct 30, 2017
4,754
So for those of you that already used an AMD card can you confirm if AMD now have an easy way to enable half rate Vsync? I game on an LG B8 OLED so can't use VRR and a locked framerate at 60fps is always my number one priority. However, sometimes it's impossible to escape bad PC ports like the recent Watch Dogs: Legion where this is nigh on impossible on existing hardware. In these instances I'd usually prefer to drop into "console mode" by using half rate Vsync and adaptive sync through Nvidia's drivers. This never used to be possible back when I had a R9 280x but is it now available?

Please don't suggest other methods to lock a framerate, I've probably tried them all and half rate Vsync is the only way that gives me the same consistent motion I'd get from a 30fps console release.

Seeing how RT effects can really push up VRAM requirements and seeing examples like Doom Eternal and Watchdogs already push past the 8GB limit of the 3070 I'm just not confident in the longevity of the 3070. I could also do with the additional raw performance it brings in order to lock to 60fps at 4K. The fact that the card has more memory bandwidth than the 3070 and 128MB infinity cache as well as much more pixel and texture fill rate has to be significant in legacy titles at 4K.

I just hope AMD don't artificially limit the overclocking headroom on the 6800. There's potentially 400mhz+ headroom above the "game clock" but even just being able to raise the power limit and get it to lock to the default boost clock should leave the 3070 in the dust as we know that card has barely any headroom.
 

brain_stew

Member
Oct 30, 2017
4,754
Yeah afaik even if you hit vram limits usually that only results in a like a 5fps drop and brute force can keep a card hitting vram limits competitive anyway. We'll see.

Average FPS is a fundamentally flawed measure of a game's smoothness. A 5fps drop can hide a significant increase in frame time inconsistencies which are a very real possibility if you breach your cards VRAM budget.

I aim to lock my games to 60fps across the board. So if a drop in FPS moves my frametimes from 12ms to 15ms I don't care as long as they don't ever breach 16.67ms. Alternatively if running out of VRAM means I drop from 11ms to 12ms but I get some frames that spike upto 18ms as memory is swapped in and out of VRAM then the latter is going to be a much worse experience even if it technically has a much higher average FPS.
 

Readler

Member
Oct 6, 2018
1,974
The second half of your post doesn't match the first half lol
How so? If *every* game supported DLSS, it would be a no brainer to go for Ampere, but since that isn't the case you, stronger raster performance is still relevant.
It now depends on the impact RT is going to have on this and I do expect more games to support RT than DLSS.
 

dgrdsv

Member
Oct 25, 2017
12,024
You'd know if you hit the vram ceiling. It's more like drops to 5 fps. Repeatedly.
This is only true when you run out of VRAM for render targets.
When you have to swap assets via PCIE during rendering it's more like heavy stutters every couple dozens of seconds with the game running fine otherwise. Some games even manage to avoid stutters altogether and just run with lower fps. It's not a hard line where you start running out of VRAM, streaming systems are adjusting to h/w resources.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
Historically: it really depends on who you ask basically. If somebody just invested in an 8GB card they is not going to tell you upcoming games will benefit from 10GB. If they have a 10GB card however, then suddenly 10GB becomes the perfect balance and 11GB is wasted and 16GB is a absolute waste of silicon.
 

Readler

Member
Oct 6, 2018
1,974
Historically: it really depends on who you ask basically. If somebody just invested in an 8GB card they is not going to tell you upcoming games will benefit from 10GB. If they have a 10GB card however, then suddenly 10GB becomes the perfect balance and 11GB is wasted and 16GB is a absolute waste of silicon.
I love how that thread everyone loves to quote is saying you'll be fine with 10 GB for the next 2 years. That's not an accomplishment, that's like surviving one generation with the flagship.
Listen, I want this card to last at least four years, and no way 10 GB are gonna be fine in 4 years. It's like the GTX 770 2GB vs 280X 3GB all over again haha
 
Last edited:

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
I've not really tested this personally, but doesn't turning on RT increase VRAM usage further? Ultra textures, RT and playing in 4K as the 'norm' (because of new consoles) this gen would make me steer clear of 8GB premium cards for sure.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
I love how that thread everyone loves to quote is saying you'll be fine with 10 GB for the next 2 years. That's not an accomplishment, that's like surviving one generation with the flagship.
Listen, I want this card to last at least four years, and no way 10 GB are gonna be fine in 2 years. It's like the GTX 770 2GB vs 280X 3GB all over again haha

I agree. If it's sold as a flagship today then it should last you more than 2 years.
And it's not like Nvidia will limit all their GPUs to 8GB, I expect them to announce 16GB models before the end of the year even. Especially after all the RDNA2 reviews and benchmarks hit
 

brain_stew

Member
Oct 30, 2017
4,754
I've not really tested this personally, but doesn't turning on RT increase VRAM usage further? Ultra textures, RT and playing in 4K as the 'norm' (because of new consoles) this gen would make me steer clear of 8GB premium cards for sure.

Typically an additional 1-2GB according to Nvidia:

developer.nvidia.com

Tips and Tricks: Ray Tracing Best Practices | NVIDIA Technical Blog

This post presents best practices for implementing ray tracing in games and other real-time graphics applications. We present these as briefly as possible to help you quickly find key ideas.

With 8GB cards already being squeezed to the limit at 4K in a lot of existing games and RT and assets built around the next generation of consoles set to increase that further, I'm no longer comfortable buying a 8GB card knowing I want it to last 3-4 years.

I'd have been happy with 12GB, maybe even 10GB at a push (as it basically matches the "fast memory pool in the Series X) but 8GB just doesn't seem like it has legs. It could be similar to the GTX 770 and R9 280x back in the day where the 2GB vs. 3GB really gave the AMD card a lot more longevity.

I think I'm pretty set on the 6800 now, just need UK prices and to see how much overclocking headroom it has. If AMD really let you go wild and overclock it to 2.3-2.4ghz sustained boost it could get awfully close to a 3080 in a lot of games.
 

Alvis

Saw the truth behind the copied door
Member
Oct 25, 2017
11,272
I agree. If it's sold as a flagship today then it should last you more than 2 years.
And it's not like Nvidia will limit all their GPUs to 8GB, I expect them to announce 16GB models before the end of the year even. Especially after all the RDNA2 reviews and benchmarks hit
I kinda feel bad for buying the 8 GB 3070 now lol. Then again, I want a PC now
 
Status
Not open for further replies.