• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

brain_stew

Member
Oct 30, 2017
4,727
I love how that thread everyone loves to quote is saying you'll be fine with 10 GB for the next 2 years. That's not an accomplishment, that's like surviving one generation with the flagship.
Listen, I want this card to last at least four years, and no way 10 GB are gonna be fine in 4 years. It's like the GTX 770 2GB vs 280X 3GB all over again haha

This forum has a warped perception of the average upgrade cycle. 3-4 years for a GPU upgrade is normal, not some unrealistic expectation. You get barely any value upgrading every generation and the leaps are tiny.

The 3080 and 3070 only being viable for high end PC gaming got 2 years would be a complete disaster, and is no sort of endorsement. I do think the 10GB 3080 is in a much better spot than the 3070 due to the Series X memory allocation. We may end up with a scenario where the 3070 needs to use Series S assets which wouldn't be a good look for a $500-$600 GPU.

Meanwhile I intend to go from a GTX 1060 to a RX 6800 and know I'll be getting between a 3-4x performance improvement as well as lots of new features.
 

brain_stew

Member
Oct 30, 2017
4,727
So VRAM really is PC's SSD in this HW generation cycle, huh?

Its not exactly a new thing, many have been caught out and the lifespan of their GPUs been cut unexpectedly short due to low VRAM amounts in generations prior.

I'm thinking the 320MB 8800GTS and GTX 770 2GB as good examples here, they both aged terribly as the VRAM allocation proved to not be enough for the generation.

I'm not convinced that the 10GB 3080 is going to run into too much trouble but I'm less confident about the 8GB 3070.

Of course, part of this is me trying to justify the AMD card after missing out on a 3070 at launch but there is something to it.
 

Readler

Member
Oct 6, 2018
1,972
This forum has a warped perception of the average upgrade cycle. 3-4 years for a GPU upgrade is normal, not some unrealistic expectation. You get barely any value upgrading every generation and the leaps are tiny.

The 3080 and 3070 only being viable for high end PC gaming got 2 years would be a complete disaster, and is no sort of endorsement. I do think the 10GB 3080 is in a much better spot than the 3070 due to the Series X memory allocation. We may end up with a scenario where the 3070 needs to use Series S assets which wouldn't be a good look for a $500-$600 GPU.

Meanwhile I intend to go from a GTX 1060 to a RX 6800 and know I'll be getting between a 3-4x performance improvement as well as lots of new features.
If I wasn't unemployed due to Rona, I'd earn a pretty good salary and I still couldn't warrant upgrading every gen.

If we assume that everything that AMD has said about the 6000 series is true (grain of salt and all), that'd make the 3070 a terrible value proposition: 8GB of GDDR6 (not GDDR6X like on the 6080) is just embarassing.

I mean yeah, who know what's gonna happen etc., but from what we know NOW Nvidia is being stingy again when it comes to VRAM.
 

Serious Sam

Banned
Oct 27, 2017
4,354
User Banned (2 Weeks): Antagonizing Other Users and Inflammatory Commentary; Prior Infractions for the Same
I love how that thread everyone loves to quote is saying you'll be fine with 10 GB for the next 2 years. That's not an accomplishment, that's like surviving one generation with the flagship.
Listen, I want this card to last at least four years, and no way 10 GB are gonna be fine in 4 years. It's like the GTX 770 2GB vs 280X 3GB all over again haha
The 10GB VRAM thread is equivalent to this gif.

giphy.gif


Just a bunch of people swept up in 3080 hype trying to shut down legitimate criticisms and concerns, despite obvious trends of VRAM usage skyrocketing. Just look at WD Legion and now Godfall. There is a reason why AMD didn't go under 16GB even for their cheapest announced card RX 6800. I completely agree that 2 year upgrade cycle is absolute bullshit. How many people in PC gaming community do that? 1% of 1%? I consider myself PC enthusiast and yet I still skip 1-2 GPU gens, so more VRAM is highly valuable feature to me.
 

Tovarisc

Member
Oct 25, 2017
24,404
FIN
^^ SSD is the SSD if this coming generation.

Fast storage will actually mean something now........ hopefully.

My post was meta commentary on everyone losing their minds over it.

Has there ever been a HW generation cycle in which VRAM didn't at least double?

Technically 980Ti to 1080Ti didn't double (6GB vs. 11GB), but 2080Ti rocks same 11GB as 1080Ti. 3080 "regressed" that to 10GB. A lot of gains have come from pure performance improvement in memory alone (GDDR5 to GDDR6 to GDDR6X...).

I'm not convinced that the 10GB 3080 is going to run into too much trouble but I'm less confident about the 8GB 3070.

In few years that 8GB can get dicey, especially above 1080p, but at that point 2080Ti / 3070 levels of rendering power most likely start to affect performance too. If gamer is okay with lower settings etc. then both aspect will be less of an issue and card will hold out longer, but if want keep those settings up etc. then there could be trouble.

Just look at WD Legion and now Godfall.

What there is to look at?

Legion doesn't even break 8 GB on V.High settings and RTX @ 3440 x 1440 and we have no clue what Godfalls real VRAM usage is. I'm amazed they didn't go all in on AMD marketing with 16GB rec.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
My post was meta commentary on everyone losing their minds over it.



Technically 980Ti to 1080Ti didn't double (6GB vs. 11GB), but 2080Ti rocks same 11GB as 1080Ti. 3080 "regressed" that to 10GB. A lot of gains have come from pure performance improvement in memory alone (GDDR5 to GDDR6 to GDDR6X...).



In few years that 8GB can get dicey, especially above 1080p, but at that point 2080Ti / 3070 levels of rendering power most likely start to affect performance too. If gamer is okay with lower settings etc. then both aspect will be less of an issue and card will hold out longer, but if want keep those settings up etc. then there could be trouble.

I thought you meant HW generation based on(console) game design.
 

Serious Sam

Banned
Oct 27, 2017
4,354
Legion doesn't even break 8 GB on V.High settings and RTX @ 3440 x 1440 and we have no clue what Godfalls real VRAM usage is. I'm amazed they didn't go all in on AMD marketing with 16GB rec.
People gaming at 1440p and below shouldn't even concern themselves VRAM. 4K is entirely different beast when it comes to VRAM usage.
 

brain_stew

Member
Oct 30, 2017
4,727
we have no clue what Godfalls real VRAM usage is. I'm amazed they didn't go all in on AMD marketing with 16GB rec.

You forget the 6700 series will have 12GB VRAM ;).

12GB is the perfect AMD marketing value, even the 2080Ti misses out and the only pricepoint Nvidia can offer it at is $1500. Meanwhile AMD will have $400-$500 cards that meet the spec.

Its almost too perfect so I do believe a lot of it has to do with AMD marketing.

That's not to say that their may not be some validity to it. If the game is throwing out 4096x4096 textures like they're going out of fashion, then yeah, an 8GB card may have tricks with that.
 

Readler

Member
Oct 6, 2018
1,972
My post was meta commentary on everyone losing their minds over it.



Technically 980Ti to 1080Ti didn't double (6GB vs. 11GB), but 2080Ti rocks same 11GB as 1080Ti. 3080 "regressed" that to 10GB. A lot of gains have come from pure performance improvement in memory alone (GDDR5 to GDDR6 to GDDR6X...).



In few years that 8GB can get dicey, especially above 1080p, but at that point 2080Ti / 3070 levels of rendering power most likely start to affect performance too. If gamer is okay with lower settings etc. then both aspect will be less of an issue and card will hold out longer, but if want keep those settings up etc. then there could be trouble.



What there is to look at?

Legion doesn't even break 8 GB on V.High settings and RTX @ 3440 x 1440 and we have no clue what Godfalls real VRAM usage is. I'm amazed they didn't go all in on AMD marketing with 16GB rec.
I'd say it's less about losing minds, than being unhappy with how Nvidia is being stingy about it.

These cards are being advertised as 4K cards - 1080p shouldn't even part of this discussion - and I definitely see the 3070 struggling to keep up with VRAM usage once we see the first actual next gen games roll in, especially since the improvement in memory does not apply to it.
I just don't see why people are defending a billion dollar company for their shitty decisions on already overpriced GPUs. The 3080 should ideally have had more VRAM and the 3070 is definitely just bad value at this point if AMD delivers.
 

Serious Sam

Banned
Oct 27, 2017
4,354
You really believe we will be playing in native 4k on pc?
I don't. Not with current cards at-least. And not because of VRAM.
Me and many others have been gaming in native 4K on PC for years. Although in very demanding games you may need cap FPS to 30, you still need the same amount of VRAM in 4K, doesn't really matter if you run at 30 or 60 or 120FPS.
 
Last edited:

shinbojan

Member
Oct 27, 2017
1,101
Are you being deliberately obtuse? Me and many others have been gaming in native 4K on PC for years. Although in very demanding games you may need cap FPS to 30, you still need the same amount of VRAM in 4K, doesn't matter really matter if you run at 30 or 60 or 120FPS.

I've been playing some games in 4k, but not all of them. Legion already shows what we can expect in the future.
And no, I don't upgrade my pc to play games in 30fps.
And please, avoid insults if possible (I know that it probably ain't).
 
Nov 8, 2017
13,097
Has there ever been a HW generation cycle in which VRAM didn't at least double?

PS2: 32mb RD RAM + 4mb ED RAM
PS3: 256MB XD RAM + 256MB GDDR3
PS4: 8GB GDDR5 (~5GB available to games for all purposes)
PS5: 16GB GDDR6 (~13-13.5GB available to games for all purposes)

This generation is - no contest - the smallest increase in memory ever on the consoles. It's not usually a doubling, it's usually like an 8 or 16x increase.

People are theorycrafting that maybe we will have massively ballooning video memory requirements because of the SSDs but the hard reality is that most consumer GPUs will have much less than 8GB for years to come, and the most popular cards sold are below the tier that both AMD and Nvidia have announced right now - it will be the RX 6600s and the RTX 3060s that dominate the sales charts.
 
Nov 8, 2017
13,097
Are you being deliberately obtuse? Me and many others have been gaming in native 4K on PC for years. Although in very demanding games you may need cap FPS to 30, you still need the same amount of VRAM in 4K, doesn't really matter if you run at 30 or 60 or 120FPS.

You're making appeals to popularity on the idea of 2 year upgrades but gaming at 4k puts you in a tiny minority of PC users (<2.5%). Your use case is perfectly valid but it isn't common, and it's totally ok to recommend different cards to different use cases.
 

Serious Sam

Banned
Oct 27, 2017
4,354
I've been playing some games in 4k, but not all of them. Legion already shows what we can expect in the future.
And no, I don't upgrade my pc to play games in 30fps.
And please, avoid insults if possible (I know that it probably ain't).
No insults intended (edited my post). I think Legion is a terrible example as far as general performance is concerned, but it's interesting to see general VRAM trend going up. Game optimization isn't related to VRAM usage usually. You can have insanely optimized game like DOOM which still loves to use heaps of VRAM in 4K.

And yes, of course native 4K goal will be harder to achieve if GPU manufacturers continue to stagnate in VRAM department. Luckily there is some competition and AMD is moving the ball forward. It will be really interesting to see how AMD will exploit this strength in future marketing and AMD sponsored titles, we saw a glimpse of it in Godfall video.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
PS2: 32mb RD RAM + 4mb ED RAM
PS3: 256MB XD RAM + 256MB GDDR3
PS4: 8GB GDDR5 (~5GB available to games for all purposes)
PS5: 16GB GDDR6 (~13-13.5GB available to games for all purposes)

This generation is - no contest - the smallest increase in memory ever on the consoles. It's not usually a doubling, it's usually like an 8 or 16x increase.

People are theorycrafting that maybe we will have massively ballooning video memory requirements because of the SSDs but the hard reality is that most consumer GPUs will have much less than 8GB for years to come, and the most popular cards sold are below the tier that both AMD and Nvidia have announced right now - it will be the RX 6600s and the RTX 3060s that dominate the sales charts.

Look at PC GPUs at the start of a console generation and their ability to run mid-late generation games. VRAM usually becomes an early bottleneck.
Also recently, an example: a 770 could easily destroy whatever GPU the PS4 had. It's 2GB however cannot even run modern games like RDR2 above 20fps without excessive frame drops. At the beginning it ran everything at twice the frame rate compared to the console counterparts.
VRAM is future proofing
 
Nov 2, 2017
2,275
This forum has a warped perception of the average upgrade cycle. 3-4 years for a GPU upgrade is normal, not some unrealistic expectation. You get barely any value upgrading every generation and the leaps are tiny.

The 3080 and 3070 only being viable for high end PC gaming got 2 years would be a complete disaster, and is no sort of endorsement.
That's the unfortunate reality though when you buy hardware at the start of a new gen. None of the new cards are all that much faster than consoles so by the standards of people who buy a GPUs for $500+ I'd say these cards are not going to last much longer than 1-2 years if you want to match consoles at 60fps. You'd need at minimum 2x the power for that and none of the new cards offer that. No amount of VRAM is going to to make up for that. 3-4 year upgrade cylces are only realistic a couple of years into a console gen when GPUs are already much faster than the GPU baseline of consoles.

If you're uncomfortable with spending $500+ on a GPU that will only last for 2 years then I suggest you to cut your budget and buy a console or a $200-300 GPU to last through the next 2 years.

I do think the 10GB 3080 is in a much better spot than the 3070 due to the Series X memory allocation. We may end up with a scenario where the 3070 needs to use Series S assets which wouldn't be a good look for a $500-$600 GPU.

Once next gen starts you're going to have to drop settings or resolution anyway to get a decent framerate. Also consoles have shared memory so both the 3080 & 3070 have more available memory than the XSX & XSS respectively. Not that it will help them.

Either people overestimate the new GPUs or they underestimate the power of next gen. I've had this same discussion last gen when people were contemplating whether to buy the 770/780 2GB vs 4GB version. I argued that the 4GB version was a waste of money as none of these cards would last long enough for it to matter with next gen and I was right. The 4GB versions were trash 1-2 years into the gen too. Now Kepler did age spectacularly bad but the gap between high end GPUs & consoles was much much larger then. Even a 7970 was more than twice as fast as consoles back then. The 6900xt isn't even that.

Are you being deliberately obtuse? Me and many others have been gaming in native 4K on PC for years. Although in very demanding games you may need cap FPS to 30, you still need the same amount of VRAM in 4K, doesn't really matter if you run at 30 or 60 or 120FPS.

Hitting 4k for the past years in current gen and hitting 4K in next gen games are two different things. How do you think a 3080 is going to fare at 4K in a next gen 30fps game with a dynamic res that drops to 1440p? It's not going to do well at all, not even if you're targetting 30fps. It'll be okay for 60fps 4K games and games that actually do target native 4K but it's naieve to think there won't be games that target 30fps <4K resolution on consoles.
 
Last edited:

Simuly

Alt-Account
Banned
Jul 8, 2019
1,281
People are theorycrafting that maybe we will have massively ballooning video memory requirements because of the SSDs but the hard reality is that most consumer GPUs will have much less than 8GB for years to come, and the most popular cards sold are below the tier that both AMD and Nvidia have announced right now - it will be the RX 6600s and the RTX 3060s that dominate the sales charts.

What is your point? None of us here are talking about low end or volume graphics cards and 1080p 30 fps settings. This is about whether these premium graphics cards can handle settings at the highest end or not. It's irrelevant that the most popular card will be the 3060.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
Excellent points napata.

I will add that this generation might be a bit weird because a lot of extremely taxing techniques are being introduced into games. With a focus from Nvidia and a promise from AMD on ML based upscaling, some requirements might be offset. But then again, the lead platforms might use ML upscaling as well. Games are not developed in vacuüms anymore and PC is now often an important pillar instead of an afterthought. So a focus on 3060 with the rest targeting ultra could be possible for sure
 
Nov 8, 2017
13,097
Look at PC GPUs at the start of a console generation and their ability to run mid-late generation games. VRAM usually becomes an early bottleneck.
Also recently, an example: a 770 could easily destroy whatever GPU the PS4 had. It's 2GB however cannot even run modern games like RDR2 above 20fps without excessive frame drops. At the beginning it ran everything at twice the frame rate compared to the console counterparts.
VRAM is future proofing

A 770 is a 7 year old GPU, not a 4 year old one. But you can also do the math here - PS4 had 5.5 GB of memory available to games, XSX has 13.5 (and XSS has 7.5).

The equivalent of 2GB if 2013, which is what the common 770 variants had, would be 4.9GB today. 10GB today would be more like a 2013 GPU having 4GB (~4.1 or something is the exact math but you know, round numbers). To this day, 4GB can run games pretty well at 1080p (the most common resolution today - in 2013 you might have been on 1680x1050 or 1366 x 768 but 1080p wasn't rare). Your 2013 GPUs are not running modern games at ultra settings, they're compromising on every setting, usually quite heavily.

What is your point? None of us here are talking about low end or volume graphics cards and 1080p 30 fps settings. This is about whether these premium graphics cards can handle settings at the highest end or not. It's irrelevant that the most popular card will be the 3060.

It's relevant because obviously games will be able to scale down effectively to 8GB and 6GB cards, for years to come, otherwise only cutting edge enthusiasts will be able to play.

The importance of being able to not drop texture settings for some indefinite amount of time is massively overinflated. We will all be dropping tons of settings within 3-4 years in order to maintain performance targets. The drop from "ultra" to "very high" or (gasp) "high" will not be a significant weight on people's minds in 2023-2024.
 
Nov 8, 2017
13,097
Who said 770 is a 4 year old GPU?

It's not directly anything you said, I'm just situating it in the rest of the discussion more broadly going on - it's pretty different to wonder about timelines 3-4 years from now than it is to wonder about something 7 years from now. Most of what I'm talking about in all my posts is a question of ~4 years or so.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
But there are no data points for the past 4 years so that would be mostly hypothetical, correct?
We have had 8GB for years now, and we are at the dawn of the new generation of games
 

shinbojan

Member
Oct 27, 2017
1,101
but it's interesting to see general VRAM trend going up

I also fear that 10GB won't be enough.
I really want a new card, but I keep reminding myself that buying a gpu now is stupid, as we don't know how next-gen consoles will affect pc requirements.
Not being able to buy 3080 for 700$ might be a blessing in disguise.
 

Serious Sam

Banned
Oct 27, 2017
4,354
I also fear that 10GB won't be enough.
I really want a new card, but I keep reminding myself that buying a gpu now is stupid, as we don't know how next-gen consoles will affect pc requirements.
Not being able to buy 3080 for 700$ might be a blessing in disguise.
Yeah, not being able to buy RTX30 cards at launch might be a blessing in disguise after all. Big Navi power caught many off guard. Now I'm really curious to wait and see how GPU market looks once 6000 cards are out as well as rumored 3080Ti.

Radeon already has an answer to raytracing, and DLSS alternative is coming too. It's really hard to recommend 3070 over 6800 right now. Maybe if you require DLSS immediately and can't afford to wait for AMD's solution. But AMD's DLSS variant will came, and 3070 will forever be stuck with 8GB.
 
Nov 2, 2017
2,275
Look at PC GPUs at the start of a console generation and their ability to run mid-late generation games. VRAM usually becomes an early bottleneck.
Also recently, an example: a 770 could easily destroy whatever GPU the PS4 had. It's 2GB however cannot even run modern games like RDR2 above 20fps without excessive frame drops. At the beginning it ran everything at twice the frame rate compared to the console counterparts.
VRAM is future proofing
A 4GB 770 doesn't do much better in RDR2. I said it in my example above but VRAM isn't the issue for the 770. In current games the 770 isn't actually much faster than the PS4, because Kepler is not suited for current gaming workloads.

My overall point is that at the start of the gen power is a bigger issue than VRAM and that changes as the gen progresses and GPUs become significantly more powerful than console GPUs. This is especially the case next gen where the gap between high end GPUs & consoles isn't really that big. The 390 vs 970 is a better example where VRAM does matter to a degree whereas it didn't really matter for the 770.

At the start of a gen is the worst time to futureproof and IMO it's better to spend as little as possible on hardware if this is a concern.
 

WishIwasAwolf

Banned
Oct 24, 2020
260
A 4GB 770 doesn't do much better in RDR2. I said it in my example above but VRAM isn't the issue for the 770. In current games the 770 isn't actually much faster than the PS4, because Kepler is not suited for current gaming workloads.

My overall point is that at the start of the gen power is a bigger issue than VRAM and that changes as the gen progresses and GPUs become significantly more powerful than console GPUs. This is especially the case next gen where the gap between high end GPUs & consoles isn't really that big. The 390 vs 970 is a better example where VRAM does matter to a degree whereas it didn't really matter for the 770.

At the start of a gen is the worst time to futureproof and IMO it's better to spend as little as possible on hardware if this is a concern.

I guess the 8800 was an exception. Though it did come out a year after the 360 it I'm not mistaken. But yeah best to wait a while if you are looking to future proof anything.
(I'd take a bet with the 6800XT maybe, based on the similarities in architecture. Excluding VRAM the 3080 is a powerful option as well)
 

tokkun

Member
Oct 27, 2017
5,400
A 770 is a 7 year old GPU, not a 4 year old one. But you can also do the math here - PS4 had 5.5 GB of memory available to games, XSX has 13.5 (and XSS has 7.5).

The equivalent of 2GB if 2013, which is what the common 770 variants had, would be 4.9GB today. 10GB today would be more like a 2013 GPU having 4GB (~4.1 or something is the exact math but you know, round numbers). To this day, 4GB can run games pretty well at 1080p (the most common resolution today - in 2013 you might have been on 1680x1050 or 1366 x 768 but 1080p wasn't rare). Your 2013 GPUs are not running modern games at ultra settings, they're compromising on every setting, usually quite heavily.

Since you are comparing the ratio of the unified RAM in a console with the VRAM of a graphics card over time, the math only makes sense if your premise is that graphics and general purpose RAM use in games grow at the same rate.
 
Nov 2, 2017
2,275
I guess the 8800 was an exception. Though it did come out a year after the 360 it I'm not mistaken. But yeah best to wait a while if you are looking to future proof anything.
(I'd take a bet with the 6800XT maybe, based on the similarities in architecture. Excluding VRAM the 3080 is a powerful option as well)
The 6800xt & 3080 are only 60-70% faster than the XSX and I just don't think that's enough for 4K in full next gen games. I think they'll be fine for 1080p and maybe 1440p if you lower settings for the next 4-5 years if you aim for 60fps though. At minimum I think you need atleast between 2-3x console GPU to comfortably play games at the same resolution as a console at 60fps, like the 390 & 970.
 
Nov 8, 2017
13,097
Since you are comparing the ratio of the unified RAM in a console with the VRAM of a graphics card over time, the math only makes sense if your premise is that graphics and general purpose RAM use in games grow at the same rate.

Do you have a basis think it's significantly different? The XSX has 10GB accessable in it's "fast" pool suitable for graphics work, so that represents something of an upper bounds for what I'd expect developers to be optimizing around, but unless you have detailed knowledge you can share with us regarding what a typical graphics memory budget looked like this gen, I don't know if we have any point of comparison.

It varies, right? Ultra settings in Gears 5 (a 2019 game) runs fine at 4GB VRAM / 1080p. But Wolfenstein 2 is heavier on it's higher settings and you want a lot more VRAM if you're maxing things. Not that you need to, the game only looks a little worse within a 4GB budget. eSports stuff is trivial. The concept of "maxing" and "ultra" is so variable game to game. There will always be boundary pushing games that will grow into whatever cards of the day allow them to do (and have settings - not vram related - that bring systems to their knees and are just bad ideas all round).

I distinctly remember the VRAM panic when Shadow of Mordor had an optional texture pack that called for 6GB. It barely looked better and it turns out that people on 4GB cards are fine to this day at Console+ settings. Even 3GB lasted a while without awful compromises. 2GB was indeed low - but you could probably conservatively say you can play 90% of major releases to this day, but going below console texture settings in a fair few cases. But even when you do fit games in your vram budget, a 770 is a bad experience in 2020 for other reasons.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
The 6800xt & 3080 are only 60-70% faster than the XSX and I just don't think that's enough for 4K in full next gen games. I think they'll be fine for 1080p and maybe 1440p if you lower settings for the next 4-5 years if you aim for 60fps though. At minimum I think you need atleast between 2-3x console GPU to comfortably play games at the same resolution as a console at 60fps, like the 390 & 970.

Those cards are slower than a one x though...
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
PS2: 32mb RD RAM + 4mb ED RAM
PS3: 256MB XD RAM + 256MB GDDR3
PS4: 8GB GDDR5 (~5GB available to games for all purposes)
PS5: 16GB GDDR6 (~13-13.5GB available to games for all purposes)

This generation is - no contest - the smallest increase in memory ever on the consoles. It's not usually a doubling, it's usually like an 8 or 16x increase.

People are theorycrafting that maybe we will have massively ballooning video memory requirements because of the SSDs but the hard reality is that most consumer GPUs will have much less than 8GB for years to come, and the most popular cards sold are below the tier that both AMD and Nvidia have announced right now - it will be the RX 6600s and the RTX 3060s that dominate the sales charts.

Those cards will have 8gb
 

Jroc

Banned
Jun 9, 2018
6,145
I also fear that 10GB won't be enough.
I really want a new card, but I keep reminding myself that buying a gpu now is stupid, as we don't know how next-gen consoles will affect pc requirements.
Not being able to buy 3080 for 700$ might be a blessing in disguise.

I think the 10GB 3080 might end up being like the 3GB 780Ti that released at the start of the PS4 era. Decently capable, but hamstrung by memory and blown away by the subsequent generation.
 

HanzSnubSnub

Member
Oct 27, 2017
917


So this popped into my recommended feed from 6 years ago as Dr. Su became CEO.

One thing she has been consistent on is that AMD is at its best when it's innovating.

They take calculated risks and have miraculously been able to practically close the gap between their competitors. It makes RDNA2 and future products even more exciting.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Yeah it kinda feels like a 3gb 780 ti situation. Also with amd having a better card.

Funny though, I actually had a 6gb 780 for a while.
 

fluffy pillow

Member
Sep 12, 2018
154
2GB was indeed low - but you could probably conservatively say you can play 90% of major releases to this day, but going below console texture settings in a fair few cases. But even when you do fit games in your vram budget, a 770 is a bad experience in 2020 for other reasons.
2GB became a complete nightmare a few years ago. I had a 270X 2GB and a lot of games became that telltale "not enough VRAM" stuttery mess despite the card having enough grunt to manage 1080p/30 in every respect other than memory. The comparison between the same game on my 270X and my gf's 280 3GB was like night and day, far more so than the power disparity between the cards would suggest.

I'm never going to underbuy on VRAM ever again after that horrible experience. (I have an RX580 8GB right now and I'm eyeing a 6800/XT.)
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,638
I have the option to return my 3080 until March, would genuinely consider a 6800xt as a like for like swap just for that peace of mind with VRAM and also the fact that it's actually a bit of a beast.

Let's see how launch goes.
 

tokkun

Member
Oct 27, 2017
5,400
Do you have a basis think it's significantly different? The XSX has 10GB accessable in it's "fast" pool suitable for graphics work, so that represents something of an upper bounds for what I'd expect developers to be optimizing around, but unless you have detailed knowledge you can share with us regarding what a typical graphics memory budget looked like this gen, I don't know if we have any point of comparison.

It varies, right? Ultra settings in Gears 5 (a 2019 game) runs fine at 4GB VRAM / 1080p. But Wolfenstein 2 is heavier on it's higher settings and you want a lot more VRAM if you're maxing things. Not that you need to, the game only looks a little worse within a 4GB budget. eSports stuff is trivial. The concept of "maxing" and "ultra" is so variable game to game. There will always be boundary pushing games that will grow into whatever cards of the day allow them to do (and have settings - not vram related - that bring systems to their knees and are just bad ideas all round).

I distinctly remember the VRAM panic when Shadow of Mordor had an optional texture pack that called for 6GB. It barely looked better and it turns out that people on 4GB cards are fine to this day at Console+ settings. Even 3GB lasted a while without awful compromises. 2GB was indeed low - but you could probably conservatively say you can play 90% of major releases to this day, but going below console texture settings in a fair few cases. But even when you do fit games in your vram budget, a 770 is a bad experience in 2020 for other reasons.

No, I don't have any insider information on what RAM target budgets look like at various game developers.

However I do work as a systems engineer, and my experience is that it is pretty rare for various resource demands to all scale uniformly over time. We go through various periods where demand is growing much faster for one of CPU / RAM / HDD / SSD / Network than the others. There is a degree to which the workload will grow to fit the hardware. If you give people a surplus of RAM and constrain them on CPU, they'll eventually change to algorithms that trade off memory to save CPU. I'd be surprised if game developers are much different.
 
Status
Not open for further replies.