• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Oct 25, 2017
4,750
Norman, OK
To those criticizing the OP because of her not using next-gen games... Isn't MSFS already that?

I would argue that Control is a next-gen game on PC right now. As far as I'm concerned, DLSS 2.0 + Ray Tracing= next gen on PC (not saying those features are required for a game to be "next-gen", just saying they're sufficient). Cyberpunk, Watch Dogs: Legion, COD 2020- all of these games will be a noticeably better experience on PC with Turing and above than on either next-gen console simply because of the presence DLSS 2 and RTX features.

As for this ridiculous ongoing VRAM argument- what it really boils down to is that the people who are whining about 10GB in the 3080 are basically saying that Nvidia doesn't know what they're doing. That's just a goofy notion.
 

F34R

Member
Oct 27, 2017
11,989
Thank you for posting this. This is exactly why I'm worried. I have little doubt that within the next 4 years (how long I would plan to keep a 3080) there will be games released that are more than 10% more demanding than FS from a VRAM perspective.

While I appreciate the work put into the OP, and I mean no disrespect with this, I have serious concerns with his qualifications to speak with such certainty to what will be required years from now.

The fact that we're already so close to the limit leads me to believe that 10GB will likely be a bottleneck in a couple of years.
Of course, I, like the OP, cannot know for certain and am simply guessing based on attempted extrapolations using past experiences.
I'll fly around some more areas and see if I can get it to >10GB
 

Deleted member 17184

User-requested account closure
Banned
Oct 27, 2017
5,240
I would argue that Control is a next-gen game on PC right now. As far as I'm concerned, DLSS 2.0 + Ray Tracing= next gen on PC (not saying those features are required for a game to be "next-gen", just saying they're sufficient). Cyberpunk, Watch Dogs: Legion, COD 2020- all of these games will be a noticeably better experience on PC with Turing and above than on either next-gen console simply because of the presence DLSS 2 and RTX features.

As for this ridiculous ongoing VRAM argument- what it really boils down to is that the people who are whining about 10GB in the 3080 are basically saying that Nvidia doesn't know what they're doing. That's just a goofy notion.
Yeah, I thought about Control, too. I haven't tried it with ray-tracing and DLSS, but it's probably the best showcase of that right now. I just focused on MSFS in my post because it was the OP's main example.
 
Oct 25, 2017
4,750
Norman, OK
Thank you for posting this. This is exactly why I'm worried. I have little doubt that within the next 4 years (how long I would plan to keep a 3080) there will be games released that are more than 10% more demanding than FS from a VRAM perspective.

While I appreciate the work put into the OP, and I mean no disrespect with this, I have serious concerns with his qualifications to speak with such certainty to what will be required years from now.

The fact that we're already so close to the limit leads me to believe that 10GB will likely be a bottleneck in a couple of years.
Of course, I, like the OP, cannot know for certain and am simply guessing based on attempted extrapolations using past experiences.

I mean, if your position is that you need to be playing at native 4K or above with ultra/maxed textures 4 years down the road on this purchase, then you need to be looking at the 3090 anyways, and for more reasons than just the additional VRAM.
 
Nov 14, 2017
4,928
I would argue that Control is a next-gen game on PC right now. As far as I'm concerned, DLSS 2.0 + Ray Tracing= next gen on PC (not saying those features are required for a game to be "next-gen", just saying they're sufficient). Cyberpunk, Watch Dogs: Legion, COD 2020- all of these games will be a noticeably better experience on PC with Turing and above than on either next-gen console simply because of the presence DLSS 2 and RTX features.

As for this ridiculous ongoing VRAM argument- what it really boils down to is that the people who are whining about 10GB in the 3080 are basically saying that Nvidia doesn't know what they're doing. That's just a goofy notion.
No, I'm actually saying that Nvidia were constrained by the DDR6X density currently available and are trying to sell this years product. I think for most people it's better to wait until next year, unless you have a serious need to buy this year.
 

TSM

Member
Oct 27, 2017
5,821
I'm not sure I agree with the premise of this thread. I look back to when people were arguing just as hard as the OP that 2GB of ram was enough at the start of last gen, and we eventually got games where 4GB wasn't enough. I personally think 10GB will be enough for the next couple years, but I also think we'll see games pushing past the limits of 10GB before 2024. Of course some of the new memory hungry stuff will likely also want more powerful hardware as well. 10GB is great for Nvidia right now because it's more than enough in 2020 and it will give people a good reason to upgrade down the road.
 
Oct 26, 2017
3,327
No, I'm actually saying that Nvidia were constrained by the DDR6X density currently available and are trying to sell this years product. I think for most people it's better to wait until next year, unless you have a serious need to buy this year.

This will always be the case. If people think console multiplat games are gonna be more demanding than Flight Simulator.

tumblr_nszrojveyI1u5x9uoo1_400.gif
 
Oct 25, 2017
4,750
Norman, OK
No, I'm actually saying that Nvidia were constrained by the DDR6X density currently available and are trying to sell this years product. I think for most people it's better to wait until next year, unless you have a serious need to buy this year.

It's always the case where it's better to wait until "next year" either for a new gen of cards or a mid-gen refresh. And for more reasons than just VRAM.
 

Tatsu91

Banned
Apr 7, 2019
3,147
8-10gb will likely be the minium baseline. If your at 1440P or lower but higher will benefit 4K
 

Lady Gaia

Member
Oct 27, 2017
2,478
Seattle
Lady Gaia, anexanhume, apologies for dragging you into another thread, but do you have any insight into the subject of how the Cell processor was used for what would've traditionally been GPU tasks?

The Cell's SPEs aren't really all that well suited to traditional rendering, so what was originally planned would likely have been a dramatic departure from the conventional rendering pipeline. In any event, that was completely scrapped. What they are very good at is DSP-style tasks, working on an isolated chunk of memory copied from system RAM in a very efficient manner, and then copying the result back while working on the next chunk. This would be best suited to post-processing effects that would otherwise be really bandwidth constrained on the GPU, like applying a gaussian blur for depth of field, or adding bloom to an otherwise complete image.

That all feels like something of an aside from the message you were replying to, though, which was focused on VRAM. The SPEs wouldn't act as an extension of VRAM but rather as a temporary copy, something like cache but with much more explicit management. It didn't amount to a ton in any case, as 256KiB per SPE added up to a grand total of 2MiB. I concur that the leap in RAM/VRAM, both in size and bandwidth, from PS3 to PS4 was huge and one of the bigger steps forward for the generation alongside the GPU itself.
 

Laiza

Member
Oct 25, 2017
2,170
That's absolutely false. Using the dev tools I show 8.5GB of vram being used with FS2020, Ultra settings, 2080ti.
Oh, I stand corrected. I still stand by my notion that no "next-gen" console game is going to be using more than FS2020, though.

I'm not sure I agree with the premise of this thread. I look back to when people were arguing just as hard as the OP that 2GB of ram was enough at the start of last gen, and we eventually got games where 4GB wasn't enough. I personally think 10GB will be enough for the next couple years, but I also think we'll see games pushing past the limits of 10GB before 2024. Of course some of the new memory hungry stuff will likely also want more powerful hardware as well. 10GB is great for Nvidia right now because it's more than enough in 2020 and it will give people a good reason to upgrade down the road.
The situation is completely different than last gen. Last gen we were pushing up both screen and texture resolution as fast as we possibly could, plus consoles were gaining something on the order of 20x as much memory as last gen.

It'll be awhile before we see games using more than 10GB. Especially with Flight Simulator 2020 being the new benchmark game showing us the high end of what's possible while rendering the entirety of Planet Earth in a large, airborne sight radius.
 

F34R

Member
Oct 27, 2017
11,989
Oh, I stand corrected. I still stand by my notion that no "next-gen" console game is going to be using more than FS2020, though.


The situation is completely different than last gen. Last gen we were pushing up both screen and texture resolution as fast as we possibly could, plus consoles were gaining something on the order of 20x as much memory as last gen.

It'll be awhile before we see games using more than 10GB. Especially with Flight Simulator 2020 being the new benchmark game showing us the high end of what's possible while rendering the entirety of Planet Earth in a large, airborne sight radius.
Keep moving those posts. ;) It's not rendering the entirety of Earth in a large airborne sight radius lol. I got up to 9GB VRAM at an airport, while on the ground.
 

TSM

Member
Oct 27, 2017
5,821
The situation is completely different than last gen. Last gen we were pushing up both screen and texture resolution as fast as we possibly could, plus consoles were gaining something on the order of 20x as much memory as last gen.

It'll be awhile before we see games using more than 10GB. Especially with Flight Simulator 2020 being the new benchmark game showing us the high end of what's possible while rendering the entirety of Planet Earth in a large, airborne sight radius.

Like I said this sounds just like the start of last gen when people were adamant that 2GB cards would be fine. They were good for a couple years which is likely to play out the same way this gen for 8GB cards. We'll see how long 10GB holds out for people in the long run. 2024 is a long way out, and there may be some cool new ways to utilize GPU resources that developers figure out as they always do.
 

Spork4000

Avenger
Oct 27, 2017
8,488
This will always be the case. If people think console multiplat games are gonna be more demanding than Flight Simulator.

tumblr_nszrojveyI1u5x9uoo1_400.gif

That's a bold prediction. If consoles have 13 of usable vram why wouldn't devs use it?

I'm on the side that 10gb is definitely enough for now, but it's odd to assume Flight Sim is the zenith when it's really just the beginning.
 

Korezo

Banned
Oct 25, 2017
1,145
Ill end up doing my own testing because games like Re2 remake states when changing graphics settings when you going over your vram limit and gives you a warning, I don't remember if I had the 1080ti or 2080ti. Remember other games to.
 

Darryl M R

The Spectacular PlayStation-Man
Member
Oct 25, 2017
9,721
I'm interested in how many frames you can achieve with 4k, max settings, and additional add-ons like hairworks in a game similar to Cyberpunk.
 

Nooblet

Member
Oct 25, 2017
13,624
Ill end up doing my own testing because games like Re2 remake states when changing graphics settings when you going over your vram limit and gives you a warning, I don't remember if I had the 1080ti or 2080ti. Remember other games to.
RE2 is actually a prime example of a game that doesn't use the VRAM stated.
So the ingame meter is actually even more inaccurate than RTSS, because RTSS atleast shows you the allocated VRAM whereas the ingame bar is just an arbitrary bar. There's zero difference in texture quality between the various "High" presets in RE2 be it 0.5GB or 8GB. And that even when you go over the VRAm in that bar, the game does not stutter which would be the first thing to happen when you approach VRAM limit or surpass it. Those two are prime evidence that it's just caching the textures and doesn't actually need that VRAM.

For more info on this check out this video
 

Laiza

Member
Oct 25, 2017
2,170
Keep moving those posts. ;) It's not rendering the entirety of Earth in a large airborne sight radius lol. I got up to 9GB VRAM at an airport, while on the ground.
It still has to have everything loaded into memory regardless of whether or not you're on the ground. The fact that there are plenty of screenshots of the game running at significantly less VRAM usage in the air tells me that what's actually onscreen at any one point in time is less important than what assets are actually within range (and the data size/quality of those assets).

And while I'm at it, my goal posts didn't move; my goal post this entire time has been "next gen console games won't exceed 10GB of VRAM usage" and Flight Simulator is 100% a next-gen game, and possibly the most demanding next-gen game we'll see for quite some time. The fact that FS2020 can get close to 10GB usage doesn't mean that there will suddenly be games that overtake it. It's a weird assumption to make when FS2020 has such a uniquely massive data set.
Like I said this sounds just like the start of last gen when people were adamant that 2GB cards would be fine. They were good for a couple years which is likely to play out the same way this gen for 8GB cards. We'll see how long 10GB holds out for people in the long run. 2024 is a long way out, and there may be some cool new ways to utilize GPU resources that developers figure out as they always do.
Even the "next-gen" consoles don't have that much RAM to use in the first place (as the 13.5GB is used by both the CPU and GPU), so the only thing that should ever give a 3080 pause are things that are well above console settings. In other words, optional shit that you would want a 3090 for anyway.

Regardless, I'm perfectly 100% fine with 4k-level assets and I really, really do not understand this insistence on texture sizes that are too high resolution to feasibly render on your screen. Get the 3090 if you really want that. It's not my or Nvidia's problem if your expectations are set too high.
 
10GB VRAM is never going to be its primary bottleneck.

Ra

Rap Genius
Moderator
Oct 27, 2017
12,202
Dark Space
Great write up Darktalon

People need to stop and understand something. Saying "10GB VRAM isn't enough to last the entire console generation" is not the complete statement. What you're actually saying is, "10GB VRAM isn't enough to last the entire generation at 4K Ultra settings". The response is "Well, of course not." But that's only because the RTX 3080 itself isn't enough to last the entire console generation at 4K Ultra settings.

As the RTX 3080 ages, the compromises which will need to be made will in kind reduce the VRAM usage. Over thee course of its existence, the RTX 3080 is going to transition from a native 4K card, to a 1440p card, to a 1080p card. Graphical settings it can handle will be changing over the passage of time as well.

Guess what all of this means? The card's 10GB VRAM is never going to be its primary bottleneck.

The video posted in this thread of the GTX 780 3GB trying to run Red Dead Redemption 2 is the perfect example. Its 3GB VRAM is not its main issue with one of 2018's most graphically intensive games, it's the reasterization performance where it comes up short.

edit - and please keep in mind that the new consoles are not going to be running everything at native 4K? The assumption that every game built for them is going to utilize every bit of RAM isn't healthy either.
 
Last edited:

Merv

Member
Oct 27, 2017
6,456
I still think the cards should have been.

3070 - 10GB

3080 - 12GB

3090 - 24GB

It would make them way more appealing, regardless it's explicitly needed or not.

I still have my 1070 and upgrading after skipping a gen to the same amount of RAM feels shitty. I understood the 2070 more as it was the same console generation.
 
Oct 25, 2017
41,368
Miami, FL
I still think the cards should have been.

3070 - 10GB

3080 - 12GB

3090 - 24GB

It would make them way more appealing, regardless it's explicitly needed or not.
You implying they're going to have trouble "appealing" with these cards? lol

Feel free to wait. There will be a long line ready to take your place. These cards will be slow and tired long before memory becomes an actual limiting factor.

Great write up Darktalon

People need to stop and understand something. Saying "10GB VRAM isn't enough to last the entire console generation" is not the complete statement. What you're actually saying is, "10GB VRAM isn't enough to last the entire generation at 4K Ultra settings". The response is "Well, of course not." But that's only because the RTX 3080 itself isn't enough to last the entire console generation at 4K Ultra settings.

As the RTX 3080 ages, the compromises which will need to be made will in kind reduce the VRAM usage. Over thee course of its existence, the RTX 3080 is going to transition from a native 4K card, to a 1440p card, to a 1080p card. Graphical settings it can handle will be changing over the passage of time as well.

Guess what all of this means? The card's 10GB VRAM is never going to be its primary bottleneck.

The posted in this thread of the GTX 780 3GB trying to run Red Dead Redemption 2 is the perfect example. Its 3GB VRAM is not its main issue with one of 2018's most graphically intensive games, it's the reasterization performance where it comes up short.

edit - and please keep in mind that the new consoles are not going to be running everything at native 4K? The assumption that every game built for them is going to utilize every bit of RAM isn't healthy either.
ITT: people who have been gaming for most of their lives suddenly find out how PC GPUs (and by extension, PC gaming) evolve over the course of a decade and have to have it explained to them like toddlers.

Let this thread tell it, people here would be of the mind that someone who bought a GPU in 2013 would have expected it to still be running 2020's latest and greatest at 1080p. Hint: they didn't. Power and performance became a hinderance to those PC gamers far sooner than memory did.
 

Jedi2016

Member
Oct 27, 2017
15,622
I had a feeling it was something like this. NVidia wouldn't have bottlenecked their own flagship card, and they probably know a lot more about actual usage than we do.

I said something similar about the Xbox Series S and how it could play games at Xbox One X levels despite being "less powerful", because Microsoft's engineers know a lot more about how all those games were actually running on the One X than we do, and probably came to a similar conclusion, that all that power and memory wasn't actually needed to run those games.
 

Merv

Member
Oct 27, 2017
6,456
Your implying they're going to have trouble "appealing" with these cards? lol

Feel free to wait. There will be a long line ready to take your place. These cards will be slow and tired long before memory becomes an actual limiting factor.
I mean I said way more appealing, but I guess you had to get your snark on.

Paying $100 more for the same series with the same amount of VRAM feels shitty that's all. Thank you for clarifying that there is a long line of people willing to take my place though. It's very helpful and doesn't come of as defensive at all.
 

Deleted member 22585

User requested account closure
Banned
Oct 28, 2017
4,519
EU
I mean I said way more appealing, but I guess you had to get your snark on.

Paying $100 more for the same series with the same amount of VRAM feels shitty that's all. Thank you for clarifying that there is a long line of people willing to take my place though. It's very helpful and doesn't come of as defensive at all.

It might be the same amount of VRAM, but it's better VRAM :-)
 

Nzyme32

Member
Oct 28, 2017
5,245
I still think the cards should have been.

3070 - 10GB

3080 - 12GB

3090 - 24GB

It would make them way more appealing, regardless it's explicitly needed or not.

I still have my 1070 and upgrading after skipping a gen to the same amount of RAM feels shitty. I understood the 2070 more as it was the same console generation.
You implying they're going to have trouble "appealing" with these cards? lol

Feel free to wait. There will be a long line ready to take your place. These cards will be slow and tired long before memory becomes an actual limiting factor.


ITT: people who have been gaming for most of their lives suddenly find out how PC GPUs (and by extension, PC gaming) evolve over the course of a decade and have to have it explained to them like toddlers.

Let this thread tell it, people here would be of the mind that someone who bought a GPU in 2013 would have expected it to still be running 2020's latest and greatest at 1080p. Hint: they didn't. Power and performance became a hinderance to those PC gamers far sooner than memory did.

Have to agree. As with anything, if you have no confidence, then wait. The rest of the folks who have no issues and confidence for their purposes will be just fine. I'm confident I'll be fine with 4k and other DLSS / RTX solutions, flexibly adjusting as needed over the next several years
 

vitormg

Member
Oct 26, 2017
1,928
Brazil
I have a question.

Do cards with different VRAM bandwidths result in different amounts of VRAM being used? I assumed the fast GDDR6X of the 3080 would result in less consumption if compared to a slower memory in identical scenarios. Am I wrong?
 

Merv

Member
Oct 27, 2017
6,456
It might be the same amount of VRAM, but it's better VRAM :-)

I'm sure it's fine and I will end up with a 3070. Just one of those thing when you're looking at spending $500 on a GPU, which is about how much I spent on a 3700X, 32GB 3600Mhz RAM and an X570. Been waiting for RTX to play Control and Metro.
 
Oct 27, 2017
6,302
I don't think most PC gamers will want to limit themselves to console IQ and settings.

This is where stuff like "Era bubble" comes from.

You should check some stats on that use of "most", even against the current 2xxx cards, nevermind people rushing for 3xxx. I think chasing raw power is substantially less important to the PC market in general than enthusiast communities think it is. There are other perks to this platform.
 

Laiza

Member
Oct 25, 2017
2,170
I have a question.

Do cards with different VRAM bandwidths result in different amounts of VRAM being used? I assumed the fast GDDR6X of the 3080 would result in less consumption if compared to a slower memory in identical scenarios. Am I wrong?
This is correct. Higher bandwidth means more RAM being used more effectively. For gaming, low-bandwidth high-capacity VRAM is actually completely useless in most real-world scenarios. Bandwidth has to keep up for that VRAM to be of any use.
 

Serious Sam

Banned
Oct 27, 2017
4,354
User Banned (1 Week): Antagonizing Other Users and Inflammatory Commentary; Prior Infractions for the Same
This thread is leaving bad taste for me. I think true intention here is to silence criticisms and to make those who are about to buy expensive thing feel good about buying that expense thing. RTX3070 and 3080 did stagnate with VRAM capacity and it is hot topic all around the web. If this wasn't important and it didn't matter to people, it wouldn't be discussed here on ERA, as well as all hardware forums, reddit, twitter, etc.

Also this Nvidia worshiping is cringe. There are way too many posts proclaiming Nvidia is multibillion company who knows their stuff and can do no wrong. Yeah, sure, let's just forget all the blunders Nvidia did. 970 3.5GB VRAM fiasco, DX12 adaptation delay and then taking years to catch up to AMD. Vulkan adaptation delay, and again being outmatched by AMD until recently. Heck, even today AMD excels in both DX12 and Vulkan destroying more expensive Nvidia cards. And this is just recent events. There was a lot more Nvidia fuckery in the past.

Nvidia knows perfectly well that 10GB is sufficient for cross-gen period until next summer. They also know full well that in 10 months they can release SUPER variants of these GPUs with double VRAM amount and market them as being truly next-gen ready.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,265
Kansas
This is correct. Higher bandwidth means more RAM being used more effectively. For gaming, low-bandwidth high-capacity VRAM is actually completely useless in most real-world scenarios. Bandwidth has to keep up for that VRAM to be of any use.
I have a question.

Do cards with different VRAM bandwidths result in different amounts of VRAM being used? I assumed the fast GDDR6X of the 3080 would result in less consumption if compared to a slower memory in identical scenarios. Am I wrong?
Yup and this is why products like the 750 Ti 4GB, and the 760 4GB, were so ridiculous. Instead of putting that money towards a 770 2GB, they had all this extra extremely slow VRAM. Kepler in general was just a bad time for Nvidia.
 

bobeth

Member
Oct 28, 2017
1,302
Without really being able to dive into it without NDAs, you're going to be pretty wrong here. Not entirely wrong, but mostly wrong. 😬
I owned a 2Gb 670, a 3 Gb 780 and a 4Gb 980 that all ended up being short on VRAM before their replacement got released, this I'm going to believe over any vague answer you could provide.
 

Nooblet

Member
Oct 25, 2017
13,624
Worth noting is that Flight Sim is a next gen game with last gen IO/data management. That in itself means if it was made with SSD speeds into consideration then it wouldn't require as much VRAM.

When I say next gen I mean in terms of rendering and assets, sure it doesn't use RT but plenty of next gen games won't do that. And while there will be other games that do indeed RT, I doubt they will do a ton of it, let alone do it along with a lot of really complex simulation that Flight Sim is doing.
 

Gitaroo

Member
Nov 3, 2017
7,987
Yeah... Although DLSS is an amazing tech and also save vram. People shouldn't count on it too much because not every single games will support it.
 

Deleted member 15395

Unshakable Resolve
Banned
Oct 27, 2017
3,145
I fully expect a 3080ti to launch at some point next gear with significantly more VRAM. I agree with those that say it will be enough for the first batch of next gen games but will probably start to fall short as the gen goes on.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,265
Kansas
While I appreciate the work put into the OP, and I mean no disrespect with this, I have serious concerns with his qualifications to speak with such certainty to what will be required years from now.
True, I am not here to provide my credentials and say "believe what I say", but I have documented every one of my sources, and listed the logic and reasoning behind my conclusions. I have no problem with people disagreeing or presenting information that conflicts with my information, I welcome it! If people have evidence to present, I want them to! Discussion about this is very important!

But so far I haven't seen anyone present any conflicting evidence, the closest we've gotten is demonstrating that a next-gen game @ 4k ultra is hitting 9GB. And that is without using any DLSS or RTX I/O features which would decrease the usage.
And FS2020 is absolutely a next-gen game, which is why it is such a good example. The trolls have been acting like this game runs on their laptops.
 
Jan 21, 2019
2,902
I would argue that Control is a next-gen game on PC right now. As far as I'm concerned, DLSS 2.0 + Ray Tracing= next gen on PC (not saying those features are required for a game to be "next-gen", just saying they're sufficient). Cyberpunk, Watch Dogs: Legion, COD 2020- all of these games will be a noticeably better experience on PC with Turing and above than on either next-gen console simply because of the presence DLSS 2 and RTX features.

As for this ridiculous ongoing VRAM argument- what it really boils down to is that the people who are whining about 10GB in the 3080 are basically saying that Nvidia doesn't know what they're doing. That's just a goofy notion.

Nvidia know exactly what they are doing. Release these RAM starved cards first before AMD, release super or Ti variants once AMD releases their cards to cancel out their cards with adequate VRAM. It's obvious.
 

BreakAtmo

Member
Nov 12, 2017
12,828
Australia
The Cell's SPEs aren't really all that well suited to traditional rendering, so what was originally planned would likely have been a dramatic departure from the conventional rendering pipeline. In any event, that was completely scrapped. What they are very good at is DSP-style tasks, working on an isolated chunk of memory copied from system RAM in a very efficient manner, and then copying the result back while working on the next chunk. This would be best suited to post-processing effects that would otherwise be really bandwidth constrained on the GPU, like applying a gaussian blur for depth of field, or adding bloom to an otherwise complete image.

That all feels like something of an aside from the message you were replying to, though, which was focused on VRAM. The SPEs wouldn't act as an extension of VRAM but rather as a temporary copy, something like cache but with much more explicit management. It didn't amount to a ton in any case, as 256KiB per SPE added up to a grand total of 2MiB. I concur that the leap in RAM/VRAM, both in size and bandwidth, from PS3 to PS4 was huge and one of the bigger steps forward for the generation alongside the GPU itself.

Ahhh, OK. I was under the impressing that the Cell/XDR was being used for a bunch of tasks that would normally be handled by the GPU and VRAM, but now I see that's not exactly the case. Thank you.

And yeah, I remember when that 8GB was announced and blew everyone's minds. I'm glad they went for that.