• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

RSTEIN

Member
Nov 13, 2017
1,870
I have the Gears 5 4K ultra texture pack installed. My 3080 handles that no problem. I'm doubtful that Godfall--unless it's really poorly optimized--is going to blow through 10gb.
 

GreyHand23

Member
Apr 10, 2018
413
I have the Gears 5 4K ultra texture pack installed. My 3080 handles that no problem. I'm doubtful that Godfall--unless it's really poorly optimized--is going to blow through 10gb.

Not sure why a game that was released on current gen consoles is relevant. Standards might be shifting with games that are made purely for next gen.
 

Qudi

Member
Jul 26, 2018
5,318
10GB for 3080 will be fine. The 8GB of the 3070 when playing in 4k on the other hand...
 

kostacurtas

Member
Oct 27, 2017
9,060
4K textures are 4K textures. Are 4K textures going to be more 4K this coming generation?
I agree 4K textures is not something new at all for PC gaming but 4K and ray tracing is new and Godfall has both.

It is still very new but we have to keep in mind that ray tracing will increase the VRAM requirements.
 

Dolce

Member
Oct 25, 2017
14,235

I mean, I suppose I could be wrong, but the Series X and PlayStation 5 GPUs are extremely powerful and fast compared to what came before, along with new featuresets. Gears of War 5 was ported to the PC but the base game was an XBOX One title. They're of course updating it but an update is rarely going to be the same as a game built from scratch with different expectations in mind. Even without raytracing, they have a lot more base power to work with when it comes to lighting.
 

FuturaBold

Member
Oct 27, 2017
2,518
Can the PS5 handle the 4K by 4K textures since it has crazy bandwidth? I though there was something mentioned in the PS5 Unreal demo that movie quality assets are possible however that was @ 1440p.
 

dgrdsv

Member
Oct 25, 2017
11,846
I have the Gears 5 4K ultra texture pack installed. My 3080 handles that no problem. I'm doubtful that Godfall--unless it's really poorly optimized--is going to blow through 10gb.
There is a difference between "blowing through" and "allocation a crapload of VRAM just cause we can and AMD asked us to".
And the latter won't have any effect on performance on 8-10GB cards either. Hell, even 6GB cards will probably do fine VRAM wise in this one.

The Unreal Engine 5 tech demo that was running on PlayStation 5 had 8K textures.
It had 8K textures in storage from which it streamed in parts and details according to rendering needs.
And it was running fine on a Turing class GPU with 8GB according to Epic themselves.
 

RSTEIN

Member
Nov 13, 2017
1,870
Not sure why a game that was released on current gen consoles is relevant. Standards might be shifting with games that are made purely for next gen.

You're technically correct that it's a "last gen" game but the update came out about a year ago IIRC. At ultra with the texture pack a 2080ti struggles to get a constant 60fps. While it does lack RT, Gears 5 PC is still one of the best graphical showcases going with huge levels. At 4k with the texture pack we're talking 6gb of VRAM used. RTX will be expensive but I don't see anything in Godfall or any game on the horizon (including CP77) that would lead me to believe 10gb is going to be insufficient for 4k gaming. 2023 and beyond could be a different story but by then my 3080 will be long gone.
 
Oct 26, 2017
3,326
RTX 3090 purchase vindicated............. JK it was still one of the dumbest things I've ever done in my life haha.
s3Adb1J.jpg

PC looks sweet though.
 

Laiza

Member
Oct 25, 2017
2,170
What the fuck are you on about? We are talking about vRAM, that's the issue at hand. What in my post disparages any GPU and what makes them good apart from devs may up vRAM requirements in the coming year or two. Are people that invested in their new purchase that they feel the need to talk trash about a post that doesn't fit into what they want to hear. Your 3080 is fabulous, amazing with great RT and raster performance. Better? Did I need to really state that while making an observation on vRAM.

Absolutely pathetic retort to my post. Of course it goes without saying any modern card has a plethora of great features that make games run, here's some truth, one of those features is vRAM, here's some more truth, that may not be the best amount for going forward. Not saying that suddenly the 3080 is shit, it's obviously a fantastic card, just that Nvidia has seemingly made a choice to meet a price point that devs may push beyond sooner rather than later. We'll see.

Read what the post says again and don't be so butt hurt because I'm saying 10GB may not cut it for the optimal settings in 12 to 18 months.
The point is that you're likely going to have to turn down some settings anyway, regardless of whether or not it's because of VRAM count or simply because *the GPU isn't powerful enough to maintain high frame rates with every feature maxed out*. This entire thread is stupid and pointless because a TON of people are ignoring the fact that every GPU ages poorly past a certain point and the fact that VRAM is only one (frankly, relatively minor) aspect of that overall performance profile.

That's not even touching the fact that DLSS can easily allow the 3080 to completely wreck any AMD offering in any game that supports it, something that waaaay too many people are discounting just because it's not supported in 100% of games right now. The whole thing looks incredibly vapid from my perspective and, frankly, I'm tired of posts entertaining these narrow-minded views.

It was obvious that you needed at least 12 GB or VRAM to survive next-gen. Now even 8K textures will come with UE5 and it will be even more problematic. :(
You mean that same UE5 demo that ran on a PS5 with less than 10GB of usable VRAM (remember, a bunch of the RAM is used for the CPU as well)?

I don't get where you people come from on things like this. I mean, other than pure ignorance.
 
Last edited:

amc

Member
Nov 2, 2017
241
United Kingdom
User Banned (5 Days) - Antagonizing Others
The point is that you're likely going to have to turn down some settings anyway, regardless of whether or not it's because of VRAM count or simply because *the GPU isn't powerful enough to maintain high frame rates with every feature maxed out*. This entire thread is stupid and pointless because a TON of people are ignoring the fact that every GPU ages poorly past a certain point and the fact that VRAM is only one (frankly, relatively minor) aspect of that overall performance profile.

That's not even touching the fact that DLSS can easily allow the 3080 to completely wreck any AMD offering in any game that supports it, something that waaaay too many people are discounting just because it's not supported in 100% of games right now. The whole thing looks incredibly vapid from my perspective and, frankly, I'm tired of posts entertaining these narrow-minded views.


You mean that same UE5 demo that ran on a PS5 with less than 10GB of usable VRAM (remember, a bunch of the RAM is used for the CPU as well)?

I don't get where you people come from on things like this. I mean, other than pure ignorance.


Oh, so you're just a 3080 owning fanboy who doesn't really know what he's talking about but must rebuff any discussion on vRAM amounts going forward because it upsets him. Got it. Narrow minded views lol.

Jog on.
 
Last edited:

Mathiassen

The Fallen
Oct 31, 2017
257
man, technical specifications should be hidden from the user. It sucks that it's used for marketing.
 

Buggy Loop

Member
Oct 27, 2017
1,232
Yeah.. next gen console techs and game engines are not heading into a brute force approach where they go "oh wow, BIG memory, just make 8k x 8k textures" or some ridiculously high requirement of VRAM, a few first gen games will clumsily do that, but it's not where it's headed. The brute force way of resolution rendering, storing as much assets in VRAM, using raw textures without compression, is not where we are headed.

DirectML (I'll broadly use Dx12u API rather than Nvidia solutions, even if they are amazing with DLSS) will not brute force game engines anymore. ML upscaling, texture compression, IO system with directstorage, even AI can drastically raise the texture resolution on the fly from a lower resolution one.

Showing a game that is proudly boasting about brute forcing their way into good (well okay) graphics is kind of cute, because soon we'll be somewhere else entirely in rendering techniques.
 

dgrdsv

Member
Oct 25, 2017
11,846
The PS5 UE5 demo also only ran at 1440p.
Going to 4K won't have a huge impact on VRAM utilization. The difference between a 1440p and 4K set of render targets is around 250-500 MB usually and even with higher detailed MIPs and such it's hardly more than 1 GB. An illustration:

vram.png


It should also be noted that WDL is a current gen game at its base meaning that it doesn't rely on SSD streaming as future titles will and likely tends to precache a lot more data into VRAM that would be necessary with NVMe storage.
 

Alexx

Member
Oct 27, 2017
237
Going to 4K won't have a huge impact on VRAM utilization. The difference between a 1440p and 4K set of render targets is around 250-500 MB usually and even with higher detailed MIPs and such it's hardly more than 1 GB. An illustration:

vram.png


It should also be noted that WDL is a current gen game at its base meaning that it doesn't rely on SSD streaming as future titles will and likely tends to precache a lot more data into VRAM that would be necessary with NVMe storage.

So for games on PC that are not utilizing SSD streaming, which Godfall presumably isn't as an SSD is not required, would utilizing more than 10GB not be plausible? From this performance review, WDL utilizes 8925MB (~8.7GB) of VRAM at 4K max + RT and without HD textures. Considering we're talking about 4K textures in Godfall's case, exceeding 10GB of VRAM doesn't sound impossible.
 

Laiza

Member
Oct 25, 2017
2,170
Oh, so you're just a 3080 owning fanboy who doesn't really know what he's talking about but must rebuff any discussion on vRAM amounts going forward because it upsets him. Got it. Narrow minded views lol.

Jog on.
First off... my pronouns are right there, dude. Wtf? Funny thing is that "he" is just about the least accurate way to refer to me (I used to go by "she" before switching to the gender-neutral) so, um, good job?

Secondly, if you're going to contribute to this idiocy then I'm going to call you out. Start shit get hit, etc. You do realize that the entire reason this thing became an issue is because of people reaching super fucking hard for AMD to look competitive in any way, shape, or form, right?

Mind you, it's a good thing if AMD is competitive... but this isn't "competition", this is them spreading FUD and obfuscating reality to make themselves look good because they don't have any other ways to compete. I would be very happy if they were actually showing that they can keep up with Nvidia, but spreading misinformation to get people riled up is not the way to do it.

As long as they keep relying on peddling bullshit I'm not going to pretend to be happy about it, you can count on that.
 

Serious Sam

Banned
Oct 27, 2017
4,354
Going to 4K won't have a huge impact on VRAM utilization. The difference between a 1440p and 4K set of render targets is around 250-500 MB usually and even with higher detailed MIPs and such it's hardly more than 1 GB. An illustration:

vram.png


It should also be noted that WDL is a current gen game at its base meaning that it doesn't rely on SSD streaming as future titles will and likely tends to precache a lot more data into VRAM that would be necessary with NVMe storage.
Hardware Unboxed observed reduced performance (stuttering) in Legion with 6GB cards at 1440p High textures and with 8GB cards at 1440p Ultra textures. Also should be noted, there's a big drop in quality once you go from Ultra to High.
 

mordecaii83

Avenger
Oct 28, 2017
6,858
Hardware Unboxed observed reduced performance (stuttering) in Legion with 6GB cards at 1440p High textures and with 8GB cards at 1440p Ultra textures. Also should be noted, there's a big drop in quality once you go from Ultra to High.
Couldn't that also be due to higher CPU load from needing to transfer more data to the GPU especially if it's compressed? I recall recent AC games having performance impacts from streaming in data.
 
Feb 11, 2019
166
Every single time this VRAM conversation comes up the people who argue for more VRAM are vindicated tops a few years later well within the lifetime of the cards.

Yet it happens every generation like clock work.
 

Spoit

Member
Oct 28, 2017
3,976
There were many claims of cards being more than fine for the whole of the PS4/One generation that had 4GB of vRAM. They aged like milk.
My 4gb 290x was fine for the vast majority of last gen, even at 4k. I had to turn settings down due to lack of compute power way before I hit a RAM wall. The only game that really ran out of RAM was alyx, which is kinda ideosyncratic, and probably shouldn't be used to extrapolate any larger tends
 

dgrdsv

Member
Oct 25, 2017
11,846
So for games on PC that are not utilizing SSD streaming, which Godfall presumably isn't as an SSD is not required, would utilizing more than 10GB not be plausible? From this performance review, WDL utilizes 8925MB (~8.7GB) of VRAM at 4K max + RT and without HD textures. Considering we're talking about 4K textures in Godfall's case, exceeding 10GB of VRAM doesn't sound impossible.
Anything is plausible, the question is how often would that actually happen? With 99% of GPUs available currently being 6 to 12 GB ones how likely is it that devs will just say "eh, fuck it" and start requiring a 16GB one in two months from now? A rhetorical question.

And "4K textures" doesn't mean much really. I wish people would stop using that as some indication of a huge VRAM requirement.

Hardware Unboxed observed reduced performance (stuttering) in Legion with 6GB cards at 1440p High textures and with 8GB cards at 1440p Ultra textures. Also should be noted, there's a big drop in quality once you go from Ultra to High.
Reduced compared to what exactly? What card do we have currently which is the same in all aspects but VRAM quantity?
 

Alexx

Member
Oct 27, 2017
237
Anything is plausible, the question is how often would that actually happen? With 99% of GPUs available currently being 6 to 12 GB ones how likely is it that devs will just say "eh, fuck it" and start requiring a 16GB one in two months from now? A rhetorical question.

And "4K textures" doesn't mean much really. I wish people would stop using that as some indication of a huge VRAM requirement.
I don't think anyone is saying games will require more than 10GBs of RAM. I do think it is possible for some games to exceed 10GB of RAM at max settings. WDL for example is already getting close. The obvious solution would be to lower settings which isn't really a big deal but recent 3080 buyers probably weren't expecting to have to do this two months after launch.

4K textures require more memory than non-4K textures is all I'm saying.
 
Nov 14, 2017
2,322
4K textures require more memory than non-4K textures is all I'm saying.
It's entirely possible for a scene in a game to have lots of texture data (and high resolution individual textures) without the developers including an extra high res texture pack or talking about "4K textures" in their PR. If you're comparing the same scene, obviously the higher resolution the textures the more memory is required. But you can't compare between games (or even between different scenes in the same game) without knowing the details of all the relevant texture data.
 

Alexx

Member
Oct 27, 2017
237
It's entirely possible for a scene in a game to have lots of texture data (and high resolution individual textures) without the developers including an extra high res texture pack or talking about "4K textures" in their PR. If you're comparing the same scene, obviously the higher resolution the textures the more memory is required. But you can't compare between games (or even between different scenes in the same game) without knowing the details of all the relevant texture data.

Agreed.
 

Darktalon

Member
Oct 27, 2017
3,265
Kansas
Digital Foundry is gonna have a field day with this one.
no they wont, because they, just like EVERY REVIEWER still does not take per process vram usage into account.
hardware unboxed linked a few posts back? not taking into account
thefpsreview linked a few posts back? not even using ANY kind of external monitor, going off the built in game on

how can we have a conversation or a discussion regarding this when everyone isn't even using the right tools to begin with?
 

Jedi2016

Member
Oct 27, 2017
15,614
no they wont, because they, just like EVERY REVIEWER still does not take per process vram usage into account.
hardware unboxed linked a few posts back? not taking into account
thefpsreview linked a few posts back? not even using ANY kind of external monitor, going off the built in game on

how can we have a conversation or a discussion regarding this when everyone isn't even using the right tools to begin with?
Ping Dictator , make sure he's aware of it. I don't see any reason he wouldn't use it, we know he uses RTSS in practically every PC performance video. Not every game will display it, but even that's something he could mention.
 

Darktalon

Member
Oct 27, 2017
3,265
Kansas
Ping Dictator , make sure he's aware of it. I don't see any reason he wouldn't use it, we know he uses RTSS in practically every PC performance video. Not every game will display it, but even that's something he could mention.
I've spoken to him about it on discord, pretty sure its just a case of being extremely busy and it fell through the cracks.
He absolutely could start a new trend if he highlighted it though.
https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/
 
Status
Not open for further replies.