You can tell, Steam would be counting them as monitorsyes, we don't know or can't tell how many are hooking PCs to 4k TVs for gaming
You can tell, Steam would be counting them as monitorsyes, we don't know or can't tell how many are hooking PCs to 4k TVs for gaming
Game still gonna be ass. I haven't seen such desperate game marketing since Anthem.
I have the Gears 5 4K ultra texture pack installed. My 3080 handles that no problem. I'm doubtful that Godfall--unless it's really poorly optimized--is going to blow through 10gb.
10GB for 3080 will be fine. The 8GB of the 3070 when playing in 4k on the other hand...
That's not how it works. Have my PC hooked up to 3 "monitors" but the Steam survey would only count my first one. So nearly anyone who uses a 4k TV to game on PC wouldn't be counted. Only the main monitor they use would.
I do use my 4k TV or my monitor exclusively... dunnoThat's not how it works. Have my PC hooked up to 3 "monitors" but the Steam survey would only count my first one. So nearly anyone who uses a 4k TV to game on PC wouldn't be counted. Only the main monitor they use would.
4K textures are 4K textures. Are 4K textures going to be more 4K this coming generation?Not sure why a game that was released on current gen consoles is relevant. Standards might be shifting with games that are made purely for next gen.
I agree 4K textures is not something new at all for PC gaming but 4K and ray tracing is new and Godfall has both.4K textures are 4K textures. Are 4K textures going to be more 4K this coming generation?
4K textures are 4K textures. Are 4K textures going to be more 4K this coming generation?
4K textures are 4K textures. Are 4K textures going to be more 4K this coming generation?
If you use your 4k tv as your only monitor, then it'd be detected. Most don't though so they aren't.
4K textures are 4K textures. Are 4K textures going to be more 4K this coming generation?
The game will be more demanding in other ways is the thing. There may be a larger variety of textures. Lighting of course will be more complicated even without ray tracing.
Says the person who asserted the Gears 5 VRAM comparison was irrelevant? You're also telling yourself to shut up you know.Do you know how many unique 4K textures are in one game vs the other or what other effects are using memory? We should stop pretending we are developers.
The Unreal Engine 5 tech demo that was running on PlayStation 5 had 8K textures.Can the PS5 handle the 4K by 4K textures since it has crazy bandwidth? I though there was something mentioned in the PS5 Unreal demo that movie quality assets are possible however that was @ 1440p.
There is a difference between "blowing through" and "allocation a crapload of VRAM just cause we can and AMD asked us to".I have the Gears 5 4K ultra texture pack installed. My 3080 handles that no problem. I'm doubtful that Godfall--unless it's really poorly optimized--is going to blow through 10gb.
It had 8K textures in storage from which it streamed in parts and details according to rendering needs.The Unreal Engine 5 tech demo that was running on PlayStation 5 had 8K textures.
Says the person who asserted the Gears 5 VRAM comparison was irrelevant? You're also telling yourself to shut up you know.
It had 8K textures in storage from which it streamed in parts and details according to rendering needs.
And it was running fine on a Turing class GPU with 8GB according to Epic themselves.
Not sure why a game that was released on current gen consoles is relevant. Standards might be shifting with games that are made purely for next gen.
RTX 3090 purchase vindicated............. JK it was still one of the dumbest things I've ever done in my life haha.
The point is that you're likely going to have to turn down some settings anyway, regardless of whether or not it's because of VRAM count or simply because *the GPU isn't powerful enough to maintain high frame rates with every feature maxed out*. This entire thread is stupid and pointless because a TON of people are ignoring the fact that every GPU ages poorly past a certain point and the fact that VRAM is only one (frankly, relatively minor) aspect of that overall performance profile.What the fuck are you on about? We are talking about vRAM, that's the issue at hand. What in my post disparages any GPU and what makes them good apart from devs may up vRAM requirements in the coming year or two. Are people that invested in their new purchase that they feel the need to talk trash about a post that doesn't fit into what they want to hear. Your 3080 is fabulous, amazing with great RT and raster performance. Better? Did I need to really state that while making an observation on vRAM.
Absolutely pathetic retort to my post. Of course it goes without saying any modern card has a plethora of great features that make games run, here's some truth, one of those features is vRAM, here's some more truth, that may not be the best amount for going forward. Not saying that suddenly the 3080 is shit, it's obviously a fantastic card, just that Nvidia has seemingly made a choice to meet a price point that devs may push beyond sooner rather than later. We'll see.
Read what the post says again and don't be so butt hurt because I'm saying 10GB may not cut it for the optimal settings in 12 to 18 months.
You mean that same UE5 demo that ran on a PS5 with less than 10GB of usable VRAM (remember, a bunch of the RAM is used for the CPU as well)?It was obvious that you needed at least 12 GB or VRAM to survive next-gen. Now even 8K textures will come with UE5 and it will be even more problematic. :(
The point is that you're likely going to have to turn down some settings anyway, regardless of whether or not it's because of VRAM count or simply because *the GPU isn't powerful enough to maintain high frame rates with every feature maxed out*. This entire thread is stupid and pointless because a TON of people are ignoring the fact that every GPU ages poorly past a certain point and the fact that VRAM is only one (frankly, relatively minor) aspect of that overall performance profile.
That's not even touching the fact that DLSS can easily allow the 3080 to completely wreck any AMD offering in any game that supports it, something that waaaay too many people are discounting just because it's not supported in 100% of games right now. The whole thing looks incredibly vapid from my perspective and, frankly, I'm tired of posts entertaining these narrow-minded views.
You mean that same UE5 demo that ran on a PS5 with less than 10GB of usable VRAM (remember, a bunch of the RAM is used for the CPU as well)?
I don't get where you people come from on things like this. I mean, other than pure ignorance.
Developers always say stuff. Let's wait for reviewers to benchmark the damn thing before jumping to conclusions, yea?
The people who actually coded the game and know it's limits vs a bunch of online bloggers. Having a hard time choosing who would know best.
Going to 4K won't have a huge impact on VRAM utilization. The difference between a 1440p and 4K set of render targets is around 250-500 MB usually and even with higher detailed MIPs and such it's hardly more than 1 GB. An illustration:
Going to 4K won't have a huge impact on VRAM utilization. The difference between a 1440p and 4K set of render targets is around 250-500 MB usually and even with higher detailed MIPs and such it's hardly more than 1 GB. An illustration:
It should also be noted that WDL is a current gen game at its base meaning that it doesn't rely on SSD streaming as future titles will and likely tends to precache a lot more data into VRAM that would be necessary with NVMe storage.
First off... my pronouns are right there, dude. Wtf? Funny thing is that "he" is just about the least accurate way to refer to me (I used to go by "she" before switching to the gender-neutral) so, um, good job?Oh, so you're just a 3080 owning fanboy who doesn't really know what he's talking about but must rebuff any discussion on vRAM amounts going forward because it upsets him. Got it. Narrow minded views lol.
Jog on.
Hardware Unboxed observed reduced performance (stuttering) in Legion with 6GB cards at 1440p High textures and with 8GB cards at 1440p Ultra textures. Also should be noted, there's a big drop in quality once you go from Ultra to High.Going to 4K won't have a huge impact on VRAM utilization. The difference between a 1440p and 4K set of render targets is around 250-500 MB usually and even with higher detailed MIPs and such it's hardly more than 1 GB. An illustration:
It should also be noted that WDL is a current gen game at its base meaning that it doesn't rely on SSD streaming as future titles will and likely tends to precache a lot more data into VRAM that would be necessary with NVMe storage.
Couldn't that also be due to higher CPU load from needing to transfer more data to the GPU especially if it's compressed? I recall recent AC games having performance impacts from streaming in data.Hardware Unboxed observed reduced performance (stuttering) in Legion with 6GB cards at 1440p High textures and with 8GB cards at 1440p Ultra textures. Also should be noted, there's a big drop in quality once you go from Ultra to High.
My 4gb 290x was fine for the vast majority of last gen, even at 4k. I had to turn settings down due to lack of compute power way before I hit a RAM wall. The only game that really ran out of RAM was alyx, which is kinda ideosyncratic, and probably shouldn't be used to extrapolate any larger tendsThere were many claims of cards being more than fine for the whole of the PS4/One generation that had 4GB of vRAM. They aged like milk.
Anything is plausible, the question is how often would that actually happen? With 99% of GPUs available currently being 6 to 12 GB ones how likely is it that devs will just say "eh, fuck it" and start requiring a 16GB one in two months from now? A rhetorical question.So for games on PC that are not utilizing SSD streaming, which Godfall presumably isn't as an SSD is not required, would utilizing more than 10GB not be plausible? From this performance review, WDL utilizes 8925MB (~8.7GB) of VRAM at 4K max + RT and without HD textures. Considering we're talking about 4K textures in Godfall's case, exceeding 10GB of VRAM doesn't sound impossible.
Reduced compared to what exactly? What card do we have currently which is the same in all aspects but VRAM quantity?Hardware Unboxed observed reduced performance (stuttering) in Legion with 6GB cards at 1440p High textures and with 8GB cards at 1440p Ultra textures. Also should be noted, there's a big drop in quality once you go from Ultra to High.
I don't think anyone is saying games will require more than 10GBs of RAM. I do think it is possible for some games to exceed 10GB of RAM at max settings. WDL for example is already getting close. The obvious solution would be to lower settings which isn't really a big deal but recent 3080 buyers probably weren't expecting to have to do this two months after launch.Anything is plausible, the question is how often would that actually happen? With 99% of GPUs available currently being 6 to 12 GB ones how likely is it that devs will just say "eh, fuck it" and start requiring a 16GB one in two months from now? A rhetorical question.
And "4K textures" doesn't mean much really. I wish people would stop using that as some indication of a huge VRAM requirement.
It's entirely possible for a scene in a game to have lots of texture data (and high resolution individual textures) without the developers including an extra high res texture pack or talking about "4K textures" in their PR. If you're comparing the same scene, obviously the higher resolution the textures the more memory is required. But you can't compare between games (or even between different scenes in the same game) without knowing the details of all the relevant texture data.4K textures require more memory than non-4K textures is all I'm saying.
It's entirely possible for a scene in a game to have lots of texture data (and high resolution individual textures) without the developers including an extra high res texture pack or talking about "4K textures" in their PR. If you're comparing the same scene, obviously the higher resolution the textures the more memory is required. But you can't compare between games (or even between different scenes in the same game) without knowing the details of all the relevant texture data.
Wait, AC is AMD, while Watch Dogs is Nvidia?
Why
Is it AMD's raytracing, or is it Nvidia's raytracing?
no they wont, because they, just like EVERY REVIEWER still does not take per process vram usage into account.
Ping Dictator , make sure he's aware of it. I don't see any reason he wouldn't use it, we know he uses RTSS in practically every PC performance video. Not every game will display it, but even that's something he could mention.no they wont, because they, just like EVERY REVIEWER still does not take per process vram usage into account.
hardware unboxed linked a few posts back? not taking into account
thefpsreview linked a few posts back? not even using ANY kind of external monitor, going off the built in game on
how can we have a conversation or a discussion regarding this when everyone isn't even using the right tools to begin with?
Half Life: Alyx gives me a Low VRAM warning every time I run it.Every single time this VRAM conversation comes up the people who argue for more VRAM are vindicated tops a few years later well within the lifetime of the cards.
Yet it happens every generation like clock work.
I've spoken to him about it on discord, pretty sure its just a case of being extremely busy and it fell through the cracks.Ping Dictator , make sure he's aware of it. I don't see any reason he wouldn't use it, we know he uses RTSS in practically every PC performance video. Not every game will display it, but even that's something he could mention.