Agreed. Outstanding work.Thank you for your great work Darktalon!
Great informative posts
Thank you, truly thank you, this post brought a tear to my eye and was sorely needed and appreciated.Thank you Darktalon for the og post, and all the followups.
It's a mighty interesting topic not getting the broad attention it needs to.
Shown VRAM usage, among all the other aspects of gfx, was always something that puzzled&fascinated me,
alone from the math itself, as gfx IS nothing but math, resulting in pixels lighting up,
and i saw it proven by real life experience many times.
Like a lot of even older games report 8G VRAM usage (that is actually allocation) on my 1070 8G (ups, what a coincidence *lol*),
even when i still was on 1080p, while offering not even remotely as visually pleasant and crisp textures or geometry as newer titels.
As an old boy, coming from late 80'ies consoles, my first 30386 with an amber 14" screen up to now still beeing an avid gamer,
i found my "forever sweespot" in 27"/1440p/144Hz, in the form of a marvelous Asus ROG Gsync panel.
My personal perfect size&match in regard to pov, eyemovement, viewdistance and overall feel. Anything smaller or bigger - meh, not for me.
27"/4k/144hz maybe later down the line - when prices for high end models with those specs get more reasonable...
And it is in this resolution, where i can confirm your point:
Doom Eternal (a vram hog, working a bit different than other titles) is proven to eat under 8G VRAM in 1440p@Ultranightmare.
So far, so good, but: my 1070 simply chokes on this setting, and doesn't run smoothly&pleasantly with high enough framerates.
Reducing the texture size to ultra, or even lower - does sh** nothing to improve that.
Lowering all the other settings to Ultra, and leaving the textures on Ultra Nightmare - does, then it feels smooth like silk.
Put the other way around: even if the 1070 had 24G GDDR6X, or 96GB GDDR17XXX, it would not run it smooth on 1440p - because the gpu itself simply lacks the horsepower.
And exactly that's the point, sadly so many miss:
CAD/Render/pro work on workstations all aside (that work, and those cards are a different story, and simply need a lot of vram for other purposes, than running a game)...
You can't build or buy a GPU to be futureproof for 5 years.
No matter if it's my '96 Orchid Righteous 3D (3dfx Voodo 1) 4MB (i still have laying around somewhere), or a 2020 RTX3080 10GB.
A gfx card was, and always will be, an overall, nevertheless optimal compromise of architecture/design, compute/render power, vram, power requirements etc.
For a certain lifespan, and a certain range applications (in this case games up to a certain point).
And in 5 years, you build the next next gen card - with and for technology then (and only then, not now) available.
It's the very same like simply fitting a car with bigger/wider rims&tyres (and no other adaptions), that make it look nicer, but makes it drive like sh***.
Car engineers are no idiots, and millions of test kilometers on roads and racetracks, driven by dozens of test&racedrivers (at least they do so here with our european premium cars), taking a lot of knowledge, time and money during the development of a car, to finally find an optimal spec, "a best compromise" of looks, comfort, grip, handling and price, also tailored to the specific model of the car (base/road/sport/track oriented version/model).
Hell no - they are all idiots - i put on these rims&tires, because a friend who knows somebody that told him, that Mr. Smith in his backyard garage knows it all better than all of them....
The gfx card match is:
Hell no - slap on 12, 16, 20, xy GB vram on a card that will never ever be able to take any advantage of it, because way too slow by the time that happens - but i don't care, it's more, so it's better, i want it, it will make my card futureproof, X or Y also said that, and those engineers from company XYZ have no clue....
A word of wisdom and encouragement at last Darktalon:
Thanks for your efforts, and keep going.
But, after 50 years of experience with those 2 legged mammals, it's safe to say:
Don't let people and their grief affect you. You can't convince anybody that doesn't want to (listen, think, try out themselfes, etc.)
Let them be happy with it, and we've seen in 12 pages, what that is.
There are so many things to consider on this topic - and most mix it all up/don't get it:
How company X or Y approach steady performance improvements with different methods (and often different means in the lifetime of gfx gpu generation X compared to Y),
how game engines work and how well (or badly) optimised they are towards which underlying hardware,
how the interplay with ram, cpu and storage works (and like it will change/improve with different, already mentioned i/o improvements on hw and sw level),
how utterly poor unoptimized, big in file size, textures can look,
and how photo realistic highly optimised, smaller file size textures can look - where the key is brain&effort = time = money you have to invest,
biased tendencies or prejudice towards company X or Y in general,
general attitude vs (tech) press/articles/media/insider sources, ranging from religion-like fanatism/belief to complete ignorance vs anything,
how consoles and their shared memory work,
how all those algorythms (in hardware + drivers + os) work,
and a hundred things more.
Life's too short to care for everybody&everything - time&effort better invested in actually enjoying life. Which of course, gaming is a part of.
That's why, and knowing i don't need 10GB VRAM on 1440p, but sure more gpu horsepower, i canceled my 3080 preorder today to get rid of fed-up feelings and constant stock/mail checking.
And get an actually available 3070 8G now instead, to thoroughly enjoy gsync furthermore, and the huge upgrade on 1440p for little money (compared to a 2080TI. Or those new winter tyres for my BMW *ouch* :p), while putting the 4yo 1070 in my wife's rig to give her machine a nice upgrade with that ;-)
Just to confirm what Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
Just to confirm what Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
Just to confirm what Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
And people wanted to pretend otherwiseSo yeah, literally "AMD sponsored game says only AMD new card can run it correctly".
You're welcome, credit where credit is due. And to assure you life forms aside angry keyboard warriors in all capital letters still exist.
(says me, the noob, who just registered here recently, with a post count of: 1 *hehe*)
Ultrawide...hmmm....tried it on several friend's systems.
Cool on first look, great while doing desktop stuff, simulations and slower stuff, but somehow not the right thing for me.
Bc i tend to get dizzy with those in longer sessions (and with fast paced shooters).
Various models, but it's not the monitors - it's just me :)
Just so perfectly comfy with the current gsync panel for fun + a vertical 24"/1080p office IPS panel beside it for work/list stuff - perfect for my needs.
+ still need to wring a lot more life&time out of that Asus ROG panel, to self-justify that ~600€ purchase :D
I mean, sucks for 1060 owners, but I don't think any of them expected to play this game at 4k max settings. :v
In "defense" of AMD. And also nVidia, and basically all companies:
a) AMD trying to push, seeking coalitions and gain more marketshare - eligible, and in the end good for all.
AMD/nV push themselfes/can't sit still (intel anyone? :p) , we get more cool stuff sooner and not years later (intel anyone? :p), prices balance out, and there is no sole supremacy of one alone (which always goes tits up, mostly for those down in the foodchain - the average joe, not matter the continent or decade, no matter if a party, government or - gfx card company).
b) They - all - use marketing lingo, and easy catchphrases. What else shall they use?
If they talk like here, or like that cool dude Steve from GamersNexus - 80% of average consumers (who are not interested in anything more, than using/enjoying something) would stand there, staring with mouths open like Homer Simpson, unsure of what to do next.
If you even have to sell cars these days not with hard facts&specs, but with some emotional mini-movie where a blonde bimbo shoves the small actor kids in some SUV and dance around on the beach.
And even Rolls Royce, who don't need any commercial at all - because "you simply know", and the product does the talking for itself, does commercials.
Ya sorry, i'm too old for this sh** :D
So forgive them for the trickery - they all pump out oneliners that all are only true when put into the correct context, so basically not wrong, but just a version of the truth.
If you think about it for a moment.... no matter which economical, technological or society phenomenon....ppl simply love oneliners, and love, better: outright beg, to be deceived and lied to. To feel comfy/easy/superior/taken care of/whatever.
Ultimately, i don't think this "game uses up to 12gb vram" oneliner will scratch on nVidia's global marketshare (~60-80%) a lot.
And that enormous marketshare means AAA PC games are much more likely to be optimized for nVidia's 10GB GDDR6X (less but faster VRAM) than AMD's 16GB of GDDR6.*
The GPU wars are going to get interesting now that:
1. Apple is using console-like unified SoC architecture with shared DRAM and a custom ARM CPU.
2. AMD is "cheating" by letting Zen3 and RDNA2 whisper to each other in new secret ways.
3. nVidia likely bought ARM to go towards that tighter CPU/GPU integration.
4. Intel "got serious" (for what? the 4th time now? lol) about their GPU game as well with the XE.
* Much will depend on the quality of the implementation of SSD to GPU DirectIO by AMD and nVidia. This is a brand new metric that GPUs are going to start competing on, just like DXR (Ray Tracing). Performance will depend on VRAM + DIO perf instead of just VRAM like it does now.
Ultrawide...hmmm....tried it on several friend's systems.
Cool on first look, great while doing desktop stuff, simulations and slower stuff, but somehow not the right thing for me.
Bc i tend to get dizzy with those in longer sessions (and with fast paced shooters).
Various models, but it's not the monitors - it's just me :)
This strange, the effect is pretty much the opposite for a couple of friends with motion sickness issues, in fact I converted them to ultra wide precisely because they were able to play more first person games without getting dizzy on my setup.
So long as the frame rate is high and FOV is properly wider it always helped them.
Nah, no motion sickness. Put me in a plane, sports/track car, rollercoaster or else, and i squeal like a happy pig.
Have good eyesight with above average fov for my age, and only 0,7 diopters in far, even less in short sight.
But too close to a screen/too big (and especially too wide) a screen - and it simply strains me too much after some time (due to constant eye/headmovement),
leading to more or less the same diziness, those with motion sickness affected feel.
So yes, the opposite, which equals "normal" i guess? ;-)
Lol, yeah I guess so. Stillness sickness :)
But not so weird, My wife is like that. no motion sickness but she can't stand video games at all, especially being close to my ultra wide.
First we must put things into context. The jump between PS3 and PS4 was 256mb to 8GB. 32x increase in VRAM!
Just to confirm what Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
Just to confirm what Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
*insert Pikachu shocked face*Just to confirm what @Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
I love it too, don't ever stop. It's very pleasant to read.thanks for the roses. Forgive me errors/strange words/grammar, should it happen.
There's a lot of cold water between us, and my native tongue is Austrian (the gentle, more subtle version of German).
And yes - my colleagues in the office also love (or outright death-wish hate) my detailed replies/emails, that more often than not, drift off into J.R.R.Tolkien length or Borat2, subsequent movie-like areas... :p :D
New info came to light today.AMD is "cheating" by letting Zen3 and RDNA2 whisper to each other in new secret ways.
In the very next paragraph, I explained that ps4 has 5GB usable for games, which is 20x increase still.apologies if this may have been addressed but why are you comparing the VRAM in the PS3 to the total RAM in the PS4?
Ohhhh...interesting!New info came to light today.
SAM is apparently just re-sizeable BAR, and is part of the pcie spec. If MB, cpu, os, and gpu all support it... It will work.
Nvidia is already testing a release updating Ampere to enable re-sizeable BAR, and thus, get the benefits of "SAM" (AMD's marketing term for it) on a zen 3 and x570/b550.
See here for more details
https://twitter.com/GamersNexus/status/1327006795253084161?s=20
In the very next paragraph, I explained that ps4 has 5GB usable for games, which is 20x increase still.
What are you expecting me to compare? Ps3 had a split memory system, while ps4 and ps5 have a unified system...
The point of the historical analysis is that the jump from gen 7 to 8, was very different than the jump from gen 8 to 9. Even if you argue that developers used ONLY 2 GB of the RAM for the gpu (and this isn't an accurate argument), that would be an 8x jump, compared to 2.7x.
My apologies, Im used to getting a lot of negative feedback on this particular topic.not sure what you think i was trying to imply. i was just wondering why you tried to make what seemed like an apples to oranges comparison.
My apologies, Im used to getting a lot of negative feedback on this particular topic.
Just to confirm what Darktalon said, this benchmark video shows that Godfall uses around 6-6.5GB of VRAM at 4K Maximum settings.
I am again reminded of the Evil Within 2 release at that sketchy flat we all used to hang around in.Sorry but HAHAHAHAHAHAHA.
Marketing and all but flubbing the numbers shouldn't double the VRAM usage, like damn you had to know you were going to be proven massive false by a huge margin.
good question.WTF is with that massive jump for the 3090 though. Usually it's only like 10-15% better, not like, 30%
WTF is with that massive jump for the 3090 though. Usually it's only like 10-15% better, not like, 30%
This whole renderer gives weird results and that's not at all surprising when someone is trying to prove a point artificially.WTF is with that massive jump for the 3090 though. Usually it's only like 10-15% better, not like, 30%
Saturation with VRAM will do that. It's mild here though... in some productivity tools you can look at 300% improvement.
edited: thought i was in the 3080 megathread ahaha, all had been said here already.
Link? I don't see anything mentioned in the 3080 megathread.
I'm curious about why the 3090 has such a large advantage when the only difference between it and the 3080 is more CUDA cores and VRAM.
Godfall PC Performance Review and Optimisation Guide - OC3D
Godfall – An Introduction into the next generation Godfall is one of the first truly next-generation game to hit the PC marketplace, launching exclusively on next-generation platforms. On consoles, the game is not held back by PlayStation 4 and Xbox One, launching as a timed exclusive on Sony’s...www.overclock3d.net
I doubt that it will add as much in this case. But yeah, it will probably go above 10GB in 4K, if only by some megabytes.Raytracing update is gonna easily add 1.5-2gb of VRAM on top so you want 12 gb of VRAM to be comfortable.
Also how did they get those numbers? Through MSI Afterburner? Not very accurate then for actual VRAM usage.If this numbers are accurate (there's gonna be a lot of variability throughout different levels), then the 12gb VRAM claim is not wrong.
Raytracing update is gonna easily add 1.5-2gb of VRAM on top so you want 12 gb of VRAM to be comfortable.
That said, I doubt having 10 or 8 gb is gonna cause a massive drop in performance, most likely will just means those cards will underperform a bit compared to their theoretical maximum perf if they weren't VRAM constrained.
as someone who doesn't udnerstand PC very much, I appreciate you making this thread.My apologies, Im used to getting a lot of negative feedback on this particular topic.
More allocation numbers.
Godfall PC Performance Review and Optimisation Guide - OC3D
Godfall – An Introduction into the next generation Godfall is one of the first truly next-generation game to hit the PC marketplace, launching exclusively on next-generation platforms. On consoles, the game is not held back by PlayStation 4 and Xbox One, launching as a timed exclusive on Sony’s...www.overclock3d.net