• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Razgriz417

Member
Oct 25, 2017
9,109
Why did Nvidia skimp out on the vram in the 3000 series? Did they think AMD would stick with using 8 GB of vram in Big Navi?
they're using new faster (and more expensive) vram in 3080 and 3090. Don't know what to tell you have the 3070 though, maybe they dont want their lower end card having more vram than their high end?
 

headspawn

Member
Oct 27, 2017
14,608
Oh no AMD sponsored game is said to have suspiciously high VRAM requirements, surely not an attempt to sway potential 3080 buyers

Seems more likely that they based it on PS5 tech, no? That is their other partner.

I don't get why anyone would fault a developer for providing extremely high quality textures for their game though, it's exceptional.
 

Grimminski

Member
Oct 27, 2017
10,130
Pittsburgh, Pennsylvania
I'm downloading and installing the Ultra texture pack on my 8GB 5500xt

tenor.gif
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,207
Dark Space
Don't forget how DLSS makes the image look weird.
RT cynicism I fully understand. It's a hefty cost and the implementations haven't matured far enough beyond "make the floor wet so they can tell it's on", as it's still early for developers. Metro Exodus is still the best glimpse of RT really changing the environment, the future.

DLSS slights are absolute madness though. Those people are on something. "It's not perfect." Yeah but neither is TAA, that's the whole point, without even getting into the performance boosts.

I'm really hoping AMD's Super Resolution and DirectML open the door for developers to support both company's solutions across the board, regardless of marketing deals. We really don't want a future where it's one or the other when it's so beneficial to all gamers.
 

RivalGT

Member
Dec 13, 2017
6,395
Seems more likely that they based it on PS5 tech, no? That is their other partner.

I don't get why anyone would fault a developer for providing extremely high quality textures for their game though, it's exceptional.
That's one thing I noticed when watching footage of the game, all the textures are of very high quality. It be interesting if PS5 games allow 12gb for vram usage.
 

Darktalon

Member
Oct 27, 2017
3,266
Kansas
I don't need to be tagged into every fanboi gpu war thread.

This is clearly AMD targeted marketing, of course they are going to say that.

Wake me up when we have benchmarks of games stuttering in godfall with a 3080, or God forbid show me even ONE youtube reviewer who actually uses per process vram monitoring instead of showing only the currently allocated amount.

And the op talks a whole lot of shit about how I've provided no proof, yet I've provided pages of evidence and screenshots of a multitude of games, and written a guide how any single one of you can test it for yourself.

www.resetera.com

MSI Afterburner can now display per process VRAM!

Nov 17th 2020: Per Process VRAM Monitoring now is supported internally, and works in all games. Install MSI Afterburner 4.6.3 Beta 4 Build 15910 from -MSIAfterburnerSetup463Beta4Build15910.rar']over here. Enter the MSI Afterburner settings/properties menu Click the monitoring tab (should be...
 

GhostofWar

Member
Apr 5, 2019
512
Not a huge fan of sponsored game fuckery, so we can look forward to amd trying to push the nvidia cards out of vram and nvidia trying to make amd cards have the lowest cache hit rate possible. Fantastic.......... /s
 

Tovarisc

Member
Oct 25, 2017
24,425
FIN
Seems more likely that they based it on PS5 tech, no? That is their other partner.

I don't get why anyone would fault a developer for providing extremely high quality textures for their game though, it's exceptional.

So they would use 12 GB as VRAM and 1 GB as general RAM for the game, when 3 GB is reserved for OS in PS5? That sounds, to me, very weird slant and awfully little general RAM to run modern AAA game out of.

There is point after which there is diminishing returns on increasing resolution and detail in textures. What is to point in using e.g. 10 GB of VRAM to just storage some fancy textures with a lot microdetail at 8K when basically no one will never notice that detail?
 

headspawn

Member
Oct 27, 2017
14,608
So they would use 12 GB as VRAM and 1 GB as general RAM for the game, when 3 GB is reserved for OS in PS5? That sounds, to me, very weird slant and awfully little general RAM to run modern AAA game out of.

There is point after which there is diminishing returns on increasing resolution and detail in textures. What is to point in using e.g. 10 GB of VRAM to just storage some fancy textures with a lot microdetail at 8K when basically no one will never notice that detail?

I don't see a problem with future proofing games, nobody is going to be forced to use the highest preset or unable to attempt to.
 

Tovarisc

Member
Oct 25, 2017
24,425
FIN
I don't see a problem with future proofing games, nobody is going to be forced to use the highest preset or unable to attempt to.

My point is that 12 GB VRAM requirement is very much PC requirement, very likely driven by AMD marketing deal. Allocating that much of RAM as VRAM isn't very realistic for next-gen consoles, when they operate with 16 GB total RAM from which OS chunks 3-ish GB right off the bat.

If devs want to make some PC specific Extreme HD Texture Pack for diminishing visual gains go ahead, Ubi does those and CD did for Avengers. That also would make 12 GB VRAM requirement soft / mirage or game would go way past when you put such pack on top.
 

Kingpin722

Member
Oct 28, 2017
1,028
10GB of VRAM is more than enough. So what if you have to lower a few settings. Wished people would stop worrying about always having the absolute best hardware. Shit is toxic
 

Timu

Member
Oct 25, 2017
15,554
10GB of VRAM is more than enough. So what if you have to lower a few settings. Wished people would stop worrying about always having the absolute best hardware. Shit is toxic
I agree, you don't need to rely on ultra for everything, definitely early on. High is usually enough in most cases plus usually have better performance.
 

ThreepQuest64

Avenger
Oct 29, 2017
5,735
Germany
After reading the title I immediately thought about people telling me directly that we won't need that much memory because recent games don't take as much as RTSS make us believe and future games will "work differently" in regard of memory usage.

I mean,maybe they're right for future games way more into the next gen cycle, maybe that Godfall dude is telling BS or Godfall is just an oddity.

In any case, I'm not interested in the title and I'm mildly concerned and simply hope my 2070 Super will hold out for at least two years. I'm usually playing at 1440p any ways.
 

mario_O

Member
Nov 15, 2017
2,755
I would definitely wait for the 3080Ti and 3070Ti...or go AMD if their ray tracing + "dlss" performance is good enough.
 

amc

Member
Nov 2, 2017
241
United Kingdom
People can laugh all they want, and tbh this Godfall claim is obviously tied into some form of AMD hyperbole, but devs love ram and If devs now have cards and new base consoles that allow them to push ram limits they'll use it. Yeah, you can turn down settings and everyone gets to play so why not up vRAM usage.

But anyone, including me, who claims to know what ram usage is going to be going forward, at the start of a new generation, is talking shit. No soothsayers here, DF included.

There were many claims of cards being more than fine for the whole of the PS4/One generation that had 4GB of vRAM. They aged like milk.

No one here knows how vRAM usage will move forward, all's we can say is devs now have new limits which they get to fill and that 4K and its high quality assets are going to be more of a factor than any previous generation. Of course Godfall will look beautiful on a 3080, games will do for a good while but that's not to say that to play games at their highest settings in a year or two it won't require more vRAM.

Uncharted waters with consumer/game cards now offering more than 11GB of vRAM. Let's see what the devs do with that. If anything.
 
Last edited:

Pottuvoi

Member
Oct 28, 2017
3,062
4k by 4k, ie. 4096x4096 doesn't sound that high for a texture these days?
Pretty sure that 8k textures are still quite rare, the memory use is quite prohibitive.
And yes, we have seen 4k textures for a long time.

As always with textures it's how one uses them that matters. (In good or bad.)
 

Tovarisc

Member
Oct 25, 2017
24,425
FIN
People can laugh all they want, and tbh this Godfall claim is obviously tied into some form of AMD hyperbole but devs love ram but If devs have cards and now base consoles that allow them to push ram limits they'll use it. Yeah you can turn down settings and everyone gets to play.

But anyone, including me, who claims to know what ram usage is going to be at the start of a new generation is talking shit. No soothsayers here, DF included. There were many claims of cards being more than fine for the whole of the PS4/One generation that had 4GB of vRAM. They aged like milk.

No one here knows how vRAM usage will move forward, all's we can say is devs now have new limits with which they get to fill and that 4K and its high quality assets are going to be more of a factor than any previous generation. Of course Godfall will look beautiful on a 3080, games will do for a good while but that's not to say that to play games at their highest settings in a year or two won't require more vRAM.

Uncharted waters with consumer cards now offering more than 11GB of vRAM. Lets see what the devs do with that. If anything.

Anyone who expects dGPU released in 2020 to last them whole console generation is either fool or very naive. Also chasing highest settings no matter what is another fools errand, just flushing performance down the toilet for .001% better looking clouds or some shit.

Consoles have 16GB of RAM from with OS takes 3GB for itself, at least that seems to be most common expectation. That will leave 13GB of RAM to be shared as VRAM and general RAM. Games, especially open world ones, will want chunk maybe even 5+ GB of that for general RAM. So 7-8GB would be left as VRAM. In PC space devs can run wilder with specs, they always do, but consoles are still baseline.

But then who knows, maybe games will use SSD speeds and move terabytes of data in and out of SSD as you play because nothing is really kept in RAM.
 
OP
OP
Jun 1, 2018
4,523
Seems nvidia is prepping a 3080 20GB which would be closer to the 3090.

www.resetera.com

NVIDIA GeForce RTX 3080 Ti rumored to feature 10496 CUDA cores and 20GB GDDR6X memory Rumor

https://videocardz.com/newz/kopite7kimi-nvidia-geforce-rtx-3080-ti-rumored-to-feature-10496-cuda-cores-and-20gb-gddr6x-memory After seeing the recommended settings for next gen games (in 4k with high res texture packs), I am more than ready! Of course you dont need 20GB if you only play in...
 

Laiza

Member
Oct 25, 2017
2,171
People can laugh all they want, and tbh this Godfall claim is obviously tied into some form of AMD hyperbole, but devs love ram and If devs now have cards and new base consoles that allow them to push ram limits they'll use it. Yeah, you can turn down settings and everyone gets to play so why not up vRAM usage.

But anyone, including me, who claims to know what ram usage is going to be going forward, at the start of a new generation, is talking shit. No soothsayers here, DF included.

There were many claims of cards being more than fine for the whole of the PS4/One generation that had 4GB of vRAM. They aged like milk.

No one here knows how vRAM usage will move forward, all's we can say is devs now have new limits which they get to fill and that 4K and its high quality assets are going to be more of a factor than any previous generation. Of course Godfall will look beautiful on a 3080, games will do for a good while but that's not to say that to play games at their highest settings in a year or two it won't require more vRAM.

Uncharted waters with consumer/game cards now offering more than 11GB of vRAM. Let's see what the devs do with that. If anything.
I like how this post completely discounts every other aspect of what makes a GPU good.

Like so many other posts in this thread...
 

KDR_11k

Banned
Nov 10, 2017
5,235
Pretty sure that 8k textures are still quite rare, the memory use is quite prohibitive.
And yes, we have seen 4k textures for a long time.

As always with textures it's how one uses them that matters. (In good or bad.)
Are we sure that when they say "4k textures" they mean 4096x4096 rather than "textures with enough resolution to look good when playing at 4k"?
 

amc

Member
Nov 2, 2017
241
United Kingdom
I like how this post completely discounts every other aspect of what makes a GPU good.

Like so many other posts in this thread...

What the fuck are you on about? We are talking about vRAM, that's the issue at hand. What in my post disparages any GPU and what makes them good apart from devs may up vRAM requirements in the coming year or two. Are people that invested in their new purchase that they feel the need to talk trash about a post that doesn't fit into what they want to hear. Your 3080 is fabulous, amazing with great RT and raster performance. Better? Did I need to really state that while making an observation on vRAM.

Absolutely pathetic retort to my post. Of course it goes without saying any modern card has a plethora of great features that make games run, here's some truth, one of those features is vRAM, here's some more truth, that may not be the best amount for going forward. Not saying that suddenly the 3080 is shit, it's obviously a fantastic card, just that Nvidia has seemingly made a choice to meet a price point that devs may push beyond sooner rather than later. We'll see.

Read what the post says again and don't be so butt hurt because I'm saying 10GB may not cut it for the optimal settings in 12 to 18 months.
 
Last edited:

Pottuvoi

Member
Oct 28, 2017
3,062
Are we sure that when they say "4k textures" they mean 4096x4096 rather than "textures with enough resolution to look good when playing at 4k"?
No.

Always disliked the latter description though, 4096x4096 at least tells something about textures used. (If talking about object level.)
Latter can be using 256x256 textures, just brilliantly.

Would love to see and hear in more detail how they handle art and rendering.
 

Zoon

Member
Oct 25, 2017
1,397
Hmm, I was almost certain that I'd be getting the 3060ti for 1440p gaming. But now I'm thinking that its 8GB might be an issue after a couple of years. AMD's mid-range cards didn't have any leaks yet so I'm assuming that it'll be a while before they release. What sucks more is that I sold my 1060 and I'll be using a 9600gt until I get a new card.
 

Black_Stride

Avenger
Oct 28, 2017
7,388
Are we sure that when they say "4k textures" they mean 4096x4096 rather than "textures with enough resolution to look good when playing at 4k"?

4K textures means 4096 x 4096, no self respecting artist would say 4K textures means good enough to render the image at 4K?
Because that doesnt actually describe shit.....a single hair card could be 256 x 256 but because theres lots of them its enough to be rendererd at 4K....so no.
When they say 4K textures they are talking about its square resolution.

And that is very high for anything rendered under like 8K resolution.

I know offline renderers who dont use 8K textures because their final renders are 6 - 8K and tiling is easy to mask these days.
Its very memory intensive after 5K with little benefit....unless you are planning on have a texture be absolutely amazing whether you are looking at it from 10 meters away and from 1 centimeter away there is rarely a need to go that high.

In many case for say a ground texture you can use 4K to cover 4 meters easy work, depending on other elements you could probably go much much higher and still have a very high fidelity.

So 8K textures cept for certain situations are still very rare.

Maybe this coming gen with better culling stream feedback we will have 8K textures that are just never fully loaded so they dont eat VRAM but still chew into storage space.
 

brain_stew

Member
Oct 30, 2017
4,731
Hmm, I was almost certain that I'd be getting the 3060ti for 1440p gaming. But now I'm thinking that its 8GB might be an issue after a couple of years. AMD's mid-range cards didn't have any leaks yet so I'm assuming that it'll be a while before they release. What sucks more is that I sold my 1060 and I'll be using a 9600gt until I get a new card.

We've got leaks about the 6700 XT. 12GB GDDR6 on a 192 bit bus and 40 CUs. Should be right around PS5/2080 performance levels.
 

hoserx

Member
Oct 27, 2017
1,172
Ohio
RTX 3090 purchase vindicated............. JK it was still one of the dumbest things I've ever done in my life haha.
s3Adb1J.jpg
 

Jokerman

Member
May 16, 2020
6,943
I'm not sure what the fuss is. You were never going to play this at 4K/60 with ultra textures on a 3080 anyway. I couldn't even manage that on bloody 'Little Hope'!
 

artsi

Member
Oct 26, 2017
2,686
Finland
I'll wait for benchmarks, so far 3080 can chew pretty much anything I throw at it.

There are so many 6GB or 8GB cards installed that developers can't be like "12GB is the minimum" if they want to sell games.

Unless AMD pays them for it ¯\_(ツ)_/¯
 

GreyHand23

Member
Apr 10, 2018
413
I'll wait for benchmarks, so far 3080 can chew pretty much anything I throw at it.

There are so many 6GB or 8GB cards installed that developers can't be like "12GB is the minimum" if they want to sell games.

Unless AMD pays them for it ¯\_(ツ)_/¯

It's not the minimum here either. You can always either turn down some settings slightly or lower resolution.
 

PLASTICA-MAN

Member
Oct 26, 2017
23,610
It was obvious that you needed at least 12 GB or VRAM to survive next-gen. Now even 8K textures will come with UE5 and it will be even more problematic. :(
 

Tovarisc

Member
Oct 25, 2017
24,425
FIN
It was obvious that you needed at least 12 GB or VRAM to survive next-gen. Now even 8K textures will come with UE5 and it will be even more problematic. :(

12 GB wont last you this generation, nor will compute power of GPUs that VRAM is soldered onto.

Uncompressed 8K textures are some clown shoes stuff, they will take insane amounts of just storage space and VRAM. 12 GB most likely wont help you there either.
 

Amauri14

Member
Oct 27, 2017
3,694
Danbury, CT, USA
I'm honestly still surprised that they put so little VRAM on those cards compare to their power. Like my Vega 56 has 8GB of VRAM and that card was released three years ago.
 

FoZzI bEaR

Banned
Oct 30, 2017
35
England
As a 3440 X 1440 user I've always speculated that I'd run out of GPU horsepower long before I'd run out of VRAM. Even if it is an issue I can always turn textures down to High.
 
Status
Not open for further replies.