• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

nikos

Banned
Oct 27, 2017
2,998
New York, NY
Pretty obvious what's going on here. Spoiler: Marketing.

Counterplay Games partnered with AMD for this game. AMD is about to launch cards. NVIDIA's 3080 has 10GB of GDDR6X. [Insert 12GB bullshit marketing here.] Difference is the 3080 runs GDDR6X, not GDDR6. Counterplay never even indicated the type of memory they're referring to. The 3080 won't have an issue running this game unless it's poorly optimized.
 

Nooblet

Member
Oct 25, 2017
13,626
Because dlss goes a wrong sometimes with RT sometimes. In hacker game or whatever the new ubisoft game with rt look crazy off and I mean not right at all on some cars that were driving by. But yeah the performance hit really isn't worth it in alot of cases by itself.
I mean that's the game I'm talking about.

It doesn't look off, it basically looks as it should. What's happening is that it's simply doing a lower resolution reflection and that's because DLSS cannot do anything about reconstructing reflection resolution. Watch Dogs RT is half res of the native res/pre DLSS resolution. So at 4K Quality DLSS your RT reflection is like quarter res of 4K rather than half res...which isn't really a bad thing as quarter of 4K for reflection is still pretty damn good and still quite a bit higher resolution than what Xbox Series X seems to be doing.
 
Last edited:

Shutts

Member
Oct 31, 2017
9
Your operating system (Windows) is still running in the background while you are playing.

No one expects the console operating systems to use so much RAM. Every expectation is around 13GB VRAM for games for PlayStation 5 and Xbox Series X.

We will know for sure in a few days.

That's just the WD:Legion process fella... sitting big and fat in DRAM. Win10 is obviously on top of that.
 

MajesticSoup

Banned
Feb 22, 2019
1,935
Anyone know if or when Epic will be releasing that lumen in the land of nanite tech demo? It'll be interesting to see what effect those 8K textures have on vram.
 
Oct 28, 2017
2,737
I think the 6800 XT will age much better than the 3080 unless the games you want to play happen to be the ones that support DLSS. Being the same architecture as last gen consoles with 4GB of VRAM when many cards had 2GB or 3GB is why the AMD 290X aged so well.
 

Onebadlion

Member
Oct 27, 2017
3,189
Pretty obvious what's going on here. Spoiler: Marketing.

Counterplay Games partnered with AMD for this game. AMD is about to launch cards. NVIDIA's 3080 has 10GB of GDDR6X. [Insert 12GB bullshit marketing here.] Difference is the 3080 runs GDDR6X, not GDDR6. Counterplay never even indicated the type of memory they're referring to. The 3080 won't have an issue running this game unless it's poorly optimized.

Completely agree
 

Tovarisc

Member
Oct 25, 2017
24,407
FIN
NVIDIA shouldn't be getting away with selling 8GB and 10GB GPUs going into 2021 tho if 4K gaming is going to be the norm

4k gaming wont be the norm for long time to come.

oITZaKW.png

Source: https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

4k gaming is very niche stuff in PC space still*

*
yes, we don't know or can't tell how many are hooking PCs to 4k TVs for gaming

The expectation is around 13GB VRAM for games.

So from 16 GB of total RAM in whole system 13 GB is pure VRAM, with rest 3 GB being for OS and general storage for game data? 🤔

Sounds bit... weird, to put it with some word. Next-gen games taking less general storage RAM than current-gen...

Or do you mean 13GB RAM is dedicated for the games, being split as RAM and VRAM depending on what game needs at what amounts?

game has very limited ray tracing, they played it safe

I hope games like Control work with AMD RT HW right out of the box, those could be interesting benches to see too.
 

Nooblet

Member
Oct 25, 2017
13,626
4k gaming wont be the norm for long time to come.

oITZaKW.png

Source: https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

4k gaming is very niche stuff in PC space still*

*
yes, we don't know or can't tell how many are hooking PCs to 4k TVs for gaming



So from 16 GB of total RAM in whole system 13 GB is pure VRAM, with rest 3 GB being for OS and general storage for game data? 🤔

Sounds bit... weird, to put it with some word. Next-gen games taking less general storage RAM than current-gen...

Or do you mean 13GB RAM is dedicated for the games, being split as RAM and VRAM depending on what game needs at what amounts?



I hope games like Control work with AMD RT HW right out of the box, those could be interesting benches to see too.
That 13GB is for both VRAM and RAM, so the actual VRAM usage will vary but you can be guaranteed that it's going to be less than 10GB, maybe by ever a fair bit.
 

TinTuba47

Member
Nov 14, 2017
3,794
Not that I'm sold on Godfall, but If I pick it up I wouldn't bother with 4K anything. 1440p is where it's at as far as I'm concerned.
 

Tovarisc

Member
Oct 25, 2017
24,407
FIN
That 13GB is for both VRAM and RAM, so the actual VRAM usage will vary but you can be guaranteed that it's going to be less than 10GB, maybe by ever a fair bit.

It would make sense games to chunk ~5 GB from 13 GB as general RAM, that would leave ~8 GB as VRAM. General RAM needs may even go up, especially with open world games, as gen gets rolling.
 

Vuze

Member
Oct 25, 2017
4,186
Oh no AMD sponsored game is said to have suspiciously high VRAM requirements, surely not an attempt to sway potential 3080 buyers
 
Apr 4, 2018
4,509
Vancouver, BC
Pretty obvious what's going on here. Spoiler: Marketing.

Counterplay Games partnered with AMD for this game. AMD is about to launch cards. NVIDIA's 3080 has 10GB of GDDR6X. [Insert 12GB bullshit marketing here.] Difference is the 3080 runs GDDR6X, not GDDR6. Counterplay never even indicated the type of memory they're referring to. The 3080 won't have an issue running this game unless it's poorly optimized.

I think you are generally right about this being marketing, but if the game was optimized for 12GB of memory at Ultra, and doesn't have a suitable option for 10GB cards, it's possible the 3080 could forced to use high setting textures, which could look noticeably lower res (probably closer to the PS5 textures).
 

Skyebaron

Banned
Oct 28, 2017
4,416
Godfall is that person that everyone want to shut the fuck up as soon as a syllable is uttered from their rotten mouth. Youre this years Battleborne /Anthem/Failure, no one cares.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340

Tovarisc

Member
Oct 25, 2017
24,407
FIN
What do you reckon that graph looks like on consoles? I'm sure it's larger than 2.3% but how much larger I wonder?

Who was it from MS who talked about research that was conducted in US on 4k adoption rates at TV front and it was very low? Was it Spncer? It was when they were talking about Series S and why it's targeting sub-4K output.

I tried to google that stuff, couldn't find it now to see if there was any numbers there.
 

Zutrax

Member
Oct 31, 2017
4,191
I guess I've never really had to think about this before, but are you able to dedicate System RAM to be used as VRAM as a backup if your video card runs out of available VRAM to use? I have 32GB of RAM on my machine, so plenty to spare, presuming that's a feesible task.
 

Nooblet

Member
Oct 25, 2017
13,626
I guess I've never really had to think about this before, but are you able to dedicate System RAM to be used as VRAM as a backup if your video card runs out of available VRAM to use? I have 32GB of RAM on my machine, so plenty to spare, presuming that's a feesible task.
The game does that automatically when it goes over VRAM, that's why the stuttering starts because system RAM is just not fast enough for real time rendering and there's a latency issue as well.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
Who was it from MS who talked about research that was conducted in US on 4k adoption rates at TV front and it was very low? Was it Spncer? It was when they were talking about Series S and why it's targeting sub-4K output.

I tried to google that stuff, couldn't find it now to see if there was any numbers there.

I'm sure it's super low overall, but it's probably a bit larger among gamers... maybe? Especially since marketing for console games never mentioned "4k" or upscaling, just 4K.
 

fulltimepanda

Member
Oct 28, 2017
5,797
Probably marketing but wouldn't be surprised if it was legit. Saying that 10GB would be enough for 4k@Ultra (on PC nonetheless) over the next couple years was/is still speculation.

Why did Nvidia skimp out on the vram in the 3000 series? Did they think AMD would stick with using 8 GB of vram in Big Navi?

GDDR6X is still relatively new (and expensive), the 3080 has room on the PCB for 12 chips, with 10 in use at the moment. 2gb GDDR6X chips are due to come out early next year.

So timelines and cost just didn't match up for a bigger model. They could have done what they did with the 3090 and put chips on the back of the card but that would have upped the cost substantially.
 
Jun 17, 2018
3,244
I mean, the game doesn't even look good so why are the requirements so high? There shouldn't be any game in development that recommends a 3080.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
Pretty obvious what's going on here. Spoiler: Marketing.

Counterplay Games partnered with AMD for this game. AMD is about to launch cards. NVIDIA's 3080 has 10GB of GDDR6X. [Insert 12GB bullshit marketing here.] Difference is the 3080 runs GDDR6X, not GDDR6. Counterplay never even indicated the type of memory they're referring to. The 3080 won't have an issue running this game unless it's poorly optimized.
I am going to say - this is probably, quite indeed, partner Marketing at work.
I will test this when it comes out of course. But really... as I have been testing a lot of UE4 games and never seen ones that use that much VRAM at 4K. I would like to be surprised here.

This just seems like GPU vendor wars to me till it comes out and not anything to actually cry Doom about.

As an example of similar wierdness of partner Marketing and VRAM amounts. When Radeon VII came out AMD claimed games needed 16 GB of RAM at 4K...their test to prove this was Far Cry 5 at 4K with DRS Set to on and with no fps cap. Which makes literally no sense.
Edit: spelling is hrd
 
Oct 27, 2017
9,420
As an example of similar wierness of parnet Marketing and VRAM amounts. When Radwon VII came out AMD claimed games needed 16 GB of RAM at 4K...their testto prove this was Far Cry 5 at 4K with DRS Set to on and with no fps cap. Which makes literally no sense.
All I know is that marketing fucked up somewhere. Radwon would have been a much better named series 😄
 

Deleted member 73264

User requested account closure
Banned
Jun 28, 2020
201
This is weird, they claim it needs that much VRAM due to the size of the textures used, but... why not just limit the texture memory used? You can stream in lower mip levels if there's not enough memory available, and prioritize based on distance to the camera.

It might make sense if they were doing deferred shading and using a bunch of HDR buffers... Forward+ seems more popular nowadays, at least partially due to exploding memory requirements for deferred in 4K. 12GB is still an insane amount, though.
 

noeybys

Member
Aug 8, 2020
60
Out of 16gb ram on consoles, I assume ~10gb is used for vram.

In which case console settings and resolution will need 10gb to run well during the heaviest scenes.

I guess higher than console settings and resolution with better ray tracing and high asset quality will require 12gb ram during the heaviest scenes?
 

GreyHand23

Member
Apr 10, 2018
413
This is weird, they claim it needs that much VRAM due to the size of the textures used, but... why not just limit the texture memory used? You can stream in lower mip levels if there's not enough memory available, and prioritize based on distance to the camera.

It might make sense if they were doing deferred shading and using a bunch of HDR buffers... Forward+ seems more popular nowadays, at least partially due to exploding memory requirements for deferred in 4K. 12GB is still an insane amount, though.

Why should they have to limit the texture memory? Their are video cards with more than 10 gig of VRAM, so why not use it. If you don't have that much you can always lower settings and it likely won't even look that much worse.
 

dgrdsv

Member
Oct 25, 2017
11,850
They will need to make huge strides in omitting all optimizations for that to be remotely close to reality. Not saying that it's impossible since it's an AMD sponsored game and these are quite often fubared technically by marketing influence. Huge strides though.
 

etta

Banned
Oct 27, 2017
3,512
I am going to say - this is probably, quite indeed, partner Marketing at work.
I will test this when it comes out of course. But really... as I have been testing a lot of UE4 games and never seen ones that use that much VRAM at 4K. I would like to be surprised here.

This just seems like GPU vendor wars to me till it comes out and not anything to actually cry Doom about.

As an example of similar wierdness of partner Marketing and VRAM amounts. When Radeon VII came out AMD claimed games needed 16 GB of RAM at 4K...their test to prove this was Far Cry 5 at 4K with DRS Set to on and with no fps cap. Which makes literally no sense.
Edit: spelling is hrd
Anyone who knows anything about VRAM in games would know this is bullcrap, you shouldn't need 12GB for any reason.
 

asmith906

Member
Oct 27, 2017
27,371
I am going to say - this is probably, quite indeed, partner Marketing at work.
I will test this when it comes out of course. But really... as I have been testing a lot of UE4 games and never seen ones that use that much VRAM at 4K. I would like to be surprised here.

This just seems like GPU vendor wars to me till it comes out and not anything to actually cry Doom about.

As an example of similar wierdness of partner Marketing and VRAM amounts. When Radeon VII came out AMD claimed games needed 16 GB of RAM at 4K...their test to prove this was Far Cry 5 at 4K with DRS Set to on and with no fps cap. Which makes literally no sense.
Edit: spelling is hrd
Is there a technical reason why games targeting next gen consoles wont make vram usage go up if you're trying to play at the highest settings in 4k. I always figured vram usage was low in games since they still had to target base consoles which used the equivalent of a 7870 gpu.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
They will need to make huge strides in omitting all optimizations for that to be remotely close to reality. Not saying that it's impossible since it's an AMD sponsored game and these are quite often fubared technically by marketing influence. Huge strides though.
That would be some shit, sabotage your own game for the marketing deal.


Is there a technical reason why games targeting next gen consoles wont make vram usage go up if you're trying to play at the highest settings in 4k. I always figured vram usage was low in games since they still had to target base consoles which used the equivalent of a 7870 gpu.
VRAM usage will go up. But please establish the baseline for where it has been, from where it will go up, and by how much it will increase.

Now we get into allocation vs utilization, and people wrongly assuming how much VRAM we've been using for the last 5 years. Furthermore, what tools are the developers using and which number are they reporting in the system requirements?
 

tusharngf

Member
Oct 29, 2017
2,288
Lordran
nvidia sponsored game : Wow next gen is here 12gb vram. What a time to alive.
Amd sponsored game: what a load of bullshit .. lets wait for some benchmarks.
 

Xiaomi

Member
Oct 25, 2017
7,237
"Uses" is a vague term. It could mean "allocates" which is fine, a lot of games allocate the max vram a gpu has as "usable," or it could mean uses uses, which would be pretty unique.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,203
Dark Space
nvidia sponsored game : Wow next gen is here 12gb vram. What a time to alive.
Amd sponsored game: what a load of bullshit .. lets wait for some benchmarks.
Yes I'm sure this is how the discussion plays out when an Nvidia sponsored game dev is reading a teleprompter of PR jargon.

People never then shift the discussion to a touted feature like uh ray tracing stomping the performance like 70%.
 
Status
Not open for further replies.