• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Zoyos

Banned
Oct 30, 2017
322
Your link doesn't work for me.

As someone who has spent a fair amount of time fixing code to try to reduce cache misses I hope you're right but you'll have to forgive me until I profile something and see for myself...

That's odd. Posted from mobile. This is from an electrical engineering perspective, so it will be an other side of the fence take.

Anyway, my main point there is that actual on die cache throughput on multiple levels is actually equal or even up to several multiples of max theoretical throughput of off die memory. Which can only happen with incredibly high cache hit ratios. Usually in the realm of 97% at L1 alone.

There is theoretically a lower bound for the average proportional to the number of instructions in the set, on die logic size, and a logarithmic function.

Cache misses are extremely costly in terms of latency. Nothing wastes more modern processing capability than having the cpu idle for several hundred cycles while waiting for off die data. That kind of work is unending and yet really makes the most of these cpu designs.
 

DuvalDevil

Member
Nov 18, 2020
4,176
I finally picked this up on Series X, and must say, I'm really impressed with how great this game looks, runs, and feels to play. It plays as smooth as silk.

It's actually probably one of the nicest looking games I've played across both systems. One really noticable aspect, are the incredibly high resolution textures, which is something I'm not seeing in any other titles, even Demon's Souls (which focuses on high poly counts).

Yep, it's great. That new version looks and plays really good.
 

Dartastic

One Winged Slayer
Banned
Oct 25, 2017
3,779
I just got to a city in the north of England. Bigger city. Frame rate is baaaaad on ps5.
 

Tragicomedy

Avenger
Oct 25, 2017
4,310
The reason it sucks is because the XSX on paper should not be having to have a lower res than a PS5.

I don't own a PS5, but the reviewer and others in here are mentioning tearing being an issue on the console. The lower resolution has made it all but disappear on the XSX. So it sounds like they should implement a similar lower threshold on the PS5 as well.

Having the resolution drop for a second of more demanding scene is imperceptible to my eye, especially compared to tearing.
 

Absolute

Banned
Nov 6, 2017
2,090
I'm glad I didn't bite when I got offered a series s. These digital foundry articles have been pretty eye opening. Will wait for series x to become available.
 

TheHunter

Bold Bur3n Wrangler
Banned
Oct 25, 2017
25,774
Well it's not a good start for MS to be honest. For MS they care less because it's not about hardware but GamePass subs. But for us we like to compare hardware. At the moment XSX is looking a little impotent. MS didn't come out swinging and had no showcase or tent pole games.

Also the sales figures we're hearing have PS5 blitzing XSX again.


I'll take 1080@60 any day over 4K@30
Y'all need Jesus.
 

Spish!

Member
Oct 27, 2017
571
That DF video I watched this morning seems to dispel nearly all the hysteria we have spent 10 pages talking about re: PS5 performance or am I missing something? Seems a shame the XSX has a slightly lower minimum dynamic rez, but it also doesn't seem like a huge deal?
Yeah, it's not a huge deal perceptually. The resolution drop isn't constant and resolves itself quickly enough that people shouldn't notice it.

It's just for those that are into hardware where it matters. In that context it's a bigger deal since PS5's lowest resolution is 46% higher than the Series X's lowest for roughly identical performance.
 
Last edited:

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
So I've played several hours on PS5 and it ran butter smooth. I don't understand the posts saying PS5 frame rate is bad.

disc version

I think there's some issue (some have theorised a memory leak) where for some the game needs a hard console and game restart after the patch, in order to get optimal performance.

Even John from Digital Foundry mentioned that till he restarted his system/game following the patch, he was getting worse performance. That only after the restart did he get performance that was pretty much as it was pre-patch, outside of a handful of drops in mostly cutscenes.
 
Last edited:

Calvin

Member
Oct 26, 2017
1,581
The reason it sucks is because the XSX on paper should not be having to have a lower res than a PS5.
Yeah I get it completely. I am bummed in general - the XSX looks worse even on my OLED than my PC does at highest@1440p. Definitely not saying its an amazing job, just kinda commenting on the story.
 

Calvin

Member
Oct 26, 2017
1,581
I think the hysteria is a reaction to the apparent misconception that XSX was supposed to show a clear performance advantage over PS5 even at launch. When that didn't manifest, and considering that many seem to see that multiplatform advantage as a defining feature of XB platforms, the community went off the deep end a bit in overreaction. With users needing to justify the purchase of their new $500 box, with no 1st party launch games, and only features and potential to fall back on, the reaction was understandable, but still definitely over the top.
Yeah, I guess I understand the reactions from everyone, I just think its so early in the lifecycle that absent any specific optimizations (ala Sony first party titles) we are all just kind of screaming into the wind. I don't think I expressed my thoughts well - I understand why people are aggravated about it, and as someone with both consoles that would prefer to play multiplats on Xbox for a variety of reasons I too wish it performed as the best version - which I would have expected it to. Just figured we were way too early to get too worried about these differences.
 

dgrdsv

Member
Oct 25, 2017
11,848
The next-gen promise is (X)fps and double (X)fps. So basically you either have games that run at 30fps and also have a 60fps mode and games that run at 60fps and also have a 120fpsmode.
I don't think that there ever was such a promise. The reason we have these modes now is the crossgen period. Games are targeting previous gen h/w which allow the devs to run them at either higher res or higher fps on the new h/w. Once the games will stop targeting XBO/PS4 baseline it won't be as easy to run them in these either or modes.

And that has more to do with the CPU than the GPU.
Not really. Doing twice the fps will incur more or less the same load increase on the GPU as doing twice the resolution.
The CPU just doesn't scale as well as GPU does because with GPU it's kinda easy to get to double fps by going with half the resolution.
With CPU you need to do a hell of a lot of optimization of your rendering and game logic code to hit 16ms if your CPU target was 33ms previously.
So again what we see now is the result of games being crossgen, targeting 33ms on old Jaguars. Bringing this same CPU code over to Zen2 results in it running in 16ms without much optimization needed. Once games will start targeting 33ms on Zen2 of the newer consoles it won't be nearly as easy to get the same code to run at 16ms on the same CPU - similarly to how it was with PS4/XBO really.

and I don't et hat you are really getting at...have you senPCGPUs running RT based games without DLSS?
Have you seen UE5 demo running in 1440p/30 on PS5 though? It didn't use any RT even. It's a good example of where things are heading.

Unless developers take a stance and focus on 60FPS, no amount of power will prevent this from happening. This is not a hardware problem. It's a developers choice to go after 60,120,240FPS.
Of course. This has been the case with all console gens. The problem is that if you "take a stance" and target 60 for your next gen graphics you're basically staying at current gen graphics but with twice the fps. This isn't good enough for many devs and gamers.
 

arsene_P5

Prophet of Regret
Member
Apr 17, 2020
15,438
The problem is that if you "take a stance" and target 60 for your next gen graphics you're basically staying at current gen graphics but with twice the fps
Nah and GT7/FM for example will show that. We even have seen early GT7 gameplay and T10 already said the track detail shown in the in engine trailer is what we can expect. I think you are underestimating the consoles. It's difficult to compare GTS with GT7 due to the track not being available in GTS, but we can compare GTS (PS4 Pro) vs FM (in-engine).

ps_messages_20200919_1xklc.jpg

neuebitmap28wjoy.png
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,848
Nah and GT7/FM for example will show that. We even have seen early GT7 gameplay and T10 already said the track detail shown in the in engine trailer is what we can expect. I think you are underestimating the consoles. It's difficult to compare GTS with GT7 due to the track not being available in GTS, but we can compare GTS (PS4 Pro) vs FM (in-engine).

ps_messages_20200919_1xklc.jpg

neuebitmap28wjoy.png
GT Sport is a 60 fps game on PS4.
FM7 is a 60 fps game on XBO.
Of course you will get a graphical upgrade going to PS5/XS in these while staying at the same fps target.
 

arsene_P5

Prophet of Regret
Member
Apr 17, 2020
15,438
GT Sport is a 60 fps game on PS4.
FM7 is a 60 fps game on XBO.
Of course you will get a graphical upgrade going to PS5/XS in these while staying at the same fps target.
yep bad example then. But I think even former 30FPS games will look a lot better at 60FPS on next gen. AC never was that example, tho. Look at Black Flag on PS3 vs PS4 at launch. I think the game didn't even run with 60FPS due to the poor jaguar on PS4/X1. It looked better but just like Valhalla not significantly so.
 

Argyle

Member
Oct 25, 2017
1,054
That's odd. Posted from mobile. This is from an electrical engineering perspective, so it will be an other side of the fence take.

Anyway, my main point there is that actual on die cache throughput on multiple levels is actually equal or even up to several multiples of max theoretical throughput of off die memory. Which can only happen with incredibly high cache hit ratios. Usually in the realm of 97% at L1 alone.

There is theoretically a lower bound for the average proportional to the number of instructions in the set, on die logic size, and a logarithmic function.

Cache misses are extremely costly in terms of latency. Nothing wastes more modern processing capability than having the cpu idle for several hundred cycles while waiting for off die data. That kind of work is unending and yet really makes the most of these cpu designs.

I actually have an EE background as well, I just never really worked on hardware once I graduated...

I more or less agree as far as the instruction cache. CPUs these days have all kinds of crazy speculative execution and stuff, I don't think anyone tries to manually optimize for the instruction cache anymore.

I actually went and looked up what the current best practices are for optimizing for the data cache, and well...it's more or less the same as it's always been. Here's an AMD presentation covering Zen 2 from May of this year. (should be timestamped to the relevant part, if not, skip to 23:09)

 

dgrdsv

Member
Oct 25, 2017
11,848
yep bad example then. But I think even former 30FPS games will look a lot better at 60FPS on next gen. AC never was that example, tho. Look at Black Flag on PS3 vs PS4 at launch. I think the game didn't even run with 60FPS due to the poor jaguar on PS4/X1. It looked better but just like Valhalla not significantly so.
It's hard to tell why AC4BF didn't at least try to run at 60 on PS4/XBO back at their launch. It is possible that the CPUs were too small of an upgrade over PS3/360 but it is also possible that GPUs weren't enough for that either - at least not while increasing the resolution from 720p to 1080p as well. This gen transition is a bit different in how it handle resolutions as can be seen from many crossgen titles not actually going above what Pro/XBX had already - which alone allows them to target a higher fps instead.
 

Jawmuncher

Crisis Dino
Moderator
Oct 25, 2017
38,397
Ibis Island
Multiple Videos for Assassin's Creed Valhalla but Digital Foundry could never give me a Resident Evil 5 & 6 retail switch comparison to the demo.
Still salty.
 

plow

Member
Oct 28, 2017
4,641
Game runs butter smooth for me on PS5 even after the Update. No Framedrops or anything.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
I don't think that there ever was such a promise. The reason we have these modes now is the crossgen period. Games are targeting previous gen h/w which allow the devs to run them at either higher res or higher fps on the new h/w. Once the games will stop targeting XBO/PS4 baseline it won't be as easy to run them in these either or modes.
You don't know this. And there isn't even evidence to suggest otherwise. If anything, there is evidence to suggest what I am saying. Demon's souls on the PS5 is exclusive to the PS5. So, not targeting last-gen hardware, and it as what I am saying. 30fps mode and 60fps mode. And every game released so far except Watchdogs I think has both performance targets, a fidelity and performance mode. Hell, it even has a global setting or that in the PSUI. So surely, there is more to support what I am saying based on what is happening now than what you are saying based on what you THINK will happen in the future.
Not really. Doing twice the fps will incur more or less the same load increase on the GPU as doing twice the resolution.
The CPU just doesn't scale as well as GPU does because with GPU it's kinda easy to get to double fps by going with half the resolution.
With CPU you need to do a hell of a lot of optimization of your rendering and game logic code to hit 16ms if your CPU target was 33ms previously.
So again what we see now is the result of games being crossgen, targeting 33ms on old Jaguars. Bringing this same CPU code over to Zen2 results in it running in 16ms without much optimization needed. Once games will start targeting 33ms on Zen2 of the newer consoles it won't be nearly as easy to get the same code to run at 16ms on the same CPU - similarly to how it was with PS4/XBO really.
All that just confirms what I said. That being able to double framerates has more t do with the CPU than the GPU. As you have clearly said, doubling fps as far as the GPU is concerned can be done by simply dropping the resolution. They can go from dynamic 4K with a floor of 1440p, to dyamic 1440p with a floor of 1080p (just anexample). But as you have said, more work has to be done on the CPU side of things. And for last-gen, that would've been impossible if the game logic is already taxing the CPU at 30fps, but that's not likely to be the case now. I simply can't see any kinda game taxing these current gen consoles to the point where they can't handle 16ms game logic. And that s primarily why we see fidelity and performance modes in everything so far excpet watchdogs.

Have you seen UE5 demo running in 1440p/30 on PS5 though? It didn't use any RT even. It's a good example of where things are heading.
Not at all man...come on, that's a demo, on unfinished hardware. Even the engine is not yet released. It's not a clear indication of where things are heading, just a clear indication of what is possible.
Of course. This has been the case with all console gens. The problem is that if you "take a stance" and target 60 for your next gen graphics you're basically staying at current gen graphics but with twice the fps. This isn't good enough for many devs and gamers.
This is just wrong. I won't consider demon souls, SM as staying at current-gen graphics. Like come on, I know you know better than that. We have been doing these dances for almost a year lol.
 

dgrdsv

Member
Oct 25, 2017
11,848
I simply can't see any kinda game taxing these current gen consoles to the point where they can't handle 16ms game logic.
Really? Most people crave for more interactive open world games with better AI, NPC animations, them being more life like, etc.
It will be really easy to spend all 8 (or 7 as 1 is reserved I think?) Zen2 cores on game logic and stuff which are related to rendering but is mostly done on CPUs like better physics and collision detection.
The fact that you're not seeing last gen games doing this isn't because it's somehow impossible for the devs to fill up these Zen2 cores to make them spend 33ms on each frame. It's because these very same games must run on XBO/PS4 right now.

Not at all man...come on, that's a demo, on unfinished hardware. Even the engine is not yet released. It's not a clear indication of where things are heading, just a clear indication of what is possible.
What is possible is a clear indication of where things are heading.
Yes, it's a demo and there will likely be a lot of optimizations done to this renderer so it will probably be able to reach 4K or 60 fps or even both maybe.
It isn't really using any of the new GPU h/w tech which is present in new consoles though and this tech can make it run worse than this demo did.
So it is completely possible that the 1440p/30 target which they've shown will in fact remain - if only for the projects which will specifically want to hit some incredible graphics complexity level but still, this is usually what AAA games are aiming at.

I won't consider demon souls, SM as staying at current-gen graphics.
Can you give me an example of what these games do in their 60 fps modes which is next gen in graphics?
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
It's hard to tell why AC4BF didn't at least try to run at 60 on PS4/XBO back at their launch. It is possible that the CPUs were too small of an upgrade over PS3/360 but it is also possible that GPUs weren't enough for that either - at least not while increasing the resolution from 720p to 1080p as well. This gen transition is a bit different in how it handle resolutions as can be seen from many crossgen titles not actually going above what Pro/XBX had already - which alone allows them to target a higher fps instead.
Yep... its pretty obvious to me why ACBF was what it was. And for the theories, you have given. CPU jump not enough and GPU not enough to go from720p-1080p and still double the framerate.

But I feel these current-gen consoles will not have that problem. At the very worst, we will have 4K@30fps (or dynamic 4K), but the CPUs in these consoles are more than capable of also running that at 60fps, question is just by how much would they need to drop the. To me, I believe that it's going to be a clear choice dynamic 4K@30fps/60fps and dynamic 1440p@60fps/120fps respectively. The hardware in these consoles fits perfectly with some slight cuts or optimizations. We are already seeing this very something with the series S.

Unless you believe that he zen 2 CPUin these consoles can't handle games at above 30fps...if that's the case then even PCs will struggle too. Cause not everyone has 12-16core CPUs you know?


Really? Most people crave for more interactive open world games with better AI, NPC animations, them being more life like, etc.
It will be really easy to spend all 8 (or 7 as 1 is reserved I think?) Zen2 cores on game logic and stuff which are related to rendering but is mostly done on CPUs like better physics and collision detection.
The fact that you're not seeing last gen games doing this isn't because it's somehow impossible for the devs to fill up these Zen2 cores to make them spend 33ms on each frame. It's because these very same games must run on XBO/PS4 right now.


What is possible is a clear indication of where things are heading.
Yes, it's a demo and there will likely be a lot of optimizations done to this renderer so it will probably be able to reach 4K or 60 fps or even both maybe.
It isn't really using any of the new GPU h/w tech which is present in new consoles though and this tech can make it run worse than this demo did.
So it is completely possible that the 1440p/30 target which they've shown will in fact remain - if only for the projects which will specifically want to hit some incredible graphics complexity level but still, this is usually what AAA games are aiming at.


Can you give me an example of what these games do in their 60 fps modes which is next gen in graphics?
COD does RT inits 60fpsmode? That's not next-gen? Better texture quality and all-round more complex geometry? That's not next-gen?

Anyways, let's just agree to disagree.

I believe fidelity and performance targets are going to be here to stay for the entirety of the generation. Even if at some point that men's fidelity is 1440p native with TAA upsampling to 4k and performance means 900pnative with TAA upsampling to 1080p/1152p. I also believe that at no point this gen old the CPU in these new consoles be a bottleneck to such a degree that 30fps is the best they can do.

In time we will see which of us is right.
 
Last edited:

arsene_P5

Prophet of Regret
Member
Apr 17, 2020
15,438
So it is completely possible that the 1440p/30 target which they've shown will in fact remain - if only for the projects which will specifically want to hit some incredible graphics complexity level but still, this is usually what AAA games are aiming at.
For what it's worth, Epic wants to achieve 60FPS in the UE5 demo, which is impressive on a console imo.
 

Kinan

Banned
Oct 26, 2017
648
I'm bored and want to check out this game even though my PS5/SX console's haven't even been shipped to me.

Can I buy the game on PS4 Pro and keep playing on PS5? Will I play the game in back-compat mode or will it be the 4k/60fps version? I read somewhere that there is cross-platform saves either way, but I guess PS+ covers that.

It worked for me, I started on PS4, transferred the saves to the cloud and then was able to continue on PS5. Reverse does not work though, PS5 saves get uploaded to the special PS5 folder in the cloud, which PS4 does not have access to, so you can not switch back.
 

dgrdsv

Member
Oct 25, 2017
11,848
Unless you believe that he zen 2 CPUin these consoles can't handle games at above 30fps...
Any CPU is able to handle games at above 30 fps. It's a choice of developer which fps the game is targetting since you can literally do twice more work per frame if you target half the fps.

COD does RT inits 60fpsmode?
COD runs in 60 fps on PS4/XBO. You can spend your next gen GPU power on adding RT to that - instead of increasing resolution or going to 120 fps for example.
Also COD's RT is relatively "lite" on performance as it's just shadows and not even all of them. You do something more with RT - you will dip down to 30 very fast.

For what it's worth, Epic wants to achieve 60FPS in the UE5 demo, which is impressive on a console imo.
Sure. And as I've said they will likely get to 1440p/60 or 4K/30 at the very least with this demo.
But the demo itself isn't using the h/w to it's maximum yet either.
 

Zoyos

Banned
Oct 30, 2017
322
I actually have an EE background as well, I just never really worked on hardware once I graduated...

I more or less agree as far as the instruction cache. CPUs these days have all kinds of crazy speculative execution and stuff, I don't think anyone tries to manually optimize for the instruction cache anymore.

I actually went and looked up what the current best practices are for optimizing for the data cache, and well...it's more or less the same as it's always been. Here's an AMD presentation covering Zen 2 from May of this year. (should be timestamped to the relevant part, if not, skip to 23:09)

I wrote a bit that had identified the difference between instruction and data caches in that other post I linked.

The power law of cache misses and where it begins to fail is the reason why caches are the size they are in relation to an average programs requirements for data and instructions. Hence why l4 cache or equivalent on die ram has been unsuccessful for performance increases for most programs.

Data reuse leads to a very similar hit rate in higher level cache in many programs. Hit rates just plateau out beyond a certain size offering no real benefit for datatypes commonly used.
 
Apr 4, 2018
4,509
Vancouver, BC
I've been snapping pics of the Series X version. Still really impressed with how much better this game looks than I expected it too. Even if there are "cross-gen" aspects to it, it definitely has many next-gen traits, like some of the best and highest resolution texture work I've seen on a console, silky smooth 60fps, and some great use of lighting and screen-space reflections.

These were all taken in performance mode, so Dynamic 4K.
lgr6NE7.jpeg

bs0v3qm.jpeg

mkoc49K.jpeg

xn6Ekut.jpeg

t3WEY5q.jpeg

RQ0zRPd.jpeg
 

arsene_P5

Prophet of Regret
Member
Apr 17, 2020
15,438
They just edited the html element on Twitter to say what they wanted. People are absolutely insane.
Well less insane than I thought. I expected someone hacked their account, which would be even worse. Anyway... Still bullshit behavior and crazy. Unbelievable, what these guys have to deal with. That's why we can't have nice things :/
 

k0fighter

Member
Oct 25, 2017
2,330
It worked for me, I started on PS4, transferred the saves to the cloud and then was able to continue on PS5. Reverse does not work though, PS5 saves get uploaded to the special PS5 folder in the cloud, which PS4 does not have access to, so you can not switch back.
I think you have this wrong. I'm pretty sure saves can go back and forth. Just needs to have the cloud save icon within the save files in ACV game, not PS+ saves in case that's what you're talking about.

I double checked tonight and you can definitely go back and forth between PS4 & PS5 freely. Just make sure once you've saved that you see the cloud save icon in the lower right of your save file. If you don't see it, try saving over and over until you do, or exiting to title screen, going back in and saving again.
 
Last edited: