• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Betty

Banned
Oct 25, 2017
17,604
So github was right all along. If bc forced sony into this design as suspected that is both incredibly poor planning and ridiculous that afaik not all ps4 games are even compatible. With the ridiculously high gpu clocks, same ram, ssd I can't imagine its substantially cheaper to manufacture than xsx.

This is an unmitigated disaster, honestly should have left off bc if this was the cost.

you're being a tad hyperbolic
 

residentgrigo

Banned
Oct 30, 2019
3,726
Germany
The lack of pics is what it is but the rest is what you usually get from a GDC talk. You now know if GDC streams are for you or not. I give this presentation a C+. It just ends 2/3rds through. A bit
 

lupinko

Member
Oct 26, 2017
6,154
The game and watch article just references the slides and says PS5 is PS4 BC. It doesn't make mention of a 100 compatible games or 100 PS5 boosted games which is up for debate and confusion on the unclear messaging from both Cerny and the US Blog.

twitter.com

GAME Watch on Twitter

“プレイステーション 5、PS4との互換性を確保  https://t.co/diP9NZITh5 #PS5”
 

Respawn

Member
Dec 5, 2017
780
It seems like neither side fucked up this time around in terms of the hardware. We don't have a situation where Sony spent tons of money on a proprietary CPU and forced a expensive disc drive on us or MS for going for anti consumer features and forcing a expensive camera on us. Both will bring a pretty powerful system to the table that provides great value when compared to the current PC market.

MS just did it a little better but I see no reason to trash the PS5 or demand a $400 price just because it's weaker. I think the PS5 will cost $500 and be worth every penny of it. I'm predicting the Xbox will cost $600 but if it's the same price then I applaud MS for providing even better value.
But the PS5 isn't weaker. This isn't what you think.
 

aevanhoe

Slayer of the Eternal Voidslurper
Member
Aug 28, 2018
7,316
I expect some form of DLSS for both consoles to reach 4k for most titles.

But why? I have a nice, big 4K OLED - I barely notice the difference between 1440p and 4K and can't notice it at all between checkerboard 4K and "real" 4K. Heck, 1080p with good AA can look just as good, sometimes. 4K is still very, very demanding and not that noticeable compared to 1440p or checkerboard 4K - that power could be used elsewhere.
 

GameAddict411

Member
Oct 26, 2017
8,509
But why? I have a nice, big 4K OLED - I barely notice the difference between 1440p and 4K and can't notice it at all between checkerboard 4K and "real" 4K. Heck, 1080p with good AA can look just as good, sometimes. 4K is still very, very demanding and not that noticeable compared to 1440p or checkerboard 4K - that power could be used elsewhere.
Because DLSS has incredible image quality for less performance impact? In fact if Sony actually developed a similar tech, the performance gap between the consoles will be diminished. In fact, Nvidia is actually developing a general DLSS tech that they implemented in Wolf Young Blood and the image quality is excellent. We could see it on the driver level one day.
 

Adum

Member
May 30, 2019
922
Cerny was very specific about 825GB being the sweet spot for the SSD performance. I wonder if that means that they won't have a higher capacity sku down the line.
 

vivftp

Member
Oct 29, 2017
19,744
Cerny was very specific about 825GB being the sweet spot for the SSD performance. I wonder if that means that they won't have a higher capacity sku down the line.

It was the sweet spot while keeping the price of the console in mind. As time goes on and prices come down there's no reason why we wouldn't see higher capacity SKUs eventually. Cerny even noted they could have gone for a larger capacity.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,929
Berlin, 'SCHLAND
I think the gpu and CPU clocking Thing will work like this, i have it on a good hunch:

Devs will prioritise modes like high clocked GPU and lower clockd CPU, or the other way around. That is basically what the presentation says, they will have trouble hitting full speeds on both parts simultaneously due to utilisation (which is a bit duh), so one Part being higherclocked requires an under utilisation of the other. So a game that is mostly gpu limited will use a gpu Mode, a very intense open world game or... Some other Design requiring more CPU will use a CPU Mode. So underclocking the GPU there.

This of course assume that games do not really heavily utilise both parts at the same time, in which case, like a 60 Assassin's Creed game as we see on PC. Or a game with variable drs and a lot of CPU Stuff. Or any ambtious game that want to do both simulation and graphical things.

The one Part being higher clocked, requires an under utilisation of the other. Hence why freefloating resolution game with very preise dynamic resolution scaling, like Doom, Titan Fall 2, modern warfare etc. Will all probably need to be in the Mode for GPU Power. They are already maxing the GPU as is due to their Design.
 
Last edited:

Raide

Banned
Oct 31, 2017
16,596
I think the gpu and CPU clocking Thing will work like this, i have it on a good hunch:

Devs will prioritise modes like high clocked GPU and lower clockd CPU, or the other way around. That is basically what the presentation says, they will have trouble hitting full speeds on both parts simultaneously due to utilisation (which is a bit duh), so one Part being higherclocked requires an under utilisation of the other. So a game that is mostly gpu limited will use a gpu Mode, a very intense open world game or... Some other Design requiring more CPU will use a CPU Mode. So underclocking the GPU there.

This of course assume that games do not really heavily utilise both parts at the same time, in which case, like a 60 Assassin's Creed game as we see on PC. Or a game with variable drs and a lot of CPU Stuff. Or any ambtious game that want to do both simulation and graphical things.

The one Part being higher clocked, requires an under utilisation of the other. Hence why freefloating resolution game with very preise dynamic resolution scaling, like Doom, Titan Fall 2, modern warfare etc. Will all probably need to be in the Mode for GPU Power. They are already maxing the GPU as is due to their Design.
Thanks for this. I thought it would be for this reason. Many games this gen were CPU bound, so having a system where devs choose to push depending on what they need. I guess this is great for 1st parties who can tweak the settings to what they need but will 3rd parties really want to have to cater to both a fluid system and a fixed one? Also, I wonder how SMT will play into this.
 

DukeBlueBall

Banned
Oct 27, 2017
9,059
Seattle, WA
I think the gpu and CPU clocking Thing will work like this, i have it on a good hunch:

Devs will prioritise modes like high clocked GPU and lower clockd CPU, or the other way around. That is basically what the presentation says, they will have trouble hitting full speeds on both parts simultaneously due to utilisation (which is a bit duh), so one Part being higherclocked requires an under utilisation of the other. So a game that is mostly gpu limited will use a gpu Mode, a very intense open world game or... Some other Design requiring more CPU will use a CPU Mode. So underclocking the GPU there.

This of course assume that games do not really heavily utilise both parts at the same time, in which case, like a 60 Assassin's Creed game as we see on PC. Or a game with variable drs and a lot of CPU Stuff. Or any ambtious game that want to do both simulation and graphical things.

The one Part being higher clocked, requires an under utilisation of the other. Hence why freefloating resolution game with very preise dynamic resolution scaling, like Doom, Titan Fall 2, modern warfare etc. Will all probably need to be in the Mode for GPU Power. They are already maxing the GPU as is due to their Design.

This requires developer input? Why not just have the chip automatically detect load and transfer power.

If it requires developer input then third party developers will always chose max CPU power to match XSX CPU.
 

Straffaren666

Member
Mar 13, 2018
84
I think the gpu and CPU clocking Thing will work like this, i have it on a good hunch:

Devs will prioritise modes like high clocked GPU and lower clockd CPU, or the other way around. That is basically what the presentation says, they will have trouble hitting full speeds on both parts simultaneously due to utilisation (which is a bit duh), so one Part being higherclocked requires an under utilisation of the other. So a game that is mostly gpu limited will use a gpu Mode, a very intense open world game or... Some other Design requiring more CPU will use a CPU Mode. So underclocking the GPU there.

That's not how I understand it. Power draw depends on the type of instructions the CPU/GPU is executing as well as the clock frequency. Different workloads consists of different instruction mixes and down clocking will only occur when the aggregate power draw of the CPU and GPU exceeds the down clock threshold. The key is that those simultaneous worst case power draw workloads occur relatively infrequently and the CPU/GPU will run at full speed most of the time.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,929
Berlin, 'SCHLAND
That's not how I understand it. Power draw depends on the type of instructions the CPU/GPU is executing as well as the clock frequency. Different workloads consists of different instruction mixes and down clocking will only occur when the aggregate power draw of the CPU and GPU exceeds the down clock threshold. The key is that those simultaneous worst case power draw workloads occur relatively infrequently and the CPU/GPU will run at full speed most of the time.
When i say I have it on a good hunch, i mean I have it on a good informed hunch.

The whole Thing is that some engines use intensive instructions and Max out the GPU or CPU with no bubbles already. You bet your Butt all those target 60 fps games on console have throttled the jag and the GPUs as well. Your Battlefield, dooms, call of duties, etc.
Such engines that gobble up any and All performance will have to make choice as to which Mode they want based on their game Design. I would assume many transitional engines will choose to prioritise GPU clocks since the Zen cores are just dancing around their Code optimised for utilising the Jag so well.
 

Raonak

Banned
Oct 29, 2017
2,170
I got the impression the power draw balance can be automatic,
in that; GPU clocks will go down as CPU power load is increased.
It would work well cause having more clock stability on your CPU is probably better, and GPU shortcuts like dynamic resolution and VRS would ideally cover the difference.

I'd imagine that first party devs would probably customise their power draw quite a bit, but 3rd party would go for auto.
 

aevanhoe

Slayer of the Eternal Voidslurper
Member
Aug 28, 2018
7,316
Because DLSS has incredible image quality for less performance impact? In fact if Sony actually developed a similar tech, the performance gap between the consoles will be diminished. In fact, Nvidia is actually developing a general DLSS tech that they implemented in Wolf Young Blood and the image quality is excellent. We could see it on the driver level one day.

DLSS is not actually 4K resolution, it's a sort of a good trick to make the impression the resolution is higher with the help of deep learning supersampling. This is a good thing and if that's what you mean by "4K", I agree. But that's similar (I don't mean technically, I mean in goals) to checkerboard 4K or temporal injection - meaning it's not real 4K but it's supposed to look 4K. Add full-4K UI elements (and, honestly, if you can notice lower resolution somewhere, it's on text and graphical elements) - and you have a great presentation that runs well. But a lot of people (especially high-end PC owners) were claiming "it's not really 4K" - which is annoying, considering these techniques are great for both PC and consoles.
 

Straffaren666

Member
Mar 13, 2018
84
When i say I have it on a good hunch, i mean I have it on a good informed hunch.

The whole Thing is that some engines use intensive instructions and Max out the GPU or CPU with no bubbles already. You bet your Butt all those target 60 fps games on console have throttled the jag and the GPUs as well. Your Battlefield, dooms, call of duties, etc.
Such engines that gobble up any and All performance will have to make choice as to which Mode they want based on their game Design. I would assume many transitional engines will choose to prioritise GPU clocks since the Zen cores are just dancing around their Code optimised for utilising the Jag so well.

My point is that it's not only about full utilization, but the instruction mix as well. Even though the CPU and GPU are fully utilized, they don't necessarily execute power hungry workloads all the time. Time will tell if Cerny was misleading when he said he expects the CPU/GPU to hit full speed most of the time. AFAIK, he has not a track record of over optimistic claims.
 
Jan 3, 2019
3,219
Have there been any hints about the capabilities of PS5 raytracing today?
Cerny said there's a game using RT reflections in complex scenes without a major performance hit
tLZXVh2.png
 

Deleted member 46489

User requested account closure
Banned
Aug 7, 2018
1,979
When i say I have it on a good hunch, i mean I have it on a good informed hunch.

The whole Thing is that some engines use intensive instructions and Max out the GPU or CPU with no bubbles already. You bet your Butt all those target 60 fps games on console have throttled the jag and the GPUs as well. Your Battlefield, dooms, call of duties, etc.
Such engines that gobble up any and All performance will have to make choice as to which Mode they want based on their game Design. I would assume many transitional engines will choose to prioritise GPU clocks since the Zen cores are just dancing around their Code optimised for utilising the Jag so well.
This is a bit out of left field, but considering Cerny's talk about stuff like the Geometry Engine and other PS5 specific innovations, could it happen that multiplat games on both platforms will perform roughly similar if they take full advantage of PS5's features? Or will the CPU and GPU advantage of Series X result in better fps and resolutions regardless?
 

Segafreak

Member
Oct 27, 2017
2,756
Bet this is gonna be one of those Sony certified SSDs
21037.jpg


6.5GB/s read speed, they've only shown the 980 Pro but looking at previous gens the Evo won't be too far off from these speeds. Gonna put a stick in my PS5 for a total of ~2TB blistering fast SSD.
 

Raide

Banned
Oct 31, 2017
16,596
This is a bit out of left field, but considering Cerny's talk about stuff like the Geometry Engine and other PS5 specific innovations, could it happen that multiplat games on both platforms will perform roughly similar if they take full advantage of PS5's features? Or will the CPU and GPU advantage of Series X result in better fps and resolutions regardless?
3rd party games very rarely make use of the various differing settings of a console. Everything tends to run fairly similar, with maybe the more powerful one pushing a little higher. Similar to the way Pro and 1X are now. 1st party titles will make the best use of newer system tech.
 
Oct 27, 2017
8,617
The World
This is a bit out of left field, but considering Cerny's talk about stuff like the Geometry Engine and other PS5 specific innovations, could it happen that multiplat games on both platforms will perform roughly similar if they take full advantage of PS5's features? Or will the CPU and GPU advantage of Series X result in better fps and resolutions regardless?

It is not like Microsoft has not built in innovations into XSX. It all depends on which 3rd parties utilize which innovations.
 

Deleted member 46489

User requested account closure
Banned
Aug 7, 2018
1,979
3rd party games very rarely make use of the various differing settings of a console. Everything tends to run fairly similar, with maybe the more powerful one pushing a little higher. Similar to the way Pro and 1X are now. 1st party titles will make the best use of newer system tech.
It is not like Microsoft has not built in innovations into XSX. It all depends on which 3rd parties utilize which innovations.
I get that. I'm just curious, because Cerny spent A LOT of time talking about these innovations. I do know that Series X will also have stuff like mesh shading and Xbox Velocity Architecture, but Microsoft didn't really spend a lot of time talking about these things. Of course, Cerny might have been stressing this stuff in order to de-emphasize the teraflop gap.

I guess we'll find out for sure only when the consoles release. I remember that the performance gap between PS4 and Xbox One was significant for multiplat games. I wonder if we'll see something similar but in Microsoft's favor this gen.
 

Straffaren666

Member
Mar 13, 2018
84
This is a bit out of left field, but considering Cerny's talk about stuff like the Geometry Engine and other PS5 specific innovations, could it happen that multiplat games on both platforms will perform roughly similar if they take full advantage of PS5's features? Or will the CPU and GPU advantage of Series X result in better fps and resolutions regardless?

The geometry engine and rasterization resources are most likely identical for the XSX and the PS5, so the PS5 doesn't have an advantage when it comes to features. However, the clock frequency of the PS5 is about 20% higher than the XSX, which means the geometry engine and rasterizing capabilites will probably be roughly 20% higher on the PS5 and will balance out some of its TF deficiency.
 

NotUS

Member
Oct 27, 2017
135
Quite an interesting preso, though I can't shake the feeling that the raw power difference will give XSX the heads up on frame-rate, and more effects on screen. The teraflop difference between XSX and PS5 at peak frequency is essentially the whole teraflops of a playstation 4 being dedicated to on screen effects for the XSX.

This is quite a gap dedicated toward post processing and other effects, regardless of load and streaming times due the the SSD differences.

Still looking forward to getting both of them, next gen will be amazing.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,929
Berlin, 'SCHLAND
This is a bit out of left field, but considering Cerny's talk about stuff like the Geometry Engine and other PS5 specific innovations, could it happen that multiplat games on both platforms will perform roughly similar if they take full advantage of PS5's features? Or will the CPU and GPU advantage of Series X result in better fps and resolutions regardless?
The geometry engine and primitive shader sounds like a rename of the never used vega Feature, which evolved into the more Feature complete mesh shader that Turing and Xbox Series X has. It has yet to be seen if they are in fact one and the same.
So no, there is no secret equivalent makesng sauce. GPU and CPU are just better in XSX, which makes sense as MS went with an untypical console Design which surprised everyone. It is a tall Box.
Cerny said there's a game using RT reflections in complex scenes without a major performance hit
That is so context less and we have no idea what type of RT reflections (just coherent sharp reflections?, how many rays per Pixel?), with no context for the resolution or framerate. Very different than MS showing a fully path traced game and giving out the ballpark performance metric there and resolution.

We still do not have a Real idea of the, presumably, less performant RT that the PS5 will have.
 
Last edited:

Adum

Member
May 30, 2019
922
Yeah, on paper. But the truth is, I barely see any difference in gaming performance between my Nvme and my sata, even though Nvme is supposed to be a lot faster on paper. So I'm a little skeptical about how fast it actually is compared to XSX when it comes to real life performance.
That's cause the games you're playing currently are designed to run on mechanical HDDs. They can't take full advantage of NVMe drives. Next-gen games will be built from the ground up for drives with 2 GBps+ speeds. That's when you'll notice a difference.

The geometry engine and primitive shader sounds like a rename of the never used vega Feature, which evolved into the more Feature complete mesh shader that Turing and Xbox Series X has. It has yet to be seen if they are in fact one and the same.
So no, there is no secret equivalent makesng sauce. GPU and CPU are just better in XSX, which makes sense as MS went with an untypical console Design which surprised everyone. It is a tall Box.

That is so context less and we have no idea what type of RT reflections (just coherent sharp reflections?, how many rays per Pixel?), with no context for the resolution or framerate. Very different than MS showing a fully path traced game and giving out the ballpark performance metric there and resolution.

We still do not have a Real idea of the, presumably, less performant RT that the PS5 will have.
Cerny did say that high clocked GPUs have certain advantages. How far do you think that ridiculously high GPU frequency in the PS5 will help to mitigate any performance gaps between the two consoles when it comes to third party titles?
 

DarkSlayer

Member
Jan 18, 2018
45
Pakistan
I think the gpu and CPU clocking Thing will work like this, i have it on a good hunch:

Devs will prioritise modes like high clocked GPU and lower clockd CPU, or the other way around. That is basically what the presentation says, they will have trouble hitting full speeds on both parts simultaneously due to utilisation (which is a bit duh), so one Part being higherclocked requires an under utilisation of the other. So a game that is mostly gpu limited will use a gpu Mode, a very intense open world game or... Some other Design requiring more CPU will use a CPU Mode. So underclocking the GPU there.

This of course assume that games do not really heavily utilise both parts at the same time, in which case, like a 60 Assassin's Creed game as we see on PC. Or a game with variable drs and a lot of CPU Stuff. Or any ambtious game that want to do both simulation and graphical things.

The one Part being higher clocked, requires an under utilisation of the other. Hence why freefloating resolution game with very preise dynamic resolution scaling, like Doom, Titan Fall 2, modern warfare etc. Will all probably need to be in the Mode for GPU Power. They are already maxing the GPU as is due to their Design.

I think it's more like that a developer will see at what frequency their game is hitting the desired performance (4K@30, for example) then it will tell the system to run it at that fixed clockspeed and not demand more (or maybe only if performance drops during heavy scenes?). For more ambitious games, the developer will target max clockspeeds, but the system will decide if thermals allow sustained clockspeeds.
 

raygcon

Banned
Oct 30, 2017
741
So Series X has a faster CPU, a faster GPU, faster memory, and a larger SSD.....

And probably higher price as well.

Too all folks here, if you care so much about spec, go PC. Not sure what are you trying to squeeze out of 399$ box lol. I mean they can add 30TFLops into this box too if people are willing to pay 1000. For me as long as it's not too much different from seriesX i don't really give a fuck. We all play console for exclusive, not the state of art hardware.
 

gremlinz1982

Member
Aug 11, 2018
5,331
That's cause the games you're playing currently are designed to run on mechanical HDDs. They can't take full advantage of NVMe drives. Next-gen games will be built from the ground up for drives with 2 GBps+ speeds. That's when you'll notice a difference.


Cerny did say that high clocked GPUs have certain advantages. How far do you think that ridiculously high GPU frequency in the PS5 will help to mitigate any performance gaps between the two consoles when it comes to third party titles?
Same way the Xbox One and Xbox One S had advantages over the PS4?
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,929
Berlin, 'SCHLAND
Cerny did say that high clocked GPUs have certain advantages. How far do you think that ridiculously high GPU frequency in the PS5 will help to mitigate any performance gaps between the two consoles when it comes to third party titles?
We do not know ROP or TMU count for the various GPUs here. We need to know that to say something. But even then, that assumes that certain screen elements are limiting framerate, which they might not at all.
 
Last edited:

Adum

Member
May 30, 2019
922
What would stop MS from adding an "overclock" mode of their own?
My reply isn't solely for you, but for the tons of people here who apparently didn't watch the presentation and just blindly talk about the PS5 having a boost clock. That's not it at all. The 2.23GHz GPU clock was very clearly implied to be the base GPU frequency. Cerny very clearly said that he doesn't expect the majority of games to change that. And if the frequency does dip, he again stated it would be by a couple of % points. The 2.23 GHz clock isn't some boost clock they've achieved just for this presentation as a marketing stunt.
 

Yerffej

Prophet of Regret
Member
Oct 25, 2017
23,464
I'm trying to imagine dying in a game and coming back so quickly with the speeds like this that it seems comical. Almost useless as a penalty. Like death is just a warp/rewind. Gonna take getting used to.
 

Adum

Member
May 30, 2019
922
Same way the Xbox One and Xbox One S had advantages over the PS4?
Sorry not sure what you're asking here?

I'm trying to imagine dying in a game and coming back so quickly with the speeds like this that it seems comical. Almost useless as a penalty. Like death is just a warp/rewind. Gonna take getting used to.
Ya know, I almost lost it when Cerny said that game loading was so fast that they might actually have to artificially slow down certain load screens lmao
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
What would stop MS from adding an "overclock" mode of their own?

Probably just real life testing. They "released" some extra power to the Xbox One shortly after launch as the API's for the OS and Kinect improved and it wasn't needed so much.
They talked about how the X is reporting back data about game engine behaviour (which they use to feed the AI's that drive the Fast Start) and to influence future changes.
Once the machine is out in the real world and they get that "big data" view on running temperatures in various locations, they may choose to make changes like that, but it would likely only be very small.

They made a very specific point talking about the CPU and how it's clock speed was fixed at that high level, to make it clear that delivering those top line numbers aren't being deployed as a marketing technique. "up to 3.8ghz" is not what they want to communicate. I presume they knew that Sony was going to be going for the "upto 3.5ghz"