• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Deleted member 20297

User requested account closure
Banned
Oct 28, 2017
6,943
You might be able to measure it off a rdna2 chip in PC to get an idea. It's usually very non-linear. In the presentation we only had Cerny's mention that 'a couple of %' frequency reduction buys back 10% power consumption.
Which means going higher than what the chip usually can do leads to unproportionally more power usage. That might be a challenge for the cooling, I can see that. No wonder the cooling is more expensive in the estimated bom for ps5 than previous generations.
 

gofreak

Member
Oct 26, 2017
7,736
Which means going higher than what the chip usually can do leads to unproportionally more power usage. That might be a challenge for the cooling, I can see that. No wonder the cooling is more expensive in the estimated bom for ps5 than previous generations.

Yeah, like I don't think it's the case that they're controlling the thermals with this setup in order to allow for a poor or low-capacity cooling system. They've seemingly put in as much as they can on the cooling side, and then are letting the clock float up to that limit where the workload allows that clock to fit into that thermal capacity - seemingly to the limit of the logic behaving properly on the chip. So it should be a beefy - or interesting - cooling setup i think.
 

Deleted member 20297

User requested account closure
Banned
Oct 28, 2017
6,943
Yeah, like I don't think it's the case that they're controlling the thermals in order to allow a poor or low-capacity cooling system. They've seemingly put in as much as they can on the cooling side, and then are letting the clock float up to that limit where the workload allows that clock to fit into that thermal capacity - seemingly to the limit of the logic behaving properly on the chip.
I wonder if they will go with an external power supply this gen.
 

endlessflood

Banned
Oct 28, 2017
8,693
Australia (GMT+10)
DF were shown the talk on the prior Monday, and had the chance to talk to Cerny afterward over the phone or whatever. There'll be another article on the latter coming up.
That's really stretching the definition of exclusive, isn't it? We got to watch the talk in advance but couldn't publish anything on it in advance, and we got to ask Mark Cerny questions? Those are great things but I don't know that "exclusive" is really the best way to characterise them.

Which means going higher than what the chip usually can do leads to unproportionally more power usage. That might be a challenge for the cooling, I can see that. No wonder the cooling is more expensive in the estimated bom for ps5 than previous generations.
From Cerny's talk, it sounded like cooling is actually what prompted the approach. Since you have a known thermal load it's much easier to reliably design a custom cooling solution for it.
 

gundamkyoukai

Member
Oct 25, 2017
21,136
Yeah, like I don't think it's the case that they're controlling the thermals with this setup in order to allow for a poor or low-capacity cooling system. They've seemingly put in as much as they can on the cooling side, and then are letting the clock float up to that limit where the workload allows that clock to fit into that thermal capacity - seemingly to the limit of the logic behaving properly on the chip. So it should be a beefy - or interesting - cooling setup i think.

Anex gave some nice info in the PlayStation 5 System Architecture Deep Dive thread on that stuff.


Side note i think we need 1 PS5 tech thread jump between 4 threads is annoying lol
 

tzare

Banned
Oct 27, 2017
4,145
Catalunya
They're running at their fixed clock, yes.

They made the decision to fix their GPU clock at a lower rate to accommodate assumed 'worst cases' of workload, based on what their cooling could handle, so it will never vary - even where potentially the clockspeed could have gone higher with some workload patterns it will be running. That's a different set of tradeoffs.
The difference here is context, relying only on cooling means console placement and ambient temperature is an unknown part of the equation, a real world issue, not everyone has their console in an adequate open space for example.
Relying on power theoretically means than the system is less dependant on external factors (depending on the cooling system too i guess, yet unknown)
 

Deleted member 30987

Account closed at user request
Banned
Nov 5, 2017
301
Which means going higher than what the chip usually can do leads to unproportionally more power usage. That might be a challenge for the cooling, I can see that. No wonder the cooling is more expensive in the estimated bom for ps5 than previous generations.
They better did - PS4 was not great in terms of a noise generated by the cooling solution. That's my big and only problem with the machine.
 

BradleyLove

Member
Oct 29, 2017
1,464
My understanding, from what Cerny was saying, is that the level of power consumed by the chip isn't merely a function of clockspeed, but also dependent on what is actually being processed. That certain instructions, certain types of work, consume more power, independent of the clockspeed it is running at.

Hence, you could have times where both the CPU and GPU are, effectively, running at their max clocks, and other times where it has to vary.

Usually clocks are varied with thermals. Or cooling is varied with thermals. The latter is what happens in console often - even though, in PS4, the clockspeeds were fixed, the power consumption varied with workload, and thus the thermals - which is why your fan kicks up more in some games or at some times in some games, than in others.

PS5 is looking at the workload it's currently processing, what it will mean for power consumption, and if the workloads' power consumption at max clocks comes in under the PS5's power and thermal dissipation budget, then max clocks will be maintained. Cerny, it seems, expects that 'most' workloads will fit that criteria.

Their alternative was to try to guess the worst case workloads, and then always have the chip running at a lower fixed clockspeed to accommodate that. That's what they did with their consoles previously. Why do that, why fix around the assumed worst case, if the workload and its power demands will actually change, and in some, or perhaps many cases you could get away with a higher clock? That was the question they appear to have started with.
Thanks for sharing. It's getting really frustrating seeing the clock-trade theory treated as gospel.
 
Jan 21, 2019
2,902
I wonder if his assumption that Sony will give devs more of the RAM since they can just load in the OS when needed holds true. I get the idea, but this also means pushing things from the game aside to get the OS into RAM and then getting it back when the player wants to resume.
 

Fafalada

Member
Oct 27, 2017
3,066
What I meant was that as resolution goes up as a function of CU number, the rays per pixel don't go up even faster. The two values move in lockstep.
XSX 380b intersections/s.
45 intersections/pixel @ 4k.
183 @ 1080p.
It scales like everything else. With 4x faster hw, if you pick 4x resolution there's no 4x left over for "improved" RT.

I understand that BVH intersection tests aren't the only step to raytracing, but I'd have thought with dedicated hardware for that part, RT would use little of the normal compute hardware. Is this incorrect?
That's separate, AMD doesn't accelerate search-data structure traversal, so yes, there's compute costs in addition to intersection tests.But the math for that will not(ever?) be 1:1 comparable since each platform will have its own software stack for it, and eventually devs will come up with custom stuff as well.
 

CrispyGamer

Banned
Jan 4, 2020
2,774
Thanks for sharing. It's getting really frustrating seeing the clock-trade theory treated as gospel.

All that matters is the PS5 is a super efficient machine that will hit at or close to 10.3 TF majority of the time according to Mark Cerny, he even mentions that in those extreme worse case situations that if the cpu load causes it to drop a few percentage points they could just feed the extra power with SmartShift to the GPU so it's a brilliant design and I'm excited to see the possibilities...
 

Ploid 6.0

Member
Oct 25, 2017
12,440
I can't wait to see exclusive games on this puppy, and I hope PC games follow suit. No more design based on HDDs. Before this SSD talk I had no idea why the mounts in Dragon Age Inquisition was so slow, I tried finding a mod to speed them up but I got nothing. Recently I tried BDO on PC again, and you can zoom all over this world, and it looks so friggin good, too bad about the pay to win tedious game design situation that puts me off from it (it has cell phone game design on just about everything in it. You can't sneeze without it being monetized in some way).
 

marecki

Member
Aug 2, 2018
251
I'm seeing a lot of people posting that the likely scenario is we will see native 4K on Series X and 1800p on PSV. If this is due to the difference on the TF number between the two (ignoring literally everything else and assuming linear relationship between TF and number of pixels drawn) is much smaller than that and my calculation actually produces something like 1999p, here's my math, ofc I'm open to be corrected
VcSmYOA.png
 

tzare

Banned
Oct 27, 2017
4,145
Catalunya
I'm seeing a lot of people posting that the likely scenario is we will see native 4K on Series X and 1800p on PSV. If this is due to the difference on the TF number between the two (ignoring literally everything else and assuming linear relationship between TF and number of pixels drawn) is much smaller than that and my calculation actually produces something like 1999p, here's my math, ofc I'm open to be corrected
VcSmYOA.png
I think XSX is 12.1, but yeah, difference exists but will be hard to spot, especially with dynamic resolution. And maybe both will be 4k but with minor graphical differences.
 

joeblow

Member
Oct 27, 2017
2,930
Laker Nation
Hey NXGamer, I wanted to add my appreciation of the content and analysis in the video you posted. It was the first one of yours I've ever watched, and I am now a YT subscriber. Keep up the good work!
 

III-V

Member
Oct 25, 2017
18,827
I agree NXGamer you did a nice job getting content out quickly that was both clear minded in perspective and informative in content.
 

cooldawn

Member
Oct 28, 2017
2,449
I watched it yesterday and yeah, NXGamer did an excellent job of breaking this thing down. Looking forward to more details from you fella.
 

Liabe Brave

Professionally Enhanced
Member
Oct 27, 2017
1,672
XSX 380b intersections/s.
45 intersections/pixel @ 4k.
183 @ 1080p.
It scales like everything else. With 4x faster hw, if you pick 4x resolution there's no 4x left over for "improved" RT.


That's separate, AMD doesn't accelerate search-data structure traversal, so yes, there's compute costs in addition to intersection tests.But the math for that will not(ever?) be 1:1 comparable since each platform will have its own software stack for it, and eventually devs will come up with custom stuff as well.
Thanks so much for the explanation and correction. You can definitely tell I'm a layman here. Your expertise is much appreciated.
 

rntongo

Banned
Jan 6, 2020
2,712
You mean the can prioritise the frequency to GPU or CPU IF it becomes constrained beyond the ceiling level of the capped rates. This is exactly what happens now, to have a dual mode option for develoers to choose from (ala Switch) is not new but I will be interested in seeing this option and just what the delta's are. Thanks for sharing this which I assume came from a Dev source then?
I look forward to your next video but I think you should clarify this in your next video considering that your PS5 video may leave the viewer with the assumption that the APU can maintain both max CPU and GPU clock speeds at the same time
 

gundamkyoukai

Member
Oct 25, 2017
21,136
I look forward to your next video but I think you should clarify this in your next video considering that your PS5 video may leave the viewer with the assumption that the APU can maintain both max CPU and GPU clock speeds at the same time

It can most of the time that was what Cerny was saying .
How true it is will be seen when we get software and people test it out.
 

zombiejames

Member
Oct 25, 2017
11,933
I look forward to your next video but I think you should clarify this in your next video considering that your PS5 video may leave the viewer with the assumption that the APU can maintain both max CPU and GPU clock speeds at the same time
I'll have to rewatch that section of the presentation but I'm pretty sure it can maintain those speeds. If someone can get there before I get the opportunity and quote what was said, that'd be great.
 
Aug 26, 2018
1,793
Has there been a discussion on Sony's cooling solution?

36CUs with 2.23 Ghz as base with very few drops according to Cerny will require a crazy good cooling solution and PS5 HW Design.
 

19thCenturyFox

Prophet of Regret
Member
Oct 29, 2017
4,309
Basically you target and say I want full GPU and the CPU underclocks so the Power Budget keeps the GPU clock high. The Power that would have been CPU Reserved goes over to the GPU to keep it's clock more stable, and since the CPU is now lower clocked, the more intense utilisation or instructions will not tip the Power Balance - well, that is for a game that is also not Absolutely thrashing both.
indeed this Info comes from people who work on the Thing.

Basically, if the Gpu is at 10.2 TF, the cpu is not at 3.5 GHz.


Cerny said all this on stage basically, just not in the most direct way. The only reason to mention smart shift is if this happens, just like it does on smart shift.

Well this certainly changes things. Cerny made it sound like both GPU and CPU can hit max clocks at the same time. I'm tempted to not give Sony the benefit of the doubt from here on out.
 

kostacurtas

Member
Oct 27, 2017
9,063
Has there been a discussion on Sony's cooling solution?

36CUs with 2.23 Ghz as base with very few drops according to Cerny will require a crazy good cooling solution and PS5 HW Design.
Not yet.

Mark Cerny said that they will do a tear-down of the console at a later date. Hopefully soon. He said that the cooling solution should keep us happy.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Has there been a discussion on Sony's cooling solution?

36CUs with 2.23 Ghz as base with very few drops according to Cerny will require a crazy good cooling solution and PS5 HW Design.
Gamers Nexus claimed to know about the cooling solution, but probably won't talk about it before Sony does (besides, they're in Taiwan right now)
 
Aug 26, 2018
1,793
I really, really hope they have some good solutions to heat generation and noise.
Not yet.

Mark Cerny said that they will do a tear-down of the console at a later date. Hopefully soon. He said that the cooling solution should keep us happy.
Gamers Nexus claimed to know about the cooling solution, but probably won't talk about it before Sony does (besides, they're in Taiwan right now)

I think Sony's cooling solution depending on how sophisticated it is can shoot up BOM if they have to design based on this. Maybe?
 
Apr 4, 2018
4,513
Vancouver, BC
Thanks for the detailed and constructive reply, always good.

So re the X improvements over just resolution, yes, but much of this came from the fact it had a significat Memory increase
(3.5GB ~63% and ~85% greater bandwidth)
This gave developers a much great scope to play with above just the Tflop increase of almost the same gap as PS5 - XSX of 1.8Tflops.
Enabling us to enjoy greater texture quality, higher LoD, better performance and increase effects, I said all this before the console launched as it was a significant boost over the Pro.

The SSD is not just for storage, it supplements the RAM and enables game choices to be open, higher density (helped by the Mesh Shaders which help improve detail and reduce bandwidth etc. Using this it will enable (as can the XSX) a shift in game design and streaming to reduce pop in, Mip Map chain delayed loads, stutter and all those other things we see often in the current gen, the CPU will greatly empower teams to make bigger and bolder choices alongside performance as you say.

The piece on Scaling was to demonstrate that a ~18-20% gap in GPU performance can be mitigated (all other effects and throughput being equal) by turning on Dynamic scaling for the PS5 by 10% per axis and leave the XSX at native 4K this is where teams can make the easiest choices to use the power and not add a great deal to the development. 1st party will have the choice to make more use of that and the ray tracing functions, which again the XSX should be slightly better at but this will likely be an even smaller gap, BUT I need to stress this is just talk now and we will have to wait for actual games and more info as this is going into much speculation as was my comment on the SSD Ram use, just thoughts.

Thanks

Not to drag this on, but I think it's an interesting conversation ;). Once again, I appreciate the well considered replies, and I also apologies if I am nitpicking.

With regards to visual improvements on X1X, my experience and understanding is that when PC/crossplatform devs build a game, all of those settings they create exist within the general codebase as tweakable variables in a well constructed game, and are tweaked constantly through development on all platforms. As we can see from the two-week port job of a game like Gears 5 to Series X, it was quick and easy for them to dial up any number of settings. The way developers use the extra GPU power (and I am assuming here that when we see a 12TF vs 10.2TF difference and wide CU difference, we are looking at basically the next-gen RDNA2 video cards in both, and MS quite literally went with the next step above, like a moderately overclocked RTX 2070 vs a standard RTX 2080).

I expect 3rd part devs to act like they always have. Make decisions based on whether they want the versions consistent, or to individually look the best they can. There is no making up for the GPU power difference. If they have a dynamic resolution scaler, I imagine they will use it in both versions, and choose to dial up settings in whatever way makes the game look best.

Also, while I agree that extra Vram can be a big visual differentiator, the greatest advantage for that is higher resolution assets, which many games on the X1X still didn't take advantage of. Many games still dialed up other non-memory related GPU tasks, such as:
- Character, world, foliage level-of-detail
- Shadow/lighting quality
- Post-processing quality
- Anti-aliasing quality
- Screen-space reflections
- Refractions
- Particles
- Texture filtering
- Or opted for improved framerates

Choosing to dial these up, if a resolution boost proved negligible, would be an incredible smart thing to do. I also assume that lighting quality could make the biggest GPU difference, especially with Ray-tracing now a big focus.

I suppose my point is that, just like it always has when consoles have power-difference, many devs will absolutely opt to use that difference, and in the case of almost any PC dev, it shouldn't take a huge amount of extra work, and I am quite sure it will be noticeable just as it always has been in the past. I will say though again, that while I am skeptical that Sony will be able to create much of any asset quality divide, if they can somehow manage to stream in textures and assets at higher quality than the series X, that could certainly give the system a very nice visual quality boost. I really can't wait to see how (not if), the incredible NVMes in these systems help create some extraordinary things.

My hunch is that the GPU overclock for the PS5 was a bit of a last-minute change/ hail-mary, and that the Series X has a better balance of hardware vs cost, but I could certainly be wrong if Sony releases significantly cheaper. I think you were bang-on when I believe you suggested doubt that the price difference would be any greater than $50 at most, if there is any difference at all.
 
Last edited:

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,931
Berlin, 'SCHLAND
Well this certainly changes things. Cerny made it sound like both GPU and CPU can hit max clocks at the same time. I'm tempted to not give Sony the benefit of the doubt from here on out.
Nothing I typed changes anything about what was presented at the talk - which is just as valid as it was said there. One component uses more Power when the other one is not using it, same thing that was presented there. Both cannot Max the Power Budget, as was also said there.
 

LebGuns

Member
Oct 25, 2017
4,127
Basically you target and say I want full GPU and the CPU underclocks so the Power Budget keeps the GPU clock high. The Power that would have been CPU Reserved goes over to the GPU to keep it's clock more stable, and since the CPU is now lower clocked, the more intense utilisation or instructions will not tip the Power Balance - well, that is for a game that is also not Absolutely thrashing both.
indeed this Info comes from people who work on the Thing.

Basically, if the Gpu is at 10.2 TF, the cpu is not at 3.5 GHz.


Cerny said all this on stage basically, just not in the most direct way. The only reason to mention smart shift is if this happens, just like it does on smart shift.
I'm confused; are you saying both the CPU and GPU cannot operate at peak at the same time? Because that's not how Cerny made it sound like, or how the DF direct discussed it. I thought Cerny specifically said the most of the time both CPU and GPU operate at peak? Sorry if you've already answered this you can point me to the post. Thank you!
 

space_nut

Member
Oct 28, 2017
3,306
NJ
Nothing I typed changes anything about what was presented at the talk - which is just as valid as it was said there. One component uses more Power when the other one is not using it, same thing that was presented there. Both cannot Max the Power Budget, as was also said there.

Thanks for the info!

P.S. can't wait to see your vids of RT in the future. That minecraft was astonishing using Path racing
 

kostacurtas

Member
Oct 27, 2017
9,063
How often a game fully utilizes at the same time the CPU and the GPU? Not very often.

I believe that the dynamic design with the variable frequency of the PS5 APU was created with this fact in mind.

Essentially is the AMD SmartShift technology that gaming laptops will use this year and probably it will favor the GPU on PS5. On PCs depending on the workload the SmartShift can push either part of the APU (CPU or GPU).

I am assuming this is how the CPU and GPU will operate most of the time near their frequency cap limit as Mark Cerny said. In the worst case scenario that a game will push both (at the same time) they expect a minor down-clock. Mark Cerny said 2%.
 

LebGuns

Member
Oct 25, 2017
4,127
My understanding, from what Cerny was saying, is that the level of power consumed by the chip isn't merely a function of clockspeed, but also dependent on what is actually being processed. That certain instructions, certain types of work, consume more power, independent of the clockspeed it is running at.

Hence, you could have times where both the CPU and GPU are, effectively, running at their max clocks, and other times where it has to vary.

Usually clocks are varied with thermals. Or cooling is varied with thermals. The latter is what happens in console often - even though, in PS4, the clockspeeds were fixed, the power consumption varied with workload, and thus the thermals - which is why your fan kicks up more in some games or at some times in some games, than in others.

PS5 is looking at the workload it's currently processing, what it will mean for power consumption, and if the workloads' power consumption at max clocks comes in under the PS5's power and thermal dissipation budget, then max clocks will be maintained. Cerny, it seems, expects that 'most' workloads will fit that criteria.

Their alternative was to try to guess the worst case workloads, and then always have the chip running at a lower fixed clockspeed to accommodate that. That's what they did with their consoles previously. Why do that, why fix around the assumed worst case, if the workload and its power demands will actually change, and in some, or perhaps many cases you could get away with a higher clock? That was the question they appear to have started with.
This is an excellent explanation, and super helpful. Thank you for taking the time and posting it here.
 

Dizastah

Member
Oct 25, 2017
6,124
Nothing I typed changes anything about what was presented at the talk - which is just as valid as it was said there. One component uses more Power when the other one is not using it, same thing that was presented there. Both cannot Max the Power Budget, as was also said there.
Thanks for clearing that up.
 

JoeNut

Member
Oct 27, 2017
3,482
UK
Ultimately the numbers will only mean so much, I want a gameplay video of a comparison between the 2 consoles running the same game
 

amstradcpc

Member
Oct 27, 2017
1,768
I'm confused; are you saying both the CPU and GPU cannot operate at peak at the same time? Because that's not how Cerny made it sound like, or how the DF direct discussed it. I thought Cerny specifically said the most of the time both CPU and GPU operate at peak? Sorry if you've already answered this you can point me to the post. Thank you!
Imagine cpu is using one core and gpu one cu, of course they both operate at max clock!.
The clock decreade comes when the full apu hits the x occupacy target. x will be the max wattage the cooling solution and psu are designed for.
In rhis x moment is when kicks what Dictator says and the devs can choose whether to keep the clocks in the cpu or gpu.